7
votes

Given a set of N points in a 3D space, I am trying to find the best fitting plane using SVD and Eigen.

My algorithm is:

  1. Center data points around (0,0,0).
  2. Form 3xN matrix of point coordinates.
  3. Calculate SVD of the matrix.
  4. Set the smallest singular vector corresponding to the least singular value as normal of the plane.
  5. Set distance from origin to the plane as normal∙centroid.

I can't figure out how to use Eigen's SVD Module to find the smallest singular vector corresponding to the least singular value of point coordinates matrix.

So far I have this code (steps 1, 2 and 5 of the algorithm):

Eigen::Matrix<float, 3, 1> mean = points.rowwise().mean();
const Eigen::Matrix3Xf points_centered = points.colwise() - mean;

int setting = Eigen::ComputeThinU | Eigen::ComputeThinV;
Eigen::JacobiSVD<Eigen::Matrix3Xf> svd = points_centered.jacobiSvd(setting);

Eigen::Vector3d normal = **???**

double d = normal.dot(mean);
2
What code have you tried so far? - Avi Ginsburg
Added code to the question. - dustymax
See slides 97-99 of this presentation. - ggael
@ggael I would like to use SVD, I read that should be more precise in some cases. - dustymax

2 Answers

3
votes

Denoting U = svd.matrixU(), the vectors U.col(0) and U.col(1) defines a base of your plane and U.col(2) is normal to your plane.

U.col(0) also defines the direction with the greatest standard deviation.

You should use the flag ComputeFullU instead of ComputeThinU to have the correct dimensions even if your points are coplanar.

1
votes

Your problem is basically how to do a least-square fitting using the Eigen JacobiSVD module. Here's a link with a more helpful example. The basic idea of least-square fitting is that you first take the vector difference of all the N-1 points with one of the N points, and then try to approximate all such N-1 vectors as a linear combination of two basis vectors, which define the 2D plane.