0
votes

I have a set of linear algebraic equations in matrices form, Ax=By. Where A is matrix of 36x20 and x is a vector of size 20, B is 36x13 and y is 13x1. Rank(A)=20. Because system is overdetermined (there are more number of equations than the variables), so least squares solution is possible, i,e; x = (A^TA)^-1A^TBy. I want the solution so that the residual error e = Ax-By should be minimized.

Using Eigen/Dense library of C++ i have formulated all the matrices etc. I tried the method described on this page Eigen Tutorial!

I guess the method described in this page is only for square matrices. Because when it try to run this it gives error.

 x = A.jacobiSvd( ComputeThinU | ComputeThinV ).solve(B*y);

Error

 /usr/include/eigen3/Eigen/src/SVD/JacobiSVD.h: In member function 'const    
 Eigen::internal::solve_retval<Eigen::JacobiSVD<MatrixType, QRPreconditioner>, Rhs> 
 Eigen::JacobiSVD<MatrixType, QRPreconditioner>::solve(const 
 Eigen::MatrixBase<OtherDerived>&) const [with Rhs = 
 Eigen::GeneralProduct<Eigen::Matrix<float, 36, 13>, Eigen::Matrix<double, -1, 1>, 4>; 
 _MatrixType = Eigen::Matrix<float, 36, 20>; int QRPreconditioner = 2]':
 /usr/include/eigen3/Eigen/src/SVD/JacobiSVD.h:658:5: warning: control reaches end of  
 non-void function [-Wreturn-type]
 make[2]: *** [src/CMakeFiles/spacebot_actuationKinematics.dir 
 /ActuationKinematics.cpp.o] Error 1
 make[1]: *** [src/CMakeFiles/spacebot_actuationKinematics.dir/all] Error 2
 make: *** [all] Error 2
2
The page favors the QR decomposition. Did you try it? It also works with rectangular matrices A for least squares, as well as the SVD method. Your error is from a missing return statement in some special case of the reported function, check bug reports for Eigen. - Lutz Lehmann
what do you mean by y? LS method is to find parameters of linear combination of x that best fit x observed into y observed. So you substitute x values and y values to equation. Do you have 36 observations? - 4pie0
@LutzL Yes I tried it. Here is the output. Assertion failed because rows == cols. ActuationKinematicsTest: /usr/include/eigen3/Eigen/src/LU/Inverse.h:334: const Eigen::internal::inverse_impl<Derived> Eigen::MatrixBase<Derived>::inverse() const [with Derived = Eigen::Matrix<double, 36, 20>]: Assertion `rows() == cols()' failed. Aborted (core dumped) - Wafeeq
Yes of course, that is to be expected if you try to compute the inverse matrix using the LU factorization. That is only possible for regular square matrices. Use QR as fast and SVD as slower but more stable factorization and do not compute inverse matrices, only solve linear systems. - Lutz Lehmann
This is a rather strong statement. If A has full rank and reasonable condition number, QR is numerically as stable as SVD. LU comes only close if full, expensive pivoting is employed. If the matrix A is rank deficient or nearly so, you will need SVD to decide on the effective rank. But any solution of a linear system will dramatically depend on the threshold were you decide if a singular value is only small or effectively zero. - Lutz Lehmann

2 Answers

0
votes

It seems that is having a problem with the matrix multiplication in your solve(B*y); part. Try doing B*y separately and use solve(result); instead.

Eigen::GeneralProduct<Eigen::Matrix<float, 36, 13>, Eigen::Matrix<double, -1, 1>, 4>

That line gave me this suspicion. It says that the y variable came with a size of -1x1, thus your program wont run no matter what since it can't multiply with the matrix.

Also, the tutorial says...

A.jacobiSvd(ComputeThinU | ComputeThinV).solve(b) << endl;

I don't know how Eigen works exactly, but that seems to be the problem.

0
votes

As explained in the documentation, the ComputeThin* options are only for Dynamic sized matrices. For fixed sizes, you must use ComputeFull*. Nevertheless, in your case it is better to use Dynamic size matrices, i.e., MatrixXf. Fixed size matrices only makes sense for very small ones.

Finally, ColPivHouseholderQR is probably a better choice for least-square solving. SVD is a bit overkill.