I am looking for finding or rather building common eigenvectors matrix X between 2 matrices A
and B
such as :
AX=aX with "a" the diagonal matrix corresponding to the eigenvalues
BX=bX with "b" the diagonal matrix corresponding to the eigenvalues
where A
and B
are square and diagonalizable matrices.
I took a look in a similar post but had not managed to conclude, i.e having valid results when I build the final wanted endomorphism F
defined by : F = P D P^-1
I have also read the wikipedia topic and this interesting paper but couldn't have to extract methods pretty easy to implement.
Particularly, I am interested by the eig(A,B)
Matlab function.
I tried to use it like this :
% Search for common build eigen vectors between FISH_sp and FISH_xc
[V,D] = eig(FISH_sp,FISH_xc);
% Diagonalize the matrix (A B^-1) to compute Lambda since we have AX=Lambda B X
[eigenv, eigen_final] = eig(inv(FISH_xc)*FISH_sp);
% Compute the final endomorphism : F = P D P^-1
FISH_final = V*eye(7).*eigen_final*inv(V)
But the matrix FISH_final
don't give good results since I can do other computations from this matrix FISH_final
(this is actually a Fisher matrix) and the results of these computations are not valid.
So surely, I must have done an error in my code snippet above. In a first time, I prefer to conclude in Matlab as if it was a prototype, and after if it works, look for doing this synthesis with MKL or with Python functions. Hence also tagging python.
How can I build these common eigenvectors and finding also the eigenvalues associated? I am a little lost between all the potential methods that exist to carry it out.
The screen capture below shows that the kernel of commutator has to be different from null vector :
EDIT 1: From maths exchange, one advices to use Singular values Decomposition (SVD) on the commutator [A,B], that is in Matlab doing by :
"If ???? is a common eigenvector, then ‖(????????−????????)????‖=0. The SVD approach gives you a unit-vector ???? that minimizes ‖(????????−????????)????‖ (with the constraint that ‖????‖=1)"
So I extract the approximative eigen vectors V from :
[U,S,V] = svd(A*B-B*A)
Is there a way to increase the accuracy to minimize ‖(????????−????????)????‖ as much as possible ?
IMPORTANT REMARK : Maybe some of you didn't fully understand my goal.
Concerning the common basis of eigen vectors, I am looking for a combination (vectorial or matricial) of V1
and V2
, or directly using null
operator on the 2 input Fisher marices, to build this new basis "P" in which, with others eigenvalues than known D1
and D2
(noted D1a
and D2a
), we could have :
F = P (D1a+D2a) P^-1
To compute the new Fisher matrix F, I need to know P
, assuming that D1a
and D2a
are equal respectively to D1
and D2
diagonal matrices (coming from diagonalization of A
and B
matrices)
If I know common basis of eigen vectors P
, I could deduce D1a
and Da2
from D1
and D2
, couldn't I ?
The 2 Fisher matrices are available on these links :
eig
? I don't see whereeig(A,B)
gives common eigenvectors/values of A and B. – Argyll[V,D] = eig(A,B) returns diagonal matrix D of generalized eigenvalues and full matrix V whose columns are the corresponding right eigenvectors, so that A*V = B*V*D.
How to make the link to find, between the 2 matrices, a common eigen vectors basis from this relationA*V = B*V*D
? Regards – youpilat13A*V = B*V*D
and common eigenvectors? I can write an answer highlighting Matlab's eigenvector methods. The answer will work on small matrices; otherwise I do not wish to devise an efficient algorithm on the spot. As far as I can tell, there is no standard numerical method to find common eigenvectors. If you are trying to understand Matlab, perhaps what I suggest would help. Otherwise, I think either you need to provide an algorithm or this question needs to be asked on a different stack exchange site. – Argyll