5
votes

I noticed that after running eigs() function multiple times, every time it gives different but approximate result.

Is there way to return it every time the same result ? Output is sometimes with "+" sign or "-" sign.

Content of M :

[2, 1]  =  1.0
[3, 1]  =  0.5
[1, 2]  =  1.0
[3, 2]  =  2.5
[1, 3]  =  0.5
[2, 3]  =  2.5

M = M+M'
(d, v) = eigs(M, nev=1, which=:LR)

I tried running same function on same sparse matrix in Python , although the matrix looks bit different I think it is same. Just left values are numbered from 0. In julia they are numbered from 1. I do not know if that is a big difference. Values are approximately same in Julia and Python but in Python they are always the same after every evaluation. Also return values in python are complex numbers, in Julia real.

Python code:

Content of M.T :

from scipy.sparse import linalg

(1, 0)  1.0
(2, 0)  0.5
(0, 1)  1.0
(2, 1)  2.5
(0, 2)  0.5
(1, 2)  2.5

eigenvalue, eigenvector = linalg.eigs(M.T, k=1, which='LR')

Any idea why this behavior is occurring ?

Edit :

These are results of four evaluations of eigs

==========eigvalues==============
[2.8921298144977587]
===========eigvector=============
[-0.34667468634025667
-0.679134250677923
-0.6469878912367839]
=================================

==========eigvalues==============
[2.8921298144977596]
===========eigvector=============
[0.34667468634025655
0.6791342506779232
0.646987891236784]
=================================

==========eigvalues==============
[2.8921298144977596]
===========eigvector=============
[0.34667468634025655
0.6791342506779233
0.6469878912367841]
=================================

==========eigvalues==============
[2.8921298144977583]
===========eigvector=============
[0.3466746863402567
0.679134250677923
0.646987891236784]
=================================
1
I tried raising number of iterations. I am not so concerned that number are slightly different. I am concerned about that "-" sign. In Python no matter how many times I run it, there are never "-" signs in result for this input data. - M.Puk
Correction: after the edit it seems that not the number of iterations is the problem. The algorithm is probably oscillating towards the solution and sometimes it finds it from top and sometimes from bottom and it stops if maximum precision (difference smaller eps = unmeasurable) is reached. - maraca
I'm fairly sure that eigs is supposed to be deterministic. Eventually, the goal is for eigs to be in pure Julia, but I think it still depends on ARPACK which has caused problems in the past. Relevant google groups reading and issues page here and here. As suggested at those pages, older versions of ARPACK are a bit buggy and all the fixes may not have propagated through yet depending on your distro. This might be the source of your problem... - Colin T Bowers
The eigenvectors of a matrix are defined up to a scalar. That is, if v is an eigenvector, so is 2v and -v. The big variations in the output above are due to this freedom of choice for the algorithm. If the eigenvectors are normalized (multiplied to have norm 1 and say a positive first component) then the vectors should be the same. Caveat: there is some more freedom when eigenvalues are the same (have high multiplicity). - Dan Getz
If you file an issue on the Julia issue tracker, this will get fixed: github.com/JuliaLang/julia/issues. As @maraca said, this isn't wrong, it's just annoying that it's non-deterministic. - StefanKarpinski

1 Answers

6
votes

The result of eigs depends on the initial vector for the Lanczos iterations. When not specified, it is random so even though all the vectors returned are correct the phase is not guaranteed to be the same over different iterations.

If you want the result to be the same every time, you can set v0 in eigs, e.g.

eigs(M, nev=1, which=:LR, v0 = ones(3))

As long as v0 doesn't change you should get deterministic results.

Note that if you want a deterministic result for testing purposes, you might want to consider a testing scheme that allows phase shifts since the phase can shift with the smallest perturbations. E.g. if you link a different BLAS or change the number of threads the result might change again.