1
votes

I am trying to learn how to use numpy to determine eigenvectors and values in a simple example but the results do not look correct. Here is my code:

import numpy as np
import numpy.linalg as la

# create the matrix
matrix = np.array([[-2, 1, 0], [1, -2, 1], [0, 1, -2]])
print("Matrix:\n", matrix)

# calculate the eigenvalues and vectors
vals, vecs = np.linalg.eigh(matrix)

# print the eigenvalues and vectores
print("vals:\n", vals)
print("vecs:\n", vecs)

# get the eigenvectors
v1 = vecs[:,0]
v2 = vecs[:,1]
v3 = vecs[:,2]

print("v1:", v1)
print("v2:", v2)
print("v3:", v3)

# compute dot
dot1 = np.dot(matrix, v1)
dot2 = np.dot(matrix, v2)
dot3 = np.dot(matrix, v3)

# is the dot collinear to the eigenvectors?
print("dot1 / v1", dot1 / v1)
print("dot2 / v2", dot2 / v2)
print("dot3 / v3", dot3 / v3)

Here is the output:

Matrix:
 [[-2  1  0]
 [ 1 -2  1]
 [ 0  1 -2]]
vals:
 [-3.41421356 -2.         -0.58578644]
vecs:
 [[  5.00000000e-01  -7.07106781e-01  -5.00000000e-01]
 [ -7.07106781e-01   4.88509860e-17  -7.07106781e-01]
 [  5.00000000e-01   7.07106781e-01  -5.00000000e-01]]
v1: [ 0.5        -0.70710678  0.5       ]
v2: [ -7.07106781e-01   4.88509860e-17   7.07106781e-01]
v3: [-0.5        -0.70710678 -0.5       ]
dot1 / v1 [-3.41421356 -3.41421356 -3.41421356]
dot2 / v2 [-2.         -4.54534541 -2.        ]
dot3 / v3 [-0.58578644 -0.58578644 -0.58578644]

When I calculate the eigenvectors with an online calculator (http://www.arndt-bruenner.de/mathe/scripts/engl_eigenwert2.htm) I get: Real Eigenvalues: { -3.414213562373095 ; -2 ; -0.585786437626905 }

Eigenvectors:

for Eigenvalue -3.414213562373095: [ 1 ; -1.4142135623730954 ; 1 ]

for Eigenvalue -2: [ -1 ; 0 ; 1 ]

for Eigenvalue -0.585786437626905: [ 1 ; 1.4142135623730954 ; 1 ]

The eigenvalues match but the eigenvectors don't match.

Questions: 1. Is numpy scaling the eigenvectors?

  1. How can I show that (matrix dot eigenvector) is collinear to the eigenvector to prove that I have the correct eigenvector. I was thinking that dividing the dot product by the eigenvector would show a constant offset but that doesn't happen for eigenvector 2. That being said, eigenvector 2 doesn't really look correct to me when I compare numpy to the online calculator. Is this a correct way to show collinearity?
2
You are dividing dot2 by v2 element-wise. The second elements in dot2 and v2 are approximately 0 (I get approximately -2.6e-16 and 2.08e-17, resp.). If the computations were exact, those element would be exactly 0, and your division would not be defined. So you are dividing small numerical noise by small numerical noise, resulting in amplified noise.Warren Weckesser

2 Answers

2
votes

Numpy is correctly calculating the eigenvectors/values. You can check this by running (answer to Question 2):

print(np.dot(vecs,np.dot(np.diag(vals),vecs.T)) - matrix)
print(np.dot(vecs,vecs.T))

The first output tells you how closely your eigenvalue decomposition approximates the matrix. The second output shows that the eigenvectors are orthonormal. These two conditions satisfy the constraints/objective of eigenvalue decomposition.

Question 1: Yes, numpy normalizes the length of the eigenvectors to 1.

lengths = [print(la.norm(vecs[:,i])) for i in range(3)]

Note: You can also confirm that the eigenvectors from the two methods are equivalent by calculating their dot product:

print(np.dot(vecs[:,0],[ -1, 0, 1 ]))

is zeros, while

np.dot(vecs[:,0],[ 1, -1.4142135623730954, 1 ]) 

is 2, which is the multiplication of the length of two vectors.

1
votes

The documentation for numpy.linalg.eig states that it returns

The normalized (unit “length”) eigenvectors

So yes, the vectors are normalized.

The eigenvector for eigenvalue 2 has a value very close to zero. If you consider that to be zero, the answer is correct and is collinear with the eigenvector you got the other way.

One easy way to check if two 3D vectors are collinear is to take their cross product. If the resulting vector is the zero vector or very close to it, the vectors are collinear.