12
votes

I have 2 rotation matrices (lets call them A and B) where:

A = 1  0  0
    0  0 -1
    0  1  0

and

B = -1  0  0
     0  0 -1
     0 -1  0

This is basically just a rotation where the camera spins around to look behind itself. Obviously I can't just interpolate the values in the matrices directly because it looks weird. I have tried converting the matrices to Euler angles which yields 2 sets of X,Y,Z angles and trying to determine which angles to use based on the minimum distance between each component of the X,Y,Z angle. That definitely results in the kind of rotation I want but I can't think of a decent way to determine which angles to interpolate between because sometimes the sets of angles which result in the least error result in a rotation about the wrong axis/axes. I also tried quaternions but that essentially gave me the same result. Can anyone point me in the right direction?

2

2 Answers

25
votes

Use quaternions (SLERP). Neither rotation matrices nor Euler angles are appropriate for interpolation.

See 45:05 here (David Sachs, Google Tech Talk).

10
votes

My personal opinion is that using quaternions for this type of thing makes more sense. That said, you can do it without using quaternions.

The thing to notice is that the "difference" matrix, that is, the matrix which takes "orientation" A into "orientation" B can be calculated by T = A.tranpose() * B (considering you are multiplying on the right). Once you have the rotation matrix T, you can convert to Axis-Angle representation (see for instance http://en.wikipedia.org/wiki/Axis-angle_representation).

Finally, since you know a rotation axis which takes A to B, you can linearly interpolate the angles from zero to the angle previously calculated from T.

This is equivalent to using SLERP.