I have a camera that needs to orbit locally around an object. This object has an arbitrary rotation, described by a normal vector. Imagine a spherical planet, with a camera looking down at a certain triangle on that planet.
My current implementation is to use the classic vector crossing method to generate a rotation matrix from the triangle's normal, then use that matrix as the basis for the standard orbit camera. This works fine near the equator of the planet, but once it gets near the poles, it starts blowing up, with the camera behaving increasingly erratically the closer that it gets to the very center of the pole.
I've determined that this is due to the first vector cross, as the two vectors are close to one another in that case - I'm not sure what the technical name for the phenomena is. If first vector is 0,1,0, the craziness happens when the normal is close to 0, 1, 0 or 0, -1, 0.
I've found quite a few descriptions of this problem, but no working solutions. The closest I've come was here: http://xboxforums.create.msdn.com/forums/p/13278/13278.aspx It mentions that to handle the 'singularity', use a different vector when it is detected. I can easily determine when the camera is on planet face that will cause this to happen (as my planet sphere is generated from 6 quadtrees projected to spherical coordinates), but there is a very noticeable snap when I switch to a new vector.
Here's the current code:
Vector3 triNormal; //the current normal of the target vertex
Vector3 origin = Vector3.Forward;
Matrix orientation.Forward = origin;
orientation.Up = triNormal;
orientation.Right = Vector3.Cross(orientation.Up, orientation.Forward);
orientation.Right.Normalize();
orientation.Forward = Vector3.Cross(orientation.Right, orientation.Up);
orientation.Forward.Normalize();
I've experimented with detecting when triNormal is on one of the pole faces, and setting 'origin' to something else such as Right. The camera then behaves properly once it is on the face, but is immediately snapped to a new rotation as it crosses over. This makes sense, as its reference vector has just changed, but needs to be eliminated for a smooth user experience. I tried figuring out how to offset the camera's yaw for the orbit camera to counteract the new coordinate system, but it doesn't seem to be a constant value, depending on where on the sphere the camera is currently aiming. I'm not sure how I could calculate what the difference is.
Also note that as it's in XNA and C#, I'm using a right-hand coordinate system.