1
votes

I want to compute the camera pitch and yaw values based on position and target points camera gets when it is instantiated. If I initialize pitch to 0 and yaw to -90 (as in LearnOpenGL Camera Tutorial), when the first rotate occurs, camera suddenly jumps; after that rotation works correctly. So, firstly, starting from the equations given in that tutorial I tried to get pitch and yaw values from forward vector:

float pitch = glm::degrees(glm::asin(forward.y));
float yaw = glm::degreees(glm::acos((float)x/glm::cos(glm::asin(y))));

But also

float yaw = glm::degreees(glm::asin((float)z/glm::cos(glm::asin(y))));

And cos(sin(y)) should not be 0, so y should not be 0 or -1. I tried to implement those but the 2 values of yaw are not the same, also first value of yaw seems to be what I am looking for, but pitch does not. After that, I tried a simple approach, knowing that pitch is the angle between forward vector and y-axis, and yaw is the angle between forward vector and x-axis, I tried to compute (on paper) the following:

const glm::vec3 yUnit(0, 1, 0);
const glm::vec3 xUnit(1, 0, 0);
float pitch = glm::degrees(glm::acos(glm::dot(forward, yUnit)));
float yaw = glm::degrees(glm::acos(glm::dot(forward, xUnit)));

With the following 2 inputs: position = (0, 0, 0), target = (0, 1, 2,5), forward = target - position = (0, 1, 2.5), normalised forward ~ (0, 0.37, 0.926), the results are pitch ~ 68 degrees and yaw = 90 degrees. Also, i printed pitch and yaw values inside the application and the expected values should be pitch = -20 and yaw = -90.

Can you explain me if I’m wrong, why and where I made the mistake?

2

2 Answers

1
votes

Can't point at any single line to change, but suggestions:

when the first rotate occurs, camera suddenly jumps

This is a sign that something is not being initialized to what you think it is. In this case, the pitch and yaw are derived values that can be calculated from the forward vector, right? When you have derived values you should never initialise them directly, because there's a chance you'll get it wrong. If two values that are supposed to be "the same" are different, weird things will happen. Instead initialise the forward vector and immediately calculate pitch and yaw from it.

And cos(sin(y)) should not be 0

We all make this mistake from time to time. But in this case I don't think it matters for reasons later on.

However, you might want to test for what happens when the forward vector is (0, 0, 0). It's surprisingly common to somehow get an all zero vector in graphics programming.

float yaw = glm::degreees(glm::acos((float)x/glm::cos(glm::asin(y))));

Have you decided what kind of Euler angle representation you are using? In the simple case you never need to use more than one single axis coordinate value to calculate an angle, but here you are using two.

The simple case is where yaw (heading), pitch, and roll are independent angles, so yaw doesn't change if pitch does and vice versa. Your last code block with the xUnit and yUnit vectors seems to be doing that.

However in flight simulators and aerospace calculations yaw - pitch - roll are a bit more complicated because they're not independent. The yaw angle might be measured in the plane of the pitch, not absolute XZ. And aerospace yaw is often measured from "north" the Z axis, not the X. So you need to be clear about what kind of yaw and pitch you are measuring, and be consistent throughout. And you need to study any textbook examples or code to figure out how they're using pitch - yaw - roll and if it's consistent with yours.

I suggest sticking to the simple single coordinate measures for now.

float pitch = glm::degrees(glm::acos(glm::dot(forward, yUnit)));

Again, are you sure forward is normalised? LookAt is usually coded to be more forgiving than math libraries, so it's an easy mistake to make.

And, have you checked what your math library does wit values outside -180 to 180 degrees? One more thing to worry about.

Hope this helps. If you find Euler angles to be fiddly and annoying, you are not alone! That's why many 3D books and tutorials recommend learning about quaternions.

1
votes

based on your expected values it seems like your pitch should vary between -90 and 90 degrees, so you should use asin instead of acos for pitch.

when calculating the yaw, you need to project the forward vector onto the xz plane. you can do this by setting the y component to zero and then normalizing it.

the yaw will need to vary between 0 and 360 degrees but acos only returns between 0 and 180 degrees. the yaw will be correct for the first 180 degrees, but as yaw increases from 180 to 360, the acos will decrease from 180 back to zero. when the dot product between the forward vector and (0,0,1) is greater than zero, the yaw will be greater than 180 degrees, and so the acos value should be adjusted by subtracting it from 360.

based on your expected values, my guess for the correct yaw and pitch values is

pitch = degrees(-asin(dot(forward, y_unit)))
forward.y = 0
forward = normalize(forward)
yaw = degrees(acos(dot(forward, x_unit)))
if(dot(forward, z_unit) > 0)
    yaw = 360 - yaw