I want to compute the camera pitch and yaw values based on position and target points camera gets when it is instantiated. If I initialize pitch to 0 and yaw to -90 (as in LearnOpenGL Camera Tutorial), when the first rotate occurs, camera suddenly jumps; after that rotation works correctly. So, firstly, starting from the equations given in that tutorial I tried to get pitch and yaw values from forward vector:
float pitch = glm::degrees(glm::asin(forward.y));
float yaw = glm::degreees(glm::acos((float)x/glm::cos(glm::asin(y))));
But also
float yaw = glm::degreees(glm::asin((float)z/glm::cos(glm::asin(y))));
And cos(sin(y)) should not be 0, so y should not be 0 or -1. I tried to implement those but the 2 values of yaw are not the same, also first value of yaw seems to be what I am looking for, but pitch does not. After that, I tried a simple approach, knowing that pitch is the angle between forward vector and y-axis, and yaw is the angle between forward vector and x-axis, I tried to compute (on paper) the following:
const glm::vec3 yUnit(0, 1, 0);
const glm::vec3 xUnit(1, 0, 0);
float pitch = glm::degrees(glm::acos(glm::dot(forward, yUnit)));
float yaw = glm::degrees(glm::acos(glm::dot(forward, xUnit)));
With the following 2 inputs: position = (0, 0, 0), target = (0, 1, 2,5), forward = target - position = (0, 1, 2.5), normalised forward ~ (0, 0.37, 0.926), the results are pitch ~ 68 degrees and yaw = 90 degrees. Also, i printed pitch and yaw values inside the application and the expected values should be pitch = -20 and yaw = -90.
Can you explain me if I’m wrong, why and where I made the mistake?