I'm trying to use glm::unproject() to convert my SDL mouse coordinates into a world position vector, on the x/z-plane. Basically I want to figure out which "x/z" coordinate the user clicked on with a mouse.
From other stack overflow answers I came up needing to call glm::unproject(). I think I'm passing it the wrong arguments, because the values I'm getting back for the world position (printed std std::cerr) aren't world position values as I would expect.
Am I constructing the arguments to glm::unproject() correctly below? Specifically should I be combing the camera's world position and the view matrix (computed using glm::lookAt) to compute the modelview matrix passed into glm::unproject?
struct Dimensions {
int x, y, w, h;
};
glm::mat4
Camera::view_matrix() const
{
// VIEW matrix is created by looking at some target member
auto const& target = target_->translation;
auto const position_xyz = world_position();
glm::vec3 const UP{0, 1, 0};
return glm::lookAt(position_xyz, target, UP);
}
glm::mat4
Camera::projection_matrix() const
{
auto const fov = glm::radians(90.0f);
return glm::perspective(fov, 4.0f/3.0f, 0.1f, 200.0f);
}
glm::vec3
calculate_worldpos(Camera const& camera, int const mouse_x, int const mouse_y)
{
float const width = 1024.0f, height = 768.0f;
glm::vec4 const viewport = glm::vec4(0.0f, 0.0f, width, height);
glm::mat4 const modelview = camera.view_matrix();
glm::mat4 const projection = camera.projection_matrix();
float z = 0.0;
glm::vec3 screenPos = glm::vec3(mouse_x, height - mouse_y - 1, z);
std::cerr << "screenpos: xyz: '" << glm::to_string(screenPos) << "'\n";
glm::vec3 worldPos = glm::unProject(screenPos, modelview, projection, viewport);
std::cerr << "worldpos: xyz: '" << glm::to_string(worldPos) << "'\n";
return worldPos;
}
In the image below, I have the follow setup.
camera lookAt target = (0, 0, 0)
camera world position = (-0.009, 5.107, -0.368)
(mouse_x, mouse_y, mouse_z) = (286, 393, 0)
If you look at the image below, you can see that my mouse is hovering over the world position (3, 0, 0) as shown by the grid. I would expect calculating the world position of my mouse (as shown in the picture) would return me the vector (3, 0, 0). It does not, instead I get the vector: (0.049, 5.007, -0.360).
Does anyone see where I might be going wrong? I'm assuming I'm making some kind of incorrect assumption somewhere.