1
votes

I'm trying to port an old SDL game I wrote some time ago to 3.3+ OpenGL and I'm having a few issues getting a proper orthogonal matrix and it's corresponding view.

    glm::mat4 proj = glm::ortho( 0.0f, static_cast<float>(win_W), static_cast<float>(win_H), 0.0f,-5.0f, 5.0f);

If I simply use this projection and try to render say a quad inside the boundaries of win_W and win_H, it works. However, if instead I try to simulate a camera using:

   glm::mat4 view = glm::lookAt(
    glm::vec3(0.0f, 0.0f, 1.0f),//cam pos
    glm::vec3(0.0f, 0.0f, 0.0f),//looking at
    glm::vec3(0.0f, 0.0f, 1.0f)//floored
);

I get nothing. Even when positioning the quad in the center, same if instead I center the view:

  glm::mat4 view = glm::lookAt(
    glm::vec3(static_cast<float>(win_W), static_cast<float>(winH), 1.0f),//cam pos
    glm::vec3(static_cast<float>(win_W), static_cast<float>(winH), 0.0f),//looking at
    glm::vec3(0.0f, 0.0f, 1.0f)//floored
);

Considering that SDL usually subtracts the camera values from the vertices in the game to simulate a view, is it simply better to replace my MVP operation in the shader like this:

#version 150

in vec2 position;
in vec3 color;

out vec3 Color;

uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
uniform mat4 camera;

void main()
{
Color = color;
 //faulty view
//gl_Position = projection *view * model * vec4(position, 0.0, 1.0);
//new SDL style camera
  mat4 newPosition = projection * model * vec4(position, 0.0, 1.0);
 gl_Position = newPosition - camera;
} 

Unfortunately the source I'm learning from (ArcSynthesis Book) doesn't cover a view that involves a coordinate system other than the default NDC. And for some reason even today most discussion about opengl is about deprecated versions. So in short, whats the best way of setting up the view to an orthogonal projection matrix for simple 2D rendering in modern opengl?

1
I wouldn't bother using lookAt for 2D rendering; a translation and scale matrix is much simpler and works just as well.Colonel Thirty Two
I've been playing about with the same problem off and on for the last few days. Is there any real benefit from moving from say OpenGL 2.1 to OpenGL 3.x for 2D games?Zammalad

1 Answers

1
votes

Your lookup call actually does not make sense at all, since your up vector (0,0,1) is colinear to the viewing direction (0,0,-1). This will not produce a useable matrix.

You probably want (0,1,0) as up vector. However, even if you change that, you might be surprised about the result. In combination with the projection matrix you set up, the point you specified as the lookat target will not appear in the center, but at the corner of the screen. Your projection does not map (0,0) to the center, which one typically assumes when using some LookAt function.

I agree to Colonel Thirty Two's comment: Don't use LookAt in this scenario, unless you have a very good reason to.