0
votes

I am finally graduating to working with my own matrices in OpenGL and am having a bit of an issue setting up an orthographic projection. At the moment, I have a method which takes the left, right, top, and bottom and throws them into a matrix like this (from here):

2(right-left)   0               0                   -right + (left * right) - left
0               2(top-bottom)   0                   -top + (top * bottom) - bottom
0               0               -2(farVal-nearVal)  -far + (far * near) - far
0               0               0                   1

I then multiply that matrix by the projection matrix, and I get absolutely nothing that makes sense as an output. This leads me to believe I am 1). not using the proper orthographic matrix, 2). Not multiplying the orthographic matrix by the correct matrix, or 3). both.

I doubt it is 2 because whenever I worked with the deprecated functions, glOrtho was called when the current matrix was GL_PROJECTION. So how exactly do you calculate the orthographic matrix from the left, right, top, and bottom values?

1
I think you need 2 / width, 2 / height (you seem to be multiplying rather than dividing here unless I've missed something).Robinson
I feel stupid now.. I finally found a wiki page on the ortho projection matrix and that is one of the problems. The other I found is what daten wolf said.CoderTheTyler

1 Answers

1
votes

Well, technically you don't have to require your ortho matrix with something. Using the fixed function pipeline you'd normally load a identity matrix on the projection matrix stack, so multiplying the ortho matrix with that would leave just the ortho matrix on the projection matrix stack.

In your case, whatever you multiply the ortho with, it's probably wrong (i.e. not identity). Suggestion: Don't multiply, just load it.