I have a OpenGL program which utilizes OpenGL 3.2 Core profile on Mac OS X and OpenGL ES 2.0 on iOS.
Part of my application renders point sprites by writing to gl_PointSize
in the vertex shader. Unfortunately it appears that gl_PointSize
renders at roughly 50x larger in OpenGL 3 than it does in OpenGL ES 2.0. The documentation for each API states that gl_PointSize
defines the number of pixels, so I am unsure why this would be the case. Is there perhaps a default OpenGL parameter that modified the output of gl_PointSize
? Is there anything else that may be causing the vast difference in size?
Each platform uses exactly the same shader (desktop has ARB_ES2 compatibility). I have also checked that all uniform inputs are identical and both render at the same resolution.
Outside of the shader, the only point sprite related call I make is glEnable(GL_VERTEX_PROGRAM_POINT_SIZE);
. On each platform independent of one another, I can adjust the point size just fine.