I'm trying to implement a nearest neighbor search for points using OpenGL and GLSL shaders. The NN calculation works correctly and the result is drawn into a texture of size 1024x1024 (using a viewport of the current screen size). The result simply contains a vec4 holding the position of the neighbor.
Now the important part is: The texel holding the vec4 is located exactly where the point is projected to (the point for which I am searching for neighbors). So in theory, to access the neighbor of an arbitrary point, I project its world location to screen coordinates and use these to access the texture (e.g. using texelFetch).
This works if I do the point projection in a vertex shader and by using gl_FragCoord to access the texture in my fragment program. But now I have a new situation where the points are only available in the fragment shader (accessed through a texture/buffer), and therefore I have to calculate the screen position manually.
I tried the following to calculate gl_FragCoord on my own, but it doesn't work (blank results only):
vec4 pointPos = ... //texture lookup
vec4 transformedPos = matProjectionOrtho * pointPos;
transformedPos.xy /= transformedPos.w;
transformedPos.xy = transformedPos.xy * 0.5f + 0.5f;
transformedPos.xy = vec2(transformedPos.x * textureWidth,
transformedPos.y * textureHeight);
The projection matrix matProjectionOrtho
is the same for all rendering passes, simply an orthogonal projection. textureWidth
and textureHeight
are the size of the texture holding the neighbor data (usually 1024x1024).
Is this calculation of the screen/texture position correct?