0
votes

Im having a bit of trouble with getting a depth value that I'm storing in a Float texture (or rather i don't understand the values). Essentially I am creating a deffered renderer, and in one of the passes I am storing the depth in the alpha component of a floating point render target. The code for that shader looks something like this

Define the clip position as a varying

 varying vec4 clipPos;

... In the vertex shader assign the position

clipPos = gl_Position;

Now in the fragment shader I store the depth:

 gl_FragColor.w = clipPos.z / clipPos.w;

This by and large works. When I access this render target in any subsequent shaders I can get the depth. I.e something like this:

float depth = depthMap.w;

Am i right to assume that 0.0 is right in front of the camera and 1 is in the distance? Because I am doing some fog calculations based on this but they don't seem to be correct.

fogFactor = smoothstep( fogNear, fogFar, depth );

fogNear and fogFar are uniforms I send to the shader. When the fogNear is set to 0, I would have thought I get a smooth transition of fog from right in front of the camera to its draw distance. However this is what I see: enter image description here

When I set the fogNear to 0.995, then I get something more like what Im expecting:

enter image description here

Is that correct, it just doesn't seem right to me? (The scale of the geometry is not really small / too large and neither is the camera near and far too large. All the values are pretty reasonable)

1
what are your zNear and zFar set to in your perspective matrix?gman
You might find this answer useful stackoverflow.com/a/21106656/128511gman

1 Answers

3
votes

There are two issues with your approach:

  1. You assume the depth is in the range of [0,1], buit what you use is clipPos.z / clipPos.w, which is NDC z coord in the range [-1,1]. You might be better of by directly writing the window space z coord to your depth texture, which is in [0,1] and will simply be gl_FragCoord.z.

  2. The more serious issue that you assume a linear depth mapping. However, that is not the case. The NDC and window space z value is not a linear representation of the distance to the camera plane. It is not surprisinng that anything you see in the screenshot is very closely to 1. Typical, fog calculations are done in eye space. However, since you only need the z coord here, you simply could store the clip space w coordinate - since typically, that is just -z_eye (look at the last row of your projection matrix). However, the resulting value will be not in any normailized range, but in [near,far] that you use in your projection matrix - but specifying fog distances in eye space units (which normally are indentical to world space units) is more intuitive anyway.