2
votes

I'm trying to mix raymarching and usual geometry rendering but I don't understand how to correctly compare the distance from the ray with the depth value I stored on my buffer.

This drawing will make it clear

On the raymarching side, I have rays starting from the eye (red dot). If you only render fragments at fixed ray distance 'T', you will see a curved line (in yellow on my drawing). I understand this because if you start from the ray origins Oa and Ob, and follow the direction Da and Db during T units (Oa + T * Da, and Ob + T * Db), you see that only the ray at the middle of the screen reaches the blue plane.

Now on the geometry side, I stored values directly from gl_FragCoord.z. But I don't understand why we don't see this curved effect there. I edited this picture in gimp, playing with 'posterize' feature to make it clear.

make it clear

We see straight lines, not curved lines.

I am OK with converting this depth value to a distance (taking into account of the linearization). But then when comparing both distance I've got problems on the side of my screen.

There is one thing I missing with projection and how depth value are stored... I supposed depth value was the (remapped) distance from the near plane following the direction of rays starting from the eye.

EDIT: It seems writing this question helped me a bit. I see now why we don't see this the curve effect on depth buffer: because the distance between Near and Far is bigger for the ray on the side of my screen. Thus even if the depth values are the same (middle or side screen), the distances are not.

Thus it seems my problem come from the way I convert depth to distance. I used the following:

float z_n = 2.0 * mydepthvalue - 1.0;
float z_e = 2.0 * zNear * zFar / (zFar + zNear - z_n * (zFar - zNear));

However, after the conversion I still don't see this curve effect that I need for comparing with my ray distance. Note this code doesn't take into account the FragCoord.x and .y, and that's weird for me...

1
In OpenGL, the surface of invariance depth is a plane perpendicular to the camera direction, not a spherical shell (this is called rectilinear projection). Therefore, the distance of a ray from the camera to a point on the plane varies hyperbolically with its offset distance From the camera axis.meowgoesthedog

1 Answers

2
votes

You're overcomplicating the matter. There is no "curve effect" because a plane isn't curved. The term z value litteraly describes what it is: the z coordinate in some euclidean coordinate space. And in such a space z=C will form a parallel to the plane spanned by the xy-Axes of that space, at distance C.

So if you want the distance to some point, you need to take the x and y coordinates into account as well. In eye space, the camera is typically at the origin, so distance to the camera boils down to length(x_e, y_e, z_e) (which is of course a non-linear operation which would create the "curve" you seem to expect).