1
votes

I render animated geometry. In each frame, I want to texturemap the geometry with a screen-space-texture from the previous frame (projected onto the geometry as it was in the previous frame). so the result should be such as if the screen-space texture was projected onto the geometry one frame ago and then transformed by geometry animation to the current frame.

Calculating the proper texture coordinates per vertex is not difficult. In GLSL that's simply:

void main(void)
{
   vPos = currentMVP  * vec4(position,1);
   gl_Position = vPos;
   vec4 oldPos = previousMVP * vec4(position,1);
   vec2 UV = vec2(((oldPos.x/oldPos.w)+1)*0.5f, ((oldPos.y/oldPos.w)+1)*0.5f);
   ...
}

But getting the texturecoordinates interpolate correctly over the geometry is more tricky than I thoght. Normally, texturecoordinates for projection should be interpolated linearily in screenspace - so to achieve this one would multiply them by vPos.w in the vertexshader and divide them again by vPos.w in the fragmentshader. However, that's only correct if the texture is projected from the cameraview. In this case I need something else. I need an interpolation that attributes for forward-perspectivecorrect interpolation in the previous frame and backward-perspectivecorrect interpolation in the current frame.

This graphic illustrates three different cases: enter image description here

-Case A is simple. here i could leave the normal perspective corrected interpolation of the texturecoordinates (as performed by default by the rasterizer).

-in Case B however, I would need linear interpolation of the texturecoordinates to get the proper result (either by multiplying with vPos.w in vertexShader and divide by vPos.w in fragment shader. Or in newer GLSL versions by using the "noperspective" interpolation qualifier).

-and in Case C I would need perspective corrected interpolation, but according to the oldPos.w value. so I would have to linearize the interpolation of u'=(u/oldPos.w) and v'=(v/oldPos.w) by multiplying u' with currentPos.w in vertex and divide the interpolated value by currentPos.w in fragment. I would also need to linearily interpolate w'=(1/oldPos.w) in the same way and then calculate the final u'' in fragment by dividing the interpolated u' by the interpolated w' (and same for v'' respectively).

So - the question now is, what's the proper math to yeld the correct result in either case?

again, calculating the correct uv's for the vertices is not the problem. it's about achieving the correct interpolation over the triangles.

//maybe relevant: in the same pass I also want to do some regular texturing of the object using non projective, perspective corrected texturing. This means I must not alter the gl_Position.w value.

1
Please explain (briefly) Normally, texturecoordinates for projection should be interpolated linearily in screenspace and non projective, perspective corrected texturing.Stefan Hanke

1 Answers

2
votes
vec2 UV = vec2(((oldPos.x/oldPos.w)+1)*0.5f, ((oldPos.y/oldPos.w)+1)*0.5f);

Wrong. You need the W; you don't want to divide yet. What you want is this:

vec4 oldPos = previousMVP * vec4(position,1);
oldPos = clipToTexture * oldPos;
vec3 UV = oldPos.xyw;

The clipToScreen matrix is a 4x4 matrix that does the scale and translation needed to go from clip to screen space. That's what your scale of 0.5 and adding 1.0 were doing. Here, it's in matrix form; normally, you'd just left-multiply "previousMVP" with this, so it would all be a single matrix multiply.

In your fragment shader, you need to do projective texture lookups. I don't remember the GLSL 1.20 function, but I know the 1.30+ function:

vec4 color = textureProj(samplerName, UV.stp);

It is this function which will do the necessary division-by-W step.