0
votes

How are texture coordinates interpolated in a GLSL shader?

I'm trying to downsample an image from screen size to 1/4 its original size. This is a pretty simple procedure but it doesn't seem to be correct. I have an FBO, with a single colour attachment to use as a target that is 1/4 screen size.

In order to downsample the texture, I have a simple full-size quad that I'm going to draw, where coordinates are as follows:

v[0].v.position.x = -1.f; v[0].v.position.y = -1.f; v[0].v.position.z = 0.f;
v[1].v.position.x = -1.f; v[1].v.position.y =  1.f; v[1].v.position.z = 0.f;
v[2].v.position.x =  1.f; v[2].v.position.y = -1.f; v[2].v.position.z = 0.f;
v[3].v.position.x =  1.f; v[3].v.position.y =  1.f; v[3].v.position.z = 0.f;

Now, my vertex shader simply scales and biases the input coordinates in order to generate texture coordinates, passing through the vertices without need to transform them and without the need for me to pass texture coordinates in with the geometry, as follows:

#version 420

layout(location = 0) in vec3 attrib_Position;

out vec2 attrib_Fragment_Texture;

void main()
{
    attrib_Fragment_Texture = attrib_Position.xy * 0.5 + 0.5; 

    gl_Position = vec4(attrib_Position.xy, 0.0, 1.0);
}

So the texture coordinates output as *attrib_Fragment_Texture* will go from 0.0 to 1.0 as the vertex coordinates go from -1 to 1. A little test of this with the following shader however, seems to show that uv.x and uv.y only go from 0.0 to 0.25. I expected them to interpolate from 0.0 to 1.0!

#version 420

in vec2 attrib_Fragment_Texture;

out vec4 Out_Colour;

void main(void)
{
    vec2 uv = attrib_Fragment_Texture.xy;

    Out_Colour = vec4(uv.x, uv.y, 1.0, 1.0);
}

Can anyone spot what my obviously simple mistake/misunderstanding might be?

1

1 Answers

1
votes

I suspect you're rendering with the original viewport, which is 4x the size of your FBO, resulting in most of the primitive being clipped.