13
votes

I have two textures with different coordinates and sizes in my fragment shader:

varying highp vec2 v_currentTextureCoords;
varying highp vec2 v_backgroundTextureCoords;
uniform sampler2D u_currentTexture;
uniform sampler2D u_backgroundTexture;

void main()
{
    vec4 currentColor = texture2D(u_currentTexture, v_currentTextureCoords);
    vec4 backgroundColor = texture2D(u_backgroundTexture, v_backgroundTextureCoords);

    gl_FragColor = backgroundColor;
}

Here is the corresponding vertex shader:

attribute vec4 a_position;
attribute vec2 a_currentTextureCoords;
attribute vec2 a_backgroundTextureCoords;
varying vec2 v_currentTextureCoords;
varying vec2 v_backgroundTextureCoords;

void main()
{
    gl_Position = a_position;
    v_currentTextureCoords = a_currentTextureCoords;
    v_backgroundTextureCoords = a_backgroundTextureCoords;
}

Those shaders are responsible for rendering u_currentTexture.

EDIT : updated post with my real application (Android app) and problem.

As you can read above, the two textures are:

  • u_backgroundTexture: it's a video stream, full screen, size is 1080x1920
  • u_currentTexture: it can be any image, size is 469x833 (smaller but same ratio).

To make it simple, for now, I don't want to blend anything, but just display the pixels of u_backgroundTexture in the shader program of u_currentTexture.

Application Screenshot - Texture not scaled

As you can see, the rendered image with the shaders above (top left corner, not the whole image), is the same as the background image, but scaled down to fit in a smaller rectangle. That's not what I want.

I want to display the pixels which are "behind" u_currentTexture (those of u_backgroundTexture), so in the end, one wouldn't even notice there are two textures.

But since the textures have different sizes and coordinates, it doesn't give this result at all (for now, it's what you see above).

Then, in my fragment shader, I managed to "scale" the texture so the image in the top left corner has the same "zoom" as the background image:

Application Screenshot - Texture scaled

To do this, I modified my fragment shader:

varying highp vec2 v_currentTextureCoords;
varying highp vec2 v_backgroundTextureCoords;
uniform sampler2D u_currentTexture;
uniform sampler2D u_backgroundTexture;
uniform vec2 u_scaleRatio; // Notice here

void main()
{
    vec4 currentColor = texture2D(u_currentTexture, v_currentTextureCoords);
    vec4 backgroundColor = texture2D(u_backgroundTexture, v_backgroundTextureCoords * u_scaleRatio); // And here

    gl_FragColor = backgroundColor;
}

I set u_scaleRatio in my program with glUniform2fv(). The values are basically (pseudo-code):

u_scaleRatio = vec2(currentTextureWidth / backgroundTextureWidth, currentTextureHeight / backgroundTextureHeight);

As you can see, it's almost working, but it looks like there is an offset on the X axis: the rendered image in the top left corner is actually what we see in the top right corner... I can't find a way to correct it.

How can I modify my shaders so I can correct this offset, knowing that the textures have different sizes and coordinates?

EDIT 2 :

To answer a comment below, this is a screenshot with

vec4 backgroundColor = texture2D(u_backgroundTexture, v_currentTextureCoords);

Application screenshot

and with:

vec4 backgroundColor = texture2D(u_backgroundTexture, v_currentTextureCoords * u_scaleRatio);

enter image description here

EDIT 3 : to reply to this answer, here is a screenshot with:

uniform vec2 texelSize; // (1.0/windowResolution.x, 1.0/windowResolution.y)

vec2 uv = gl_FragCoord.xy * texelSize;
vec4 backgroundColor = texture2D(u_backgroundTexture, uv);

I understand what you're trying to do, I think it would have worked if vertex position and texture coordinates were the same for both textures but... they're not. Do you have an idea how to fix it?

enter image description here

1
You could post your result with different size textures.eldo
I don't think your fragment shader gives you correct colors. The result of multiplying red and blue color is black color. vec4(1, 0, 0, 1) * vec4(0, 0, 1, 1) = vec4(0, 0, 0, 1) You should use another blending method like mix function.MarGenDo
@eldo I added more explanations and a screenshot in my original post.GuiTeK
Can you please provide code for calculating a_currentTextureCoords? That is probably where the distortion comes from.MarGenDo
Textures don't really have sizes in the sense that you think they do here. Those texture lookup functions take normalized coordinates (0-1 is the same if the texture is 256x256 or 16x16). You can always clamp the coordinates into some range and then if the coordinates fall out of said range, treat the texture as translucent. That's effectively what border texels (deprecated) are.Andon M. Coleman

1 Answers

0
votes

If I understand your problem correctly, you should be fine by calculating

uniform vec2 texelSize; // (1.0/windowResolution.x, 1.0/windowResolution.y)

vec2 uv = gl_FragCoord.xy * texelSize;
vec4 backgroundColor = texture2D(u_backgroundTexture, uv);

The built-in variable gl_FragCoord.xy "contains the window relative coordinate (x, y, z, 1/w) values for the fragment" (https://www.opengl.org/sdk/docs/man/html/gl_FragCoord.xhtml). Since you basically need the fragment's texture coordinates as if it was part of a full-screen quad, this should give you the expected results. You only have to pass the relative size of a full-screen texel as a uniform (look in the commented part for the calculation).