2
votes

Over the past few days, I have stumbled upon a particularly tricky bug. I have reduced my code down to a very simple and direct set of examples.

This is the processing code I use to call my shaders:

PGraphics test;
PShader testShader;
PImage testImage;

void setup() {
  size(400, 400, P2D);
  testShader = loadShader("test.frag", "vert2D.vert");
  testImage = loadImage("test.png");
  testShader.set("image", testImage);
  testShader.set("size", testImage.width);
  shader(testShader);
}

void draw() {
  background(0, 0, 0);
  shader(testShader);
  beginShape(TRIANGLES);
  vertex(-1, -1, 0, 1);
  vertex(1, -1, 1, 1);
  vertex(-1, 1, 0, 0);
  vertex(1, -1, 1, 1);
  vertex(-1, 1, 0, 0);
  vertex(1, 1, 1, 0);
  endShape();
}

Here is my vertex shader:

attribute vec2 vertex;
attribute vec2 texCoord;
varying vec2 vertTexCoord;
void main() {
    gl_Position = vec4(vertex, 0, 1);
    vertTexCoord = texCoord;
}

When I call this fragment shader:

uniform sampler2D image;
varying vec2 vertTexCoord;
void main(void) {
    gl_FragColor = texture2D(image, vertTexCoord);
}

I get this:
enter image description here
This is the expected result. However, when I render the texture coordinates to the red and green channels instead with the following fragment shader:

uniform sampler2D image;
uniform float size;
varying vec2 vertTexCoord;
void main(void) {
    gl_FragColor = vec4(vertTexCoord, 0, 1);
}

I get this:

enter image description here

As you can see, a majority of the screen is black, which would indicate that at these fragments, the texture coordinates are [0, 0]. This can't be the case though, because when passed into the texture2D function, they are correctly mapped to the corresponding positions in the image. To verify that the exact same values for texture coordinates were being used in both of these cases, I combined them with the following shader.

uniform sampler2D image;
uniform float size;
varying vec2 vertTexCoord;
void main(void) {
    gl_FragColor = texture2D(image, vertTexCoord) + vec4(vertTexCoord, 0, 0);
}

This produced:
enter image description here

Which is exactly what you would expect if the texture coordinates did smoothly vary across the screen. So I tried a completely black image, expecting to see this variation more clearly without the face. When I did this, I got the image with the two triangles again. After playing around with it some more, I found that if I have an entirely black image except with the top left pixel transparent, I get this:

enter image description here

Which is finally the image I would expect with smoothly varying coordinates.This has completely stumped me. Why does the texture lookup work properly but rendering the actual coordinates gives me mostly junk?

EDIT:
I found a solution which I have posted but am still unsure why the bug exists in the first place. I came across an interesting test case that might provide a little more information about why this is happening.

Fragment Shader:

varying vec2 vertTexCoord;
void main(void) {
    gl_FragColor = vec4(vertTexCoord, 0, 0.5);
}

Result:
enter image description here

1
what gfx card and driver you have (can be a bug on driver side or "feature")? what for is the size ? my bet is the driver optimize out some interpolation of texture coords when you do not use access to texture. Try a silly thing use texture2D(image, vertTexCoord) if frag coordinates are 0,0 if it helps.Spektre
I was using gl_FragCoord / size instead of texCoords in some previous tests to see if that had the same behavior. I left it in so I could switch back and fourth between those tests and compare them. I tried your example and other variations of it and I still got the image with the two triangles. I did, however, find a particularly puzzling solution which I have posted. It seems to indicate the origin of the bug is within Processing's internal APIs and not GLSL, though I don't understand how the nature of the bug seems to change mid-pipeline.Joshua Dotson

1 Answers

1
votes

I have found two different solutions. Both involve changes in the processing code. I have no idea how or why these changes make it to work.

Solution 1:
Pass down screen space coordinates instead of clip space coordinates and use the transform matrix generated by processing to convert those into clip space in the vertex shader.
Processing code:

PGraphics test;
PShader testShader;
PImage testImage;

void setup() {
  size(400, 400, P2D);
  testShader = loadShader("test.frag", "vert2D.vert");
  testImage = loadImage("test.png");
  testShader.set("image", testImage);
  testShader.set("size", testImage.width);
  shader(testShader);
}

void draw() {
  background(0, 0, 0);
  shader(testShader);
  beginShape(TRIANGLES);
  //Pass down screen space coordinates instead.
  vertex(0, 400, 0, 1);
  vertex(400, 400, 1, 1);
  vertex(0, 0, 0, 0);
  vertex(400, 400, 1, 1);
  vertex(0, 0, 0, 0);
  vertex(400, 0, 1, 0);
  endShape();
}

Vertex Shader:

attribute vec2 vertex;
attribute vec2 texCoord;
uniform mat4 transform;
varying vec2 vertTexCoord;
void main() {
    //Multiply transform matrix.
    gl_Position = transform * vec4(vertex, 0, 1);
    vertTexCoord = texCoord;
}

Result:
enter image description here
Notice the line through the center of the screen. This is because we haven't called noStroke() in the processing code. Still, texture coordinates are interpolated properly.

Solution 2:
If we just call noStroke() in the setup, we can pass the clip space coordinates down without any issues and everything works exactly as expected. No shader changes needed.

PGraphics test;
PShader testShader;
PImage testImage;

void setup() {
  size(400, 400, P2D);
  //Call noStroke()
  noStroke();
  testShader = loadShader("test.frag", "vert2D.vert");
  testImage = loadImage("test.png");
  testShader.set("image", testImage);
  testShader.set("size", testImage.width);
  shader(testShader);
}

void draw() {
  background(0, 0, 0);
  shader(testShader);
  beginShape(TRIANGLES);
  vertex(-1, -1, 0, 1);
  vertex(1, -1, 1, 1);
  vertex(-1, 1, 0, 0);
  vertex(1, -1, 1, 1);
  vertex(-1, 1, 0, 0);
  vertex(1, 1, 1, 0);
  endShape();
}

Result:
enter image description here
Pretty easy fix. How this one change manages to affect the way the texture coordinates are interpolated/not interpolated in the fragment shader is beyond me.

To anyone that is maybe a little more familiar with how processing wraps OpenGL that might have insight on why these bugs exist, I'd be interested to know.