1
votes

I see problems related to accuracy of interpolation by shaders. I get a distorted blue line as shown below due to accuracy of interpolation of varying variable (height in this case) by shaders. How do I fix it?

enter image description here

Here is how I feed the data to shaders:

    uint8_t * oglData = CVPixelBufferGetBaseAddress(_pixelBuffer);  

    for (NSInteger i = 0; i < 256; i++) {

       /* Only first two columns blue in BGRA 256x1 buffer */     
        if (i < 2) {
            oglData[4 * i] =   255;
            oglData[4 * i + 1] =  0;
            oglData[4 * i +2]   = 0;
            oglData[4 * i +3] = 1;
        } else {
            oglData[4 * i] =   0;
            oglData[4 * i + 1] =  0;
            oglData[4 * i +2]   = 0;
            oglData[4 * i +3] = 1;
        }
    }
}

And here are shaders:

Vertex shader:

   attribute vec4 position;
   attribute vec4 inputTextureCoordinate;

   varying vec2 textureCoordinate;
   varying float height;

   void main()
   {
     gl_Position = position;                                                          
     textureCoordinate = vec2(inputTextureCoordinate.x, 0.0);
     height = inputTextureCoordinate.y;
   }

Fragment Shader:

   varying highp vec2 textureCoordinate;
   varying highp float height;

   uniform sampler2D inputImageTexture;
   uniform lowp vec4 backgroundColor;

   void main() {
     lowp vec3 colorChannels = texture2D(inputImageTexture, textureCoordinate).rgb;
                                                               //  
    if (colorChannels.b >= height) {                
                 gl_FragColor = vec4(0.0, 0.0, 1.0, 1.0);
    } else {                                                      
        gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0);                                                  
    }

  }

Here is the drawing code:

glActiveTexture( GL_TEXTURE0 );
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 256, 1, 0, GL_RGBA, GL_UNSIGNED_BYTE, oglData);

glUniform1i( _inputImageTexture, 0 );

// Set texture parameters
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );

glVertexAttribPointer( ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, squareVertices );
glEnableVertexAttribArray( ATTRIB_VERTEX );

glVertexAttribPointer( ATTRIB_TEXTURECOORDINATE, 2, GL_FLOAT, 0, 0, textureCoordinates );
glEnableVertexAttribArray( ATTRIB_TEXTURECOORDINATE );

glDrawArrays( GL_TRIANGLE_STRIP, 0, 4 );

glBindRenderbuffer( GL_RENDERBUFFER, _colorBufferHandle );
[_oglContext presentRenderbuffer:GL_RENDERBUFFER];

EDIT: Changing GL_LINEAR to GL_NEAREST fixes the issue, which means varying vars are interpolated based on the type of min & mag filters in texture parameters? But using GL_Nearest creates non-smooth curves.

1

1 Answers

2
votes

How do I fix it?

You can't (at least not directly) - this is fixed function hardware inside the GPU, so you'll just have to live with it.

Changing GL_LINEAR to GL_NEAREST fixes the issue, which means varying vars are interpolated based on the type of min & mag filters in texture parameters?

No - varying interpolation has nothing to do with the texture min/mag filters.

Your issue is that the inaccuracy in the varying interpolation is causing the sample point input into your texture() function call to drift away from the texel center, so the GL_LINEAR texture filtering starts to have an effect. By using GL_NEAREST you effectively quantize the sampling points, so as long as the error is less than half a texel wide you snap back to the texel you want.