9
votes

I created a class that renders videoframes (on Mac) to a custom framebuffer object. As input I have a YUV texture, and I successfully created a fragment shader, which takes as input 3 rectangle textures (one for Y, U, and V planes each, the data for which is uploaded by glTexSubImage2D using GL_TEXTURE_RECTANGLE_ARB, GL_LUMINANCE and GL_UNSIGNED_BYTE), before rendering I set the active textures to three different texture units (0, 1 and 2) and bind a texture for each, and for performance reasons I used GL_APPLE_client_storage and GL_APPLE_texture_range. Then I rendered it using glUseProgram(myProg), glBegin(GL_QUADS) ... glEnd().

That worked fine, and I got the expected result (aside from a flickering effect, which I guess has to do with the fact that I used two different GL contexts on two different threads, and I suppose they get into each other's way at some point [that's topic for another question later]). Anyway, I decided to further improve my code by adding a vertex shader as well, so that I can skip the glBegin/glEnd - which I read is outdated and should be avoided anyway.

So as a next step I created two buffer objects, one for the vertices and one for the texture coordinates:

      const GLsizeiptr posSize = 4 * 4 * sizeof(GLfloat);
      const GLfloat posData[] =
      {
            -1.0f, -1.0f, -1.0f, 1.0f,
             1.0f, -1.0f, -1.0f, 1.0f,
             1.0f,  1.0f, -1.0f, 1.0f,
            -1.0f,  1.0f, -1.0f, 1.0f
        };

        const GLsizeiptr texCoordSize = 4 * 2 * sizeof(GLfloat);
        const GLfloat texCoordData[] =
        {
            0.0, 0.0,
            1.0, 0.0,
            1.0, 1.0,
            0.0, 1.0
        };

    glGenBuffers(1, &m_vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, m_vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, posSize, posData, GL_STATIC_DRAW);

    glGenBuffers(1, &m_texCoordBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, m_texCoordBuffer);
    glBufferData(GL_ARRAY_BUFFER, texCoordSize, texCoordData, GL_STATIC_DRAW);

Then after loading the shaders I try to retrieve the locations of the attributes in the vertex shader:

        m_attributeTexCoord = glGetAttribLocation( m_shaderProgram, "texCoord");
        m_attributePos = glGetAttribLocation( m_shaderProgram, "position");

which gives me 0 for texCoord and 1 for position, which seems fine.

After getting the attributes I also call

        glEnableVertexAttribArray(m_attributePos);
        glEnableVertexAttribArray(m_attributeTexCoord);

(I am doing that only once, or does it have to be done before every glVertexAttribPointer and glDrawArrays? Does it need to be done per texture unit? or while my shader is activated with glProgram? Or can I do it just anywhere?)

After that I changed the rendering code to replace the glBegin/glEnd:

        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_RECTANGLE_ARB, texID_Y);

        glActiveTexture(GL_TEXTURE1);
        glBindTexture(GL_TEXTURE_RECTANGLE_ARB, texID_U);

        glActiveTexture(GL_TEXTURE2);
        glBindTexture(GL_TEXTURE_RECTANGLE_ARB, texID_V);

        glUseProgram(myShaderProgID);

        // new method with shaders and buffers
        glBindBuffer(GL_ARRAY_BUFFER, m_vertexBuffer);
        glVertexAttribPointer(m_attributePos, 4, GL_FLOAT, GL_FALSE, 0, NULL);
        glBindBuffer(GL_ARRAY_BUFFER, m_texCoordBuffer);
        glVertexAttribPointer(m_attributeTexCoord, 2, GL_FLOAT, GL_FALSE, 0, NULL);
        glDrawArrays(GL_QUADS, 0, 4);

        glUseProgram(0);

But since changing the code to this, I always only ever get a black screen as result. So I suppose I am missing some simple steps, maybe some glEnable/glDisable or setting some things properly - but but like I said I am new to this, so I haven't really got an idea. For your reference, here is the vertex shader:

#version 110
attribute vec2 texCoord;
attribute vec4 position;

// the tex coords for the fragment shader
varying vec2 texCoordY;
varying vec2 texCoordUV;

//the shader entry point is the main method
void main()
{   
    texCoordY = texCoord;
    texCoordUV = texCoordY * 0.5; // U and V are only half the size of Y texture
    gl_Position = gl_ModelViewProjectionMatrix * position;
}

My guess is that I am missing something obvious here, or just don't have a deep enough understanding of the ongoing processes here yet. I tried using OpenGLShaderBuilder as well, which helped me get the original code for the fragment shader right (this is why I haven't posted it here), but since adding the vertex shader it doesn't give me any output either (was wondering how it could know how to produce the output, if it doesn't know the position/texCoord attributes anyway?)

1
For texture rectangles the coordinates are not supposed to be normalized, but you appear to be passing in normalized coordinates.harold
Oh so for the texCoordData[] instead of 0.0, 1.0 etc I pass in for example 1920.0, 1080.0 (ie actual size of video frame) ?Bjoern
The black screen of death is so common in graphics. I presume you tried changing the clear color just to ensure that you're actually drawing anything? If you are, then it's a shader issue, if not, vertex array issue.imallett
Ok, yes my clear color was a horrible yellow :-) I just figured out my two problems: 1. I had to change the texture coords to be not normalized (thanks harold!), and 2. glEnableVertexAttribArray seems to have been called in the wrong place before. Now I finally got it to work - thanks everyone!Bjoern

1 Answers

2
votes

I haven't closely studied every line, but I think your logic is mostly correct. What I see missing is glEnableVertexAttribArray. You need to enable both vertex attributes before the glDrawArrays call.