0
votes

After my last post, when someone recommended me to use pBuffers, I digged a bit on Google and I found some cool examples to make Offscreen Rendering, using pbuffers. Some example, available on nVidia's website, does a simple offscreen rendering, which just renders on the pbuffer context, reads the pixels into an array and then calls the opengl functions to DrawPixels.

I changed this example, in order to create a texture from the pixels read - Render it offscreen, read the pixels to the array, and then initialize a texture with this colorbit array. But this looks very redundant to me - We render the image, we copy it from Graphical Card memory into our memory (the array), to later copy it back to the graphical card in order to display it on screen, but just in a different rendering context. It looks kinda stupid the copies that I am making just to display the rendered texture, so I tried a different approach using glCopyTexImage2D(), which unfortunately doesn't work. I'll display code and explanations:

mypbuffer.Initialize(256, 256, false, false);

- The false values are for Sharing context and sharing object. They are false cause this fantastic graphical card doesn't support it.

Then I perform the usual initializations, to enable Blending, and GL_TEXTURE_2D.

    CreateTexture();
    mypbuffer.Activate();

        int viewport[4];
        glGetIntegerv(GL_VIEWPORT,(int*)viewport);
        glViewport(0,0,xSize,ySize);


        DrawScene(hDC);

        //save data to texture using glCopyTexImage2D
        glBindTexture(GL_TEXTURE_2D,texture);

        glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA,
                         0,0, xSize, ySize, 0);
        glClearColor(.0f, 0.5f, 0.5f, 1.0f);                // Set The Clear Color To Medium Blue
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

        glViewport(viewport[0],viewport[1],viewport[2],viewport[3]);
        // glBindTexture(GL_TEXTURE_2D,texture);
        first = false;
        mypbuffer.Deactivate();

- The DrawScene function is very simple, it just renders a triangle and a rectangle, which is suposed to be offscreen rendered (I HOPE). CreateTexture() creates an empty texture. The function should work, as it was tested in the previous way I described and it works.

After this, in the main loop, i just do the following:

            glClear(GL_COLOR_BUFFER_BIT);
            glBindTexture(GL_TEXTURE_2D,texture);

            glRotatef(theta, 0.0f, 0.0f, 0.01f);
            glBegin(GL_QUADS);
            //Front Face
            glTexCoord2f(0.0f, 0.0f);
            glVertex3f(-0.5, -0.5f,  0.5f);
            glTexCoord2f(1.0f, 0.0f);
            glVertex3f( 0.5f, -0.5f,  0.5f);
            glTexCoord2f(1.0f, 1.0f);
            glVertex3f( 0.5f,  0.5f,  0.5f);
            glTexCoord2f(0.0f, 1.0f);
            glVertex3f(-0.5f,  0.5f,  0.5f);
            glEnd();
            SwapBuffers(hDC);

            theta = 0.10f;
            Sleep (1);

The final result is just a window with a blue background, nothing got actually Rendered. Any Idea why is this happening? My Graphical Card doesn't support the extension wgl_ARB_render_texture, but this shouldn't be a problem when calling the glCopyTexImage2D() right?

My Card doesn't support FBO either

2
I've also tried glCopyTexSubImage2D() and nothing happens either. Thanks.filipehd
Why the Z coordinate of the quad is 0.5f, instead of 0 usually used in Ortho2D projections?Luca
Luca, i've tried with different values: 0, 1, -1 and it doesn't make any difference at all. This is just the final quad in which the texture should be rendered onfilipehd

2 Answers

2
votes

What you must do is, sort of "connect" your two OpenGL contexts so that the textures of your PBuffer context also show up in the main render context. The term you need to look for is "display list sharing". In Windows you connect the contexts retroactively using wglShareLists, on X11 and MacOS X you must supply the handle of the context to be shared at context creation.

An entirely different possibility and working just as well is reusing the same context on the PBuffer. It's a little known fact, that you can use OpenGL render contexts not only on the drawable it has been created with first, but on any drawable with compatible settings. So if your PBuffer matches your main window's pixel format, you can detach the render context from the main window and attach it to the PBuffer. Of course you then need low level access to the main window's device context/drawable, which is normally hidden behind a framework.

0
votes

You should check whether your OpenGL implementation supports framebuffer objects: these object are able to be render targets, and they can have attached textures as color buffers, indeed rendering directly into a texture.

This should be the way to go, otherwise your method is the alternative.