0
votes

As subject states, does GDI support alpha buffer for textures?

If my application runs in a windows-remote-desktop environment, when I try to keep a copy of the back buffer to a texture, the alpha channel seems to be ignored.

The alpha buffer is already set in the openGL context. In order to check if everything is ok with it, I made a copy to the main memory (glReadPixels) and counted the pixels with the given alpha value. It resulted correct.

When I try the texture method, all pixels result fully opaque.

Let me give you an example. I won't write every detail, like glBindTexture etc.

I generate the texture and create an image:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width(), height(), 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);

And here's how I save the back buffer to texture:

glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 0, 0, width(), height());

Then I create a buffer:

GLvoid *buf = (GLvoid *)malloc(4 * width() * height() * sizeof(unsigned char));
glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, buf);

Then I loop buf and check alpha values. All are opaque (i.e. 255).

On the other hand, if I use buf with glReadPixels:

glReadPixels(0, 0, width(), height(), GL_RGBA, GL_UNSIGNED_BYTE, byf);

All are ok. I mean pixels set with various alpha values are read correctly, which means that alpha buffer works.

Any ideas?

1

1 Answers

1
votes

Copying from the main window framebuffer (front or back) is highly unreliable; you're at the mercy of the underlying graphics/windowing system as the main window framebuffers are the windowing systems property. You should not rely on it and make no assumptions.

If you need the result of some rendering in a texture, use a FBO, which gives you well defined framebuffer and content retention behaviour. If you need the rendering results to show up in the main window as well first render to a FBO and then blit to the main window framebuffer.

Anything else is road down into territory that will make you pull your hair.