As subject states, does GDI support alpha buffer for textures?
If my application runs in a windows-remote-desktop environment, when I try to keep a copy of the back buffer to a texture, the alpha channel seems to be ignored.
The alpha buffer is already set in the openGL context. In order to check if everything is ok with it, I made a copy to the main memory (glReadPixels) and counted the pixels with the given alpha value. It resulted correct.
When I try the texture method, all pixels result fully opaque.
Let me give you an example. I won't write every detail, like glBindTexture etc.
I generate the texture and create an image:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width(), height(), 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
And here's how I save the back buffer to texture:
glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 0, 0, width(), height());
Then I create a buffer:
GLvoid *buf = (GLvoid *)malloc(4 * width() * height() * sizeof(unsigned char));
glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, buf);
Then I loop buf and check alpha values. All are opaque (i.e. 255).
On the other hand, if I use buf with glReadPixels:
glReadPixels(0, 0, width(), height(), GL_RGBA, GL_UNSIGNED_BYTE, byf);
All are ok. I mean pixels set with various alpha values are read correctly, which means that alpha buffer works.
Any ideas?