1
votes

I am having issues reading back the alpha channel from my OpenGL texture on Android. I call glReadPixels this way:

glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

and just as a test, I made a fragment shader looking like this:

gl_FragColor = vec4(0.5, 0.5, 0.5, 0.5);

For each pixels, I get back { 64, 64, 64, -1 }.

I've tried all sorts of things to work out why the alpha is not returned the same way as the RGB values, including this before each render:

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT);
//Tried various blend functions
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

But no matter what I do, it comes back -1.

Is this an Android limitation or what am I missing?

Just as a note, I am using an off-screen FBO if it makes any difference.

EDIT: If I do this:

int[] bits= new int[4];
glGetIntegerv(GL_RED_BITS, bits, 0);
glGetIntegerv(GL_GREEN_BITS, bits, 1);
glGetIntegerv(GL_BLUE_BITS, bits, 2);
glGetIntegerv(GL_ALPHA_BITS, bits, 3);

I get back { 8, 8, 8, 0 }.

2
This looks like you're using the NDK, or native OpenGL library. If so, please tag it as well. - m0skit0
Is buffer actually a GLubyte array? Because it's not possible for a GLubyte to have a negative number. So unless you're doing it wrong, you cannot possibly get { 64, 64, 64, -1 } back. - Nicol Bolas
@NicolBolas That's fine. They don't need to be unsigned for what I need to do. They are put in a Java ByteBuffer. - BlueVoodoo
@BlueVoodoo: No, it's not fine. How can you expect to get accurate values when you and OpenGL are in disagreement over how to interpret the data? If you are reading unsigned bytes, you need to interpret them as unsigned bytes. It's not optional. - Nicol Bolas
@NicolBolas - Yes, I understand the concern but in my case they are turned into signed bytes via the ByteBuffer. For what I neeed to to, this is fine. I just need the alpha channel to be returned correctly and my problem will be solved. The rest of the code in the pipeline already works. - BlueVoodoo

2 Answers

1
votes

Alpha channels control blending, they're usually lost the moment they are used. The data that was in the alpha channel has been used to blend your colours with the colours that were already in the data set. You cannot use:

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

and get your alpha channel back. I'd note also that this is why you're getting 64 rather than 128 back for your RGB values, each original value (128) has been blended 50/50 with the existing black to produce an RGB of 64.

0
votes

Worked out what the problem was. My render buffer storage was set to a 24 bit mode. When changing it to a 32 bit mode, it started working:

glRenderbufferStorage(GL_RENDERBUFFER, GLES11Ext.GL_RGBA8_OES, w, h);