6
votes

I've been trying to figure out how glTexImage2D works and an seeing some odd results from some pretty clear-cut code. My code simply draws a rough circle into a 256*256 length unsigned array and then sends that data out to become a texture. The texture displayed however is turning out as variations of red and orange no matter what combinations I select inside my image creation loop:

unsigned* data = new unsigned[256*256];
for (int y = 0; y < 256; ++y)
    for (int x = 0; x < 256; ++x)
        if ((x - 100)*(x - 100) + (y - 156)*(y - 156) < 75*75)
            data[256*y + x] = ((156 << 24) | (256 << 16) | (156 << 8) | (200 << 0));
        else
            data[256*y + x] = 0;  // I'd expect this to be transparent and the above to be slightly transparent and green, but it's red somehow.

glBindTexture(GL_TEXTURE_2D, texid);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 256, 256, 0, GL_RGBA, GL_UNSIGNED_BYTE, (GLvoid*)data);

OpenGL options:

    glEnable(GL_TEXTURE_2D);
glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 0.5f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
//glBlendFunc(GL_SRC_ALPHA, GL_ONE);
//glEnable(GL_BLEND);
//glDisable(GL_CULL_FACE);

glGenTextures(1, &leaf[0]);
    createLeaf(leaf[0]);  // createLeaf(GLuint& texid) is posted entirely above

The rest of the code does nothing but display the texture on a single quad in a window. (x64 win7)

Edit: I tried Rickard's solution exactly and I'm still getting a purple circle.

2
When you draw the texture, does it look like a circle? I'm curious if it's just the texture's colors that're off or if the entire texture is just coming out as garbage.user457812

2 Answers

16
votes
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 256, 256, 0, GL_RGBA, GL_UNSIGNED_BYTE, (GLvoid*)data);

First the positive things. You use a sized internal format (GL_RGBA8, rather than GL_RGBA). This is very good; keep doing that. You have a clear understanding of the difference between the internal format (GL_RGBA8) and the pixel transfer format (GL_RGBA). This is also good.

The problem is this. You told OpenGL that your data was a stream of unsigned bytes. But it's not a stream of unsigned bytes; it's a stream of unsigned integers. That's how you declared data, that's how you filled data. So why are you lying to OpenGL?

The problem is with your colors. This is one of your color values:

 ((156 << 24) | (256 << 16) | (156 << 8) | (200 << 0))

First, 256 is not a valid color. 256 in hex is 0x100, which is two bytes, not one.

The unsigned integer you would get from this is:

 0x9D009CC8

If these are intended to be RGBA colors in that order, then the red is 0x9D, green is 0x00, blue is 0x9C, and alpha is 0xC8.

Now, because you're probably working on a little-endian computer, that 4 bytes is stored flipped, like this:

 0xC89C009D

When you tell OpenGL to pretend that this is a byte array (which it is not), you are losing the little-endian conversion. So OpenGL sees the byte array starting with 0xC8, so that is the red value. And so on.

You need to tell OpenGL what you're actually doing: you're storing four 8-bit unsigned values in a single unsigned 32-bit integer. To do this, use the following:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 256, 256, 0, GL_RGBA, GL_UNSIGNED_INT_8_8_8_8, (GLvoid*)data);

The GL_UNSIGNED_INT_8_8_8_8 says that you're feeding OpenGL an array of unsigned 32-bit integers (which you are). The first 8-bits of the 32-bit integer is red, the second is green, the third is blue, and the fourth is alpha.

So, to completely fix your code, you need this:

GLuint* data = new GLuint[256*256]; //Use OpenGL's types
for (int y = 0; y < 256; ++y)
    for (int x = 0; x < 256; ++x)
        if ((x - 100)*(x - 100) + (y - 156)*(y - 156) < 75*75)
            data[256*y + x] = ((0x9C << 24) | (0xFF << 16) | (0x9C << 8) | (0xC8 << 0));
        else
            data[256*y + x] = 0;  // I'd expect this to be transparent and the above to be slightly transparent and green, but it's red somehow.

glBindTexture(GL_TEXTURE_2D, texid);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0);  //Always set the base and max mipmap levels of a texture.
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL, 0);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 256, 256, 0, GL_RGBA, GL_UNSIGNED_INT_8_8_8_8, (GLvoid*)data);

// I'd expect this to be transparent and the above to be slightly transparent and green, but it's red somehow.

Alpha doesn't mean transparent; it means nothing at all unless you give it a meaning. Alpha only represents transparency if you use blending and set up a blending mode that causes a low alpha to make things transparent. Otherwise, it means nothing at all.

2
votes

If I where to do the same thing as you I would use a array of unsigned chars instead of unsigned int with 4 times the length.

unsigned char* data = new unsigned char[256*256*4];
for (int y = 0; y < 256; ++y)
    for (int x = 0; x < 256; ++x)
        if ((x - 100)*(x - 100) + (y - 156)*(y - 156) < 75*75){
            data[(256*y + x)*4+0] = 156;
            data[(256*y + x)*4+1] = 256;
            data[(256*y + x)*4+2] = 156;
            data[(256*y + x)*4+3] = 200;
        }else{
            data[(256*y + x)*4+0] = 0;
            data[(256*y + x)*4+1] = 0;
            data[(256*y + x)*4+2] = 0;
            data[(256*y + x)*4+3] = 0;
        }

glBindTexture(GL_TEXTURE_2D, texid);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 256, 256, 0, GL_RGBA, GL_UNSIGNED_BYTE, (GLvoid*)data);

But your code looks right to me and I not sure if the code above will change anything. If its the same result, then try change GL_RGBA8 to just GL_RGBA. And what is the varible type of texid. I alwas call glBindTexture with a GLuint by refrense (&texid) but if your texid is a pointer to a GLuint (GLuint *texid;) then I guess that part is ok. (Edit: Just realize that im wrong on the last part, i was thinking about glGenTexture and not glBindTexture)