8
votes

I'm trying to figure out texture mapping in OpenGL and I can't get a simple example to work.

The polygon is being drawn, though it's not textured but just a solid color. Also the bitmap is being loaded correctly into sprite1[] as I was successfully using glDrawPixels up til now.

I use glGenTextures to get my tex name, but I notice it doesn't change texName1; this GLuint is whatever I initialize it to, even after the call to glGenTextures...

I have enabled GL_TEXTURE_2D.

Heres the code:

GLuint texName1 = 0;

glGenTextures(1, &texName1);
glBindTexture(GL_TEXTURE_2D, texName1);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_BGRA_EXT, sprite1[18], sprite1[22], 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, &sprite1[54]);

glColor3f(1, 1, 0);
glBindTexture(GL_TEXTURE_2D, texName1);
glBegin(GL_QUADS);
    glTexCoord2f (0.0, 0.0);
    glVertex3f (0.0, 0.0, -5.0f);
    glTexCoord2f (1.0, 0.0);
    glVertex3f (.5, 0.0, -5.0f);
    glTexCoord2f (1.0, 1.0);
    glVertex3f (.5, .5, -5.0f);
    glTexCoord2f (0.0, 1.0);
    glVertex3f (0.0, .5, -5.0f);
glEnd();

UPDATE: I'm at a loss. Here's everything I've tried:

  1. Turns out I was initializing my texture before OGL was initialized. The texture is initialized (glGenTextures->glTexImage2D) in a class constructor and drawn (glBegin->glEnd) in a member function that is called every frame. genTextures appears to be working correctly now and I'm getting a name of 1.

  2. Every possible combination of GL_RGBA8, GL_BGRA_EXT (GL_BGRA doesn't work on my system; I need the _EXT), and I even removed the alpha channel from the bitmap and tried all combinations of GL_RGB, GL_BGR_EXT, etc etc. No luck.

  3. Tried procedurally creating a bitmap and using that

  4. Made sure GL_COLOR_MATERIAL isn't enabled.

  5. Changed bitmap size to 32x32.

  6. Tried glTexEnvi instead of glTexEnvf.

7
if texName1 is remaining 0 after glGenTextures you should check your error conditions...Jason Coco
Is the width & height stored as byte or int in sprite1[]?Maurice Gilden
width and height is stored as byte little endian. I've used sprite1[18] and sprite1[22] successfully before with glDrawPixels; they are correct. The bitmap is only 29x20 pixels so it works.Tony R
Also, how do I check error conditions? From the reference pages I don't see that glGenTextures generates any errors that would be useful to me using glGetError().Tony R

7 Answers

7
votes

In addition to mentat's note that you might have a problem with non-power-of-two texture dimensions, you mention the texture name generation not changing the name.

That sounds as if you're calling glGenTextures() too early, i.e. before initializing OpenGL. If you're not, then I suggest adding code just after the call to glGenTextures() that check the OpenGL error state, by calling glGetError().

4
votes

In your comments, you say your bitmap is 29x20 pixels. Afaik to generate a valid texture, OpenGL requires that the image size (on each dimension) be a power of 2. It doesn't need to be a square, it can be a rectangle though. You can overcome this by using some OpenGL extensions like GL_ARB_texture_rectangle.

2
votes

I'll put this here as I had the same issue and found another post explaining the issue. The iPhone does support GL_BGRA(GL_EXT_BGRA) but seemingly only as an input format and not as an internal format. So, if you change the glTexImage2D call to have an internal format of GL_RGBA then it works.

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_BGRA, GL_UNSIGNED_BYTE, &sprite1[54]);

I hope this helps someone else that stumbles upon this post.

1
votes

Some random ideas:

  • GL_COLOR_MATERIAL might be enabled
  • change "glTexEnvf" to "glTexEnvi" and see if that helps
  • if texName1 is 0 after glGenTextures you might not have an active OpenGL context

For error checking I recommend writing a small function that prints readable output for the most common results from glGetErrors and use that to find the line that creates the error. Another possibility would be to use something like GLIntercept, BuGLe or gDEBugger.

1
votes

My OpenGL is rusty, but I remember having same problems with with glTexImage2D . Finally I managed to make it work, but I always had more luck with gluBuild2DMipmaps so i ended up with

gluBuild2DMipmaps (
  GL_TEXTURE_2D, type, i.width, i.height, type, GL_UNSIGNED_BYTE, i.data
);

which replaced

glTexImage2D (
  GL_TEXTURE_2D, 0, type, i.width, i.height, 0, type, GL_UNSIGNED_BYTE, i.data 
);
1
votes

I found the problem. My call to glEnable was glEnable(GL_BLEND | GL_TEXTURE_2D). Using glGetError I saw I was getting a GL_INVALID_ENUM for this call, so I moved GL_TEXTURE_2D to its own enable function and bingo. I guess logical OR isn't allowed for glEnable?

0
votes

First thing I'd check is the colour material setting, as mentioned by ShadowIce, then check your texture file to ensure it's a reasonable size (i.e. something like 256x256) and an RGB bitmap. If the file has even a slight problem it WILL NOT render correctly, no matter how you try.

Then, I'd stop trying to just debug that code and instead see what you have different to the tutorial on the NeHe website.

NeHe is always a good place to check if you're trying to do stuff in OpenGL. Textures are probably the hardest thing to get right, and they only get more difficult as the rest of your GL skills increase.