1
votes

I'm writing a 2D game using SDL and OpenGL in the D programming language. At the moment it simply tries to render a texture-mapped quad to the screen. Problem is, the whole texture-mapping part doesn't quite seem to work. Despite the texture apparently loading fine (gets assigned a nonzero texture number, doesn't cause glGetError to return values other than zero), the quad is rendered with the last color set in glColor, entirely ignoring the texture.

I've looked for common reasons for texture mapping to fail, including this question, to no avail. The image file being loaded is 64x64, a valid power-of-2 size.

Please don't get scared off because this is in D—it's almost entirely C-style SDL and OpenGL calls.

SDL initialization code:

if (SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1) == -1 ||
    SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 0) == -1)
    throw new Exception("An OpenGL attribute could not be set!");

uint videoFlags = SDL_OPENGL | SDL_HWSURFACE | SDL_ANYFORMAT;

if (threadsPerCPU() > 1)
    videoFlags |= SDL_ASYNCBLIT;

SDL_Surface* screen = SDL_SetVideoMode(800, 600, 32, videoFlags);

if (screen == null)
    throw new Exception("SDL_SetVideoMode failed!");

OpenGL initialization code:

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0., 800, 600, 0., 0., 1.);
glMatrixMode(GL_MODELVIEW);

glDisable(GL_DEPTH_TEST);
glDisable(GL_LIGHTING);

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

glEnable(GL_TEXTURE_2D);

glClearColor(0.f, 0.f, 0.f, 0.f);

Texture loading code:

SDL_Surface* s = IMG_Load(toStringz("hello.png"));

if (s == null)
    throw new Exception("Image file could not be loaded!");

uint texFormat;

switch (s.format.BytesPerPixel)
{
case 4:
    texFormat = (s.format.Rmask == 0x000000ff ? GL_RGBA : GL_BGRA);
    break;
case 3:
    texFormat = (s.format.Rmask == 0x000000ff ? GL_RGB : GL_BGR);
    break;
default:
    throw new Exception("Bad pixel format!");
    break;
}

if ((s.w & (s.w - 1)) != 0)
    throw new Exception("Width must be a power of 2!");
if ((s.h & (s.h - 1)) != 0)
    throw new Exception("Height must be a power of 2!");

uint glName;

glGenTextures(1, &glName);

glBindTexture(GL_TEXTURE_2D, glName);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexImage2D(GL_TEXTURE_2D, 0, s.format.BytesPerPixel, s.w, s.h, 0,
    texFormat, GL_UNSIGNED_BYTE, s.pixels);

SDL_FreeSurface(s);

Rendering code:

glClear(GL_COLOR_BUFFER_BIT);

glBegin(GL_QUADS);
    glColor4ub(255, 255, 255, 255);
    glBindTexture(GL_TEXTURE_2D, glName);
    glTexCoord2i(0, 0); glVertex2i(0, 0);
    glTexCoord2i(1, 0); glVertex2i(64, 0);
    glTexCoord2i(1, 1); glVertex2i(64, 64);
    glTexCoord2i(0, 1); glVertex2i(0, 64);
glEnd();

SDL_GL_SwapBuffers();

My program uses Derelict2, a library that provides SDL and OpenGL bindings for D.

Any ideas on exactly what is going awry here?

Edit: For future reference, here's the solution to my problem:

http://www.opengl.org/wiki/Common_Mistakes#Creating_a_Texture

I needed to set a couple parameters prior to calling glTexImage2D or else my texture was technically incomplete. Adding the following four lines just before that call did the trick:

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
4
The 3rd parameter of glTexImage, "format" IS NOT bytes per pixel. Historically the values 1, 2, 3, 4 were accepted by OpenGL-1.0. Since OpenGL-1.1 and onwards the format parameter only accepts well defined tokens, like GL_RGBA, GL_RGB, etc. which numeric value has not relation to the actual format.datenwolf
Please forget about the concept of "OpenGL initialization". Yes, you load textures and stuff, but in any real world application this is not a one time thing, but happens on demand. All the rendering state, i.e. projection, clear color, viewport, etc. should be set right before they're needed in the rendering procedure. All the stuff you've placed in the "OpenGL initialization code" section belongs to the start of the "Rendering code" section.datenwolf
@datenwolf: Points taken. Regarding your first comment, the docs I was using must have been outdated, so I'll update the call to use one of the OpenGL enum values instead. Regarding the second comment, I think it's appropriate in my particular app to lump things like the projection, viewport, etc. into the init code, but only because my particular app only ever uses one perspective of the scene (just a static 2D view so I can do hardware-accelerated 2D drawing). I definitely see your point though.jgottula
Perhaps I will split out those portions of the so-called "init" code into a separate function for added clarity, and also since I now realize that resizing of the video output will likely require changing the viewport after init time.jgottula

4 Answers

7
votes

One common mistake is to create/use an incomplete texture, see here.

Your code does not show any call to :

glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR/*or GL_NEAREST*/);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR/*or GL_NEAREST*/);

and also does not show any mipmap creation.

You problem may be related to your texture being incomplete.

1
votes

Just a vague theory here: have you tried drawing the vertices counter-clockwise instead of clockwise?

glTexCoord2i(0, 0); glVertex2i(0, 0);
glTexCoord2i(0, 1); glVertex2i(0, 64);
glTexCoord2i(1, 1); glVertex2i(64, 64);
glTexCoord2i(1, 0); glVertex2i(64, 0);
0
votes

Don't you need a glOrtho(0., 800, 600, 0., 0., 1.); in the paintGL function?

0
votes

Try enabling GL_COLOR_MATERIAL? I haven't used the fixed-functionality pipeline in a while, but I think that might do it.