0
votes

There is SDL_WM_ToggleFullScreen. However, on Mac, its implementation destroys the OpenGL context, which destroys your textures along with it. Ok, annoying, but I can reload my textures. However, when I reload my textures after toggling, it crashes on certain textures inside of a memcpy being called by glTexImage2D. Huh, it sure didn't crash when I loaded those textures the first time around. I even try deleting all my textures before the toggle, but I get the same result.

As a test, I reload textures without toggling fullscreen, and the textures reload fine. So, toggling the fullscreen does something funny to OpenGL. (And, by "toggling", I mean going in either direction: windowed->fullscreen or fullscreen->windowed - I get the same results.)

As an alternative, this code seems to toggle fullscreen as well:

SDL_Surface *surface = SDL_GetVideoSurce();
Uint32 flags = surface->flags;
flags ^= SDL_FULLSCREEN;
SDL_SetVideoMode(surface->w, surface->h, surface->format->BitsPerPixel, flags);

However, calling SDL_GetError after this code says that the "Invalid window" error was set during the SDL_SetVideoMode. And if I ignore it and try to load textures, there is no crashing, but my textures don't show up either (perhaps OpenGL is just immediately returning from glTexImage2D without actually doing anything). As a test, at startup time, I try calling SDL_SetVideoMode twice in a row to perform a toggle right before I even load my textures the first time. In that particular case, the textures do actually show up. Huh, what?

I am using SDL 1.3.0-6176 (posted on the SDL website January 7, 2012).

Update:

My texture uploading code is below (nothing surprising here). For clarification, this application (including the code below) is already working without any issues as a finished application for iOS, Android, PSP, and Windows. Windows is the only other version, other than Mac, that is using SDL under the hood.

unsigned int Texture::init(const void *data, int width, int height)
{
    unsigned int textureID;
    glGenTextures(1, &textureID);
    glBindTexture(GL_TEXTURE_2D, textureID);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
    return textureID;
}
1

1 Answers

-1
votes

it crashes on certain textures inside of a memcpy being called by glTexImage2D.

This shouldn't happen. There's some bug somewhere, either in your code or that of the OpenGL implementation. And I'd say it's yours. So please post your texture allocation and upload code for us to see.

However, on Mac, its implementation destroys the OpenGL context, which destroys your textures along with it.

Yes, unfortunately this is the way how it works on MacOS X. And due to the design of its OpenGL driver model, there's litte you can do about it (did I mention that MacOS X sucks; oh yes, I did. On occasion. Like over a hundred times). The same badly designed driver model also makes MacOS X so slow in catching up with OpenGL development.