0
votes

I've been trying to start a new SDL + GLEW + OpenGL project, and the setup has been difficult (MinGW-w64-32 on Windows 8 64-bit with an Intel i7-4700MQ CPU and NVidia GTX 745M GPU).

If I set the GL attributes to be used for context creation to use OpenGL version 4.2, the color and depth bit sizes get set to 0. However, if I request a 2.1 context (which is also the default), I can get the requested bit depths (8 bits for each color, 24 bits for depth). In either case, however, glClearColor has no effect (just a black background).

In both cases, the result of a few glGetString calls is the same- a 4.2 context, suggesting SDL's output is far from correct.

The entirety of the code can be found here, it's mostly boilerplate for a larger project for now. The relevant sections would most likely be

if(SDL_Init(SDL_INIT_EVERYTHING) != 0) {
    std::cerr << "Error initializing SDL.\n";
    exit(1);
}

SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 2);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);
//SDL_GL_SetAttribute(SDL_GL_CONTEXT_FLAGS, SDL_GL_CONTEXT_FORWARD_COMPATIBLE_FLAG);
SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);

SDL_Window* window = SDL_CreateWindow("RenderSystem", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 640, 480, SDL_WINDOW_OPENGL|SDL_WINDOW_RESIZABLE);

if(!window) {
    std::cerr << "Window creation failed.\n";
    SDL_Quit();
    exit(2);
}

SDL_GLContext context = SDL_GL_CreateContext(window);

if(!context) {
    std::cerr << "OpenGL Context creation failed.\n";
    SDL_DestroyWindow(window);
    SDL_Quit();
    exit(3);
}

SDL_GL_MakeCurrent(window, context);

and

glClearColor(1.0f, 1.0f, 1.0f, 1.0f);

SDL_Event evt;
bool run = true;

while(run) {

    SDL_PollEvent(&evt);

    switch(evt.type) {
        case SDL_KEYDOWN:
            if(evt.key.keysym.sym == SDLK_ESCAPE) {
                run = false;
            }
            break;

        case SDL_QUIT:
            run = false;
            break;
    }

    SDL_GL_SwapWindow(window);

}
2
How are you determining said values? In OpenGL 3.0, it is not valid to query GL_RED_BITS, GL_DEPTH_BITS, etc. on the default framebuffer. A compatibility profile should deliver consistent results, but core profiles have different behavior.Andon M. Coleman
You don't seem to call glClear() anywhere. glClearColor() just latches some state, it doesn't actually clear any buffers.genpfault
@AndonM.Coleman I'm using SDL's built-in function SDL_GL_GetAttribute which returns the values that SDL uses to create the context, not the actual context attributes (as I understand it).Cave Dweller
@genpfault I was worrying more about the 0 values from the SDL_GL_GetAttribute calls, and completely skipped over that. That does solve the issue of course for the 2.1 context request that allows me to set the values for the bit depths, and also oddly solves the issue for a 4.2 context, even though SDL still reports the zero-bit buffer depths.Cave Dweller

2 Answers

2
votes

I'm using SDL's built-in function SDL_GL_GetAttribute which returns the values that SDL uses to create the context, not the actual context attributes (as I understand it).

This is incorrect, I took a look at the SDL implementation of SDL_GL_GetAttribute (...) (see src/video/SDL_video.c) and it does what I described. You cannot query the values on a core profile context because they are not defined for the default framebuffer.

Here is where the problem comes from:

int
SDL_GL_GetAttribute(SDL_GLattr attr, int *value)
{
  // ...

  switch (attr) {
  case SDL_GL_RED_SIZE:
    attrib = GL_RED_BITS;
    break;
  case SDL_GL_BLUE_SIZE:
    attrib = GL_BLUE_BITS;
    break;
  case SDL_GL_GREEN_SIZE:
    attrib = GL_GREEN_BITS;
    break;
  case SDL_GL_ALPHA_SIZE:
    attrib = GL_ALPHA_BITS;
    break;
  }

  // ...

  glGetIntegervFunc(attrib, (GLint *) value);
  error = glGetErrorFunc();
}

That code actually generates a GL_INVALID_ENUM error on a core profile, and the return value of SDL_GL_GetAttribute (...) should be non-zero as a result.

If you must get meaningful values from SDL_GL_GetAttribute (...) for bit depths, then that means you must use a compatibility profile. SDL2 does not extract this information from the pixel format it selected (smarter frameworks like GLFW do this), but it naively tries to query it from GL.

-3
votes

If not fail me memory, SDL_GL_DEPTH_SIZE has to be the sum of all color channels:

using four color channels:

SDL_GL_SetAttribute (SDL_GL_DEPTH_SIZE, 32);

If you were using 3 color channels would then:

SDL_GL_SetAttribute (SDL_GL_DEPTH_SIZE, 24);

Already had some problems with it, this might be the problem. Sorry for my english.