1
votes

I am using SDL2 for OS interfacing and GLEW for OpenGL function loading. Initially I had my SDL_GL_SetAttribute() calls in-between window creation and context creation. I noticed that my SDL_GL_SetAttribute() calls made little effect, so I re-read the documentation and it states "The requested attributes should be set before creating an OpenGL window". If I try this, I only manage to get an OpenGL 1.1 context (the one Microsoft ships). If I don't call SDL_GL_SetAttribute() at all I get a 4.4 context with no issue.

One strange thing I've noted is my calls to SDL_GL_GetAttribute() after I create the context seem to return inaccurate values. If I don't set any attributes I get:

SDL GL Context Major Version: 2
SDL GL Context Minor Version: 1
SDL GL Context Profile: Unknown
SDL GL Accelerated Visuals: True
SDL GL Double Buffering: True
SDL GL Depth Size: 24
----------------------------------------------------------------
Graphics Successfully Initialized
OpenGL Info
  Version: 4.4.0
  Vendor: NVIDIA Corporation
  Renderer: GeForce GT 650M/PCIe/SSE2
   Shading: 4.40 NVIDIA via Cg compiler
----------------------------------------------------------------

If I do set all the attributes I get:

SDL GL Context Major Version: 3
SDL GL Context Minor Version: 1
SDL GL Context Profile: Compatibility
SDL GL Accelerated Visuals: True
SDL GL Double Buffering: True
SDL GL Depth Size: 32
----------------------------------------------------------------
Graphics Successfully Initialized
OpenGL Info
    Version: 1.1.0
     Vendor: Microsoft Corporation
   Renderer: GDI Generic
    Shading: (null)
----------------------------------------------------------------

Note how in the first case, SDL says I have a 2.1 context, while I actually get a 4.4. In the second case it says I have a 3.1 context (what I requested), when I actually have a 1.1.

My current setup code (which follows what is outlined in this tutorial). Error checking omitted for brevity.

SDL

SDL_Init(SDL_INIT_VIDEO);

// Request compatibility because GLEW doesn't play well with core contexts.
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_COMPATIBILITY);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3); 
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1); 
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 32); 

SDL_Window * window = SDL_CreateWindow("Title", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 800, 600, SDL_WINDOW_OPENGL);

SDL_GLContext context = SDL_GL_CreateContext(window);

SDL_GL_MakeCurrent(window, context)

// Print what the OS actually gave us for each value we requested.
SDL_GL_GetAttribute(/* ... */);

Directly followed by GLEW

glewExperimental = GL_TRUE;
GLenum result = glewInit();
if(result != GLEW_OK)
{
  // Handle & print error
}

if(GLEW_VERSION_1_1)
{
  printf("----------------------------------------------------------------\n");
  printf("Graphics Successfully Initialized\n");
  printf("OpenGL Info\n");
  printf("    Version: %s\n", glGetString(GL_VERSION));
  printf("     Vendor: %s\n", glGetString(GL_VENDOR));
  printf("   Renderer: %s\n", glGetString(GL_RENDERER));
  printf("    Shading: %s\n", glGetString(GL_SHADING_LANGUAGE_VERSION));
  printf("----------------------------------------------------------------\n");
}
else
{
  printf("Unable to get any OpenGL version from GLEW!");
}

if(!GLEW_VERSION_3_1)
{
  // Handle & print error
}
1
You try to get a 32 Bit depth buffer, while by default, a 24 bit one is selected. You might get that fallback to the MS renderer for that reason. Another thing is that you request a 3.1 compatibility profile context, which does not exist at all - profiles were introduced in 3.2. I would recommend testing with GL 3.2, or setting the profile mask to 0 for something <= 3.1.derhass
Even on GPUs that support 32-bit depth buffers, they generally will not expose those pixel formats for use with the default framebuffer on Microsoft Windows. If you really need a 32-bit depth buffer, the most portable way to go about this is to use an FBO.Andon M. Coleman
@derhass I completely forgot that core/compatibility are a 3.2+ construct, thanks! It does appear as though SDL2 is ok with requesting one regardless. It was the depth buffer request.Peter Clark
@AndonM.Coleman That was it! Thanks for responding with the detail you did, that actually helps fill in the blanks on an issue I am having on my integrated Intel card. I am getting a 16 bit depth buffer, but have a FBO for my deferred renderer that has a 32 bit depth buffer. I had thought if it wasn't supported my FBO shouldn't be able to get one. Feel free to post your comment as an answer and I'll accept it.Peter Clark

1 Answers

2
votes

First, to be completely clear GL 3.1 technically does have a notion of compatibility, in the form of an extension called GL_ARB_compatibility. If that extension is present in your extensions string, then features marked as deprecated in the GL 3.0 specification will continue to be usable. If it is absent, then things deprecated in GL 3.0 are actually removed in 3.1. To that end, there is a forked version of the GL 3.1 specification that deals specifically with this extension.

However, this is not the same thing as a context profile. You cannot actively request compatibility in 3.1, you can only detect this behavior after the fact. So, as derhass points out, it does not make sense to request a 3.1 "compatibility profile" (or any profile for that matter) context.


More importantly, 32-bit depth buffers are not widely supported for the default framebuffer on Microsoft Windows. Some drivers allow you to choose a pixel format with a 32-bit depth buffer and still get a hardware implementation, but more often than not you wind up falling back to Microsoft's software-based GDI implementation.

This behavior even occurs on GPUs that actually support 32-bit depth buffers, and the most portable way to approach an application that truly requires a 32-bit depth buffer is to use a Framebuffer Object.