2
votes

In my OpenGL application on Ubuntu (12.10), I've issued the following GLFW window hints when creating a window:

glfwWindowHint(GLFW_CLIENT_API, GLFW_OPENGL_API);
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 4);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
glfwWindowHint(GLFW_VISIBLE, GL_TRUE);
glfwWindowHint(GLFW_SAMPLES, 0);
glfwWindowHint(GLFW_RED_BITS, 24);
glfwWindowHint(GLFW_GREEN_BITS, 24);
glfwWindowHint(GLFW_BLUE_BITS, 24);
glfwWindowHint(GLFW_ALPHA_BITS, 8);

// Create Opengl Window
window = glfwCreateWindow(width, height, windowTitle.c_str(), NULL, NULL);

centerWindow();
glfwMakeContextCurrent(window);

But this results in an OpenGL context being created with an associated window with pixel color bit depth of R-G-B-A = 8-8-8-8. To check this I used the following code after creating my GLFW window:

int count;
const GLFWvidmode * mode = glfwGetVideoMode(monitor);

cout << "Current video mode: " <<
        mode->redBits << "-" <<
        mode->greenBits << "-" <<
        mode->blueBits << endl;

cout << "All possible video modes: " << endl;
mode = glfwGetVideoModes(monitor, &count);
for(int i = 0; i < count; i++) {
    cout << mode->redBits << "-" <<
            mode->greenBits << "-" <<
            mode->blueBits << endl;
}

Surprisingly I get 8-8-8-8 for my current video mode, and for all possible video modes. I'm sure this cannot be the case, as my monitor (Samsung S23B550) can display mono-colored gradients without any mach banding issues, which means it should be at least 16-24 bit depth per color channel. I'm also using a modern graphics card (Nvidia GT650M), which should have a framebuffer that supports between 24-32bits per channel.

The only odd thing to take into account is that my graphics card uses the Nvidia Optimus technology, which means that if I want to use my dedicated graphics card on Linux, I have to use optirun (Bumblee) for graphics card switching (integrated-to-dedicated), which I do when I run my OpenGL applications.

1
The manual for that display says it only does 8-bit color ("Display Color: 16.7M", page 79), not 16- or even 10-bit. - genpfault
It is still possible that it is a 10-bit panel, though very unlikely given the price (for that price, it might even be a 6-bit panel in disguise). Any higher-bit image processing it does will be just that, image processing; the source is always 8-bit. - Andon M. Coleman
@genpfault Doesn't 16.7M colors = 2^24 so a bit depth of 24? - Dustin Biser
24 bits per pixel, which is 8 bits per R/G/B channel. - genpfault

1 Answers

0
votes

Nvidia says you have to shell out for a Quadro to use 10-bit color in OpenGL.

Not that AMD is any better in that regard.