7
votes

Some computers have more than one graphics card/chipset installed, even when (for example for laptops) they don't have more than one monitor.

I'm having trouble with a laptop system that's got both Intel and Nvidia graphics hardware. Intel's drivers are notoriously awful in their OpenGL support, and my code is running up against an inexplicable rendering bug, because it seems to default to the Intel system, not the Nvidia one, when creating the rendering context.

Is there any way to avert this at startup? To say something like "poll for all available graphics drivers, avoid Intel drivers if possible, and build me a OpenGL rendering context with the driver that will work"?

1
In NVidia panel you can select the default GPUMichael IV
@MichaelIV: Thanks, but that's not what I'm asking.Mason Wheeler
It is not only about the drivers but about the GPU too.Many today's notebooks have 2 cards : Intel and dedicated one (NVidia or ATI) .So you must set the hardware first via bias or windows interface like NVidia panel.If you don't switch hardware first trying to select different drivers is meaningless .Michael IV
@MichaellV: That's still not what I'm asking. I want to know how my program can do this. Telling me how the user (who is not necessarily me and does not necessarily have my level of technical knowledge) can do this is meaningless.Mason Wheeler
@MasonWheeler: There's no standard API for this. The best you could do is trying to emulate the switch the user would do (Windows registry entry).datenwolf

1 Answers

2
votes

There's no portable way to do what you're asking, but this document describes how to force "High Performance Graphics Render" on systems with NVIDIA Optimus technology:

http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf

Specifically, refer to the section "Global Variable NvOptimusEnablement (new in Driver Release 302)", which says:

Starting with the Release 302 drivers, application developers can direct the Optimus driver at runtime to use the High Performance Graphics to render any application–even those applications for which there is no existing application profile. They can do this by exporting a global variable named NvOptimusEnablement . The Optimus driver looks for the existence and value of the export. Only th e LSB of the DWORD matters at this time. A value of 0x00000001 indicates that rendering should be performed using High Performance Graphics. A value of 0x00000000 indicates that this method should be ignored.

Example Usage:

extern "C" {
    _declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
}

Another possibility is the WGL_nv_gpu_affinity extension, but your WGL context needs to support it and I'm not sure if it works on mixed Intel/NVIDIA systems:

http://developer.download.nvidia.com/opengl/specs/WGL_nv_gpu_affinity.txt