17
votes

I am part of a team developing an application using C++ with SDL and OpenGL.

On laptops when the application is ran the dedicated graphics card is not used and the GL context fails to create because the integrated graphics card does not support the version of GL we want.

I have a feeling that this problem is specific to the laptop in question and not something we can solve through code. But, if anyone knows if there is a solution that'd be great.

2
Do you mean laptops with dual graphics cards? Does manually switching to the dedicated card help (in nvidia settings or wherever it is)?riv
You might be able to use the target platform(s) specific API(s) to access what devices are available then pick which one to create the active context on. Though I have a feeling you are right and the inactive graphics device will not show up until turned on in the settings for the laptop as suggested by @riv.kc7zax
@riv yes it is a laptop with dual graphics cards. We can of course add the application to the list of applications that use the dedicated card in the nvidia/ati settings but for end users we would prefer they don't have to do that.Connor Hollis
The replies with __declspec(dllexport) are old and specific to the Nvidia optimus driver. Windows 10 has now it's own way to configure high performance GPU (see pureinfotech.com/set-gpu-app-windows-10). Are the replies still up-to-date, or is there a vendor neutral way to achive this in Windows 10 in the meantime?jcm

2 Answers

29
votes

The easiest way from C++ to ensure that the dedicated graphics card is used instead of chipset switchable graphics under Windows is to export the following symbols (MSVC sample code):

Enable dedicated graphics for NVIDIA:

extern "C" 
{
  __declspec(dllexport) unsigned long NvOptimusEnablement = 0x00000001;
}

Enable dedicated graphics for AMD Radeon:

extern "C"
{
  __declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}

Caveat: If the user has created a profile for the application to use integrated chipset, then these will not work.

I am unsure if this would work similarly under Linux / MacOS (unlikely).

3
votes

Does it use NVidia dedicated graphics? AFAIK, the process of automatically switching from integrated to dedicated is based on application profiles. Your application is not in the driver's list of known 3D applications, and therefore the user has to manually switch to the dedicated GPU.

Try changing the executable name of your application to something the driver looks for. For example "Doom3.exe". If that works, then you've found your problem.

If that didn't help, try following the instructions in this video on how to make the driver insert your application in its list of 3D apps:

http://www.frequency.com/video/how-to-whitelist-game-with-nvidias/24814032

But the above is only for verifying if this is indeed your problem. For an actual solution to this, you should check with the graphics drivers vendors (AMD and NVidia) on the best way to insert a profile for your application into their lists. NVidia provides NVAPI and AMD has ADL and AGS. They're definitely worth a study.