-
Notifications
You must be signed in to change notification settings - Fork 440
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ability to control GPU preference on Windows and Linux/BSD #323
base: master
Are you sure you want to change the base?
Conversation
Using the NvOptimusEnablement and AmdPowerXpressRequestHighPerformance executable-local symbols on Windows and using the DRI_PRIME environment variable on Linux.
Potential equivalent solution for macOS: https://github.com/CodySchrank/gSwitch |
Tested yesterday with my Intel Radeon RX Vega M on Windows, |
I stumbled across this today after having trouble getting the Primitives example working locally (it works fine on your website). I've gotten things working using the master branch and setting NvOptimusEnablement=1. Let me know if you need help testing this branch. I have an older Dell XPS laptop with Optimus chipset (GeForce GT 640M) running on windows. Using the latest Dell video drivers resulted in a crash after the Phong shader failed to compile. After upgrading to the latest drivers from NVIDIA, the example ran but selected the Intel driver (HD Graphics 4000) and rendered a flat coloured cube. Exporting NvOptimusEnablement=1 resulted in the correct driver being selected and the example renders correctly. |
Oh well. What's the Intel driver version? I was doing some patching and workarounds for Intel drivers recently and managed to iron out all driver bugs on the recent GPUs (Intel 530 - 630), but I have nowhere to test the older ones. Can you paste the engine output log here? It should show what extensions and driver workarounds is it using. As a random guess, can you try running the example with For the NvOptimusEnablement, the problem is that even setting to 0 makes it choose the NVidia GPU, which means is basically a compile-time option and thus useless. |
I'm pretty sure Intel HD Graphics 4000 is total rubbish and no amount of driver workarounds will change that. It's been a long time since I did any graphics programming (I'm liking Magnum BTW) and I don't recognize any of the modern GL extensions but I doubt the Intel GPU has the required features to run basic shaders. Here is the output when running without the NVIDIA GPU:
I'm guessing this output means Not being able to set NvOptimusEnablement at runtime sucks, seems like a pretty poor design decision on NVIDIA's behalf. I'm glad I found this pull request though, I could have spent days trying to figure out how to turn on it. Thanks |
Oh, right, this one is pretty old. It still should be capable of doing at least the WebGL1-level things. If you have some free time, it would be great to see what the tests say on the Intel card. You can enable them using the Thank you! |
With f7d7390, I don't expect SDL/GLFW to implement EGL-based device selection anytime soon so in the future I might be looking into replacing SDL/GLFW's own context creation with the EGL-based implementation where supported |
It's using the
NvOptimusEnablement
andAmdPowerXpressRequestHighPerformance
executable-local symbols on Windows and theDRI_PRIME
environment variable on Linux to control whether to use the integrated or the dedicated GPU. TheDRI_PRIME
part works as expected.Problem is, according to our tests, simply adding theSeems that the value does have an effect after all (Sept 2021), did the drivers get fixed since?NvOptimusEnablement
to application sources, will force the app to use the dedicated GPU, no matter what the value isWhich ... makes this quite useless, as the switch between integrated/dedicated GPU is done at compile time.Things to do:
DRI_PRIME
part and commit it tomaster
, since that works correctlyDRI_PRIME
,MESA_LOADER_DRIVER_OVERRIDE=zink|iris|...
can switch to some other driver as well, but ugh :/__GLX_VENDOR_LIBRARY_NAME=nvidia __NV_PRIME_RENDER_OFFLOAD=1
env vars specific to the NV driver that make it possible to switch between Intel and NV card at runtime: https://gitlab.freedesktop.org/glvnd/libglvnd/-/issues/205#note_553296 well, it's basically whatprime-run
does internally, setting those env variables (together with a Vulkan one) and running the applicationWindowlessApplication
that chooses between GLX and EGL based on--magnum-device glx|egl|0|1...
and enable either depending onWITH_WINDOWLESS[GLX,EGL,..]APPLICATION
being enabledflextGLInit()
being present, one for GLX and one for EGLTARGET_HEADLESS
andTARGET_DESKTOP_GLES
defines that were only confusing