mapsvilla.blogg.se

Gfxcardstatus not setting discrete only
Gfxcardstatus not setting discrete only









gfxcardstatus not setting discrete only

If the currently used GPU is the discrete one (icon is then 'd'), try quitting applications until it's switched back to 'i'. Make sure you're in the default auto-switch mode, and that the currently in-use GPU is the integrated one (gfxCardStatus's icon should be 'i'). On older macs, you can only force one GPU or the other. You can verify that by looking at the gfxCardStatus menu: only on supporting Macs does it offer the default/auto-switch option.

#GFXCARDSTATUS NOT SETTING DISCRETE ONLY INSTALL#

install gfxCardStatus (Marcia's link) and verify that your mac supports dynamic GPU switching (only recent Macs do). That would be done by delaying the destruction of the dummy context when its refcount hits 0.

gfxcardstatus not setting discrete only

Then for the actual WebGL context's OpenGL context, do not change anything (keep sharing, keep AllowOfflineRenderers).Īpple asked us to debounce GPU switching. That should be a global refcounted object. So the idea is: during WebGL context creation, just before creating the actual WebGL context's OpenGL context, create a dummy context without AllowOfflineRenderers.

gfxcardstatus not setting discrete only

The usefulness of such a dummy context comes from the fact that it doesn't have to share resources with any other context, so we are free to choose its attributes. So if we create a dummy OpenGL context without AllowOfflineRenderers, we will switch immediately to the discrete GPU and stay there as long as this context is alive. The idea is that, except possibly in corner cases that we could detect if needed, only one GPU is used at a time. Not sure if it's been written down before, so here it is.











Gfxcardstatus not setting discrete only