General Discussions
Chrome browser not using eGPU
 

Chrome browser not using eGPU  

  RSS

tonyburn
(@tonyburn)
Active Member
Joined: 2 years ago
 

Hi all,  I have a MacBook Air with Akitio2/GTX970 installed and working as an eGPU.  Typical setup is a large 34" display plugged into the eGPU, the laptop lid is closed and laptop display disabled using DisableMonitor.  This setup is great, the eGPU happily drives my large display which the integrated HD4000 cannot, I can play steam games etc.

The issue I have is using the eGPU in Chrome, when researching WebGL it became apparent that my eGPU was not being leveraged by chrome, the HD4000 was (low framerate; 2-4 fps).  Doing some digging I could find Chrome provides information about the GPU setup, mine lists the HD4000 as active and not the GTX970

Screen Shot 2017 05 21 at 09.37.45

.

I am baffled as to how I can find a way to force the eGPU(GTX970) for chrome, can someone please advise?

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
nando4
(@nando4)
Noble Member Admin
Joined: 3 years ago
 

There is a bug with Chrome not using NVidia GPUs when running Optimus with no solution as yet:

https://bugs.chromium.org/p/chromium/issues/detail?id=476203

eGPU Setup 1.35    •    eGPU Port Bandwidth Reference Table


ReplyQuote
shalbob
(@shalbob)
New Member
Joined: 2 years ago
 

I am also seeing this problem: none of my web browsers will utilize my eGPU! Whether it's Firefox, Chrome or Safari, I can't get any of them to use the eGPU for WebGL. And this is important to me since I am designing a WebGL-based product.

My setup:

When I run bootcamp, I can get eGPU acceleration for webGL when either the RX580 or 1070 are connected, but not in MacOS....

Any tips/advice??

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
dani29m
(@dani_tx)
Estimable Member
Joined: 3 years ago
 

Forcing specific GPU

In multi-GPU systems, Chromium automatically detects which GPU should be used for rendering (discrete or integrated). This works 99% of the time, except when it doesn't - if a unavailable GPU is picked (for example, discrete graphics on VFIO GPU passthrough-enabled systems), chrome://gpu will complain about not being able to initialize the GPU process. On the same page below Driver Information there'll be multiple GPUs shown (GPU0, GPU1, ...). There's no way to switch between them in a user-friendly way, but you can read the device/vendor IDs present there and configure Chromium to use a specific GPU with flags:

$ chromium --gpu-testing-vendor-id=0x8086 --gpu-testing-device-id=0x1912

...where 0x8086 and 0x1912 is replaced by the IDs of the GPU you want to use (as shown on the chrome://gpu page).

ASUS A53SV: i7-2860QM, 16GB DDR3 running WINDOWS 10 Enterprise LTSB 2016
SSD: SAMSUNG 850 PRO(256GB)+SAMSUNG 850 EVO(120GB)
eGPU: EVGA GTX 1080 FTW+ PE4C V3.0(mPCIe)+EVGA 650 G3 PSU (internal display ONLY)


ReplyQuote
(@majkelos)
New Member
Joined: 2 years ago
 

$ chromium --gpu-testing-vendor-id=0x8086 --gpu-testing-device-id=0x1912

hi, when I try paste this in terminal (with changed ids) I have information:
sh: chromium: command not found

Is this possible in MacOS?

MacBook Pro 13" 2017 Touchbar i7 3.5Ghz #16GB RAM #512GB PCIe SSD #Mantiz Venus with AMD Radeon RX 580 #macos 10.13.4


ReplyQuote
dani29m
(@dani_tx)
Estimable Member
Joined: 3 years ago
 

How to specify command line flags

Let's say that you want to add two command line flags to chrome: --foo and --bar=2.

 Mac OS X
  1. Quit any running instance of chrome.
  2. Launch /Applications/Utilities/Terminal.app
  3. At the command prompt enter:
    /Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome --foo --bar=2

Too lazy to Google it or.......?

ASUS A53SV: i7-2860QM, 16GB DDR3 running WINDOWS 10 Enterprise LTSB 2016
SSD: SAMSUNG 850 PRO(256GB)+SAMSUNG 850 EVO(120GB)
eGPU: EVGA GTX 1080 FTW+ PE4C V3.0(mPCIe)+EVGA 650 G3 PSU (internal display ONLY)


ReplyQuote
shalbob
(@shalbob)
New Member
Joined: 2 years ago
 

Dani, I really appreciate the tips on forcing GPU. Unfortunately I haven't been able to get that command to work (iGPU remains active).

Do you know if I can execute the command you provided above in a way that outputs any errors that might be happening? I'm just wondering if there is some hidden error that is happening which is preventing progress on this problem, well, can I make this error reveal itself?

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
dani29m
(@dani_tx)
Estimable Member
Joined: 3 years ago
 

Don't think of it too much.It's the NVIDIA driver.Probably is more restricted then in windows os,even under windows WEBGL is accelerated by your iGPU.You can visit this website(under Windows),go all the way to the bottom and you'll see something like that: ANGLE (Intel(R) HD Graphics 3000 Direct3D11 vs_5_0 ps_5_0) which means that the iGPU does most of the work.Nvidia's excuse is that if they allow their driver to accelerate  the browser on switchable graphics systems it will waste too much battery(i guess here is a good place to say: F#@%K NVIDIA).

You can try these flags combined with the the two mentioned above to see if it's possible to force it,if it's not you just have to leave with it: --supports-dual-gpus  --ignore-gpu-blacklist  --disable-gpu-driver-bug-workarounds  --use-angle=gl   --enable-gpu-scheduler

P.S. Remember to insert two(2) hyphen characters(dash) before the command line.For some reason when i post it here it's showing only one.

ASUS A53SV: i7-2860QM, 16GB DDR3 running WINDOWS 10 Enterprise LTSB 2016
SSD: SAMSUNG 850 PRO(256GB)+SAMSUNG 850 EVO(120GB)
eGPU: EVGA GTX 1080 FTW+ PE4C V3.0(mPCIe)+EVGA 650 G3 PSU (internal display ONLY)


ReplyQuote
dani29m
(@dani_tx)
Estimable Member
Joined: 3 years ago
 

@shalbob not sure if you still checking the replies here but if you do i think i found the proper command line switch:

--gpu-active-vendor-id=

--gpu-active-device-id=

 --gpu-active-vendor-id=   --gpu-active-device-id=

Unfortunately i can't test it due to being stuck with 378.92 driver which is the last one that properly accelerates Chrome with my eGPU  so try it and let us know.

ASUS A53SV: i7-2860QM, 16GB DDR3 running WINDOWS 10 Enterprise LTSB 2016
SSD: SAMSUNG 850 PRO(256GB)+SAMSUNG 850 EVO(120GB)
eGPU: EVGA GTX 1080 FTW+ PE4C V3.0(mPCIe)+EVGA 650 G3 PSU (internal display ONLY)


ReplyQuote
Ethan Richardson
(@ethanx94)
New Member
Joined: 2 years ago
 

This is a big issue for me. I'm using a GTX970 and as soon as I open Chrome the internal display becomes completely disabled and performance of my entire system on my external display is extremely sluggish. This also affects Chrome/Electron based apps such as Hyper and VSCode.

I was able to fix these by running the apps within their .app/Contents/MacOS directory with the --supports-dual-gpus=true flag only. From that point I made a pull request (pending approval) to do this automatically.  https://github.com/zeit/hyper/pull/2608/files

However this same fix does not work on Chrome itself and I've just fallen back to going to Settings -> Advanced -> System -> Use Hardware Acceleration Automatically (Off)

Everything I've read about this issue seems to be a lot of fingerpointing. Chrome people say Nvidia drivers are at fault Nvidia people say Apple is at fault. I can see the issue being that my iGPU is being completely disabled from the gfxCardStatus utility each time Chrome is started. But even disabling Settings -> Energy Savings -> Automatic Graphics Switching in the system produces the same issue.

I hope this gets fixed soon. In the meantime I'll keep throwing flags at the Chrome executable and hope something sticks.

 

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
Ethan Richardson
(@ethanx94)
New Member
Joined: 2 years ago
 

Hey, I hate to double post but I'm unable to edit my last post. After much snooping I've almost found a solution. Apparently Electron uses an older Chromium version that doesn't have this issue and can take the --supports-dual-gpus=true flag with no issue. I'm not sure how to patch this into the app itself but here's how I got it to work and maybe this will bring a solution to your issue as well.

In terminal run the following

If you've got a newer version of Chromium already, be sure to uninstall

$ brew cask uninstall mac-chromium

$ brew cask install https://raw.githubusercontent.com/DomT4/homebrew-chromium/86f2725972b95a0bcfe0192472b4736e4719e6ca/Casks/mac-chromium.rb

This will install a version of Chromium that is one version newer than what Electron uses, 60.0.3100.0. (Oddly enough Electron's build crashes entirely) After installing, pass in the flag like the following.

$ /Applications/Chromium.app/Contents/MacOS/Chromium --supports-dual-gpus=true

Check your chrome://gpu/ This will say hardware acceleration however sets the GL renderer to the Intel graphics. Although I had more issues than you (I think you acutually began at this point anyway), I feel like this is a step in the right direction and maybe with a couple more flags we can set the other GPU as active and have it not disable the iGPU entirely.

Let me know if you get any better performance with this method or if you have 'Hardware Accelerated' on all the items at the top of the page now.

UPDATE:

So I might've found a way around the issue. Unfortunately it only works if you have a MacBook with an dGPU. To quote myself...

Posted by: ethanx94

I found out something interesting today. eGPU/iGPU switching was giving me a lot of issues. Mainly because Nvidia drivers on OSX are currently problematic and have no Optimus support (which they might never will). Because of this certain Chrome/Electron based apps would cause my iGPU to get disabled and my laptop display to go black, however leaving the external display intact (albeit sluggish).

That being said, my MacBook also has a AMD460 dGPU, but it would never fall back to this for graphics switching when I booted with the eGPU attached. I don't know if I didn't read the guides thoroughly enough but apparently when you're booting OSX DO NOT plug the eGPU in until you're at the login screen, it will take a few seconds to register the external monitor, THEN login. With this the order of the graphics cards gets rearranged and I'm having no issues when my iGPU decides to switch to dGPU. As far as I can tell my eGPU still does it's job correctly and accelerates everything on my external display. So you're correct, just as Apple planned it.

For Windows, it's always been the opposite. Plug it in once you hit the bootcamp menu.

 

TL;DR: dGPU owners, plug your eGPU in at the Login screen, you will save yourself a lot of headache. Don't boot with the eGPU attached, don't plug in in at the Bootcamp screen.

 

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
stasilo
(@stasilo)
New Member
Joined: 1 year ago
 

Hm, when I start the chromium version that @ethanx94 linked with the following command line args my egpu (geforce gtx 1070) at least shows up as the active one on the chrome://gpu listing:

/Applications/Chromium.app/Contents/MacOS/Chromium --gpu-active-vendor-id=0x10de --gpu-active-device-id=0x1b81 --supports-dual-gpus=false --ignore-gpu-blacklist --disable-gpu-driver-bug-workarounds --use-angle=gl --enable-gpu-scheduler --gpu-testing-vendor-id=0x10de --gpu-testing-device-id=0x1b81

However, the name listed still says "Intel" and it does not appear to be used; the performance is as shitty as with the integrated card  🙁 I'm using a 2017 13" MacBook Pro with High Sierra 10.13.4. OpenGL works fine in other apps - just did Heaven benchmark, for example.

This is driving me mad, one of my main reasons for getting an egpu is webgl development work and os x is my primary dev environment :/ Doesn't work in Safari or Firefox either. I'd treat whoever figures this out before Nvidia/Google/Apple can get their shit together to a couple of beers 🙂

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
(@sebastien_jourdain)
New Member
Joined: 1 year ago
 

You just need to clause the lid when you start Chrome/Firefox from the screen attached to the eGPU and you are good to go. No flag needed. 
Tested on macOS 10.13.6 with ParaView-Glance.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
dani29m
(@dani_tx)
Estimable Member
Joined: 3 years ago
 

For macOS(ONLY), run Chrome with "--force_discrete_gpu"  argument....
....(also try "--force_discrete_gpu=1" or "2")
If it doesn't work try  "--gpu-driver-bug-workarounds=48" or "228"  
If it doesn't work try  "--enable-features=force_discrete_gpu"

Post some feedback guys and good luck. 🙂 

P.S. For Windows: NO LUCK...YET  😕

ASUS A53SV: i7-2860QM, 16GB DDR3 running WINDOWS 10 Enterprise LTSB 2016
SSD: SAMSUNG 850 PRO(256GB)+SAMSUNG 850 EVO(120GB)
eGPU: EVGA GTX 1080 FTW+ PE4C V3.0(mPCIe)+EVGA 650 G3 PSU (internal display ONLY)


ReplyQuote
dbz.io
(@dbz-io)
New Member
Joined: 1 year ago
 

None of these solutions have worked for me.

I'm running MacOS 10.13.5 (17F77) on the 2016 13-inch MacBook Pro. My eGPU is a GTX 1080. Interestingly, I'm able to get Chrome to detect/use the eGPU if I boot into Windows via BootCamp (I followed these instructions:  https://egpu.io/bootcamp-setup-guide-tb3-macbook-pro/).  

Has anyone else been able to get Chrome (or any browser for that matter) to use the eGPU?

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
tuner
(@tuner)
New Member
Joined: 4 months ago
 

Hi!

This topic hasn't got any posts for a while...

I'm doing WebGL software development. Because my 13" MacBook Pro 2018 is a bit slow with my 4K display, I'm considering getting an eGPU.

So, could someone confirm that Chrome (and/or Firefox & Safari) is able to utilise an eGPU on macOS Mojave?

Blackmagic eGPU is probably going to be my choice. It's silent and has enough power for my applications.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
(@jay_weeks)
New Member
Joined: 4 months ago
 

I've tested Chrome (and an Electron app) with a Razer Core X and it doesn't use the eGPU when running on the main display, only uses the eGPU when an external monitor is plugged directly into it.

Actually, editing to add I've had mixed results; sometimes Chrome detects the eGPU, other times it uses the Intel integrated graphics chip.

This post was modified 3 months ago

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote