2019 16" MacBook Pro (RP5500M) [9th,8C,H] + RTX 2080 @ 32Gbps-TB3 (Razer Core X Chroma) + Win10 1909 [Ningauble77] // inc Radeon VII @32Gbps-TB3 (Razer Core V2) macOS 10.15.1
I was toying with the idea of buying a windows gaming laptop as my primary egpu host/auxiliary work machine, but Apple released the 16" MBPro and it seemed like I could use it for both purposes, and it will be cheaper to sell the 13" MBPro and replace it than to buy a Razer or Alienware outright with the specs I wanted. I've been running this double enclosure egpu swapping setup with the 13" for a month or so after I decided I couldn't stay on the 295 Windows 10 build to use the Radeon in Windows due to security patches.
2019 16″ MacBook Pro – i9-9880H/HD Graphics 630 iGPU & Radeon Pro 5500M dGPU/16GB RAM/1TB SSD
Both Video Cards are connected via displayport to a Samsung Crg9 5120x1440x120hz Monitor
Unigine Valley MacOS, Radeon VII:
Superposition Win10, RTX 2080 FE:
Time Spy, RTX 2080 FE:
Fire Strike, RTX 2080 FE:
Port Royal, RTX 2080 FE:
Time Spy, Radeon Pro 5500M:
Fire Strike, Radeon Pro 5500M:
Eventually if and when AMD and Microsoft can get AMD egpu's running in fully patched Windows I will consider going back to 1 egpu, but I have to admit that while the Radeon VII was tolerable in W10 the games I play run noticeably better on the RTX 2080. When I do downsize to 1 enclosure I will sell the Core v2, since while it works well, and my Radeon VII sleep mode works more or less as it should, it is extremely noisy, even when idle.
This is the first account I've heard of eGPU functioning under 1909. Is that exclusive to Nvidia cards?
death to self
@haoshiro my GTX 1070 also works fine in BootCamp on 1909 (2017 13-inch using automate-egpu.efi method; without it, I get a bluescreen on hotplug and Windows logo freeze on internal screen on connected cold boot). Haven't tried the 1060 yet but I don't think there will be any issue with that on either.
I'm just curious about those 16" MacBook BootCamp builds, when did Apple decide to make the firmware work with eGPUs in Windows without any EFI hacks? Or do you still need to use something on those? (since none of the builds i read so far mentioned any specific boot method)
2017 13-inch MacBook Pro TB [7th,2C,U], macOS 11 (not in regular use with eGPU, mostly for testing)
2019 Intel NUC10i7FNK [10th,6C,U], Windows 10 20H2
Aorus Gaming Box 1070 (EVGA GTX1060 3G) - silent mod #1
Aorus Gaming Box 1080 (Gigabyte GTX1070 ITX OC) - Custom Case solution (TBD)
I'm trying to build a machine for VR and stumbled upon your thread. Since I don't know enough about benchmarks, I have to ask: what tells me the average of 130fps in superposition? Its just 1080p if I read that correctly, right? So I might be needing either a RTX2070Super or even a 2080Ti, to get roughly that fps in 4K, right?
First I really appreciate the great information you're sharing here, thanks a lot you're both awesome.
Bottom line up front: I am having only partial success with a very similar setup to the OP (16" MBP and a 2080 eGPU), but the card isn't working properly and would love some insight. I'm trying to get this up and running like this because I very much prefer Macs but I want to play with a Valve Index.
I am working with a 16" MBP base model (i7, 5300m dGPU) and a Gigabyte Nvidia RTX 2080 Super enclosed in a Sonnet 550W eGPU box. Before going any further... could it be as simple as I need a 650W power supply like the Sonnet 650W or the Razer Core X for the 2080?
Continuing on... I installed the latest version of Windows 10 via bootcamp and installed the latest Nvidia drivers. The 2080 is recognized by device manager, and the external display works connected directly to the eGPU as well as the internal display both - however the card itself doesn't actually seem like it's being used while running benchmarks so I am very confused.
My heaven benchmark averaged a low 30 fps (which is the exact same as the 5300m, so I'm assuming that is what was actually being used for the test even though while the benchmark is running it says 2080 in the corner), 3dMark got a low score of about 3,000, and my MBP fans were going crazy the entire time while the eGPU was almost dead silent. However, when I run the Geekbench 5 compute test and isolate the 2080 I get a score around 100,000 versus only a 25,000 when I isolate the 5300m so the card seems to be working for that specific test.
Any ideas how I can get my card working properly? Any thoughts as to why my card would register and work in some ways but not the important ones? Is it perhaps because I have an i7 and a 5300m, or that I am using a Sonnet instead of a Razer Core X? I have a Razer Core X arriving next week I can try, but it doesn't make sense to me.
@tsygna If you are running the RTX 2080 eGPU with an external monitor, make sure to set the external monitor as the primary display. This will make sure all apps/games launch through the eGPU-connected monitor and therefore use the eGPU. The Sonnet Breakaway Box 550 has plenty of juice for the RTX 2080 and 87W Power Delivery to the 2019 16-in MacBook Pro.
Thanks - I will give it a shot shortly and let you know!
So basically what you're saying is that it seems like the 2080 is installed correctly and being recognized by the computer but is not using it - so I basically need to tell the software specifically to use it primarily instead of the 5300m by default? I also read here that in Windows 10 may have to individually select app by app which GPU profile to use for best performance - do you think that applies to this situation?
eGPU setup on Windows 10
- Method 1: Connect a second monitor to the video output on the eGPU card and then set that monitor as your main display. For more details, please see https://support.microsoft.com/en-us/help/4340331/windows-10-set-up-dual-monitors .
- Method 2: If the software supports it (e.g. DaVinci Resolve), go to the preferences of that software and add the eGPU to the list of GPUs that can be used by that software. That way, the eGPU can be used to help render your videos for example.
- Method 3: In the system settings under Display settings > Graphics settings, add the programs you are using to the list of apps and then set the graphics preference for these apps to "High performance".
I got it working thanks to your help itsage! I am very excited because the setup is now ready for my Valve Index arriving next week (hello Pistol Whip, Boneworks, Gorn, and next year Half-Life: Alyx). In addition to the initial setup, this is what I did to finally get it working:
First in Windows display settings, I extended the monitor to my 70" 4K HDR TV instead of screen-mirroring which I was doing before. The external TV is connected via HDMI to the eGPU, and the eGPU is connected to my 16" MBP via TB3.
Second I used nVidia desktop manager and under 3D Settings went to Set PhysX Configuration and set the 2080 as the default (it was set to the 5300m).
I am not sure if only one or both of the changes above were necessary because I did both steps at the same time and doesn't matter to me as long as it works.
Thanks again for your help, hope this helps anybody else searching. I will probably start a separate post with my specific situation just in case others are looking.
@tsygna Glad to hear you have the setup working! Assigning each app/game in Windows Graphics Option to use the "High Performance GPU" is the surest way to make use of the eGPU regardless of monitor configuration. Of course the eGPU powering an external monitor yields the best performance.