2019 16" MacBook Pro (RP5500M) [9th,8C,H] + RTX 2080 @ 32Gbps-TB3 (Razer Core X Chroma) + Win10 1909 [Ningauble77] // inc Radeon VII @32Gbps-TB3 (Razer Core V2) macOS 10.15.1
I was toying with the idea of buying a windows gaming laptop as my primary egpu host/auxiliary work machine, but Apple released the 16" MBPro and it seemed like I could use it for both purposes, and it will be cheaper to sell the 13" MBPro and replace it than to buy a Razer or Alienware outright with the specs I wanted. I've been running this double enclosure egpu swapping setup with the 13" for a month or so after I decided I couldn't stay on the 295 Windows 10 build to use the Radeon in Windows due to security patches.
2019 16″ MacBook Pro – i9-9880H/HD Graphics 630 iGPU & Radeon Pro 5500M dGPU/16GB RAM/1TB SSD
Both Video Cards are connected via displayport to a Samsung Crg9 5120x1440x120hz Monitor
Unigine Valley MacOS, Radeon VII:
Superposition Win10, RTX 2080 FE:
Time Spy, RTX 2080 FE:
Fire Strike, RTX 2080 FE:
Port Royal, RTX 2080 FE:
Time Spy, Radeon Pro 5500M:
Fire Strike, Radeon Pro 5500M:
Eventually if and when AMD and Microsoft can get AMD egpu's running in fully patched Windows I will consider going back to 1 egpu, but I have to admit that while the Radeon VII was tolerable in W10 the games I play run noticeably better on the RTX 2080. When I do downsize to 1 enclosure I will sell the Core v2, since while it works well, and my Radeon VII sleep mode works more or less as it should, it is extremely noisy, even when idle.
@haoshiro my GTX 1070 also works fine in BootCamp on 1909 (2017 13-inch using automate-egpu.efi method; without it, I get a bluescreen on hotplug and Windows logo freeze on internal screen on connected cold boot). Haven't tried the 1060 yet but I don't think there will be any issue with that on either.
I'm just curious about those 16" MacBook BootCamp builds, when did Apple decide to make the firmware work with eGPUs in Windows without any EFI hacks? Or do you still need to use something on those? (since none of the builds i read so far mentioned any specific boot method)
I'm trying to build a machine for VR and stumbled upon your thread. Since I don't know enough about benchmarks, I have to ask: what tells me the average of 130fps in superposition? Its just 1080p if I read that correctly, right? So I might be needing either a RTX2070Super or even a 2080Ti, to get roughly that fps in 4K, right?
Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts
First I really appreciate the great information you're sharing here, thanks a lot you're both awesome.
Bottom line up front: I am having only partial success with a very similar setup to the OP (16" MBP and a 2080 eGPU), but the card isn't working properly and would love some insight. I'm trying to get this up and running like this because I very much prefer Macs but I want to play with a Valve Index.
I am working with a 16" MBP base model (i7, 5300m dGPU) and a Gigabyte Nvidia RTX 2080 Super enclosed in a Sonnet 550W eGPU box. Before going any further... could it be as simple as I need a 650W power supply like the Sonnet 650W or the Razer Core X for the 2080?
Continuing on... I installed the latest version of Windows 10 via bootcamp and installed the latest Nvidia drivers. The 2080 is recognized by device manager, and the external display works connected directly to the eGPU as well as the internal display both - however the card itself doesn't actually seem like it's being used while running benchmarks so I am very confused.
My heaven benchmark averaged a low 30 fps (which is the exact same as the 5300m, so I'm assuming that is what was actually being used for the test even though while the benchmark is running it says 2080 in the corner), 3dMark got a low score of about 3,000, and my MBP fans were going crazy the entire time while the eGPU was almost dead silent. However, when I run the Geekbench 5 compute test and isolate the 2080 I get a score around 100,000 versus only a 25,000 when I isolate the 5300m so the card seems to be working for that specific test.
Any ideas how I can get my card working properly? Any thoughts as to why my card would register and work in some ways but not the important ones? Is it perhaps because I have an i7 and a 5300m, or that I am using a Sonnet instead of a Razer Core X? I have a Razer Core X arriving next week I can try, but it doesn't make sense to me.
@tsygna If you are running the RTX 2080 eGPU with an external monitor, make sure to set the external monitor as the primary display. This will make sure all apps/games launch through the eGPU-connected monitor and therefore use the eGPU. The Sonnet Breakaway Box 550 has plenty of juice for the RTX 2080 and 87W Power Delivery to the 2019 16-in MacBook Pro.
Thanks - I will give it a shot shortly and let you know!
So basically what you're saying is that it seems like the 2080 is installed correctly and being recognized by the computer but is not using it - so I basically need to tell the software specifically to use it primarily instead of the 5300m by default? I also read here that in Windows 10 may have to individually select app by app which GPU profile to use for best performance - do you think that applies to this situation?
eGPU setup on Windows 10
- Method 1: Connect a second monitor to the video output on the eGPU card and then set that monitor as your main display. For more details, please see https://support.microsoft.com/en-us/help/4340331/windows-10-set-up-dual-monitors .
- Method 2: If the software supports it (e.g. DaVinci Resolve), go to the preferences of that software and add the eGPU to the list of GPUs that can be used by that software. That way, the eGPU can be used to help render your videos for example.
- Method 3: In the system settings under Display settings > Graphics settings, add the programs you are using to the list of apps and then set the graphics preference for these apps to "High performance".
I got it working thanks to your help itsage! I am very excited because the setup is now ready for my Valve Index arriving next week (hello Pistol Whip, Boneworks, Gorn, and next year Half-Life: Alyx). In addition to the initial setup, this is what I did to finally get it working:
First in Windows display settings, I extended the monitor to my 70" 4K HDR TV instead of screen-mirroring which I was doing before. The external TV is connected via HDMI to the eGPU, and the eGPU is connected to my 16" MBP via TB3.
Second I used nVidia desktop manager and under 3D Settings went to Set PhysX Configuration and set the 2080 as the default (it was set to the 5300m).
I am not sure if only one or both of the changes above were necessary because I did both steps at the same time and doesn't matter to me as long as it works.
Thanks again for your help, hope this helps anybody else searching. I will probably start a separate post with my specific situation just in case others are looking.
@tsygna Glad to hear you have the setup working! Assigning each app/game in Windows Graphics Option to use the “High Performance GPU” is the surest way to make use of the eGPU regardless of monitor configuration. Of course the eGPU powering an external monitor yields the best performance.
Hi guys sorry to revive this thread,
I seem to be in the same position as @tsygna where I have a sonnet breakaway box 550w but I have an rtx 2080 ti and the i9 edition of the MacBook Pro. I can get the egpu working on the external display only, extended of course, but never on the internal display (which is what I bought it for lol).
I have windows 1903 installed, which seems the most friendly with egpu setups compared to 1909 which used the amd gpu for everything regardless of the egpu. I have changed Nvidia settings to use the rtx 2080 ti for OpenGL. Need to check the PhysX option which I'll get done after work.
Is there anyway I can just use a HDMI dummy plug, attach it to the gpu and use the internal screen only? At the moment with an actual display attached I can't make it primary, the option is blanked out. It either says use display 1 or display 2, extend or duplicate. When I choose display 2 the egpu works perfectly on an external screen as the DGPU is cut out of the equation. Why can't apple make it easy and disable the dgpu and enable the igpu with egpu detection. Working with intel graphics is like bread and butter for me.
I just want it working lol, I went through the effort of putting liquid metal on the MacBook Pro 3 hours after getting it (fairly easy to disassemble, tutorial would be on NBR forums soon) and liquid metalling the rtx 2080 ti, as well as doing tdp mods and vbios mods on it to make the bottleneck literally unnoticeable with the amount of overclock I'm doing.
Edit: Even with high performance selected in Windows graphics options the internal display still utilises the Radeon 5500M instead of the 2080 ti
I saw your gtx 1080 ti setup with the MacBook and was quite impressed to say the least.
Sorry for the long post, I hope I can get the problem across.
@damafiagamer The only possibility to accelerate the internal display with eGPU is through Windows Hybrid graphics. This is done by going to Graphics settings then manually assign the High performance GPU to the game EXE.
Thanks for replying quickly,
Yep I did that but the Radeon GPU is still utilised on the internal display. I tried 3dmark, unigine, furmark and a couple of games. All set to high performance whilst the display is shown to be extended and NOT mirrored. I’ve reinstalled Nvidia drivers too, I don’t know what I’m doing wrong...
If I go into display settings and choose only display 2 the external gpu works completely fine and the internal goes black (as it should).
Maybe the laptop isn’t compatible with the rtx 2080 ti? Or there must be some driver settings I haven’t messed with yet.
If you render using eGPU on to a display not connected to it (such as internal display) then there will be activity on both GPUs - the eGPU as well as the GPU connected to the display (data is copied across GPU framebuffers). Is your issue that the RTX is not utilized at all on internal display after setting high performance?
Hey brother maybe this info will help even though it’s basic and you sound like you know what you’re doing. For me the problems were all basic Windows settings and easy to fix but because this is my first eGPU took me a while to realize.
Software wise I used bootcamp to auto-install the most recent version of Windows, then updated Windows. I also downloaded the newest drivers from NVIDIA. So all the software is what is straightforward and the newest. I didn’t download any specific versions. Hardware wise I have to have my eGPU turned on, but not connected to my MBP until Windows is booted up. Then I connect and it is recognized properly. Next I went to display options. I chose to extend displays. Then I went back to the top where both displays are labeled as 1 and 2, clicked the display I wanted to be primary - click it so it is highlighted. Then I scrolled back down and marked the box that says use as primary. Everything then worked fine, and as a second measure, in graphics settings I do manually add each program .exe file for the programs I want, and choose the max performance setting.
It should work for you too since now we both have almost the same setup as I took back my base model MBP and got an i9 version like you, and we both have 2080 but yours is a ti and I have a super. For me it was just user error and lack of knowledge since this is my first eGPU.
Hope this helps sorry if I misunderstood your question.
Thank you both for your help, I got it working in the end but the bottleneck was so severe, talking a 40% loss in performance, that I just went out and bought a used 4k 28'' monitor off gumtree.
I have that connected and I've got to say I'm impressed by Samsungs 4k display's, calibrated it can get close to even the MacBook Pro screen! Well I guess I now have the portability factor and the gaming factor too.
You are 100% correct and this gpu frame buffer issue is what causes the performance loss. I wish apple could make a 16'' MacBook with just the i9 cpu without dedicated graphics, that as an option alone would automatically allocate the CPU more wattage and higher clock speeds, making it a true productivity laptop. Something that could get close to the 9900k like my HP Omen 17'' with the same i9 cpu as the MacBook.
Here's my final setup, slightly cramped but will do. Damn I love this new monitor lol[img] [/img]
You are 100% correct and this gpu frame buffer issue is what causes the performance loss.
Your decision to get an external display was a good one.
The reason for performance loss is actually thunderbolt latency (which comes up due to the framebuffer copy in this scenario). Memory copy across framebuffers is actually generally fine (NVIDIA Optimus does this), but when doing the copy over two GPUs connected via Thunderbolt (as here), the latency is simply too great (back and forth communication).
The CPU now runs cooler too as the internal GPU isn't utilised as much when the external display is active. I can get around 10-15w more out of the CPU so games should run smoother too.
An external display was the best option to be honest. Thanks for your help bro.