RTX 3080, Thunderbolt 3, and Throughput Saturation?
The gist of my question is: am I already saturating TB3 throughput with my 2080 and therefore wouldn't see any improvements with a RTX 3080?
- This year's Dell XPS 15 9500
- Razer Chroma eGPU w/ an RTX 2080 Super in it
- Attached to laptop via TB3
- Attached to external monitor (3440 x 1440) via Displayport 1.2
Naturally I get the ~30% performance drop compared to a desktop with the same card. From what I understand, this is due to a throughput limitation of the TB3 connection. Could someone comment on how that TB3 limitation would affect an RTX 3080 card? Would I see an overall performance improvement, or would there be no improvement because of the TB3 bandwidth limitation?
I know there's a math equation at work here, and I'm curious to learn what it is. Thanks!
In a reddit thread somewhere (r/nvidia?), I think someone from Nvidia responded to some questions. I remember someone asking about using the RTX 30-series in PCIe 3.0 slots and the performance hit. Nvidia responded that in their testing, current games only experience a low single digit % decrease in performance. We could extrapolate that to mean that you could very well experience quite a bit of performance drop in an eGPU, since the new cards are so much more powerful. Supposedly.
I agree, that a 30% drop seems steep. CPU, RAM, SSD, could all be factors.
The number that I have seen thrown around here is about 5-10% assuming no CPU bottleneck and playing at 4k or very high res, on 1080p the loss will be insane
@mike_caputo, somebody has done an extensive review of the different options and the impact on the performance. Not sure if it was mini i5. When I get on my laptop I ll try to find it.
If I remember well the 25pct sounds right.
@gakkou, if it was only 5-10pct everybody would go the egpu route.
A) 2020 MacBook Pro, i7-1068NG7, 32GB RAM, 1TB, EGPU Razer Core X, Gigabyte OC 3080 10Gb, Samsung 49 1440p UltraWide C49RG
Mac OS Catalina 10.15.7, Internal Bootcamp Windows 10 latest update previously W10 2004 pci.sys swap.
B) 2.7 GHz I7 4 Cores, 16Gb, 1TB MBP 13 2018 TB3 , EGPU Razer Core X, Nitro+ RX5700 xt 8Gb, LG 32UK550
Mac OS Catalina 10.15.2, Ext SSD Windows 10 1903 V1 .295
In that thread it says at 4k the drop is around 5% when compared with same CPU in same desktop PC. Of course the 1080p difference is huge but as the resolution goes higher, the drop becomes smaller. No point buying a 3080 card anyway if you play at 1080p, my iMac's internal card can handle that resolution ultra settings easily
@tsakal, that’s a great post I forgot all about. Would be great to sticky that actually (if it isn’t? Itsage?)
something I often forgot when comparing FPS between my set up and a pc benchmark was my laptop cpu was a bigger bottleneck than tb3 so performance loss was higher.
2017 13" MacBook Pro Touch Bar
GTX1060 + AKiTiO Thunder3 + Win10
GTX1070 + Sonnet Breakaway Box + Win10
GTX1070 + Razer Core V1 + Win10
Vega 56 + Razer Core V1 + macOS + Win10
Vega 56 + Mantiz Venus + macOS + W10
While high end 2000 series cards could theoretically bottleneck a PCIe 3.0 x8 slot, for the most part they would run fine. This means an eGPU setup using an at best 3.0 x4 connection wouldn't be affected as much. 3000 series cards on the other hand are made for PCIe 4.0 which is double the speed of PCIe 3.0. So in some niche cases they could even be bottle necked by a PCIe 3.0 x16 slot, and thus perform much worse on a x4 slot.
I wouldn't pull the trigger on a 3080 untill eGPU enclosures that run on PCIe 4.0 emerge, including a thunderbolt, USB or M.2 connection that also uses PCIe 4.0. Of course this is just simple math and theory crafting, the best way for us to know for sure is when someone tests it out
Actually even the 3 series won't take much advantage of PCIE 4, as explained in the NVIDIA answers thread on Reddit:
Will customers find a performance degradation on PCIE 3.0?
System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases.
Full thread available here - https://www.reddit.com/r/nvidia/comments/ilhao8/nvidia_rtx_30series_you_asked_we_answered/
I think 3070 will be great for egpu. I might actually buy that one or 3080 when it comes, but I am not sure how much better 3080 will be as egpu