RTX 3080, Thunderbolt 3, and Throughput Saturation?
 
Notifications
Clear all

RTX 3080, Thunderbolt 3, and Throughput Saturation?  

 of  5
  RSS

Mike Caputo
(@mike_caputo)
New Member
Joined: 4 weeks ago
 

The gist of my question is: am I already saturating TB3 throughput with my 2080 and therefore wouldn't see any improvements with a RTX 3080?

I'm running:

  • This year's Dell XPS 15 9500
  • Razer Chroma eGPU w/ an RTX 2080 Super in it
  • Attached to laptop via TB3
  • Attached to external monitor (3440 x 1440) via Displayport 1.2

Naturally I get the ~30% performance drop compared to a desktop with the same card. From what I understand, this is due to a throughput limitation of the TB3 connection. Could someone comment on how that TB3 limitation would affect an RTX 3080 card? Would I see an overall performance improvement, or would there be no improvement because of the TB3 bandwidth limitation? 

I know there's a math equation at work here, and I'm curious to learn what it is. Thanks!

 

Update: Amazon has a landing page for AIB RTX 3080 cards [link]. Newegg also has listings for AIB cards [link]. Best Buy will have RTX 3080 FE [link].

This topic was modified 4 weeks ago

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
Gakkou
(@gakkou)
Eminent Member
Joined: 2 months ago
 

You get 30% performance drop compared to a desktop? That's quite a drop. Have you checked that the laptop processor is not bottlenecking you?

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

DrEGPU liked
ReplyQuote
DrEGPU
(@dregpu)
Estimable Member
Joined: 2 years ago
 

In a reddit thread somewhere (r/nvidia?), I think someone from Nvidia responded to some questions. I remember someone asking about using the RTX 30-series in PCIe 3.0 slots and the performance hit. Nvidia responded that in their testing, current games only experience a low single digit % decrease in performance. We could extrapolate that to mean that you could very well experience quite a bit of performance drop in an eGPU, since the new cards are so much more powerful. Supposedly. 

I agree, that a 30% drop seems steep. CPU, RAM, SSD, could all be factors. 

MBP 2018 16 inch + Razer Core X@40Gbps-TB1 (Zotac RTX 2080 Ti Amp) + Ubuntu 20.04, Win10

 
2011 13" MacBook Pro [2nd,2C,M] + RTX 2080 Ti @ 10Gbps-TB1>TB3 (Razer Core X) + Linux Ubuntu 18.04.02 LTS [build link]  


ReplyQuote
Gakkou
(@gakkou)
Eminent Member
Joined: 2 months ago
 

@dregpu

 

The number that I have seen thrown around here is about 5-10% assuming no CPU bottleneck and playing at 4k or very high res, on 1080p the loss will be insane

 

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
tsakal
(@tsakal)
Reputable Member
Joined: 2 years ago
 

@mike_caputo, somebody has done an extensive review of the different options and the impact on the performance. Not sure if it was mini i5. When I get on my laptop I ll try to find it. 

If I remember well the 25pct sounds right. 

 

@gakkou, if it was only 5-10pct everybody would go the egpu route. 

@mike_caputo There you go  https://egpu.io/forums/mac-setup/pcie-slot-dgpu-vs-thunderbolt-3-egpu-internal-display-test/

A) 2020 MacBook Pro, i7-1038NG7, 32GB RAM, 1TB, EGPU Razer Core X, Nitro+ RX5700 xt 8Gb, Samsung 65 Q70R
Mac OS Catalina 10.15.5, Internal Bootcamp Windows 10 2004 pci.sys swap.
B) 2.7 GHz I7 4 Cores, 16Gb, 1TB MBP 13 2018 TB3 , EGPU Razer Core X, Nitro+ RX5700 xt 8Gb, LG 32UK550
Mac OS Catalina 10.15.2, Ext SSD Windows 10 1903 V1 .295

C) 2.7 GHz I7 4 Cores, 16Gb, 1TB MBP 13 2018 TB3 , EGPU Gigabyte Gaming Box RX580 8Gb, Mac OS Catalina 10.15.2, Ext SSD Windows 10 1803

D) 3.1 GHz I7, 16Gb, 1TB MBP 13 2015 TB2 , EGPU Gigabyte Gaming Box RX580 8Gb


ReplyQuote
Gakkou
(@gakkou)
Eminent Member
Joined: 2 months ago
 

@tsakal

In that thread it says at 4k the drop is around 5% when compared with same CPU in same desktop PC. Of course the 1080p difference is huge but as the resolution goes higher, the drop becomes smaller. No point buying a 3080 card anyway if you play at 1080p, my iMac's internal card can handle that resolution ultra settings easily

 

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
Eightarmedpet
(@eightarmedpet)
Noble Member Moderator
Joined: 4 years ago
 

@tsakal, that’s a great post I forgot all about. Would be great to sticky that actually (if it isn’t? Itsage?)

something I often forgot when comparing FPS between my set up and a pc benchmark was my laptop cpu was a bigger bottleneck than tb3 so performance loss was higher. 

after these new Nvidia cards are released I’m going to be looking at a Mac mini paired with a rtx 3070, should be a beast!

 

2017 13" MacBook Pro Touch Bar
GTX1060 + AKiTiO Thunder3 + Win10
GTX1070 + Sonnet Breakaway Box + Win10
GTX1070 + Razer Core V1 + Win10
Vega 56 + Razer Core V1 + macOS + Win10
Vega 56 + Mantiz Venus + macOS + W10

---

LG 5K Ultrafine flickering issue fix

 
2017 13" MacBook Pro [7th,2C,U] + RX 5700 XT @ 32Gbps-TB3 (Mantiz Venus) + macOS 10.15.4 & Win10 2004 [build link]  


ReplyQuote
tilchev
(@tilchev)
Eminent Member
Joined: 5 months ago
 

While high end 2000 series cards could theoretically bottleneck a PCIe 3.0 x8 slot, for the most part they would run fine. This means an eGPU setup using an at best 3.0 x4 connection wouldn't be affected as much. 3000 series cards on the other hand are made for PCIe 4.0 which is double the speed of PCIe 3.0. So in some niche cases they could even be bottle necked by a PCIe 3.0 x16 slot, and thus perform much worse on a x4 slot.

I wouldn't pull the trigger on a 3080 untill eGPU enclosures that run on PCIe 4.0 emerge, including a thunderbolt, USB or M.2 connection that also uses PCIe 4.0. Of course this is just simple math and theory crafting, the best way for us to know for sure is when someone tests it out Smile

2015 15" Dell Latitude E5570 (R7 M370) [6th,4C,H] + GTX 1060 @ 32Gbps-M2 (ADT-Link R43SG) + Win10 [build link]  

ReplyQuote
Arizor
(@arizor)
Trusted Member
Joined: 7 months ago
 

Actually even the 3 series won't take much advantage of PCIE 4, as explained in the NVIDIA answers thread on Reddit:

 

PCIE Gen4

Will customers find a performance degradation on PCIE 3.0?

System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases.

 

Full thread available here - https://www.reddit.com/r/nvidia/comments/ilhao8/nvidia_rtx_30series_you_asked_we_answered/  

2019 16-inch MacBook Pro Retina (2.3ghz Intel i9) // Radeon 5500M (8GB).
MacOS Catalina // Bootcamp Win10 2004 (Build 19041).
Razer Core X // Vega 64 // 2m Active Thunderbolt cable.


ReplyQuote
Gakkou
(@gakkou)
Eminent Member
Joined: 2 months ago
 

@eightarmedpet

I think 3070 will be great for egpu. I might actually buy that one or 3080 when it comes, but I am not sure how much better 3080 will be as egpu

 

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
 of  5