Ah fair, so there is a notable increase in performance in the 2080ti setup then?
I thought the thunderbolt 3 would completely bottleneck the gpu and cpu performance.
Also have you found any stuttering issues when playing games? I've heard quite a lot of people complain about that when using an egpu.
How much temperature decrease you get in the laptop overall? does it affect the cpu temperature as well?
To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.
.
Thanks for your benchmarks!
With the increase in VR performance, do you find any notable increase in non-VR games as well?
Also have you tried running it on 4K since the 3080 is designed to run at that resolution.
To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.
.
Ah fair, so there is a notable increase in performance in the 2080ti setup then?
I thought the thunderbolt 3 would completely bottleneck the gpu and cpu performance.
Also have you found any stuttering issues when playing games? I've heard quite a lot of people complain about that when using an egpu.
How much temperature decrease you get in the laptop overall? does it affect the cpu temperature as well?
Yes, there's a significant increase for me.
I'm definitely being bottlenecked in my RB15 by the TB3 port, but if I were to have an Icelake or the soon to be released tigerlake processors, then I wouldn't have nearly as much of a bottleneck since for these the TB3 ports direct connect to the CPU.
I had stuttering at first, this was solved by correctly installing the drivers (there are guides on this website) and by disabling my dGPU when I am plugged into the eGPU.
With my temps, I went from having 90-95C highs on the CPU while gaming to 80C highs, up to 75C highs if I force the fans on my laptop to run at full blast (I've found razer's fan profile is way, way more conservative than it should be when reaching gaming temps.
Laptop: Razer Blade Advanced 15" w/ GTX 2060, i7 8750H, 16GB RAM
eGPU: Razer Core x Chroma w/ 2080ti
Yeah, I've read that the new tigerlake CPU will have the thunderbolt 4.
But I also read that the thunderbolt 4 won't be much faster than 3 with the same 40GB/s transfer rate. I assume that Intel will go with the same architecture as the icelake CPU in the tigerlake so we can actually see how much of an improvement tb 4 is.
Interesting, I usually run a cooler when gaming and set the fans to balanced and it managed to keep the temperature around low 80s and high 70s. I'm curious how much of a difference the egpu could make if that's the case! Also, how much is your temperature when running the egpu?
To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.
.I don't play non-VR games, now. Sorry. I focus on VR games.
I ran Time spy and Time spy extreme(4K).
dGPU2070 Time spy 7524 graphic7804 CPU6255
https://www.3dmark.com/3dm/51945385?
dGPU2070 Time spy Extreme 3473 graphic3673 CPU2657
https://www.3dmark.com/3dm/51945385?
RTX3080 Time spy 11792 graphic13544 CPU6805
https://www.3dmark.com/3dm/51459819
RTX3080 Time spy extreme 6142 graphic7846 CPU2754
https://www.3dmark.com/3dm/51946576?
do you think a evga 3080 ftw3 with 3 pin , will work with the razer core x chroma?
Razer 15 advanced model (2019) 32gb/1tbssd 2070 maxq w/ Razer core x chroma (3080 evga ftw3 ultra)
Razer core x chroma has 6+2(8) pin x2 power supply, so it cannot handle 8pin x3.
@mo_oi, so your saying that the 700 watts razer egpu will not work with the evga ftw3 that will have an available 8 pin? it needs to have a psu that has 3 pin connectors of 8?
Razer 15 advanced model (2019) 32gb/1tbssd 2070 maxq w/ Razer core x chroma (3080 evga ftw3 ultra)
@mo_oi, so your saying that the 700 watts razer egpu will not work with the evga ftw3 that will have an available 8 pin? it needs to have a psu that has 3 pin connectors of 8?
Get a 8 pin to 2 8pin cable. If the power supply itself is single rail then it should work.
https://www.amazon.com/COMeap-Express-Adapter-Braided-Y-Splitter/dp/B072JR4H3N
Just put that on one of the 8pin cables... obviously at your own risk, but I see no reason why it would not work. The chroma x power supply is 700W (plenty of overhead for a 3080), and if it really is a single rail supply you shouldn't have to worry about over current protection.
LG Gram 17 | Sonnet Breakaway Box 550 | Asus Strix RTX 2070 OC Edition | Win 10 Pro 20H2 + Fedora 32 Dual Boot
Build Link
2018 17" LG Gram 17 [8th,4C,U] + RTX 2070 @ 32Gbps-TB3 (Sonnet Breakaway 550) + Win10 [build link]