Setup & Software Discussions
2018 17" Lenovo ThinkPad P72 [8th,6C,H] + GTX 1080 Ti @ 32Gbps-TB3 (Razer Core X...
 

2018 17" Lenovo ThinkPad P72 [8th,6C,H] + GTX 1080 Ti @ 32Gbps-TB3 (Razer Core X) + Win10 [bobbie424242] // Detailed 4K benchmarks comparison vs my i7-8700K gaming PC  

  RSS

bobbie424242
(@bobbie424242)
Active Member
Joined: 3 months ago
 

 

Introduction

(Preamble: this post is adapted from a /r/egpu post to present benchmarks comparison results between eGPU vs gaming PC @ 4K, hence not following the template. It is also
a build but the main purpose is benchmarking)

I have both a gaming PC and powerful laptop (specs below) and I've been thinking of replacing the PC by my laptop + eGPU.

One specificity of my setup is that I have a 4K / 60Hz G-Sync monitor (The Asus Predator XB271K) and exclusively game at 4K.
While researching the viability of this, I found that it is virtually impossible to find eGPU 4K benchmarks, let alone a comparison vs a PC with the same graphic card and a CPU with comparable performance.

Thus I decided to buy a Razer Core X and do the comparison myself. This post presents the results using several games that I own or benchmark demo of games. There's 3DMark syntehic benchmarks as well.

TL;DR

This benchmarking shows that the performance loss at 4K with an eGPU is greater than I thought it would which is somewhat disappointing. In AAA games it is not rare to see a 30% drop in fps.

Some games that are within G-Sync range (36-60fps, ie Assasins Creed: Origins and KC:D) on the PC and playable like that fall below the G-Sync range with an eGPU and require reducing graphic settings or resolution. Games that are above 60fps or way above (ie DOOM) can still stay above 60 or drop just below an be within G-Sync range thus still being OK.

Though keep in mind that these benchmark are done at maxed out graphic settings (or close to it) and this is rarely the most effective way to play, as slightly lower settings can be way better performing for minimal (or unnoticeable) visual differences.

Methodology

PC specs

* Motherboard: MSI Z370 SLI PLUS
* CPU: Intel Core i7-8700K, no overclock
* RAM: 2x16GB DDR4 2666 Mhz, dual channel

Laptop specs

* Model: Lenovo ThinkPad P72
* CPU: Intel Core i7-8850H, -0.135mv undervolt
* RAM: 2x8GB DDR4 2400 Mhz, dual channel
* Razer Core X connected with 4-lane TB3

Common

* MSI GTX 1080 Ti GAMING X 11G, no overclock
* Windows 10 1903
* NVIDIA driver v431.36
* Geforce Experience overlay disabled
* G-Sync disabled in NVIDIA CP
* Power management set to "Optimal Power" in NVIDIA CP

Games

* DOOM 2016 (Vulkan)
* F1 2016 (in-game benchmark)
* Final Fantasy XV Demo (in-game benchmark)
* Assassin's Creed: Origins (in-game benchmark)
* Asseto Corsa (in-game benchmark)
* Kingdom Come: deliverance
* Forza Horizon 4 demo (in-game benchmark)

ALL TESTS PERFORMED AT 4K ON AN EXTERNAL 4K MONITOR, v-sync off

For games with no in-built benchmark, I ran through the same 1-minute section and recorded framerates with the MSI Afterburner benchmark tool.

Geekbench CPU benchmarks

PC (i7-8700K): single core: 5429, multicore: 24532
Laptop (i7-8850H): single core: 5215, multicore: 22493

=> both CPU are fairly close in performance which is not too surprising as both are 6 cores. This is good for this GPU vs eGPU comparison
as the CPU should not impact significantly on the results.

Synthetic GPU benchmarks

PC

Time Spy: 9405 / Graphics: 9957 / CPU: 7160
Time Spy Extreme: 4405 / Graphics: 4640 / CPU: 3426
Fire Strike: 22063 / Graphics: 28435 / Physics: 18107 / Combined: 9378
Fire Strike Ultra: 7099 / Graphics: 7055 / Physics: 18078 / Combined: 3810

eGPU

Time Spy: 8049 / Graphics: 8432 (-15.32%) / CPU: 6402
Time Spy Extreme: 3996 / Graphics: 4325 (-6.79%) / CPU: 2795
Fire Strike: 17205/ Graphics: 19855 (-30.17%) / Physics: 16695 / Combined: 8801
Fire Strike Ultra: 6426 / Graphics: 6331 (-10.26%) / Physics: 16717 / Combined: 3549

Time Spy PC vs eGPU
Time Spy Extreme PC vs eGPU
Fire Strike PC vs eGPU
Fire Strike Ultra PC vs eGPU

DOOM 2016 / Vulkan / Ultra / Kadigir Sanctum

PC

Average framerate : 111.3 FPS
Minimum framerate : 91.2 FPS
Maximum framerate : 136.8 FPS
1% low framerate : 88.3 FPS
0.1% low framerate : 81.8 FPS

eGPU

Average framerate : 84.8 FPS
Minimum framerate : 67.7 FPS
Maximum framerate : 105.8 FPS
1% low framerate : 66.0 FPS
0.1% low framerate : 59.0 FPS

=> Average framerate 23.81% decrease

F1 2016 / built-in benchmark / Ultra High preset / SMAA+TAA / Aniso 16x / Clear weather

PC: AVG: 81, MIN:68, MAX:95
eGPU: AVG: 56, MIN:44, MAX:73

=> 30.86% AVG FPS decrease

Final Fantasy XV Demo / built-in benchmark / High Quality / Fullscreen

PC: Score: 4819
eGPU Score: 4007

=> 16.85% decrease

Assassin's Creed: Origins / built-in benchmark / Ultra-High preset / FOV 115 / max fps 60

PC FPS: 49
eGPU FPS: 34

=> 30.61% FPS decrease

Asseto Corsa / in-game benchmark / graphic settings maxed out

PC: FPS: AVG=160 MIN=92 MAX=194 VARIANCE=0 CPU=55% POINTS: 23543
eGPU: AVG=125 MIN=14 MAX=262 VARIANCE=5 CPU=48% POINTS: 18429

=> 21.72% AVG decrease

Kingdom Come: Deliverance / Very High preset

PC

Average framerate : 39.7 FPS
Minimum framerate : 34.1 FPS
Maximum framerate : 44.1 FPS
1% low framerate : 33.4 FPS
0.1% low framerate : 31.6 FPS

eGPU

Average framerate : 26.7 FPS
Minimum framerate : 19.2 FPS
Maximum framerate : 31.7 FPS
1% low framerate : 18.8 FPS
0.1% low framerate : 17.1 FPS

=> Average framerate 32.74% decrease

Forza Horizon 4 demo (in-game benchmark)

PC: GPU FPS: AVG:80.7 MIN:71.3 MAX:84.7
eGPU: GPU FPS: AVG:51.4 MIN:42.6 MAX:63

=> AVG:36.3% decrease

This topic was modified 3 months ago

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


Leon Woodward, nando4, switch and 1 people liked
ReplyQuote
(@leon_woodward)
Active Member
Joined: 5 months ago
 

Fantastic write up, thanks! How did you find the set up? Any issues switching from the dGPU to the eGPU?

Laptop: Lenovo X1 Extreme Gen1 w/ i7-8750H GTX 1050 Max-Q dGPU
eGPU Enclosure: Razer Core X Chroma
eGPU: GTX 960
Cable: Caldigit 2m TB3 Active Cable


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 12 months ago
 
Posted by: @bobbie424242

Synthetic GPU benchmarks

PC

Time Spy: 9405 / Graphics: 9957 / CPU: 7160
Time Spy Extreme: 4405 / Graphics: 4640 / CPU: 3426
Fire Strike: 22063 / Graphics: 28435 / Physics: 18107 / Combined: 9378
Fire Strike Ultra: 7099 / Graphics: 7055 / Physics: 18078 / Combined: 3810

eGPU

Time Spy: 8049 / Graphics: 8432 (-15.32%) / CPU: 6402
Time Spy Extreme: 3996 / Graphics: 4325 (-6.79%) / CPU: 2795
Fire Strike: 17205/ Graphics: 19855 (-30.17%) / Physics: 16695 / Combined: 8801
Fire Strike Ultra: 6426 / Graphics: 6331 (-10.26%) / Physics: 16717 / Combined: 3549

Thank you for this verification. It confirms both, theory and experience, for the following rule:
With higher resolutions an eGPU performs better compared to Desktop.

For 5k it's even better, it's only about 4% performance loss, which means 5k-Gamers can use eGPU.

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
bobbie424242
(@bobbie424242)
Active Member
Joined: 3 months ago
 

@leon_woodward

Setup was super easy. Since my P72 has a Quadro P600 dGPU and is using Lenovo packaged drivers for it, I had to install the latest drivers from NVIDIA which to my surprise installed just fine from GeForce Experience and recognized both the Quadro and the 1080TI.

As for switching, I usually booted either with the eGPU connected or disconnected. I vaguely remember the time I tried to hotplug the eGPU it wasn't detected but I really did not try hard (since my purpose was benchmarking hence preferring to do that from a clean boot).

Overall it was a painless experience and worked out of the box.

Posted by: @oliverb
Posted by: @bobbie424242

Synthetic GPU benchmarks

PC

Time Spy: 9405 / Graphics: 9957 / CPU: 7160
Time Spy Extreme: 4405 / Graphics: 4640 / CPU: 3426
Fire Strike: 22063 / Graphics: 28435 / Physics: 18107 / Combined: 9378
Fire Strike Ultra: 7099 / Graphics: 7055 / Physics: 18078 / Combined: 3810

eGPU

Time Spy: 8049 / Graphics: 8432 (-15.32%) / CPU: 6402
Time Spy Extreme: 3996 / Graphics: 4325 (-6.79%) / CPU: 2795
Fire Strike: 17205/ Graphics: 19855 (-30.17%) / Physics: 16695 / Combined: 8801
Fire Strike Ultra: 6426 / Graphics: 6331 (-10.26%) / Physics: 16717 / Combined: 3549

Thank you for this verification. It confirms both, theory and experience, for the following rule:
With higher resolutions an eGPU performs better compared to Desktop.

For 5k it's even better, it's only about 4% performance loss, which means 5k-Gamers can use eGPU.

3DMark 4K benchmarks show a limited loss of performance, but it did not translate to game benchmarks (at least for the games I tested), with a 15-35% loss of performance in games tested, which is enormous and disappointing. Kingdom Come: Deliverance that is perfectly playable in 4K on my PC (35-45fps) thanks to G-Sync becomes unplayable on the eGPU (20-30fps) unless downgrading graphics significantly.
The conclusion of my eGPU vs PC 4K test is that I will keep my gaming PC because under-using that 1080TI in an eGPU setup feels such a letdown and downgrade. If I had no point of comparison, I would probably be happy with the eGPU setup.

This post was modified 3 months ago

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 12 months ago
 
Posted by: @bobbie424242

3DMark 4K benchmarks show a limited loss of performance, but it did not translate to game benchmarks (at least for the games I tested), with a 15-35% loss of performance in games tested, which is enormous and disappointing. Kingdom Come: Deliverance that is perfectly playable in 4K on my PC (35-45fps) thanks to G-Sync becomes unplayable on the eGPU (20-30fps) unless downgrading graphics significantly.
The conclusion of my eGPU vs PC 4K test is that I will keep my gaming PC because under-using that 1080TI in an eGPU setup feels such a letdown and downgrade. If I had no point of comparison, I would probably be happy with the eGPU setup.

@bobbie424242

I do not concur. I have a difference opinion and experience in this matter. A big range of tests with a EVGA GTX1080Ti SC2 showed that this card performance extraordinarily on 4k and 5k in a eGPU setup. In most games there is no notable/big difference compared to Desktop.

If you have a look at those benchmarks, it becomes clear that this GTX 1080 Ti performs great as eGPU, especially in 4k and 5k setups. Quite a number of current games are actually playable at 5K and highest settings. In Desktop PC the numbers are not much better, less than 5%.

For 4K and 5K gamers there is absolutely no need to keep a Desktop PC when there is a setup like this. If you go for 1440p and 144Hz, this is a different story. For very high FPS an eGPU setup falls behind, but we are talking about 4k/5k with 60Hz.

 

This post was modified 3 months ago

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
bobbie424242
(@bobbie424242)
Active Member
Joined: 3 months ago
 

@oliverb

Well, my game benchmarks tell otherwise. And I took great care to test in the most identical setup as possible (same drivers, same Windows version and settings, same NV Control Panel settings, same game settings...).

This post was modified 3 months ago

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 12 months ago
 
Posted by: @bobbie424242

@oliverb

Well, my game benchmarks tell otherwise. And I took great care to test in the most identical setup as possible (same drivers, same Windows version and settings, same NV Control Panel settings, same game settings...).

@bobbie424242

Why don't we double check this apparent contradiction: You chose three games of those tested here in in my benchmarks.

Then you run the benchmarks in 4k, or better 5k, if you can. Please do it for both Desktop and eGPU setip. Important: Max Details, everything is maxed out. Those results and comparing them with my results with probably enlighten us.
Thank you very much.

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
psonice
(@psonice)
Estimable Member
Joined: 2 years ago
 

Some general thoughts on this (I'm a graphics programmer, not a gamer):

- The system difference might have more of an effect than you expect. There's going to be non-rendering work to do which will be similar on both systems, and the remaining time gets spent rendering. The slightly faster system can end up with a lot more time left, meaning more frames can be rendered.

- RAM will make a difference too - probably not with geek bench, but a game might use >16GB, and each time it has to wait for disk (even with a fast SSD) it's going to stall things. 

Basically, you want to plug the eGPU into the desktop for a proper comparison, or the results won't mean much.

- The reason why the performance drop is much lower at 4k or 5k: the only performance difference with an eGPU over an internal GPU is bandwidth. In a game, the CPU has to copy a bunch of stuff to the GPU for every frame, and depending on the game architecture the GPU might stall waiting for it or the game code might have to wait for the GPU work to complete, which takes longer if it's waiting for that data to transfer. More frames per second means more waiting on that slower thunderbolt link.

So if you crank the resolution up, you get a lower frame rate, and the lower frame rate means fewer transfers and less bandwidth hit. So less performance drop on the eGPU.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


OliverB liked
ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 12 months ago
 
Posted by: @psonice

So if you crank the resolution up, you get a lower frame rate, and the lower frame rate means fewer transfers and less bandwidth hit. So less performance drop on the eGPU.

This is very precise and correct.

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
bobbie424242
(@bobbie424242)
Active Member
Joined: 3 months ago
 

@oliverb

I don't need to make anymore game benchmarks to know that the eGPU incurs a massive hit, even at 4K. I mean, look again at the results I posted: ACO: 49fps vs 34, FH4 demo: 80fps vs 51, KCD: 39fps vs 26.
I also monitored the laptop's TB3 PCH  temperature in case it would overheat and degrade performance but it stayed at a normal 70 degrees Celsius.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 12 months ago
 
Posted by: @bobbie424242

@oliverb

I don't need to make anymore game benchmarks to know that the eGPU incurs a massive hit, even at 4K. I mean, look again at the results I posted: ACO: 49fps vs 34, FH4 demo: 80fps vs 51, KCD: 39fps vs 26.
I also monitored the laptop's TB3 PCH  temperature in case it would overheat and degrade performance but it stayed at a normal 70 degrees Celsius.

@bobbie424242
If you want to verify your claims, some benchmarks for the same games and the same settings would help a lot. I don't understand such abbreviations, I take my time to write out titles.
Of course, monitoring TB3 PCH temperature doesn't help a lot, in this matter I am talking about a notebook with a direct connection between TB3 and CPU like my 15-inch MBP 2018.

So, I may precise my statement: If you use a good notebook with a direct link between TB3 and CPU, you won't have a notable performance loss in 4k, or even better in 5k gaming. Notebooks where the TB3 data goes over PCH are not recommended, if you want to have good performance. 

EDIT: Of course you are probably correct, that in the setup/notebook from this thread a Desktop solution is notably better then a eGPU solution, but this may not be the case generally. It is my mistake, that I didn't considered it. I wasn't aware that the difference is this big.

This post was modified 3 months ago

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
bobbie424242
(@bobbie424242)
Active Member
Joined: 3 months ago
 
Posted by: @psonice

Some general thoughts on this (I'm a graphics programmer, not a gamer):

- The system difference might have more of an effect than you expect. There's going to be non-rendering work to do which will be similar on both systems, and the remaining time gets spent rendering. The slightly faster system can end up with a lot more time left, meaning more frames can be rendered.

- RAM will make a difference too - probably not with geek bench, but a game might use >16GB, and each time it has to wait for disk (even with a fast SSD) it's going to stall things. 

Basically, you want to plug the eGPU into the desktop for a proper comparison, or the results won't mean much.

- The reason why the performance drop is much lower at 4k or 5k: the only performance difference with an eGPU over an internal GPU is bandwidth. In a game, the CPU has to copy a bunch of stuff to the GPU for every frame, and depending on the game architecture the GPU might stall waiting for it or the game code might have to wait for the GPU work to complete, which takes longer if it's waiting for that data to transfer. More frames per second means more waiting on that slower thunderbolt link.

So if you crank the resolution up, you get a lower frame rate, and the lower frame rate means fewer transfers and less bandwidth hit. So less performance drop on the eGPU.

1. I don't think the system (CPU and RAM) made too much of a difference: most benchmarked games where using the GPU at close to 100% indicating that the GPU is the limiting factor. Also the laptop CPU did not throttle (I looked at that) as it is in a 17" chassis, well cooled.

2. RAM is nothing spectacular on the PC: it is DDR4 2133 (and not 2666 as I wrongly mentioned). Although there was 32GB on the PC but I really doubt any game went over 16GB

3. The results are still interesting I think even without comparing the eGPU connected to the PC. At least it is interesting to me in my decision to keep my gaming PC or not.

4. Yes, the performance loss is game dependent but I did not expect to be this important (again: for the games I tested).

In any case, I really wish there were other PC vs eGPU comparison at high resolutions (4k and eventually 5K).

@oliverb

Macbooks have notoriously the best TB3 connection possible. So it is very well possible it would be faster than my P72 (assuming same CPU). I don't think there is a PC laptop with a TB3 connection as efficient as MacBooks. Maybe Alienware laptops with the amplifier port but that is something else.

This post was modified 3 months ago

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 12 months ago
 
Posted by: @bobbie424242

@oliverb

Macbooks have notoriously the best TB3 connection possible. So it is very well possible it would be faster than my P72 (assuming same CPU). I don't think there is a PC laptop with a TB3 connection as efficient as MacBooks. Maybe Alienware laptops with the amplifier port but that is something else.

@bobbie424242
It was my mistake, I didn't consider this difference.
Yes, I hear that the amplifier port is clearly better than TB3.

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
psonice
(@psonice)
Estimable Member
Joined: 2 years ago
 
Posted by: @bobbie424242

1. I don't think the system (CPU and RAM) made too much of a difference: most benchmarked games where using the GPU at close to 100% indicating that the GPU is the limiting factor. Also the laptop CPU did not throttle (I looked at that) as it is in a 17" chassis, well cooled.

You won't be able to check how busy the GPU is like that. Yes, it's 100% busy... but doing what? Transferring data over the bus, or rendering? Waiting for data or sync? Honestly, this really is a pretty meaningless number unless you have a proper GPU trace and can see what it's actually doing.

For some idea of what I mean, I recently made a rendering pipeline ~10% faster. It was "100% busy" according to basic tools at the start, and the same at the end, and the workload didn't change at all! But looking closely at a full GPU trace I could see that part of the time the GPU wasn't actually rendering, it was transferring data or waiting for other work. Just changing the way things are scheduled can have a big difference.

With an eGPU the added latency and lower bandwidth your GPU might be 100% 'busy', but only actually rendering stuff for 10% of the time! 

So trying to compare without the same CPU and memory isn't going to tell you much at all.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
bobbie424242
(@bobbie424242)
Active Member
Joined: 3 months ago
 
Posted by: @psonice
Posted by: @bobbie424242

1. I don't think the system (CPU and RAM) made too much of a difference: most benchmarked games where using the GPU at close to 100% indicating that the GPU is the limiting factor. Also the laptop CPU did not throttle (I looked at that) as it is in a 17" chassis, well cooled.

You won't be able to check how busy the GPU is like that. Yes, it's 100% busy... but doing what? Transferring data over the bus, or rendering? Waiting for data or sync? Honestly, this really is a pretty meaningless number unless you have a proper GPU trace and can see what it's actually doing.

For some idea of what I mean, I recently made a rendering pipeline ~10% faster. It was "100% busy" according to basic tools at the start, and the same at the end, and the workload didn't change at all! But looking closely at a full GPU trace I could see that part of the time the GPU wasn't actually rendering, it was transferring data or waiting for other work. Just changing the way things are scheduled can have a big difference.

With an eGPU the added latency and lower bandwidth your GPU might be 100% 'busy', but only actually rendering stuff for 10% of the time! 

So trying to compare without the same CPU and memory isn't going to tell you much at all.

If the GPU is at 100%, of which 50% are only used rendering, it doesn't matter much as it is at the maximum capacity on this setup, possibly bottlenecked by TB3: it cannot do more. At least this is how I understand it.

"So trying to compare without the same CPU and memory isn't going to tell you much at all."

Except I wanted to evaluate replacing my PC with my laptop+eGPU, not switching my PC to using an eGPU.
Sure, benchmarks of PC + eGPU would be interesting to tell the full story (to better understand possible bottlenecks) but not practically useful as nobody is going to ever use a PC + eGPU if the GPU fits inside the PC case.

This post was modified 3 months ago

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
psonice
(@psonice)
Estimable Member
Joined: 2 years ago
 

What I mean: the GPU might be at 100% load on both machines, but doing 30fps on one machine and 60 on the other. It's clearly not "100% busy" 🙂 You need some pretty serious dev tools (and enough knowledge to interpret the graphs) to know what's actually going on, but it's not as simple as "the thunderbolt bus is slowing it down". That will be a factor, but different CPU speeds will have an effect too, and it's likely the different RAM amount will too.

Basically, it's complicated, and without comparing on the same system the numbers won't mean much 🙂

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
bobbie424242
(@bobbie424242)
Active Member
Joined: 3 months ago
 

@psonice

Ok, thanks for clarifying. Sure nothing is really simple on these complex topic.

Unfortunately my PC does not have TB3. Otherwise I would run PC + eGPU benchmark just for science :).

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote