2020 13" Dell XPS 13 9310 [11th,4C,G] + RX 480 @ 32Gbps-TB4 (Mantiz Saturn Pro) + Win10 // Disapponting Performance
XPS 13 9310
i7-1165G7 @ 2.8GHz
External 1080p Monitor (Dell)
- Installed Arch Linux over the pre-installed Windows, leaving some space for dual-boot Windows 10
- Installed a vanilla Windows 10 Home
- Start windows update, wait for it to complete, needs a couple of reboots
- Use Dell SupportAssist to update all Dell-specific drivers. This also caused a Dell BIOS update.
- Hot-plug eGPU, let Windows do some auto installations
- Install newest AMD drivers and software
Preset FPS avg Score Extreme 16.86 2254 High 26.61 3558 Medium 23.50 3141 Low 23.69 3167
The Superposition benchmark gives disappointing results, lower than expected, and lower than my desktop PC with the same RX480 GPU.
My gaming PC, which used to run the RX480 until now, has these specs:
CPU: Intel Core i5-6500
Mainboard: ASRock H110M-ITX/ac
PSU: Be quiet! Pure Power L8-500W -
RAM: Crucial - DDR4 - 8 GB : 2 x 4 GB
Same Gigabyte Radeon RX480 G1 Gaming 8G
Same 1080p Dell monitor
I just took the RX480 out of the PC and slotted it into the Mantiz, expecting similar performance. People talk about 20 % performance loss, but I feel that I'm getting a lot less.
Superposition benchmark, 1080p Medium preset, fullscreen
PC: FPS avg 59.37 max 75.20 score 7937
Laptop/eGPU: FPS avg 23.49 max 48.28 score 3140
Heaven benchmark, Extreme preset, 1600x900 windowed
PC: FPS avg 68.8 max 149.5 score 1734
Laptop/eGPU: FPS avg 65.1 max 136.1 score 1641
AIDA64 GPGPU Benchmark gives me Memory Write 2534 MB/s
Dark Souls 3: My PC can run Dark Souls 3 with rock solid 60 FPS on Max settings, 1080p, in all situations. The Laptop with eGPU fluctuates between 40-60 FPS on Max settings. Weirdly enough, even on High and Medium settings I get the same 40-60 FPS, there is no noticable performance difference. Only on Low settings do I get better performance, but even then there are noticable dips to around 55 FPS. Dark Souls 3 is kind of playable, even on Max settings, but those FPS dips are annoying and frequent, I was expecting more.
Hitman 2: Similar behavior here. My PC can run Hitman 2 on medium settings 1080p, at mostly 60 FPS, with some dips to 55 FPS here and there. The laptop struggles to reach 60 FPS, I get mostly around 45-50 FPS.
Things I have already tried
Install Dell Power Manager and set the CPU to Maximum Performance. Makes no measurable difference, neither in Dark Souls nor in Superposition. If at all, Dark Souls feels less responsive at Maximum Performance.
Make sure that the latest Dell drivers are installed, using Dell's SupportAssist app, it even performed a BIOS upgrade. Makes no difference.
I have disabled the integrated graphics card (Intel Iris Xe) and made sure that the laptop's built-in screen is disabled. Makes no difference.
I made some more in-depth measurements with MSI Afterburner, here are the results.
1. Dark Souls 3
On both systems, the PC and the laptop/eGPU, I just stood at the place seen in the screenshot, letting the game render the scene. The links are screenshots, plots of the MSI Afterburner log data, and the log data itself. I just stood in the place seen in the screenshot, after 30 seconds changing the graphics quality.
The plots show some noise from starting the game. Then comes:
30 seconds on Max settings
30 seconds on High settings
30 seconds on Medium settings
30 seconds on Low settings
The PC maintains a steady 60 FPS, while the laptop maintains around 45 FPS on Max, High, and Medium settings. Only at the end when I went to Low did the framerate rise to 60.
GPU load on the PC fluctuates heavily between 0 and 100 %. On the laptop it fluctuates between 80 and 100 %, decreasing as I lower the settings.
2. Unigine Heaven
Throughout the benchmark, PC and laptop show similar GPU load and framerates. The benchmark scores are very similar.
3. Unigine Superposition
There was some longer loading time on PC, please ignore the inital noise in the plot. On the PC, framerate hovers around 60 FPS, while GPU load is almost always at 100 %. On the laptop Framerate is between 15 and 45, and GPU load fluctuates heavily between 30 and 100 %.
The GPU load when using the RX480 as eGPU is high in Dark Souls 3 and Unigine Heaven, but flucuates wildly in Unigine Superposition. Why does Superposition not cause 100 % GPU load on the laptop? It does on the PC.
In Dark Souls 3, it is the other way around: the PC does not fully load the GPU, but the laptop does. Any ideas why this could be the case?
I'm getting very inconsistent results in different benchmarks. Please help me interpret the numbers. Any ideas what's wrong with my setup?
I'm unhappy with the performance of my setup. Let's call this configuration "laptop".
I'm using the RX480 that used to sit in my gaming PC (i5-6500). The laptop
sometimes gives lower performance than the PC, sometimes it gives comparable
performance. I made some structured measurements and created some graphs.
Superposition DirectX vs OpenGL
Summary: When using the DirectX renderer, graphics settings have no impact on benchmark results. When using the OpenGL renderer, benchmark results scale with graphics settings as expected.
For these measurements I ran Superposition in the presets extreme, high, medium, and low, on the laptop and on the PC. All run at resolution 1080p, except low which runs at 720p. I swapped the RX480 between these systems.
The following table shows average FPS / Score as reported by Superposition at the end of the benchmark.
preset Laptop DirectX Laptop OpenGL PC DirectX PC OpenGL -------------------------------------------------------------------------- extreme 16.86 / 2254 14.89 / 1991 17.77 / 2375 14.91 / 1992 high 26.61 / 3558 32.83 / 4388 42.02 / 5617 34.76 / 4647 medium 23.50 / 3141 44.05 / 5889 59.44 / 7947 47.86 / 6398 low (720p) 23.69 / 3167 105.56 / 14112 128.07 / 17122 89.99 / 12031
- On PC, performance scales with graphics settings as expected: lower settings give higher scores. This is the case for both the DirectX and OpenGL renderer.
- On the laptop with DirectX, graphics settings, except for extreme, have no impact on performance. This is especially surprising on low, because it runs at 720p.
- On the laptop with OpenGL, performance scales with graphics settings as expected.
Here are the graphs belonging to the benchmark runs, measured with MSI Afterburner.
- Laptop DirectX does not fully utilize the GPU, and CPU utilization across all cores is higher compared to PC, even though the laptop's CPU is much newer.
- Laptop OpenGL fully utilizes the GPU in all cases, and CPU utilization is very low, between 5 - 25 % across all cores.
- PC DirectX fully utilizes the GPU with relatively low CPU usage. Only on low settings does it seem to become CPU-bound, as CPU usage rises but GPU usage becomes inconsistent.
- PC OpenGL also fully utilizes the GPU, with one CPU core showing higher usage than the others. Low settings seem to become CPU-bound, as CPU usage rises but GPU usage becomes inconsistent.
- GPU temperatures are not in the laptop graphs for some reason, but I set the Radeon software to agressively cool the GPU to 50 degrees Celsius. GPU temperatures are stable around this value in all cases.
Doom 2016 OpenGL vs Vulcan
Summary: On the laptop, when using the OpenGL renderer, graphics settings have no impact on framerate. When using the Vulcan renderer, framerate scales with graphics settings as expected.
I ran Doom 2016 with the OpenGL and Vulcan renderer for 30 seconds on each of the following settings: 1080p ultra, high, medium, low, and finally 720p low.
The following table shows FPS at the various graphics presets. These averages values are my estimations from looking at the graphs and Steam's in-game FPS counter.
Laptop Laptop PC PC preset OpenGL Vulcan OpenGL Vulcan ----------------------------------------- ultra 45 120 80 150 high 45 130 90 160 medium 45 140 100 170 low 45 150 110 175 low 720p 45 200 120 200
Here are the corresponding graphs. These graphs were created with MSI Afterburner, while running in circles in one of the starting areas in Doom. Each time after 30 seconds I lowered the graphics settings to the next lower level.
- Laptop OpenGL: Framerate is lower compared to PC OpenGL, and the GPU is not fully utilized. Framerate does not scale with graphics settings: on all settings, even 720p low, it hovers around 45 FPS.
- Laptop Vulcan: Framerate is similar to PC Vulcan, and the GPU is fully utilized. Also, framerate scales with graphics settings: higher settings give lower framerate.
What could cause the bad performance, especially in the case Superposition DirectX with medium settings? That's how I usually play games. For example in both Dark Souls 3 and Hitman, at 1080p medium, framerate hovers around 45-55 FPS.
So far I have seen that the Superposition benchmark does not fully utilize my GPU, nor my CPU. Other games do that as well, but let's focus on Superposition for now.
After some experiments today, I think I can exclude thermal throttling of the CPU. HWinfo can show whether individual cores are thermal throttled. See the following screenshot of the sensors pane of HWinfo, which I took immediately after running the Superposition DirectX medium benchmark.
- Core 1 and 3 experienced thermal throttling for a split-second during the loading screen. This can be seen in the Maximum column. During the benchmark, no thermal throttling happened.
- The core temperatures were quite normal 59 - 60 degrees Celsius in the Average column.
- CPU core clocks are around 4.7 GHz, because TurboBoost was enabled, which allows the CPU to go to higher clock speeds than the base 2.8 GHz.
Next, I disabled TurboBoost in the BIOS.
Without TurboBoost, the Superposition medium benchmark gave almost exactly the same score, 0.5 FPS lower. The following screenshot is HWinfo sensors after the run without TurboBoost.
- Average core temperatures are a little bit lower than with TurboBoost, 53 - 56 degrees Celsius.
- Core clock speeds do not exceed 2.8 GHz
- Thermal throttling did not happen even once
From these observations, I think I can safely say that CPU throttling is not my problem. Something else bottlenecks my system. The GPU is not put under full load. What else could it be?
But rx480 is not such a top videocard!!If you want more fps,go with gtx 1080 ti,rtx 2070 or so.And i'am sure this card is a bottleneck for that i7-1165G7 ,and is not working on the full speed (only 4x) is a totally mess for that cpu.
@aurelius_pontius, Thanks for your answer. I'm aware that my graphics card is not top-notch, but that's not the problem I'm facing. The problem is that the Superposition benchmark should be able to put 100% load on the card, but it doesn't. The benchmark should be GPU-bound, which means the same card should give similar results with different CPUs. My desktop CPU is slower, it's an i5-6500.
With the combination PC i5-6500 + RX480, Superposition puts 100% load on the GPU, resulting in the maximum score this card is able to achieve: ~60 FPS, score ~7300.
With the combination Laptop i7-1165G7 + RX480 as eGPU, Superposition does not 100% utilize the GPU. The score and framerate is lower than what this card should be able to achieve: ~23 FPS, score ~3100.
On the other hand, the Heaven and Valley benchmarks do put 100% load on the GPU, leading to similar scores in both setups.
Why does the same card with a faster CPU give lower framerate? If the card is not utilized 100% then it is not the bottleneck. Something else is, but what?
Looking further into the issue that my GPU is sometimes not fully utilized, I went through my steam library and randomly installed some old and new games. Here are my findings. Since my CPU is much faster than my GPU, I would expect most games, at least the more recent ones, to be GPU-bound, and not CPU-bound.
The signs of being GPU-bound are:
- 100 % GPU utilization, with low CPU utilization, as reported for example by MSI Afterburner
- Lowering GPU-intensive graphics settings increases framerate. Two important examples are resolution and texture quality.
Signs of being CPU-bound are:
- High CPU utilization. Due to how multi-core processors work, how they shift a process quickly across all cores, one does not see 100 % CPU utilization however, so this is harder to see.
- Irregular, less than 100 % GPU utilization
- Lowering CPU-intensive graphics settings, often shadow detail, increases framerate
The following table lists my findings. I consider an entry to be good if it is either GPU-bound, or some kind of maximum framerate is achieved: Some games are locked at 60. An entry is bad if it has low framerate while the GPU is not 100 % utilized. (Sorry for posting this as screenshot, egpu.io does not support attaching spreadsheets)
The goal was to find some kind of pattern which games are good and bad, but from my experiments today I can't see a pattern. Death Stranding for example is considered good, because it fully utilizes the GPU and reaches the expected (okay-ish) performance. Monster Hunter World on the other hand runs at 30-35 FPS, on low, medium, and high graphics settings, at 1080p and 720p. Nothing improves performance.
There still is some bottleneck in my system that prevents some games from fully utilizing my GPU.