The TB1 Low-Power eGPU Review that No One Asked for: AMD Radeon RX550 vs NVIDIA ...
Clear all

The TB1 Low-Power eGPU Review that No One Asked for: AMD Radeon RX550 vs NVIDIA Quadro K1200  


Noble Member Moderator
Joined: 5 years ago


Coronavirus is upon us. We are stuck at home, and we are bored. This household is no exception. Thankfully, ebay still exists, and toys can still be delivered. In this case the toy is a PNY Quadro K1200 4GB video card. I've wanted one for a while to pair with my ATTO Thunderlink units: Since it is a single-slot, low-profile, slot-powered card it is a match made in heaven. Or at least in my bored mind.

56$ + shipping were parted with and a week later I am the proud owner of a 1st gen Maxwell card in 2020. Now, what to do with it? Well, the answer is obvious, compare it to my Radeon RX550 in a battle of single-slot, low-profile, Thunderbolt1 eGPUs, of course.

Why? Because I'm bored, and because I might learn something in the process.

Who asked for it? Absolutely no one!

Who is it useful for? Hopefully someone.


The Base System

For those familiar with my literal zoo of eGPUs and base systems, this is not a new laptop. My Lenovo T430s is a 2012 machine which I bought second-hand sometime in 2016 for 119$. It is equipped with two batteries (one in place of the DVD drive), two SSDs (one in place of the cellular WLAN modem), 16GB of DDR3 2133Mhz memory (which I paid way too much for) in dual channel, and sports a 3rd gen dual-core Intel i7 chip (the i7-3520M). There is no discrete graphics card in this T430s and it has both Expresscard and Thunderbolt1 as potential eGPU interfaces (and mPCIe as well, but there is no real need to use it, ever).

It is about the most capable eGPU platform of the late dual-core laptop era, shortly before (and somewhat in parallel with) high-clocked quad-cores becoming the norm in laptops. Today, this isn't saying much. A modern 25W quad-core CPU is cooler, faster, allows for a longer battery life, while blowing the i7-3520M out of the water. A modern 45W laptop CPU is far, far more powerful.

That said, based on the amount of people inquiring about an eGPU for their low-cost dual-core system on /r/eGPU, this system is still surprisingly relevant in 2020. It is also a joy to work with: It is still the only laptop I personally own, even if my work provided ZBook G4 gets most of my computing (including gaming) time.

In this review, we'll be using Thunderbolt1 to connect the two competing cards, each housed in its own ATTO Thunderlink enclosure.


The Methodology

The usual plethora of synthetic benchmarks will be used for this comparison: 3DMark Advanced, the various Unigine Engine benchmarks, as well as Final Fantasy XV. None of these are terribly new, but let's face it: The cards I am comparing will choke on just about any high-end modern game. No, this is not a high-powered gaming setup. This is a 100-120$ shot in the arm of an old trusty system. The goals for such a setup are gaining playable framerates in older or less demanding games, allowing for high-resolution display output from a system not, by itself, capable of such, and adding video decoding capability lacking in the HD4000.

We'll compare the cards to each other across the full benchmark suite, and also use a sub-set to compare the iGPU against the eGPU performance gains.

The eGPU performance is captured on an external monitor, in this case, a 43" 1080p Sansui TV that is nothing special, and runs a 60Hz panel. The iGPU tests were done on the internal monitor, and only utilized tests that render properly on its 1600x900 display, or support downscaling to run on the lower-resolution monitor while rendering the full resolution on the GPU.


The GPUs

We are pitting the red camp against the green camp in this one. The AMD contender is a Dell OEM RX550 4GB. Pulled out of some unknown system, and setting me back 75$ on ebay about half a year ago, it is a tiny single-slot, half-height card with a small active heatsink. Launched in 2017, the RX550 is a low-end card that typically competes against the NVIDIA GT1030 in the e-sports segment. The NVIDIA-built opponent is a PNY Quadro K1200 4GB. This is a low-end workstation card, based on a 1st generation Maxwell chip - the same one that gave birth to the legendary GTX750Ti. The K1200 is essentially a GTX750 (non-Ti) with Quadro features and 4GB of memory in a low-profile, single-slot form factor. Launched in 2015, it is a lot older than the AMD chip.


The enclosures used for this comparison are a pair of ATTO Thunderlink Fibre-Channel enclosures. Each originally housed a dual-port Fibre-Channel card, designed to connect a Thunderbolt system to a Storage Area Network (SAN). These used to cost hundreds to thousands of dollars back in the day. I got mine for a total of about 80$ off ebay (~45$ for one, ~35$ for the other). They were connected to the T430s with a 2m Apple Thunderbolt cable. Both cards worked without issue in 64-bit Windows 10 Professional (build 1909). In both cases, a code 12 occurs when initially installed, but disabling one of the sound devices in the system (the iGPU one, or the eGPU one), followed by disabling and enabling the eGPU itself (and rebooting) leads it to work flawlessly on future boots. Since if the eGPU is used for sound output, the iGPU is not, and vice versa, this is a rather painless workaround. In my case, I use a USB DAC which is then connected via an optical connecton to a stereo receiver, so I just disabled both the Intel and the eGPU sound devices - I don't need them.

Both the AMD and the NVIDIA drivers pick up the external eGPU connected, and display a safe removal option, as if this were an official, fancy, Thunderbolt3 eGPU unit.


Something interesting can be spotted on this table: The two GPUs are extremely well matched in specs. Both have 512 computation cores, 32 texture mapping units, and 16 render output processors. They operate at very similar clocks, and they have similar theoretical performance figures. Both have a 128-bit memory bus, and both sport 4GB of GDDR5 memory. This comparison becomes a little more interesting as a result: Not only it is a comparison of two low-powered eGPUs, it is a rather even comparison between NVIDIA's Maxwell and AMD's Polaris architectures.

A few more things stand out: Both graphics chips are made in TSMC foundries, but where the 1st generation Maxwell chip is made on TSMC's 28nm process, the AMD Polaris chips (as befitting a two-year newer chip) are manufactured on TSMC's 14nm wafers instead. Interestingly, however, the AMD card has a 50W TDP compared to NVIDIA's 45W. The AMD chip has about 17% more transistors on the die, but I would still have expected it to consume less power than the much older NVIDIA part. The answer may lie in the GDDR memory: The 5Ghz memory chips on the NVIDIA card might be significantly less power-hungry than the 7Ghz chips on the AMD. Or it might be that NVIDIA is that more efficient. The AMD card enjoys a large memory bandwidth advantage, however. Will it matter? We'll see.

Of final note are the output arrangements on these cards. The K1200 comes with an array of four mini-DisplayPort 1.2 outputs, good enough for driving four 4K monitors, or a pair of 5K ones. This output setup is standard for the K1200, and many other low-profile, single-slot professional graphic cards. The Dell RX550 is very different from a typical RX550, however, as it sports 1 DisplayPort and 2 mini-DisplayPort outputs (a typical RX550 has one DisplayPort, HDMI and a DVI). These should be DisplayPort 1.4 outputs, each capable of driving a DisplayPort 1.4 5K display. You're not going to game at these resolutions, but both these cards do make for a fine multi-screen setup for work.


iGPU vs eGPU

Our first results graph is in Full-RGB! Red is AMD, Green is NVIDIA and Blue is for our outmatched Intel participant, the HD4000 (configured to consume up to 2GB of system memory). The benchmarks selected for this comparison were the lowest-demanding ones - Mostly to maintain my sanity. Even though heavier benchmarks would run on the HD4000, the performance would be so horrible I'd be staring at a slideshow and slowly going insane.

The results are normalized to the K1200's score, with the other two cards' results displayed as a percentage of thereof. This avoid graph scaling issues due to the very different scoring scales in use between the benchmarks.


We can see that in these lighter benchmarks, the RX550 is a pretty clear winner, falling behind only in Ice Storm Extreme. But with both cards providing in excess of 300 FPS, this is a moot point. The Intel HD4000 obviously falls behind. The average performance gain of the eGPU solution is four-times the performance. Averages are tricky, however, and this one is skewed by the Ice Storm Extreme result. Even the HD4000 manages well over 100 FPS, and we're likely hitting a Thunderbolt bottleneck, a CPU bottleneck, or both, since the eGPUs were not under full load in these tests. As they say, white lies, damned lies, and statistics. Omitting this benchmark results in a 5x speedup from the addition of the eGPU. Not bad.

The RX550 is on average 8% faster than the K1200, or 11% faster with Ice Storm Extreme omitted. Looks like AMD suffers a little worse from the bottleneck we're encountering.



The full benchmark suite comparison results in an 6% average advantage in favor of the AMD card, but the results are too interesting to leave it at that. But first, the graph!


Again, the graph is normalized against the K1200's result, and is sorted in decreasing order by the RX550 result, clearly showing where the RX550 is faster, and where it isn't.

Initially, I suspected the difference would be in DX11 and DX12 benchmarks, with the K1200 prevailing in the former, and the RX550 in the latter. But this is not the case, so we're going to have a longer discussion here. The RX550 decisively defeats the K1200 in all 3DMark benchmarks except for Ice Storm Extreme (which, as we mentioned above, seems to be extremely CPU/Thunderbolt-bound). It loses to the K1200 in both Valley and Heaven, both of which are older benchmarks. Both Userbenchmark and Final Fantasy XV have the cards essentially tied. The one benchmark where things get interesting is Superposition, where the results vary by settings, and by a lot. With the 1080p Medium preset, which sets both shaders and textures to medium, the two cards are tied. At the 720p preset, which sets both shaders and textures to low, and lowers the resolution, the RX550 is about 9% faster. The real weird one is the custom preset I used, with the shaders set to low, and the textures to high. Setting the textures to high causes the video card memory usage of the benchmark to rise to 3.3GB, making use of both cards' 4GB buffers. It would be expected that the RX550, with its much higher memory bandwidth, would prevail here. Alas, that is not the case: The K1200 is 25% faster in this configuration. I am not sure if NVIDIA has superior texture compression algorithms, or if the AMD drivers create more overhead (in computation or in data transfer) when large textures are used - Both seem like a plausible hypothesis.

Thermals are also interesting. The RX550 is the much hotter card out of the two, especially constrained in the ATTO Thunderlink with its puny fan. It tops out at around 85c with a 22c ambient temperature after an extended period of 3D loads. The K1200 in the Thunderlink tops out at around 75c in the same ambient temperature. It might be that the RX550's cooling solution is worse, but upon inspection, both cards seem rather the same: A thin aluminum and copper combo heatsink with a small fan. The K1200 has a smaller fan in a blower style, while the RX550 has a slightly larger fan driving air through both sides of the heatsink. Going to have to give the thermals battle to the K1200 in this one: Despite the two year advantage, and the 14nm vs the 28nm foundry process, AMD cannot beat the older NVIDIA card on thermal efficiency.

Neither card allows for overclocking: MSI Afterburner settings are rejected outright, and BIOS modding would be needed to enable overclocking.

Overall, I am quite pleasantly surprised by the performance of both contenders. As for gaming, both cards happily run games like Stellaris, Cities: Skylines, Ultimate Admiral: Dreadnoughts and World of Warships on a mix of medium and high settings at 1080p. Considering the pain that is trying to do the same with the HD4000, this is pretty nice to see.


Conclusions and Observations

The RX550 is the better card overall in my opinion, but that is not terribly surprising given that it is much newer. There are several reasons why I consider it superior:
1) It enjoys OS X support, so TB1/TB2 Mac users can utilize this setup.
2) It can drive higher resolution monitors on account of having the DisplayPort 1.4 standard.
3) It has HEVC decoding capabilities in HW, compared to the hybrid SW-HW approach used by a 1st generation Maxwell.
4) While only 6% faster on average, it performs better in the benchmarks I subjectively deem more important: The newer and more computationally intensive ones.

That said, the K1200 has things in favor of it as well:
1) It is cheaper. These low-profile, single-slot OEM RX550 cards are not trivial to find, and tend to cost around 70-75$. The K1200 is readily available at around 60$ or even less.
2) It runs cooler.
3) It can drive 4 monitors without having to utilize a DisplayPort hub.
4) As a Quadro card, it provides better support for CAD applications. A K1200 is not a powerhouse by any means, but it does offer a taste of what the Quadro line offers in that regard. The RX550 is a common consumer card.

Overall, though, the RX550 is the better bet.

Both cards will allow you to play older or lighter games with acceptable performance. Not bad for an old Thunderbolt1 or Thunderbolt2 based laptop. While I am not chucking out my GTX1080Ti and ZBook G4 combo any time soon, when I have to use the T430s, its eGPU suffices.


This topic was modified 2 years ago

Want to output 4K@60Hz out of an old system on the cheap? Read here.
Give your Node Pro a second Thunderbolt3 controller for reliable peripherals by re-using a TB3 dock (~50$).

"Always listen to experts. They'll tell you what can't be done, and why. Then do it."- Robert A. Heinlein, "Time Enough for Love."

2012 Mac Mini [3rd,4C,Q] + RX 480 @ 10Gbps-TB1 (Atto Thunderlink) + macOS 10.15.7 [build link]  

3RYL, Mini i5, jangoloti and 1 people liked
Mini i5
Prominent Member
Joined: 2 years ago


Thanks for that. 😁

I couldn’t imagine a more artful way of saying Apple is trolling us with it’s current 2020 iMac lineup...

57620F70 B0B5 4E22 A12C B14A8575C5F7



2018 Mac Mini + AMD RP W5500 (Sonnet Breakaway Box 650) Big Sur 11.6 + Windows 21H1

2018 Mac Mini [8th,6C,B] + RP W5500 @ 32Gbps-TB3 (Sonnet Breakaway 650) + macOS 11.6 & Win10 21H1 [build link]  

Noble Member Moderator
Joined: 5 years ago
@mini-i5 - Well, Apple is always trolling with their prices. It is literally part of the business model. That said, $1,300 for these specs isn't absolutely horrible. Considering the AIO nature of the iMac and the quality of the display, that is. Which is a surprising statement from the most Anti-Apple poster at, bar none. Not something I'd personally buy, but I've definitely seen worse in Apple-land, time and time again.
If I ever get bored, though, I am going to Hackintosh this T430s. That should be fun.

Want to output 4K@60Hz out of an old system on the cheap? Read here.
Give your Node Pro a second Thunderbolt3 controller for reliable peripherals by re-using a TB3 dock (~50$).

"Always listen to experts. They'll tell you what can't be done, and why. Then do it."- Robert A. Heinlein, "Time Enough for Love."

2012 Mac Mini [3rd,4C,Q] + RX 480 @ 10Gbps-TB1 (Atto Thunderlink) + macOS 10.15.7 [build link]