Enclosure & Hardware Discussions
Why does AMD get away with selling cards (Vega series) that require 2x more powe...
 

Why does AMD get away with selling cards (Vega series) that require 2x more power of similar performing cards?  

  RSS

motoic
(@tsh3721)
Eminent Member
Joined: 2 years ago
 

I currently have an Akitio Node with a GTX 1070 and if I wanted to upgrade to the Vega 64 I would have to sell my Node, since it only provides 300W of power. However I have heard the 300W the Node supplies work just fine for the Radeon VII, a card which outperforms the Vega 64. This all being the case, how has AMD been getting away with selling the Vega 64 that costs 2x the amount of electricity to power a similar amount of performance? With everyone worried about climate change, carbon and energy efficiency this seems incredibly irresponsible and unnecessary. Also, Apple is a company that outwardly signals how much it cares about the environment, so how can it justify forcing consumers to use such wasteful cards?

2018 15" MBP w/ Radeon Pro 560x
Nvidia GTX 1070
Akitio Node
macOS Mojave, Bootcamp Windows 10


ReplyQuote
Eightarmedpet
(@eightarmedpet)
Noble Member
Joined: 2 years ago
 

Ok, let’s just check... you’re concerned about the environment right?
Is your energy provider green/carbon neutral?
do you use single use products (plastic straws, coffee take away cups)?
do you eat meat?

This post was modified 2 months ago

2017 13" MacBook Pro Touch Bar
GTX1060 + AKiTiO Thunder3 + Win10
GTX1070 + Sonnet Breakaway Box + Win10
GTX1070 + Razer Core V1 + Win10
Vega 56 + Razer Core V1 + macOS + Win10
Vega 56 + Mantiz Venus + macOS + W10

---

LG 5K Ultrafine flickering issue fix


OliverB liked
ReplyQuote
motoic
(@tsh3721)
Eminent Member
Joined: 2 years ago
 
Posted by: Eightarmedpet

Ok, let’s juat check... you’re concerned about the environment right?
Is your energy provider green/carbon neutral?
do you use single use products (plastic straws, coffee take away cups)
do you eat meat?

This thread is not about me personally or my political beliefs. I recommend you both get some reading comprehension and while you're at it go troll elsewhere.

2018 15" MBP w/ Radeon Pro 560x
Nvidia GTX 1070
Akitio Node
macOS Mojave, Bootcamp Windows 10


Nomadante liked
ReplyQuote
Eightarmedpet
(@eightarmedpet)
Noble Member
Joined: 2 years ago
 

Hardly trolling, pretty weak go to there...
you expressed concern about the environment while focusing on something that has very little effect in the grand scheme of things. 
I also never mentioned politics, I’m talking ethics, understanding the difference isn’t rocket surgery.

2017 13" MacBook Pro Touch Bar
GTX1060 + AKiTiO Thunder3 + Win10
GTX1070 + Sonnet Breakaway Box + Win10
GTX1070 + Razer Core V1 + Win10
Vega 56 + Razer Core V1 + macOS + Win10
Vega 56 + Mantiz Venus + macOS + W10

---

LG 5K Ultrafine flickering issue fix


OliverB liked
ReplyQuote
odin
 odin
(@odin)
Eminent Member
Joined: 12 months ago
 

You did go on a rant about how unethical it is for them to build a card with these power requirements. I won't sit here and counter-attack you though and ask if you 100% run your life in a completely carbon-friendly way. It's probably not 100% possible this day and age. Are you just upset that you can't buy a Vega 64 because it's cheaper than a Radeon VII? Just blindly attacking the power requirements and AMD ethics without some research into how they are trying to build that product and compete with nVidia is a little irrational.

There was a time where AMD(ATi) had very perf-per-watt friendly cards compared to nVidia. For the last, maybe 4 or 5 years at least if not longer, that has definitely not been the case. They moved to a 7nm process for the Radeon VII, so that's where they are gaining most of their efficiency. The older Vega cards use 14nm. 7nm is simply a better process that allows less power draw. It's still the same old Vega arch though. Hopefully we will see AMD step it up with Navi and increase their efficiency and performance-per-watt.  Radeon VII (Vega 20) is still a 300W card, where Vega 64 is a 345W card. So they didn't gain MUCH by moving to a new process node. An RTX 2080Ti still uses less power than Radeon VII. So AMD has quite a ways to go here. But calling them unethical is a bit harsh, they just have to catch up. High end video cards have been using gobs of power for a LONG time. Your 300W PSU on your Node should be considered extremely borderline for high end cards if not just too low. If you are going Radeon VII because you want to also use it in macOS i'd perhaps just wait for Navi and see what that brings.

This post was modified 2 months ago

LG Gram 17 | Sonnet Breakaway Box 550 | Asus Strix RTX 2070 OC Edition | Win 10 Pro 1903 + Manjaro Deepin Dual Boot
Build Link


ReplyQuote
(@gareth_rees)
Eminent Member
Joined: 3 months ago
 

Vega 64 undervolts very well. I had mine performing better than stock @ 200W.

Your complaint is funny though, The 2080 Ti costs $1200 yet it cannot fully perform due to wattage limitation set by nvidia. The card is literally gimped out the gate. Would you rather pay more for less?

Dell Latitude 5491 14" BIOS 1.8.1 | Core i7 8850H + liquid metal - https://valid.x86.fr/z6xi8n | 32GB DDR4 2400 | Samsung 512GB PM981 AS SSD 4600 score | | MX130 + liquid metal | Logitech Z-2300 | Razer Death Adder Chroma | Corsair K70 Rapdifire | HP Omen Accelerator 700W PSU + STRIX Vega 56 8GB HBM2


craftsman liked
ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 9 months ago
 

@eightarmedpet,
great insights.

People tend to forget, that when a GPU is needing more power, it's heating the room, too and this saves the environment, because you need less central heating. (Not joking, this is actually correct)
Ok, in summer it's a different case, but don't ever drink coffee or tee in summer!
Actually the environment profits most, if people are using bicycles or walk instead of using their cars. Or using the stairs instead of the lift (yon don't even believe how much power that one is consuming). Or in english: If they move their lazy butts by themselves.

 

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
motoic
(@tsh3721)
Eminent Member
Joined: 2 years ago
 
Posted by: Eightarmedpet

Hardly trolling, pretty weak go to there...
you expressed concern about the environment while focusing on something that has very little effect in the grand scheme of things. 
I also never mentioned politics, I’m talking ethics, understanding the difference isn’t rocket surgery.

I said that Apple advertises itself as expressing concern for the environment and then chooses the company that makes cards that require 2x the wattage to achieve the same performance. Seems pretty wasteful, so I'm curious what exactly can justify it.

2018 15" MBP w/ Radeon Pro 560x
Nvidia GTX 1070
Akitio Node
macOS Mojave, Bootcamp Windows 10


ReplyQuote
Ningauble77
(@ningauble77)
Estimable Member
Joined: 2 years ago
 

I think the best guess put forth at the moment, is that Apple doesn't want to fragment their software base by having CUDA capability available.  They've already announced the deprecation of OpenCL and OpenGL in favor of Metal, and having another 3rd party compute API just means that there will be software that requires an nvidia card to run, which they don't supply in any of their machines out of the box.  My rtx 2080 is a better card by almost every measure than my Radeon VII (outside of fp64 compute, which I have no need of), I was hoping drivers would eventually arrive but I've given up hope (mostly).  The only possible twist to any of this would be if they announce the replacement Mac Pro in June and it has an nvidia card inside, but I think this is also very unlikely.  Aside from any environmental concerns, the rtx cards work in a much broader selection of egpu enclosures, and create less heat and stress the power supplies etc less also.

2018 13 Macbook Pro + Core v2 + Radeon VII Win10 1809/MacOS 10.14.5 Beta
ASUS X99 Deluxe+Core v2 + Radeon VII Win10 1809


ReplyQuote
mac_editor
(@mac_editor)
Noble Member Moderator
Joined: 2 years ago
 

@timothyov Apple-configured AMD chips (found embedded in their macs) are fairly good based on performance per watt IMO. Since AMD allows design tweaks to their GPUs (changing core counts, etc., such as with the Vega architecture), they have created a decent perf/watt GPU lineup (albeit expensive).

I have noticed that full-fledged cards are very generously overvolted for uniform performance across low to mediocre silicon. In fact, I was even able to undervolt the Radeon Pro 560X (which already is tightly configured). As mentioned above, undervolting yields significant savings (so the architecture isn’t really “inefficient” per se by nature, but AMD seem to be compensating for low-mediocre silicon which need more voltage to accomplish stock performance, thus higher yield - every chip is different). At 200W, Vega 64 is fairly good (and great for some compute). In the grand scheme of things, the environmental impact given the relatively small delta is a non-issue (more appropriately, there are many other more important environmental factors that need priority).

To add, however, NVIDIA’s chips/processes are definitively better for sure, as they are accomplishing efficiency (better than AMD) on higher transistor size. But don’t think they allow changing core counts and provide their own specific laptop designs (like MAX-Q, which Apple doesn’t want) - which are basically gimped and still beyond Apple’s TDP needs for their MBPs.

This post was modified 2 months ago

purge-wrangler.shpurge-nvda.shset-eGPU.shautomate-eGPU EFI Installer
----
Troubleshooting eGPUs on macOS
Command Line Swiss Knife
eGPU Hardware Chart
Multiple Build Guides
----
Current: MacBook Pro RP560X + 480/R9 Fury/Vega 64 | GTX 780/1070
Previous: 2014 MacBook Pro 750M + 480/R9 Fury | GTX 780/980 Ti/1070


ReplyQuote
craftsman
(@craftsman)
Trusted Member
Joined: 3 months ago
 

@timothyov

The Radeons in Macs are undervolted including in the Mac Pro 6,1 where their power consumption was cut by more than half.

Apple had always used a mix of Geforce and Radeon. They gave up on Geforce for several reasons.

i. Apple needed OpenCL performance for Final Cut. Nvidia had crippled or slow OpenCL performance in order to push CUDA. 

ii. They had a legal battle over defective Geforce GPUs in laptops. This cost Apple a lot of money.

iii. System wide 10 bit color support was introduced in El Capitan. Apple needed to implement 10 bit color system wide for video and photo professionals. Geforce doesn't support 10 bit wide color in anything else but full screen Direct X mode. 

iv. At the time all the above was happening Geforce cards were also a power hog so there was no performance or energy consumption advantage over Radeon. Radeon was and still is the better at some compute applications involving scientific computation.

 

This post was modified 2 months ago

MacBook Pro 2018, Razer Core Chroma, Power Color Radeon VII


ReplyQuote
motoic
(@tsh3721)
Eminent Member
Joined: 2 years ago
 
Posted by: mac_editor

@timothyov Apple-configured AMD chips (found embedded in their macs) are fairly good based on performance per watt IMO. Since AMD allows design tweaks to their GPUs (changing core counts, etc., such as with the Vega architecture), they have created a decent perf/watt GPU lineup (albeit expensive).

I have noticed that full-fledged cards are very generously overvolted for uniform performance across low to mediocre silicon. In fact, I was even able to undervolt the Radeon Pro 560X (which already is tightly configured). As mentioned above, undervolting yields significant savings (so the architecture isn’t really “inefficient” per se by nature, but AMD seem to be compensating for low-mediocre silicon which need more voltage to accomplish stock performance, thus higher yield - every chip is different). At 200W, Vega 64 is fairly good (and great for some compute). In the grand scheme of things, the environmental impact given the relatively small delta is a non-issue (more appropriately, there are many other more important environmental factors that need priority).

To add, however, NVIDIA’s chips/processes are definitively better for sure, as they are accomplishing efficiency (better than AMD) on higher transistor size. But don’t think they allow changing core counts and provide their own specific laptop designs (like MAX-Q, which Apple doesn’t want) - which are basically gimped and still beyond Apple’s TDP needs for their MBPs.

what does it mean "across low to mediocre silicon" ?

2018 15" MBP w/ Radeon Pro 560x
Nvidia GTX 1070
Akitio Node
macOS Mojave, Bootcamp Windows 10


ReplyQuote
(@gareth_rees)
Eminent Member
Joined: 3 months ago
 
Posted by: timothyov
Posted by: mac_editor

@timothyov Apple-configured AMD chips (found embedded in their macs) are fairly good based on performance per watt IMO. Since AMD allows design tweaks to their GPUs (changing core counts, etc., such as with the Vega architecture), they have created a decent perf/watt GPU lineup (albeit expensive).

I have noticed that full-fledged cards are very generously overvolted for uniform performance across low to mediocre silicon. In fact, I was even able to undervolt the Radeon Pro 560X (which already is tightly configured). As mentioned above, undervolting yields significant savings (so the architecture isn’t really “inefficient” per se by nature, but AMD seem to be compensating for low-mediocre silicon which need more voltage to accomplish stock performance, thus higher yield - every chip is different). At 200W, Vega 64 is fairly good (and great for some compute). In the grand scheme of things, the environmental impact given the relatively small delta is a non-issue (more appropriately, there are many other more important environmental factors that need priority).

To add, however, NVIDIA’s chips/processes are definitively better for sure, as they are accomplishing efficiency (better than AMD) on higher transistor size. But don’t think they allow changing core counts and provide their own specific laptop designs (like MAX-Q, which Apple doesn’t want) - which are basically gimped and still beyond Apple’s TDP needs for their MBPs.

what does it mean "across low to mediocre silicon" ?

TSMC 16nm > Samsung 14nm

Dell Latitude 5491 14" BIOS 1.8.1 | Core i7 8850H + liquid metal - https://valid.x86.fr/z6xi8n | 32GB DDR4 2400 | Samsung 512GB PM981 AS SSD 4600 score | | MX130 + liquid metal | Logitech Z-2300 | Razer Death Adder Chroma | Corsair K70 Rapdifire | HP Omen Accelerator 700W PSU + STRIX Vega 56 8GB HBM2


ReplyQuote
Defoler
(@defoler)
Eminent Member
Joined: 6 months ago
 

Apple switch to amd because:

  1. nvidia kept bringing drivers at the last minute and it pissed apple.
  2. nvidia wanted a lot of money for the custom GPUs for the cMP.
  3. nvidia bad GPU batch cost apple money.

nvidia opencl performance was better than amd. Performance was not why apple switched. It was because it cost apple more money. Simple as that. Amd were willing to sell much cheaper and go through hoops to satisfy apple and make much needed revenue.

The first is what actually broke the camels back really. For the 2014 MacBook Pro, nvidia almost missed the release date for drivers, forcing apple to delay the release. That is why they switched off them.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
Defoler
(@defoler)
Eminent Member
Joined: 6 months ago
 
Posted by: Gareth Rees
Posted by: timothyov
Posted by: mac_editor

@timothyov Apple-configured AMD chips (found embedded in their macs) are fairly good based on performance per watt IMO. Since AMD allows design tweaks to their GPUs (changing core counts, etc., such as with the Vega architecture), they have created a decent perf/watt GPU lineup (albeit expensive).

I have noticed that full-fledged cards are very generously overvolted for uniform performance across low to mediocre silicon. In fact, I was even able to undervolt the Radeon Pro 560X (which already is tightly configured). As mentioned above, undervolting yields significant savings (so the architecture isn’t really “inefficient” per se by nature, but AMD seem to be compensating for low-mediocre silicon which need more voltage to accomplish stock performance, thus higher yield - every chip is different). At 200W, Vega 64 is fairly good (and great for some compute). In the grand scheme of things, the environmental impact given the relatively small delta is a non-issue (more appropriately, there are many other more important environmental factors that need priority).

To add, however, NVIDIA’s chips/processes are definitively better for sure, as they are accomplishing efficiency (better than AMD) on higher transistor size. But don’t think they allow changing core counts and provide their own specific laptop designs (like MAX-Q, which Apple doesn’t want) - which are basically gimped and still beyond Apple’s TDP needs for their MBPs.

what does it mean "across low to mediocre silicon" ?

TSMC 16nm > Samsung 14nm

You mean GlobalFoundries. Not Samsung.

This post was modified 2 months ago

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
Yukikaze
(@yukikaze)
Prominent Member Moderator
Joined: 3 years ago
 

how has AMD been getting away with selling the Vega 64 that costs 2x the amount of electricity to power a similar amount of performance?

The answer to this question is always the same: Because people were buying them.
</thread>

My eGPU Zoo - Link to my Implementations.
Want to output [email protected] out of an old system on the cheap? Read here.
Give your Node Pro a second Thunderbolt3 controller for reliable peripherals by re-using a TB3 dock (~50$).

"Always listen to experts. They'll tell you what can't be done, and why. Then do it."- Robert A. Heinlein, "Time Enough for Love."


ReplyQuote
motoic
(@tsh3721)
Eminent Member
Joined: 2 years ago
 
Posted by: Gareth Rees
Posted by: timothyov
Posted by: mac_editor

@timothyov Apple-configured AMD chips (found embedded in their macs) are fairly good based on performance per watt IMO. Since AMD allows design tweaks to their GPUs (changing core counts, etc., such as with the Vega architecture), they have created a decent perf/watt GPU lineup (albeit expensive).

I have noticed that full-fledged cards are very generously overvolted for uniform performance across low to mediocre silicon. In fact, I was even able to undervolt the Radeon Pro 560X (which already is tightly configured). As mentioned above, undervolting yields significant savings (so the architecture isn’t really “inefficient” per se by nature, but AMD seem to be compensating for low-mediocre silicon which need more voltage to accomplish stock performance, thus higher yield - every chip is different). At 200W, Vega 64 is fairly good (and great for some compute). In the grand scheme of things, the environmental impact given the relatively small delta is a non-issue (more appropriately, there are many other more important environmental factors that need priority).

To add, however, NVIDIA’s chips/processes are definitively better for sure, as they are accomplishing efficiency (better than AMD) on higher transistor size. But don’t think they allow changing core counts and provide their own specific laptop designs (like MAX-Q, which Apple doesn’t want) - which are basically gimped and still beyond Apple’s TDP needs for their MBPs.

what does it mean "across low to mediocre silicon" ?

TSMC 16nm > Samsung 14nm

Still don't understand

2018 15" MBP w/ Radeon Pro 560x
Nvidia GTX 1070
Akitio Node
macOS Mojave, Bootcamp Windows 10


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 9 months ago
 
Posted by: Defoler

Apple switch to amd because:

  1. nvidia kept bringing drivers at the last minute and it pissed apple.
  2. nvidia wanted a lot of money for the custom GPUs for the cMP.
  3. nvidia bad GPU batch cost apple money.

nvidia opencl performance was better than amd. Performance was not why apple switched. It was because it cost apple more money. Simple as that. Amd were willing to sell much cheaper and go through hoops to satisfy apple and make much needed revenue.

The first is what actually broke the camels back really. For the 2014 MacBook Pro, nvidia almost missed the release date for drivers, forcing apple to delay the release. That is why they switched off them.

@defoler
you are not a relative of @defiler btw?

On topic: there was a strange decision to use Kepler chips for their MacBookPros and it was a bad decision. Kepler chips are amazing and outstanding, but not in power efficency. So they had to use a GK107, the weakest of all Kepler chips in their macbooks and this just flopped.
I looks as they couldn't find any other blame than blame each other and this was the end of the Apple-nVidia partnership.

This post was modified 2 months ago

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
mac_editor
(@mac_editor)
Noble Member Moderator
Joined: 2 years ago
 
Posted by: timothyov

what does it mean "across low to mediocre silicon" ?

@timothyov Not all chips are created equal. Meaning, potential performance of two different Vega 64's will most likely be different. It you watch overclocking guides, the reviewer will mention that the clocks they accomplished may or may not be possible on the user's card (could be no headroom, or lots more headroom). In general, a better chip would need less voltage (same model GPU) to accomplish the same performance (thus people who get these chips can undervolt significantly - affectionately called silicon lottery). To address this discrepancy, all chips have their voltages set to a margin enough to ensure a good yield - so out of the box, all GPUs of the same model would perform the same - but after tweaking, things could be very different.

purge-wrangler.shpurge-nvda.shset-eGPU.shautomate-eGPU EFI Installer
----
Troubleshooting eGPUs on macOS
Command Line Swiss Knife
eGPU Hardware Chart
Multiple Build Guides
----
Current: MacBook Pro RP560X + 480/R9 Fury/Vega 64 | GTX 780/1070
Previous: 2014 MacBook Pro 750M + 480/R9 Fury | GTX 780/980 Ti/1070


ReplyQuote
Defoler
(@defoler)
Eminent Member
Joined: 6 months ago
 
Posted by: OliverB
Posted by: Defoler

Apple switch to amd because:

  1. nvidia kept bringing drivers at the last minute and it pissed apple.
  2. nvidia wanted a lot of money for the custom GPUs for the cMP.
  3. nvidia bad GPU batch cost apple money.

nvidia opencl performance was better than amd. Performance was not why apple switched. It was because it cost apple more money. Simple as that. Amd were willing to sell much cheaper and go through hoops to satisfy apple and make much needed revenue.

The first is what actually broke the camels back really. For the 2014 MacBook Pro, nvidia almost missed the release date for drivers, forcing apple to delay the release. That is why they switched off them.

@defoler
you are not a relative of @defiler btw?

On topic: there was a strange decision to use Kepler chips for their MacBookPros and it was a bad decision. Kepler chips are amazing and outstanding, but not in power efficency. So they had to use a GK107, the weakest of all Kepler chips in their macbooks and this just flopped.
I looks as they couldn't find any other blame than blame each other and this was the end of the Apple-nVidia partnership.

No relations. 
 
But I disagree that it was a bad decision.
First off, AMD had no answer for years in that department. Their GPUs took a lot more power than nvidia and were so much slower. So it was not like apple had a better decision.

Secondly, because of how nvidia acted, apple skipped on the 800M series in 2014, which was faster than AMD's new series in 2015, and nvidia's 900M series was even faster and less power hungry.

Power wise and performance wise, nvidia have and are the choice for mobile even today.
The M370X was a 50w TDP gpu, just like the 750M. It was 2 generations newer, and only had 5% performance increase outside of apple system. Understand this. 5%.
If apple switched in 2014 on the macbook pros, they would be using the M275X or some other variation of it, which took as much power as the 750M, but it was about 10% slower.
Even today the mobile 1060 max-q takes less power (50-60w) than the RP560x (60-80w), and it is faster by almost 20-50% (depends on usage).

The only reason apple were able to pull of the 15% performance increase in 2015 when they switched to AMD, was using metal instead of opengl. Metal had a huge impact on performance, and the older macbook pros were stuck at opengl intentionally. Only a year later they updated the drivers to support metal with nvidia as well.

This is why I think nvidia/apple issues were solely because nvidia were acting out.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
motoic
(@tsh3721)
Eminent Member
Joined: 2 years ago
 
Posted by: mac_editor
Posted by: timothyov

what does it mean "across low to mediocre silicon" ?

@timothyov Not all chips are created equal. Meaning, potential performance of two different Vega 64's will most likely be different. It you watch overclocking guides, the reviewer will mention that the clocks they accomplished may or may not be possible on the user's card (could be no headroom, or lots more headroom). In general, a better chip would need less voltage (same model GPU) to accomplish the same performance (thus people who get these chips can undervolt significantly - affectionately called silicon lottery). To address this discrepancy, all chips have their voltages set to a margin enough to ensure a good yield - so out of the box, all GPUs of the same model would perform the same - but after tweaking, things could be very different.

Thanks! So how do you know which card is good? Brand? Is there some known hierarchy like: 

Saphire > Asus > MSI > Gigabyte > XFX?

2018 15" MBP w/ Radeon Pro 560x
Nvidia GTX 1070
Akitio Node
macOS Mojave, Bootcamp Windows 10


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 9 months ago
 
Posted by: timothyov

Thanks! So how do you know which card is good? Brand? Is there some known hierarchy like: 

Saphire > Asus > MSI > Gigabyte > XFX?

Why would you put XFX at the end? I like XFX, they have the coolest colours on the DVI ports. Seriously: I think, none of the vendors is outstanding in any direction, except for EVGA, but this is a non-AMD topic.

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
mac_editor
(@mac_editor)
Noble Member Moderator
Joined: 2 years ago
 

@timothyov I don't know, to be honest. The point I was making was that regardless of brand, silicon can be different (within the same brand as well). Also, lottery means luck - so it's a gamble as to whether you will get a great overclockable chip or not. This is a non-issue for most consumers. Buy what you like. I personally like the build quality of Sapphire models.

@oliverb In early eGPU years, XFX GPUs were considered to be more problematic IIRC. This should be a non-issue today for the most part.

This post was modified 2 months ago

purge-wrangler.shpurge-nvda.shset-eGPU.shautomate-eGPU EFI Installer
----
Troubleshooting eGPUs on macOS
Command Line Swiss Knife
eGPU Hardware Chart
Multiple Build Guides
----
Current: MacBook Pro RP560X + 480/R9 Fury/Vega 64 | GTX 780/1070
Previous: 2014 MacBook Pro 750M + 480/R9 Fury | GTX 780/980 Ti/1070


ReplyQuote
goalque
(@goalque)
Noble Member Admin
Joined: 3 years ago
 
Posted by: mac_editor

@oliverb In early eGPU years, XFX GPUs were considered to be more problematic IIRC. This should be a non-issue today for the most part.

AFAIK, custom XFX vBIOSes are not macOS compatible, only the reference designs are recommended.

There are lots of reports, here’s one:

https://egpu.io/forums/mac-setup/2018-mac-mini-sonnet-egfx-breakaway-box-rx-580-crashes-when-connected-help/

This post was modified 2 months ago

automate-eGPU EFIapple_set_os.efi
--
2018 13" MacBook Pro + Radeon [email protected] + Win10 1809


ReplyQuote
motoic
(@tsh3721)
Eminent Member
Joined: 2 years ago
 
Posted by: OliverB
Posted by: timothyov

Thanks! So how do you know which card is good? Brand? Is there some known hierarchy like: 

Saphire > Asus > MSI > Gigabyte > XFX?

Why would you put XFX at the end? I like XFX, they have the coolest colours on the DVI ports. Seriously: I think, none of the vendors is outstanding in any direction, except for EVGA, but this is a non-AMD topic.

XFX have almost always been the budget cards for as long as I can remember from back in the day. The highest MSRP cards would always be Asus, MSI and a couple of other companies I can't remember now.

2018 15" MBP w/ Radeon Pro 560x
Nvidia GTX 1070
Akitio Node
macOS Mojave, Bootcamp Windows 10


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 9 months ago
 
Posted by: timothyov
Posted by: OliverB
Posted by: timothyov

Thanks! So how do you know which card is good? Brand? Is there some known hierarchy like: 

Saphire > Asus > MSI > Gigabyte > XFX?

Why would you put XFX at the end? I like XFX, they have the coolest colours on the DVI ports. Seriously: I think, none of the vendors is outstanding in any direction, except for EVGA, but this is a non-AMD topic.

XFX have almost always been the budget cards for as long as I can remember from back in the day. The highest MSRP cards would always be Asus, MSI and a couple of other companies I can't remember now.

@timothyov
This is interesting.
Most of the people who fellow my posts should know that I am collecting GPUs. In my personal opinion XFX are rather interesting cards.  In terms of production quality they surely are not low budget. @goalque said the XFX Bios were not mac compatible. This may be correct, as a XFX Radeon HD 7770 I have, just won't work in MacOSX, but I wouldn't call his problem "low budget".
As said before, the only brand that is outstanding in quality, is EVGA, but they only do nVidia. Asus and MSI are middle tier brands like many others, in my opinion.

 

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 9 months ago
 

FYI.. I made a list of my collection is this the result:

EVGA 5
Gainward 5
Sapphire 3
Gigabyte 3
XFX 3
Asus 3
Zotac 3
Inno3D 2
MSI 2
Sparkle 2
Club 3D 1
Nvidia 1
ATI 1
PNY 1
Palit 1

I had additionally about 100 other GPUs, which I don't have anymore.

This post was modified 2 months ago

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote