Sonnet Puck 560 Shader / CU count (unlock?)
I was checking out Sonnet Puck RX 560 specifications, and found that there is a bit confusing info regarding shaders / CU’s in the MXM card.
I know that some 560 cards are potentially locked to 896 c / 14 CUs instead of 1024 c / 16 CUs. At least I think so from the info I found on the web.
However, I did not know that there migh be a pottential difference within the same model (Sonnet Puck RX 560). I still do not know - maybe it’s just software, however these 2 reviews seem to say that one card has full unlocked units / and the other is locked:
In case you have this eGPU - Coud you please check the info and share some GPU-Z screenshot? Could it also be possible to save the BIOS of the card with: https://www.techpowerup.com/download/ati-atiflash
...and share it for testing purposes?
@susurs, If possible post the purchase month/year you acquired the Breakaway Puck. I'm wondering if all the newer units are 896 c / 14 CUs. My media sample unit is from the very first production run in summer 2017.
Thanks! I’ll check details tomorrow when I get home.
For now I only noticed that @xPhil posted a scrrenshot (in the review above) where it says 896 / 14. And I remember that my unit scored around 2400 points in Unigene on W10 1080P medium, but you have achieved a score of 2900 with the same configuration.
I do not know for sure - maybe my unit will report 1024 / 16 tomorrow as well and it is just my ‘theory’, ...we’ll see.
In case it reports less - I’d like to try to flash it with a bios from 1024 card.
P.S. Is ATITool able to extract .rom on W10 from this eGPU?
@susurs, ATITool will most definitely be able to extract the .ROM from it. Make sure to keep the original .ROM in case the to 1024 c / 16 CUs flash doesn't work.
I have found that:
1) My card was manufactured on 2019-04-17 / at least thats what GPU-Z tells from BIOS;
2) There seem to be few versions of this card and later ones have 896 shaders / 128 cut by ATI according to the info here: https://www.techpowerup.com/gpu-specs/amd-baffin.g796
I can confirm that I was able to ‘unlock’ the shaders on my card (In case someone wants to try something similar do it fully on your own risk), however the expected performance gain is a bit confusing.
The results of the experiment from technical perspective are positive and no issues at all.
I was initially a bit confused as to whether it would work at it is eGPU ‘sitting ‘in Sonnet Puck enclosure. And W10 is on Mac Mini / Bootcamp which makes eGPU a bit more difficult as such. However, no issues.
I used the procedure with modifying hex value (see here: https://www.techpowerup.com/forums/threads/can-rx-560-mobile-baffin-xt-with-896-shaders-be-unlocked-to-1024-update-it-can.276514/ ) and after saving the file with PGE was even able to flash it with ATIFlash GUI without need to force anthing in terminal.
Within a few seconds the card was flashed and when I restarted it was ‘live’. On first restart I thought there was some W10 / drivers issue as the card showed yellow triangle next to it in the device manager with ‘Code 43’. I thought It was eGPU issue again as some tinkering is needed to avoid things like this in Bootcamp / external graphics setup. However, it was just the Atikmdag patch still needed. Run it and all was good. GPU-Z reported 1024 shaders (See screenshots before and after). Noticed that some info is missing in GPU-Z, but I took that screenshot when the yellow exclamation mark was still in W10 / device manager.
Restarted in macOS - all is good as well.
There are no errors, no artifacts etc. Temperatures are the same around 45-47c in desktop (MSI Afterburner).
However, I am a bit confused regarding the performance gains and causes. I expected that the performance gain would be around 15-20%, but in reality it is 2-5%.
I run Unigene W10 - the gain was 50 points on average (‘Medium’). Up from 2370 to around 2420. At start I thouhgt there is no gain at all and those 50 points are just software / regular fluctuations.
AIDA GPU bench even reported lower results.
Then I went to macOS and tested BasemarkGPU. There the score went up from 17000/17200 to around 17800/17900 points and it was a constant gain.
There was no real gain for Unigene Heaven and Novabench on macOS. Maybe 1% for the latter.
My first thought was that the units are laser cut and reporting 1024 is just a fiction. I reflashed back to the default bios and retested.
Ir seems that the marginal gain is not software fluctuation after all. Basemark GPU in macOS instantly went back to 16900/17000, and Unigene in W10 lost those small 50 points.
I really thougt that additional 128 shaders would make more difference. Could someone comment those small performane gains a bit?
P.S. Uploaded the original .rom to the database via GPU-Z, however I cannot find it now (even in ‘unwerified’ section). Will try again a bit later.
I don't have my rx 560 in my egpu any more but I've messed with this quite a bit and this is what I think is happening:
- around +5% in unigine is about what I would expect
- the patch that you used to get rid of error 43 I think sends you back to an older version of the driver that performs worse. I flashed a real rx 560 16CU bios from techpowerup (just make sure the memory spec is same) to avoid having to do this
EDIT: just remembered that this is an mxm card so you probably can't just use any bios, you would have to use one from an existing puck card
- this card is limited more by the power limit and memory speed. If you can/want, try increasing the memory speed by 50-100mhz and the power limit by 10%. This should increase the performance more the shader count mod did
Can I use AMD Wattman to increase power limit? Sorry, haven’t been using AMD cards previously : )
P.S. Do you perhaps remember which bios did you use exactly for flashing?
I did some more tests. Increased power limit / GPU clock with software first and tested that it is fine, then edited already modified 1024 shader bios and upped TDP from default 46W to 55W, TDC from 43A to 48A, Max Power Limit from 40W to 48W. Increased GPU step to 1300MHz as well (1200MHz default).
As a result AIDA64 GPGPU came up with 1965 SP GFLOPS instead of 1399 SP GFLOPS. Not sure if AIDA is correct but 1399 FFLOPS does not really correspond / reach to any mobile 560 card in the list: https://en.m.wikipedia.org/wiki/List_of_AMD_graphics_processing_units
Temperatures stayed within a normal range, maybe a total increase by 4-5c. Anyway while doing some not heavy benchmarks, temperatures did not go past ~71c, and stayed around 45c idle.
No issues, no artifacts.
However, it seems that most GPU benchmarks like Basemark GPU in macOS, Unigene in W10 similarly gained around 11% performance.
Basemark GPU (medium) went up from 17000/17200 to 19100, and (high) from 1350 to 1515. Unigene in W10 went up from ~2350 to almost 2600 (medium).
Just for testing purposes it might be interesting, however, I do not think that those 10% will be a reason for me to leave modified bios on the card.
P.S. Could someone comment a bit on increasing power limits, TDP, TDC, MPL? How far could I go from defaults? I looked through the TechPowerUp database too see similar cards and corresponding power settings before I increased those on my card.