2017 13" MacBook Pro [7th,2C,U] + RX Vega 56 @ 32Gbps-TB3 (Mantiz Venus) + macOS 10.14.3 & Win10 [Eightarmedpet]
Going to refrain from too much waffle (maybe).
I was getting frustrated with the amount of noise from my Razer Core V1, even when idle the fans are audible to the extent that I wasn't happy leaving it plugged in when not in use. I also may have mentioned I have had issues with my Ultrafine - primarily I am unable to drive it with any GPU (still waiting for any with TB3).
With those two considerations in mind I decided it was time for a new enclosure and my dream of a high end mini enclosure is never happening so time to accept it and go the other way - an enclosure that is a full on docking solution with the ability to upgrade/mod it.
OH hai Mantiz Venus.
System specs (model inc screen size, CPU, iGPU, dGPU, operating system)
13 Inch MacBook Pro, i5, 500GB SSD, 16GB ram - to be upgraded this year?
Mods coming soon - trying to hunt out a PCIe riser that actually works.
It appears that unlike my Core V1 boot up is much more predictable (famous last words) requiring nothing more than rEFInd where previously I sometimes had to use the EFI Bootmanager as well or instead of. I can only assume its somehow related to the Ti83 vs Ti82 chip (the Core having the later).
I created another custom rEFInd theme based on an old Clover theme I designed for Corpnewt (if anyone dabbles in hacks they will prob know the guy) dropbox download here: https://www.dropbox.com/s/e1w4id31yy7glim/refind.zip?dl=0
- sudo mkdir /Volumes/ESP
(you'll be asked for your password)
- sudo mount -t msdos /dev/disk0s1 /Volumes/ESP
- sudo bless --mount /Volumes/ESP --setBoot --file /Volumes/ESP/efi/refind/refind_x64.efi --shortform
Reboot, and you should be good.
I've disabled the webcams in W10 too but not sure if that's needed.
PUBG gets a good solid 60fps @1440.
The Mantiz Venus is better than I expected...
The USB ports all appear to work (unlike the Core which was temperamental)
It's quite than expected as it appears Mantiz tweaked the design to include a
100 or 120mm fan rather than the original 80mm (best I double check the size - checked, its 120mm).
The build quality is higher than expected but maybe my expectations were low, I actually think it's on par with the Core but maybe not as nicely made as my Node Lite.
Due to the above I am able to keep the Mantiz plugged in at all times managing boot with rEFInd and having my headphones plugged into the Venus which helps hides cables when not in use. I have a Notctura fan which I had planned on popping in instead of the stock but the fans on my MacBook Pro are actually louder so I may not even bother, I had also planned on getting a semi passive PSU but again the stock one is quite than expected. I'll see how the 5700XT works out and reassess then. Still planning on getting the TB3 addd in card installed in the enclosure to directly drive my Ultrafine.
Couple of benchmark tests, looks to under performing? But H2D is oldish?
Thanks for the build-guide. Really looking forward to hearing how the 5700 XT works out in the Venus. They recommended this setup themselves in a support case I had with them after Vega 64 nitro+ got consistently undervolted in my setup (and I lost my patience)
Would love to know when you're planning of testing the 5700 XT?
@stoffer88 hey buddy, no worries!
What was happening with your Vega?
I actually sent the 5700xt back, for two reasons..
1. No macOS support as of yet
2. Loop back performance loss is far greater than Polaris and Vega (info courtesy of @itsage).
The second is an issue that only effects folks like me who are unable to connect their monitor directly to their GPU.
In retrospect, despite the daft bargain I found I think sending back was the right choice, my current blower card gets so damn loud (although so does my MacBook) so dont really want another. The leaked MSI dual fan cooler cards that have leaked look really nice and maybe by the time they hit macOS drivers may be ready and as W10 drivers mature problem 2 should be fixed too.
Updated with a couple of benchmarks, worried my set up is under performing...
These were done on my 4K tv connected directly to my GPU, W10 resolution @1400 for gaming but Valley @1080 fullscreen.
No FPS noticeable increase in PUBG vs loopback interestingly.
So my Windows instal just got corrupted again and it couldn't find the restore points or whatever its called.
Nuke and restart.
1903 and all the updates and ogpu wasn't detected, but no crashes as per earlier.
Boot with EFI boot loader and it appeared.
Driver instal crashed at 40%.
Unable to boot windows again.
Nuke and restart.
Same thing happened with 1809, bootcamp drivers crashed the system and corrupted the windows instal to the extent it doesn’t even boot.
Hopefully my old 1604 build that I usually set up with works.
Sorry to spam, back up and running thanks to my antique 1607 ISO, I am so glad I kept that ISO!
Also installed the AMD auto detect software which picked up my Vega 56 no issues and didn't crash on driver instal.
Tested on my TV briefly (didn't want to risk another corruption) and all I needed was to boot using the EFI Bootloader, no tweaks, no disabling, all good.
Also to note - later versions of windows didn't detect the egpu when I hot plugged, had to boot with it already plugged in using the EFI Bootloader.
I hate Windows.
File link here if it helps anyone.
@eightarmedpet Windows Boot Camp with eGPU can be a silly mess. I have good luck with Win10 1809 on my 2017 13″ MacBook Pro but was never been able to resolve the boot loop when I have Radeon drivers installed in Win10 1903.
@itsage Argh, tell me about it! Getting error 12 with the efi bootloader now... going to update to 1809 and see if that changes things.
Edit: update failed but it appears to be working now with rEFInd after I disabled the iSight camera. I love my old windows build.
I’m not sure... i was able to instal but when I Tried to install bootcamp drivers things went south quickly, same with 1809, so it may be my model of MacBook and the specific build of bootcamp drivers.
Microsoft has refused to join the Thunderbolt bandwagon because Intel and Apple make all the rules. You would see the #1 compliant of loyal Surface users is the lack of Thunderbolt connectivity. Until there’s a Microsoft laptop with Thunderbolt 3, they won’t bother testing compatibility prior to releasing these updates.
I think my two brain cells just bumped into each other and have sparked an idea, which may lead to me finally being able to drive my 5K AND have a single cable solution, the best part, I have all the parts already...
I've somehow always over looked Daisy chaining... while Daisy chaining egpu's through a single port the bandwidth is split between the two enclosures (source) but if the first item in the chain uses virtually no bandwidth the egpu at he end shouldn't have any issues.
What I'm thinking is... MacBook Pro > Akitio Node lite with TB3 add in card > Mantiz venus
Plug the 5K into the TB3 card in the Node Lite, and use DP to TB3 cables to route GPU power into the Node lite.
It may be a month before I can dig everything out (house renovations) but wanted to get it down before I forgot. If anyone fancies sense checking that would be appreciated but I'll give it a go myself asap....
Seems like that should work. @itsage already proved that an eGPU doesn't need to be first in the chain (is that true in Windows too?) when he connected 4 eGPUs to a Mac. However, I don't see how this is an improvement to the case where the TB3 AIC is not in the chain (if the TB3 AIC is working otherwise). I think your method eliminates a power switch that the AIC would require if it were not in a Thunderbolt 3 enclosure? And maybe one day your method will allow PCIe tunneling through the AIC that is inside a Thunderbolt enclosure (would require somebody to make a driver - maybe a USB4 solution will exist before then).
That would work but would this not increase latency (CPU-GPU)? Or maybe this is negligible. I am interested in seeing performance differences between a direct-connected eGPU vs. daisy-chained eGPU. There is probably some reasoning regarding the recommendation of plugging in GPUs directly.
@joevt would appreciate your thoughts on this as well. I also wanted to understand “latency” in terms of “gaming” FPS. Does higher latency = lower FPS, or lower frame time, or both, or something else?
I haven't made any measurements myself. I have seen benchmarks from others showing SSD performance drop in Thunderbolt chains but I don't know how that translates to GPUs. I expect there to be a difference but I don't know how much. Is latency similar to a change in PCIe link width or speed? Maybe in some ways but I don't think it should be as bad as halving the bandwidth (like reducing PCIe link width or speed would).
@joevt I found this test to be enlightening regarding bandwidth: https://egpu.io/forums/thunderbolt-enclosures/a-call-for-measurements-isolating-the-thunderbolt-effect/paged/5/#post-14987
It would seem that bandwidth is far from being the primary constraint on thunderbolt. I am hoping to understand the real constraint.
Perhaps it is this multiplexing and de-multiplexing that induces latency - from a short excerpt from Wikipedia:
Thunderbolt controllers multiplex one or more individual data lanes from connected PCIe and DisplayPort devices for transmission via two duplex Thunderbolt lanes, then de-multiplex them for use by PCIe and DisplayPort devices on the other end.
Wow, x4 Thunderbolt is worse than x1 PCIe 3.0... That could mean latency is a bigger factor than I thought. Additional columns to that chart would be very interesting: "x4 TB3 eGPU chain 2", "x4 TB3 eGPU chain 3", "x4 TB3 eGPU chain 4". Use Thunderbolt expansion boxes with no PCIe devices installed as the intermediary Thunderbolt device in the chain.
The thing about multiplexing - I don't see how it's different than a PCIe switch that has to move packets from an upstream link to downstream link. The links may have different speeds and widths like Thunderbolt but PCIe switches seem to do the job well enough. Maybe there's things that USB4 and Thunderbolt have to deal with that a PCIe switch does not so a PCIe switch can do it more efficiently.
From the USB4 spec there appears to be a tradeoff between throughput and latency for USB4 (page 330) and PCIe (page 422). It might apply to Thunderbolt as well.
@eightarmedpet That’s a solid plan. The Thunderbolt 3 AIB has always worked for me in terms of producing Thunderbolt 3 monitor output as long as there’s sufficient power.
Wow, x4 Thunderbolt is worse than x1 PCIe 3.0… That could mean latency is a bigger factor than I thought. Additional columns to that chart would be very interesting: “x4 TB3 eGPU chain 2”, “x4 TB3 eGPU chain 3”, “x4 TB3 eGPU chain 4”. Use Thunderbolt expansion boxes with no PCIe devices installed as the intermediary Thunderbolt device in the chain.
This is a great suggestion. I will find time and equipment to give these tests a shot.
@itsage look forward to your findings because most of this is way over my head, I'm just after a single cable solution and the most FPS possible...