Redshift + Houdini nvidia GPU
Clear all

Redshift + Houdini nvidia GPU  

 of  2

Estimable Member
Joined: 2 years ago

@mac_editor Thanks for filling out more details for members. Yes, the mix of APIs makes Windows difficult for developers to be consistent. 

Regarding Radeon colour depth in Windows, simply enable 10 bit colour in Adrenalin. Alternatively the Radeon Pro software enables extra pro features on the consumer/prosumer cards. In macOS Radeons display 10 bit by default since El Capitan as you probably know.

I received my Razer Core X Chroma today and am waiting for 10.14.5 to be released before I purchase a Radeon VII. I have a 1080 Ti SC2 but will sell it soon.

2019 16" MacBook Pro (RP5500M) [9th,8C,H] + RX 5700 XT @ 32Gbps-TB3 (Razer Core X Chroma) + macOS 10.15.1 [build link]  

Famed Member Moderator
Joined: 4 years ago

@craftsman thank you for elaborating.

@oliverb I had read through the thread before @itsage appropriately removed the posts and no, you were not attacked. Rather your response was way too strong for what I thought was a casual reminder to be on-topic. For the last time, just because your tests with Radeon VII in eGPU configuration show poor results does not mean the card itself is slower than the 1080 Ti in games. That conclusion can only be drawn if the VII is slower in a Desktop as well. You are testing right, but drawing conclusions wrong. Your facts are not the only facts. If, hypothetically, AMD release a driver/firmware update and the same card starts performing better than a 1080 Ti (like on the Desktop) in eGPU - then are you going to go back to each and every post of yours and correct it? - No. Because when you wrote them, they were valid. In the future, they may not be - which happens all the time. Also, thank you for understanding my perspective and not taking it the wrong way.

Thanks to me, this thread is also going off-topic, so I shall say no more.

Insights Into macOS Video Editing Performance

Master Threads:
2014 15-inch MacBook Pro 750M
2018 15-inch MacBook Pro

2019 13" MacBook Pro [8th,4C,U] + RX Vega 64 @ 32Gbps-TB3 (Mantiz Venus) + macOS 10.14.6 & Win10 [build link]  

mark hensley
Active Member
Joined: 3 years ago
Posted by: jean-luc

Hi Mark

that sounds like a great deal. Is it a US power supply? I'm in NZ so the voltage is different. Also can you send me the exact model/brand of the 1080 and sonnet please? I have found a bunch of options while I was looking.
For some reason I'm not allowed to message you directly, sorry


It's this one:

And the power suplly is rated 100-240Volts 


Macbook Pro 2011 15"
Sonnet Breakaway
GTX 1080
High Sierra 10.13.6

New Member
Joined: 1 year ago
Posted by: @mac_editor

Chiming in to keep this discussion civil as well as to learn from it.

@oliverb Regardless of what is correct and what is not, you are being rude. I request you to mind your manners while posting on this forum. I recall correcting you on a multitude of things when you were less experienced with eGPU in general. I recall others advised and corrected me (still do politely when needed) when I started posting here. No one personally attacked anyone in these scenarios, as far as I recall. Comments like “... but the way you’re telling them, it’s just false. Like everything you said..” don’t fly. I could say a lot more, but hopefully this is enough to convince you to reconsider (for good) the way you address things you think are incorrect (or post in general). Just write one nice post with evidence to counter any claims and say something like “in my experience, I found...” instead of berating/attacking. Thank you in advance for your understanding. 

@craftsman I’m certainly no video editing expert (hobbyist) but I also wanted some clarity with respect to 10-bit support. I was under the following impression (for Windows):
In NVIDIA Control Panel, I was able to set my display output to 10-bits, and sure enough I could see a visible difference on the Windows 10 desktop (NVIDIA can display 10-bit also on the desktop and fullscreen DirectX - as already indicated). The only place where it doesn’t is in OpenGL 10-bit buffers as you correctly pointed out, but I thought even AMD cards (non-Pro cards) have this limitation, and with specific applications (Adobe Suite). So is my understanding correct here? Some references:

One of the reddit responses seems to explain the differences nicely. But I’m not sure what is correct. The second link interestingly points out that the non-Pro Radeon cards can do OpenGL 10-bit but not for Adobe apps. Which I believe is the big advantage for AMD you are referring to?

Things are different on macOS, certainly, where AMD is superior. I have personally not given much thought to that since I’ve never used any NVIDIA GPU on macOS for long (just for script work/testing) and never looked at bit depth, and irrespective, AMD performance on macOS exceeds modern NVIDIA cards (plus newer NVIDIA’s aren’t supported in Mojave).

Hi, you seems very knowledgable,  can you help me?

I'm a newbie, and began to use Houdini software on my MacBook Pro (Retina, 13-inch, Mid 2014), I already bought Akitio Node - Thunderbolt3 eGPU from Amazon (that exact link), but now I confuse on what card to use, I saw this thread that he had problem with the RX Vega 64: Cant make my setup work - Akitio Node + Vega 64 LC + MBP 15" 2017 macOS 10.14.3

Can you please help what the best card for me to use with Akitio Node - Thunderbolt3 eGPU ?



I changed my mind, I will just buy the enclosure that support NVidia cards, I read this thread on SideFX that Redshift to speed up rendering won't run on AMD hardware, unfortunately, my Akitio Node only supports AMD on MacOS :(.

This post was modified 1 year ago

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.


 of  2