General Discussions
Redshift + Houdini nvidia GPU
 

Redshift + Houdini nvidia GPU  

  RSS

jean-luc
(@jean-luc)
Active Member
Joined: 7 months ago
 

Hi All

So I'm about to make the jump and get an eGPU for my good old Mac Pro 6,1. 
I'm getting started with Houdini and it's definitely not as fast as it should be at the moment. I have more or less decided on a Razor Core X enclosure (mostly because I can easily find one locally in NZ).
I'll use a TB3-TB2 adaptor to plug it in and I'm now trying to figure out which card to buy. From the many posts I have read, the GTX 1080 ti seems like the obvious choice, but there are a few models out there and I don't know what the differences are. some have 2 fans, some have 3. Asus, Evga, Aorus, Gygabyte. The price range varies with about $700 difference from the cheapest to the most expensive.
What should I be looking for in the specs? I will not be using it for gaming, it's only for pro apps like redshift+houdini and The foundry Nuke.

Any advice appreciated.
Thanks!
jean-luc

Mac OS High Sierra 10.13.6
Mac pro 6,1
12 core
64Gb RAM
AMD Firepro D700 x2

I want to add a nvidia EGPU to use with Redshift on Houdini


ReplyQuote
craftsman
(@craftsman)
Trusted Member
Joined: 7 months ago
 

Be aware that Geforce cards display 8 bit colour and aren't suitable for accurate HDR workflows. The rendering performance will be good but if you are working in a multi person workflow then you will have to be careful that colour is consistent. The D700s you already have support 10 bit colour. It might be possible for you to visualise and work with your D700s and then use an external GPU just for rendering out.

MacBook Pro 15" 2018, Radeon 5700 XT 50th Anniversary Edition, Razer Core X Chroma


ReplyQuote
jean-luc
(@jean-luc)
Active Member
Joined: 7 months ago
 

Ah that's good to know, thanks!
I have a very strange issue with my mac pro at the moment. The viewport in Houdini is extremely slow. I get 10x better performance on my imac 2013 (which has a GTX 780M). 
All other apps (Modo,Nuke) have no issues on the mac pro, great interactity and fast renders, but houdini is totally sticky with the refresh rate dropping below 1fps as soon as I do anything. I was hoping that a nvidia would fix that and allow me to use redshift to render quicker too.
Would I have to connect my monitor to the eGPU? or can I keep using it the same way (connected with TB2) and just select the nvida card for Houdini/Redshift.

This post was modified 7 months ago

Mac OS High Sierra 10.13.6
Mac pro 6,1
12 core
64Gb RAM
AMD Firepro D700 x2

I want to add a nvidia EGPU to use with Redshift on Houdini


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 1 year ago
 

@jean-luc
please not that nVidia cards actually can display 10bit color.

See also:
https://nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce-gpus

Please don't believe everything you are told here.

 

This post was modified 7 months ago

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
jean-luc
(@jean-luc)
Active Member
Joined: 7 months ago
 

cool, that's great then 🙂

can't wait!

Mac OS High Sierra 10.13.6
Mac pro 6,1
12 core
64Gb RAM
AMD Firepro D700 x2

I want to add a nvidia EGPU to use with Redshift on Houdini


OliverB liked
ReplyQuote
craftsman
(@craftsman)
Trusted Member
Joined: 7 months ago
 

@oliverb

Your link refers to full screen Direct X gaming and not professional windowed applications. Video and photography professionals have known about this for many years. 

'NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs.  Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop.  These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector.  '

When using a Geforce graphics card, see Nvidia Control Panel in Windows or System Profiler in macOS. It will show that the driver only enables 8 bit colours per pixel in both operating systems.

MacBook Pro 15" 2018, Radeon 5700 XT 50th Anniversary Edition, Razer Core X Chroma


ReplyQuote
mac_editor
(@mac_editor)
Famed Member Moderator
Joined: 3 years ago
 

Chiming in to keep this discussion civil as well as to learn from it.

@oliverb Regardless of what is correct and what is not, you are being rude. I request you to mind your manners while posting on this forum. I recall correcting you on a multitude of things when you were less experienced with eGPU in general. I recall others advised and corrected me (still do politely when needed) when I started posting here. No one personally attacked anyone in these scenarios, as far as I recall. Comments like “... but the way you’re telling them, it’s just false. Like everything you said..” don’t fly. I could say a lot more, but hopefully this is enough to convince you to reconsider (for good) the way you address things you think are incorrect (or post in general). Just write one nice post with evidence to counter any claims and say something like “in my experience, I found...” instead of berating/attacking. Thank you in advance for your understanding. 

@craftsman I’m certainly no video editing expert (hobbyist) but I also wanted some clarity with respect to 10-bit support. I was under the following impression (for Windows):
In NVIDIA Control Panel, I was able to set my display output to 10-bits, and sure enough I could see a visible difference on the Windows 10 desktop (NVIDIA can display 10-bit also on the desktop and fullscreen DirectX - as already indicated). The only place where it doesn’t is in OpenGL 10-bit buffers as you correctly pointed out, but I thought even AMD cards (non-Pro cards) have this limitation, and with specific applications (Adobe Suite). So is my understanding correct here? Some references:

https://community.amd.com/thread/227740
https://forums.adobe.com/message/9978119#9978119
https://www.reddit.com/r/Amd/comments/7vd3ln/does_vega_support_10bit_color_output_in_windows/

One of the reddit responses seems to explain the differences nicely. But I’m not sure what is correct. The second link interestingly points out that the non-Pro Radeon cards can do OpenGL 10-bit but not for Adobe apps. Which I believe is the big advantage for AMD you are referring to?

Things are different on macOS, certainly, where AMD is superior. I have personally not given much thought to that since I’ve never used any NVIDIA GPU on macOS for long (just for script work/testing) and never looked at bit depth, and irrespective, AMD performance on macOS exceeds modern NVIDIA cards (plus newer NVIDIA’s aren’t supported in Mojave).

purge-wrangler.shpurge-nvda.shset-eGPU.shautomate-eGPU EFI Installer
2018 MacBook Pro 15" RP560X + RX 5700 XT (Mantiz Venus)


ReplyQuote
OliverB
(@oliverb)
Noble Member
Joined: 1 year ago
 

@mac_editor you are right, my temperament got into me and I apologise for any bad tone by me.

In the same you should as well see that is not unprovoked. It started when I criticised the Radeon VII being overrated and then I got a lot of flames and lecturing with incorrect facts. As it were forbidden to criticise that card. Most of those posts have been deleted meanwhile , but there was so much nonsense written that I had to respond with energy.

I may add that I am probably the only one who had both cards at the same time for a longer period, Radeon VII and GTX 1080 Ti, testing and comparing them extensively. So yes, I am entitled to my opinion and to facts others cannot have. If this was accepted I would never reply in any ruder form.

2018 15" MBP & 2015 13" MBP connected to RTX2080Ti GTX1080Ti GTX1080 Vega56 RX580 R9-290 GTX680


ReplyQuote
(@mark_hensley)
Active Member
Joined: 1 year ago
 

FYI, I am selling a sonnet 1080 GPU combo for $500.

https://egpu.io/forums/mac-setup/script-enable-egpu-on-tb1-2-macs-on-macos-10-13-4/#post-33102

Macbook Pro 2011 15"
Sonnet Breakaway
GTX 1080
High Sierra 10.13.6


craftsman liked
ReplyQuote
jean-luc
(@jean-luc)
Active Member
Joined: 7 months ago
 

Hi Mark

that sounds like a great deal. Is it a US power supply? I'm in NZ so the voltage is different. Also can you send me the exact model/brand of the 1080 and sonnet please? I have found a bunch of options while I was looking.
For some reason I'm not allowed to message you directly, sorry

Cheers

Mac OS High Sierra 10.13.6
Mac pro 6,1
12 core
64Gb RAM
AMD Firepro D700 x2

I want to add a nvidia EGPU to use with Redshift on Houdini


ReplyQuote
craftsman
(@craftsman)
Trusted Member
Joined: 7 months ago
 

@mac_editor Thanks for filling out more details for members. Yes, the mix of APIs makes Windows difficult for developers to be consistent. 

Regarding Radeon colour depth in Windows, simply enable 10 bit colour in Adrenalin. Alternatively the Radeon Pro software enables extra pro features on the consumer/prosumer cards. In macOS Radeons display 10 bit by default since El Capitan as you probably know.

I received my Razer Core X Chroma today and am waiting for 10.14.5 to be released before I purchase a Radeon VII. I have a 1080 Ti SC2 but will sell it soon.

This post was modified 7 months ago

MacBook Pro 15" 2018, Radeon 5700 XT 50th Anniversary Edition, Razer Core X Chroma


ReplyQuote
mac_editor
(@mac_editor)
Famed Member Moderator
Joined: 3 years ago
 

@craftsman thank you for elaborating.

@oliverb I had read through the thread before @itsage appropriately removed the posts and no, you were not attacked. Rather your response was way too strong for what I thought was a casual reminder to be on-topic. For the last time, just because your tests with Radeon VII in eGPU configuration show poor results does not mean the card itself is slower than the 1080 Ti in games. That conclusion can only be drawn if the VII is slower in a Desktop as well. You are testing right, but drawing conclusions wrong. Your facts are not the only facts. If, hypothetically, AMD release a driver/firmware update and the same card starts performing better than a 1080 Ti (like on the Desktop) in eGPU - then are you going to go back to each and every post of yours and correct it? - No. Because when you wrote them, they were valid. In the future, they may not be - which happens all the time. Also, thank you for understanding my perspective and not taking it the wrong way.

Thanks to me, this thread is also going off-topic, so I shall say no more.

This post was modified 7 months ago

purge-wrangler.shpurge-nvda.shset-eGPU.shautomate-eGPU EFI Installer
2018 MacBook Pro 15" RP560X + RX 5700 XT (Mantiz Venus)


ReplyQuote
(@mark_hensley)
Active Member
Joined: 1 year ago
 
Posted by: jean-luc

Hi Mark

that sounds like a great deal. Is it a US power supply? I'm in NZ so the voltage is different. Also can you send me the exact model/brand of the 1080 and sonnet please? I have found a bunch of options while I was looking.
For some reason I'm not allowed to message you directly, sorry

Cheers

It's this one:

http://www.sonnettech.com/product/egfx-breakaway-box.html

And the power suplly is rated 100-240Volts 
50-60hz

Mark

https://egpu.io/forums/mac-setup/script-enable-egpu-on-tb1-2-macs-on-macos-10-13-4/#post-33102

Macbook Pro 2011 15"
Sonnet Breakaway
GTX 1080
High Sierra 10.13.6


ReplyQuote
(@linda_f)
New Member
Joined: 3 days ago
 
Posted by: @mac_editor

Chiming in to keep this discussion civil as well as to learn from it.

@oliverb Regardless of what is correct and what is not, you are being rude. I request you to mind your manners while posting on this forum. I recall correcting you on a multitude of things when you were less experienced with eGPU in general. I recall others advised and corrected me (still do politely when needed) when I started posting here. No one personally attacked anyone in these scenarios, as far as I recall. Comments like “... but the way you’re telling them, it’s just false. Like everything you said..” don’t fly. I could say a lot more, but hopefully this is enough to convince you to reconsider (for good) the way you address things you think are incorrect (or post in general). Just write one nice post with evidence to counter any claims and say something like “in my experience, I found...” instead of berating/attacking. Thank you in advance for your understanding. 

@craftsman I’m certainly no video editing expert (hobbyist) but I also wanted some clarity with respect to 10-bit support. I was under the following impression (for Windows):
In NVIDIA Control Panel, I was able to set my display output to 10-bits, and sure enough I could see a visible difference on the Windows 10 desktop (NVIDIA can display 10-bit also on the desktop and fullscreen DirectX - as already indicated). The only place where it doesn’t is in OpenGL 10-bit buffers as you correctly pointed out, but I thought even AMD cards (non-Pro cards) have this limitation, and with specific applications (Adobe Suite). So is my understanding correct here? Some references:

https://community.amd.com/thread/227740
https://forums.adobe.com/message/9978119#9978119
https://www.reddit.com/r/Amd/comments/7vd3ln/does_vega_support_10bit_color_output_in_windows/

One of the reddit responses seems to explain the differences nicely. But I’m not sure what is correct. The second link interestingly points out that the non-Pro Radeon cards can do OpenGL 10-bit but not for Adobe apps. Which I believe is the big advantage for AMD you are referring to?

Things are different on macOS, certainly, where AMD is superior. I have personally not given much thought to that since I’ve never used any NVIDIA GPU on macOS for long (just for script work/testing) and never looked at bit depth, and irrespective, AMD performance on macOS exceeds modern NVIDIA cards (plus newer NVIDIA’s aren’t supported in Mojave).

Hi, you seems very knowledgable,  can you help me?

I'm a newbie, and began to use Houdini software on my MacBook Pro (Retina, 13-inch, Mid 2014), I already bought Akitio Node - Thunderbolt3 eGPU from Amazon (that exact link), but now I confuse on what card to use, I saw this thread that he had problem with the RX Vega 64: Cant make my setup work - Akitio Node + Vega 64 LC + MBP 15" 2017 macOS 10.14.3

Can you please help what the best card for me to use with Akitio Node - Thunderbolt3 eGPU ?

 

Update:

I changed my mind, I will just buy the enclosure that support NVidia cards, I read this thread on SideFX that Redshift to speed up rendering won't run on AMD hardware, unfortunately, my Akitio Node only supports AMD on MacOS :(.

This post was modified 3 days ago

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote