Redshift + Houdini nvidia GPU
So I'm about to make the jump and get an eGPU for my good old Mac Pro 6,1.
I'm getting started with Houdini and it's definitely not as fast as it should be at the moment. I have more or less decided on a Razer Core X enclosure (mostly because I can easily find one locally in NZ).
I'll use a TB3-TB2 adaptor to plug it in and I'm now trying to figure out which card to buy. From the many posts I have read, the GTX 1080 ti seems like the obvious choice, but there are a few models out there and I don't know what the differences are. some have 2 fans, some have 3. Asus, Evga, Aorus, Gygabyte. The price range varies with about $700 difference from the cheapest to the most expensive.
What should I be looking for in the specs? I will not be using it for gaming, it's only for pro apps like redshift+houdini and The foundry Nuke.
Any advice appreciated.
Be aware that Geforce cards display 8 bit colour and aren't suitable for accurate HDR workflows. The rendering performance will be good but if you are working in a multi person workflow then you will have to be careful that colour is consistent. The D700s you already have support 10 bit colour. It might be possible for you to visualise and work with your D700s and then use an external GPU just for rendering out.
Ah that's good to know, thanks!
I have a very strange issue with my mac pro at the moment. The viewport in Houdini is extremely slow. I get 10x better performance on my imac 2013 (which has a GTX 780M).
All other apps (Modo,Nuke) have no issues on the mac pro, great interactity and fast renders, but houdini is totally sticky with the refresh rate dropping below 1fps as soon as I do anything. I was hoping that a nvidia would fix that and allow me to use redshift to render quicker too.
Would I have to connect my monitor to the eGPU? or can I keep using it the same way (connected with TB2) and just select the nvida card for Houdini/Redshift.
please not that nVidia cards actually can display 10bit color.
Please don't believe everything you are told here.
cool, that's great then
Your link refers to full screen Direct X gaming and not professional windowed applications. Video and photography professionals have known about this for many years.
'NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. '
When using a Geforce graphics card, see Nvidia Control Panel in Windows or System Profiler in macOS. It will show that the driver only enables 8 bit colours per pixel in both operating systems.
Chiming in to keep this discussion civil as well as to learn from it.
@oliverb Regardless of what is correct and what is not, you are being rude. I request you to mind your manners while posting on this forum. I recall correcting you on a multitude of things when you were less experienced with eGPU in general. I recall others advised and corrected me (still do politely when needed) when I started posting here. No one personally attacked anyone in these scenarios, as far as I recall. Comments like “... but the way you’re telling them, it’s just false. Like everything you said..” don’t fly. I could say a lot more, but hopefully this is enough to convince you to reconsider (for good) the way you address things you think are incorrect (or post in general). Just write one nice post with evidence to counter any claims and say something like “in my experience, I found...” instead of berating/attacking. Thank you in advance for your understanding.
@craftsman I’m certainly no video editing expert (hobbyist) but I also wanted some clarity with respect to 10-bit support. I was under the following impression (for Windows):
In NVIDIA Control Panel, I was able to set my display output to 10-bits, and sure enough I could see a visible difference on the Windows 10 desktop (NVIDIA can display 10-bit also on the desktop and fullscreen DirectX - as already indicated). The only place where it doesn’t is in OpenGL 10-bit buffers as you correctly pointed out, but I thought even AMD cards (non-Pro cards) have this limitation, and with specific applications (Adobe Suite). So is my understanding correct here? Some references:
One of the reddit responses seems to explain the differences nicely. But I’m not sure what is correct. The second link interestingly points out that the non-Pro Radeon cards can do OpenGL 10-bit but not for Adobe apps. Which I believe is the big advantage for AMD you are referring to?
Things are different on macOS, certainly, where AMD is superior. I have personally not given much thought to that since I’ve never used any NVIDIA GPU on macOS for long (just for script work/testing) and never looked at bit depth, and irrespective, AMD performance on macOS exceeds modern NVIDIA cards (plus newer NVIDIA’s aren’t supported in Mojave).
@mac_editor you are right, my temperament got into me and I apologise for any bad tone by me.
In the same you should as well see that is not unprovoked. It started when I criticised the Radeon VII being overrated and then I got a lot of flames and lecturing with incorrect facts. As it were forbidden to criticise that card. Most of those posts have been deleted meanwhile , but there was so much nonsense written that I had to respond with energy.
I may add that I am probably the only one who had both cards at the same time for a longer period, Radeon VII and GTX 1080 Ti, testing and comparing them extensively. So yes, I am entitled to my opinion and to facts others cannot have. If this was accepted I would never reply in any ruder form.
that sounds like a great deal. Is it a US power supply? I'm in NZ so the voltage is different. Also can you send me the exact model/brand of the 1080 and sonnet please? I have found a bunch of options while I was looking.
For some reason I'm not allowed to message you directly, sorry