General Discussions
Tensorflow to use eGPU (Akitio Node | NVIDIA 1070), OSX
 

Tensorflow to use eGPU (Akitio Node | NVIDIA 1070), OSX  

  RSS

(@mark_stephenson)
Active Member
Joined: 2 years ago
 

Hi,

I've finally got my Mac to recognise the NVIDIA card (via my akitio node), with thanks to this wonderful forum.

Now, I need to try and figure out how to get my Mac to USE it.  My only real use case is Tensorflow, if I can use it for other things then great but my immediate aim is to get Tensorflow to run using the eGPU housed 1070.

Has anyone achieved this?  Can anyone point me in the right direction?

Thanks,

Mark.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
Morv
 Morv
(@morv)
Eminent Member
Joined: 3 years ago
 

You have to compile it yourself if you want to use newer versions of Tensorflow than 1.0/1.1.

There's a guide:  https://gist.github.com/smitshilu/53cf9ff0fd6cdb64cca69a7e2827ed0f

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
(@mark_stephenson)
Active Member
Joined: 2 years ago
 

I have now managed to do this. Not a straight forward process at all. 

Even after I get TF working via command line it was a chew on trying to get jupyter to use it. 

I'm now getting a strange cuda out of memory issue in the console, even when just outputting the session. 

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
Morv
 Morv
(@morv)
Eminent Member
Joined: 3 years ago
 

I'm not familiar with TF and CUDA on macOS but I think I ran into a similar issue regarding CUDA when compiling Deeplearning4J with CUDA support. I did something wrong but don't ask me what it was. Check if the CUDA versions and all that stuff is fitting.

No idea why TF GPU support has been dropped on macOS. Maybe the audience is too small and the effort therefore too big but it's Google that is behind TensorFlow so ye...it's strange.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
Chippy McChipset
(@chippy-mcchipset)
Reputable Member
Joined: 2 years ago
 

Would the new Tensor  cores built into NVIDIA's Volta architecture (Titan V) help you with this? If so and if the Node's PSU can handle that card, might be worth trying to borrow one as long as you're going to the trouble to set this up.

Thunderbolt 3 Macs, Sonnet and OWC eGPUs, 4K Displays, etc


ReplyQuote
psonice
(@psonice)
Estimable Member
Joined: 2 years ago
 

The massive performance boost for 8/16bit maths alone would help massively, but with no Titan V drivers on macOS it'd be a big expensive paperweight 😉

I'm currently using TF1.4 on macOS 10.3.3 with a 1080TI. It is possible to get it running. The main issue I had was that it needed (iirc) CUDA 9.0, not 9.1. Also, there are a few precompiled wheels with CUDA support you can install from on the net.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts


ReplyQuote
Chippy McChipset
(@chippy-mcchipset)
Reputable Member
Joined: 2 years ago
 

Ah. Didn't realize they weren't including Titan V drivers in their usual Mac web updates. That would indeed = paperweight.

Thunderbolt 3 Macs, Sonnet and OWC eGPUs, 4K Displays, etc


ReplyQuote