Tensorflow to use eGPU (Akitio Node | NVIDIA 1070), OSX
I've finally got my Mac to recognise the NVIDIA card (via my akitio node), with thanks to this wonderful forum.
Now, I need to try and figure out how to get my Mac to USE it. My only real use case is Tensorflow, if I can use it for other things then great but my immediate aim is to get Tensorflow to run using the eGPU housed 1070.
Has anyone achieved this? Can anyone point me in the right direction?
You have to compile it yourself if you want to use newer versions of Tensorflow than 1.0/1.1.
There's a guide: https://gist.github.com/smitshilu/53cf9ff0fd6cdb64cca69a7e2827ed0f
I have now managed to do this. Not a straight forward process at all.
Even after I get TF working via command line it was a chew on trying to get jupyter to use it.
I'm now getting a strange cuda out of memory issue in the console, even when just outputting the session.
I'm not familiar with TF and CUDA on macOS but I think I ran into a similar issue regarding CUDA when compiling Deeplearning4J with CUDA support. I did something wrong but don't ask me what it was. Check if the CUDA versions and all that stuff is fitting.
No idea why TF GPU support has been dropped on macOS. Maybe the audience is too small and the effort therefore too big but it's Google that is behind TensorFlow so ye...it's strange.
Would the new Tensor cores built into NVIDIA's Volta architecture (Titan V) help you with this? If so and if the Node's PSU can handle that card, might be worth trying to borrow one as long as you're going to the trouble to set this up.
The massive performance boost for 8/16bit maths alone would help massively, but with no Titan V drivers on macOS it'd be a big expensive paperweight 😉
I'm currently using TF1.4 on macOS 10.3.3 with a 1080TI. It is possible to get it running. The main issue I had was that it needed (iirc) CUDA 9.0, not 9.1. Also, there are a few precompiled wheels with CUDA support you can install from on the net.