Node Pro as eGPU....out of luck?
As noted above, the lack of drivers for the Titan V is the problem. UNIX is up to CUDA 390.*. The mac isn't there yet. NVIDIA won't/can't comment on when the drivers will be available.
For deep learning, the eGPU works great. I have a titan xp that hums right along. You have to build tensorflow from sources. That is a fussy, time consuming process, but it can be done.
This is a video demonstrating the speed on a relatively complex model.
Thanks for the video. The group we are working with are involved in breast cancer detection rates using deep learning vs a traditional radiologist. They are using 60k images & finding higher rates with deep learning. I'm just starting to apply these techniques to my research and see a lot of potential. We will be doing the work using super computer centers. I want to be able to explore & tweak the models using limited datasets. I think we are at the start of an exponential growth curve in the use of these techniques in medical imaging particularly now with almost entirely using digital film images. Access to relatively cheap & scalable GPU and storage capabilities makes a huge difference. Its light years away from the mid 80's when we were working with NMRI images.
MacBook Pro (Retina, 15-inch, Mid 2014),2.8 GHz Intel Core i7 16 GB NVIDIA GeForce GT 750M 2048 MB, Intel Iris Pro 1536 MB
MacBook Pro (Retina, 15-inch, 2018) I9, 2.9 GHZ 16gb Radeon Pro 560x 4 gb
Sonnet BA 650w (upgraded), Titan V & Sonnet BA 650w NVIDIA gtx 1080
Mantiz Venus RX VEGA 8 gb
iMacPro 64 gig. VEGA 54 using Sonnet BA
That's where we are going with our research and why I built a couple of eGPU systems. We have some IT requirements that make bringing up a rogue UNIX box hard. We have a couple of enterprise GPU clusters but there's much work that needs to be accomplished before we turn the training lose on those.
@Tico: What a great project! Oh how things have come on since my day... I remember writing my own neural net and learning system in Java for skin cancer diagnosis. No fancy software or hardware for the job, just pure code! But the input dataset was just a bunch of 1s and 0s for 17 different features from a laser, far simpler than an image like you're doing. I had to let the thing run overnight and hope that it was doing the right thing. Needless to say it didn't until the end, meaning it took a llllooonnngggg time to get it working properly!
Good luck with your application.
Great discussion here. I've always felt as though eGPU is going to be a major component of ushering in a new age of discovery. All that power sitting under so many clever peoples' desks. It's exciting.
MP 6,1 | 4c | d700
MP 6,1 | 6c | d500
Absolutely amazing to learn how people are making use of eGPU.
There are differences in the TBT firmware of an eGPU enclosure vs. PCIe enclosure. While we can install a graphics card inside the Node Pro, ..
Which are those differences?
(This is also the topic of this thread, https://www.reddit.com/r/eGPU/comments/92ndul/do_egpu_boxes_no_daisychain_have_special_firmware/ .)