2018 15" Samsung 900X5T (MX150) [8th,4C,U] + RTX 2080 Ti @ 32Gbps-TB3 (AKiTiO Node Pro) + Win10 & Linux Ubuntu 18.04 [chrisfrg]
- Samsung 900X5T 15" (i7 8550U, MX150, Intel Graphics 620)
- Windows 10, Ubuntu 18.04
- Dell 24" monitor (non-gaming)
|with laptop screen
||with external monitor
- pretty much plug-and-play
- The NVidia driver was already installed for MX150
- Using NVidia 410 driver
- must turn on Akitio Node Pro before booting into Ubuntu
- after each boot, 2 commands have to be run for the eGPU to be detected (got this from here)
$ sudo sh -c 'echo 1 > /sys/bus/thunderbolt/devices/0-1/authorized'
$ modprobe nvidia
- to use external monitor connected through eGPU, xorg configuration file has to be created (got this from here)
- First create /etc/X11/xorg.conf.d folder
- Then create nvidia.conf in that folder with the following content
Section "Device" Identifier "Videocard0" BusID "PCI:08:0:0" # This must be what your lspci command gave you Driver "nvidia" VendorName "NVIDIA Corporation" Option "AllowEmptyInitialConfiguration" Option "AllowExternalGpus" EndSection
Interestingly, performance numbers are better on Ubuntu across the board.
|CUDA-Z on Windows 10
||CUDA-Z on Ubuntu 18.04
For Unigine Heaven, I made some mistakes and the settings are not identical on Windows and Ubuntu.
|Unigine Heaven on Windows 10 (External Monitor)
||Unigine Heaven on Ubuntu 18.04 (External Monitor)
|Unigine Valley on Windows 10 (External Monitor)
||AIDA64 GPGPU on Windows 10
I got into eGPU to do deep learning stuffs. The flexibility to take my laptop anywhere, and plug it into the Node Pro only when I need the CUDA power is liberating. It is basically impossible to train or even do inference on modern, large neural network models with the MX150 that come with the laptop. Most of the time, I don't connect the external monitor. I think this is not degrading the performance, as I am not using the eGPU to actually drive graphics but to do calculation.
I started with cloud GPU on Azure. But I wanted to do computer vision with webcam, and it was just too complicated to stream the webcam video to cloud, do inference, and stream it back. For this, eGPU is much simpler.
Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts