eGPU on Linux - Easy-to-use setup script
Hey, thanks for the post. I did actually build it on my ppa for Disco (19.04) as well as Bionic (18.04 LTS).
It should actually work on any linux distro, as long as it supports XServer, Systemd and Bash scripts. You just have to build it yourself with "make"
It's worth mentioning I think that some things, particularly eGPU hotplug support, have gotten better with kernel version 5.0+ and GNOME 3.32+, which are both in Ubuntu 19.04. So I think it's worth being on 19.04 for Ubuntu eGPU users.
Edit: Oh yeah, and I can confirm that I was able to build the script in Manjaro so that's nice
I registered an account simply to say thanks for creating this script, it made this process so simple.
For others reference, this script worked seamlessly for me on a Dell XPS 15 9560 running Ubuntu 18.04.03 LTS with an Akitio Node and an Nvidia GT740 (only card I had spare to test with). It detected the iGPU on the i7, the GTX1050 dGPU in the laptop, and the 740 in the Node first time.
For my Ubuntu 19.04, when I finish typing my password and press enter, the whole system seems to go to sleep. If I switch sessions and finally get in, the resolution is off and everything is very laggy. The only way around this is to press enter and my mouse button rapidly until I am able to see the desktop. I am running a Dell Precision 5520, RTX2070 egpu, Ubuntu 19.04 with proprietary drivers. Anyone else have this issue?
A couple of questions:
1. In /etc/gdm3/custom.conf is WaylandEnable=false commented out or not? Uncommenting this line or installing a purely X based login manager like lightdm would be the first thing I would try.
2. What's your display configuration? Do you have an external display connected to the eGPU or are you using the laptop's internal display?
Edit: Actually, before all that, re-run egpu-switcher setup and post the names and ids of what outputs you are selecting
@hertg Your script still works great in Ubuntu and Pop!_OS 19.10! There's also some enhancement in this version at the login screen. I hot-plugged a Thunderbolt 3 NVMe M.2 adapter and saw this notification.
Having some really weird issues with this when using external displays connected to the eGPU.
Using NVIDIA PRIME, I can get the internal screen to render on the eGPU by setting up eGPU's Bus ID under /usr/share/X11/xorg.conf.d/10-nvidia.conf. However, my laptop remains completely unaware of external screens connected to eGPU.
Using this eGPU switcher, if I remove all nvidia and intel related files from /usr/share/X11/xorg.conf.d, I can get output on the external screen connected to eGPU. However, in this case, nothing is actually rendered by eGPU according to nvidia-smi. I also get massive mouse lag, 2 seconds or so. It moves smoothly, but with a 2 sec delay and everything freezes up while it's moving. A little more investigation shows that this option actually results in software rendering (llvmpipe), so it's not working.
My preference would of course be to use external screens and render everything on the eGPU. This is 18.04 LTS with nvidia-435 drivers.
Does anybody have any ideas as to why this may be happening?
This turned out to be caused by https://bugs.launchpad.net/ubuntu/+source/gdm3/+bug/1716857 , even though TB16 dock monitors and DP/HDMI ports on the laptop are not affected by it. Looks like it's lightdm life for me going forward.
To those with lagging issues: this can happen when X is running in a non-bandwidth effiecient way. For example if X is running on the igpu but then using the egpu for rendering before outputting to an egpu connected display this causes unnecessary thunderbolt bandwidth use and is the problem these scripts are written to address.
If using an nvidia card and the proprietary drivers, I would start by purging all old drivers and configuration and trying to use just the latest drivers (as installed from, say, the ubuntu software update tool) with this script or the gswitch script. This seems to yield good results in many cases.
Sidenote, if using the open source drivers you want to use the external setting of these scripts only when using an external display. When using the internal display with the egpu you should switch to the internal mode and use the DRI_PRIME=1 environment variable.
Edit: Here's a list of the exact packages I was using when testing:
nvidia-compute-utils-430* nvidia-dkms-430* nvidia-driver-430* nvidia-kernel-common-430* nvidia-kernel-source-430* nvidia-prime* nvidia-settings* nvidia-utils-430*
I wanted to post this in case the script does not work for people. I had to figure these steps out the hard way:
Using Ubuntu 19.10 [am running egpu from internal display] - after enabling thunderbolt, blacklist nouveau, install drivers, configure Xorg file (basic steps - these can be found online) - I was also stuck because I would get screen to hang at boot. I had to do these additional steps [involved booting to terminal as well and removing xorg file if needing to boot into desktop temporarily]:
With the Xorg file configured (adding the BusID , etc... AllowEmptyIntialConfiguration... AllowExternalGpus..." - you can find these templates on many forum posts and how to create this)...
Once you save that file, and then reinstall Nvidia driver manually from website (you need at least 2 different versions of the file in order to make this work so as to install a different version) - and then let Nvidia recreate Xorg file for you (this is an option in the NVidia download script file) - the template Nvidia software created was different (if you didn't create an Xorg file yourself first using the different forum post templates - then you get some generic Xorg file that doesn't work - so it is very important to create the Xorg file and have Nvidia installer work off of that) and I was able to tweak it. Here was the output (added BusID and also made sure that eGPU videocard was listed under Screen - you will see what was added if you follow steps):
I have a Lenovo Thinkpad X1 Extreme 1st Gen Laptop dGPU (1050ti maxq) as well as eGPU (2080ti). Then it hung again after about a week of use, and I booted to terminal and installed driver again (remember that there are 2 drivers downloaded - so switched back to that other driver file for install) - had Nvidia recreate the Xorg file from previous Xorg file I tweaked - and got this:
Had to tweak Xorg file - but you will see where I tweaked if you follow those steps - and other thing I had to do was to tweak BIOS settings. I had to do more than just allow Thunderbolt connection. I had to switch to hybrid graphics, No boot time delay, Diagnostic boot - no Thunderbolt security, Pre Boot ACL (authorization with user intervention) - basically the timing of the boot sequence has to be exactly right...
So I didn't see anybody posting these things which is why I am doing so. It was not as easy for me as just using a basic Xorg file.
So I used this script and it seems to be working fine, however I have a question. When I am disconnected from my eGPU and I have my internal gpu set to intel integrated, the nvidia kernel module is still loaded and my dGPU is still powered up, it's just not being used. I actually think it was on p0 power state (full power) if I remember correctly (will check tomorrow and edit post). This obviously kills the benefit of having the integrated intel gpu for battery life if the dGPU is just going to be burning the midnight oil doing nothing even if I am not using it. Is there a way around this?