2019 13" Dell XPS 13 7390 [10th,6C,U] + GTX 1660 Super @ 32Gbps-TB3 (ADT-Link R43SG * M.2-TB3) + Win10
This has been my first time using an eGPU. Here is the build:
- 2019 13" Dell XPS 13 7390
- Intel Core i7-10710U 1.1-4.7Ghz 6c/12t
- Intel UHD 620 iGPU
- 512GB SSD / 16GB RAM
- Windows 10 Home 19041.508
- External Monitors: Dell U2518D, Dell U2412M
- ADT-Link R43SG (non-TU)
- JEYI LEIDIAN M.2 to TB3 adapter
- EVGA GTX 1660 Super SC Ultra
- Dell DA-2 220w PSU
Installation steps Windows 10
- Connect graphics card to R43SG, R43SG to JEYI LEIDIAN, power connector to graphics card from R43SG, Dell DA-2 to R43SG and wall outlet
- Connect graphics card display output(s) to external monitor(s)
- Download and install latest NVIDIA graphics drivers (456.38 at time of writing) and restart
- Connect JEYI LEIDIAN to laptop TB3 Port
- Turn off link state power management
Avoiding error 12
The system recognized the PCIe connection and graphics card in Thunderbolt Control Center and device manager, but device manager returned Code 12 and the eGPU would not function. I attempted to replace pci.sys with the version from Windows 10 1903 V1 (18362.30), but this caused a boot loop upon restart.
Error 12 was resolved by plugging the eGPU into the TB3 port closer to the laptop display panel rather than the port farther from the display, port 2 rather than 3. I'm still not sure why one port would work over another identical port, but this seems to be the case.
After resolving error 12, the eGPU was able to provide display output to both external monitors. However, I experienced freezes every 5-10 minutes during light usage during which the screen would freeze and I would hear a harsh buzzing sound if audio was playing at the time of the freeze, forcing me to reboot the system.
Freezes were resolved by altering the link state power management (LSPM) setting in power options. The default setting for LSPM stops connecting with a TB3 device when there is no data being transferred through that connection. Here are the steps I took to alter that setting:
- Since PCI Express did not at first appear in my power options (Control Panel -> Hardware and Sound -> Power Options -> Change plan settings -> Change advanced power settings), I followed the instructions under "Option Two" on this site to add the option to toggle the setting.
- I navigated to the setting and changed both the "on battery" and "plugged in" settings under PCI Express -> Link State Power Management to "off" from "maximum power savings"
I have not experienced crashes or freezes since altering that setting, and my experience has been plug-and-play in that hot-plug and cold-plug both work perfectly. One new issue is that headphones are not recognized unless cold-plugged.
All iGPU and eGPU benchmarks were run on the U2518D external monitor.
|Heaven Ultra 1080p||116 / 4.6 fps||1825 / 72.4 fps|
|Valley High 1080p||549 / 13.1 fps||4407 / 105.3 fps|
|Superposition High 1080p||590 / 4.42 fps||8311 / 62.17 fps|
The eGPU has made huge improvements in my experience. The extra graphics horsepower offered by the GPU allows me to work with video and 3D rendering much more efficiently and have a better overall experience with my system as a whole. Big thanks to itsage for giving me the inspiration and the confidence to carry through with this build.
I'm from India and I really need help with making e-GPU for my Dell XPS 13 7390. I am trying to find a way to create a DIY e-GPU of my own but I literally cannot do it. My concern is what if I buy HDMI to Type-C Cable for this installation, will that work? & why do I need an extra M.2 NMVE adapter?
I mean I can have a dock (with HDMI Port on it) for my GPU and then directly connect it via HDMI to type-c cable & power supply, won't that be working.
Please help me with the process and the information, I would be very grateful to you.
@kanha_agrawal, Do you mean that you will use an HDMI to Type-C cable to connect your GPU to a Type-C monitor? If so, I don't see any reason why that wouldn't work since the cable and the monitor are built to properly handle that connection regardless of how they connect to your system. Using a dock, presumably a TB3 dock with an HDMI output, to connect an eGPU is not ideal because the extra ports on the dock and longer cable/controller chain can eat up bandwidth and lead to a lack of performance. I believe that using the HDMI output on a TB3 dock regardless of whether that dock is connected to an eGPU will use the laptop's graphics card to extend the display, and for that reason you'll want to use the outputs directly on the graphics card.
You don't need an extra M.2 NVME adapter. I used one with the M.2-output R43SG instead of simply using the TB3-output R43SG-TB3 or another TB3 eGPU enclosure just to save some money and allow for attaching an M.2 SSD if I ever need to do that.
I hope this helps. Please feel free to ask any more questions that may come up.