- Lenovo Thinkpad P52
- Intel Core i7-8850H (6C, 12T, 2.6GHz)
- Intel UHD Graphics 630 (Integrated)
- NVIDIA Quadro P3200
- 64 GB DDR4 @ 2667 MHz
- Boot: Samsung 970 Pro 512GB
- Data: Samsung 860 QVO 1TB
- Windows 10 Pro Version 20H2 (Build 19042.546)
If you want the Quadro card and the GeForce card working together, follow these instructions. If your eGPU is also a Quadro card the card should be plug and play, requiring only drivers and software for the enclosure. These instructions should work for anyone wanting to run their system with a Quadro GPU and a GeForce eGPU. In theory, it should also work the other way around, but your mileage should vary.
- Download and install the NVIDIA studio drivers on the laptop.
- This step may be able to be skipped, but better safe than sorry.
- If using a Quadro eGPU with a laptop that has a Quadro GPU, this step should be skipped.
- If using a GeForce eGPU with a laptop that has a GeForce GPU, this step should be skipped.
- If using any eGPU on any laptop that has only an Intel iGPU, this step must be skipped.
- The Studio drivers from NVIDIA (which can be found when looking for regular GeForce drivers) contain the drivers for both the GeForce cards and the Quadro cards.
- GeForce cards are supported starting with the Pascal (10 Series) architecture and up.
- Quadro cards back from 2012 (!) to present are supported
- Source: https://nvidia.custhelp.com/app/answers/detail/a_id/4931/~/nvidia-studio-faqs
- Plug in the eGPU and authorize connecting to the device with the Thunderbolt control software.
- If mixing Quadro and GeForce GPUs/eGPUs, the enclosure should be detected with a GPU present, but the drivers for the GPU won't be loaded (yet).
- Disable the Quadro dGPU in the device manager and reboot the computer.
- This will force the computer to load the drivers for the new eGPU so we can install the correct drivers
- Install the NVIDIA studio drivers for the GeForce eGPU.
- Installing GeForce Experience is wasted here, so don't install GeForce Experience.
- Re-enable the Quadro dGPU in the device manager and reboot the computer.
- This will force the computer to load the Studio drivers for both cards and start both GPUs at the same time.
- Both GPUs should now load correct drivers and be visible.
None included here as this really only verifies functionality of having a Quadro card mixed with a GeForce eGPU with both cards enabled.
This procedure allows for running both the Quadro GPU and the GeForce GPU at the same time, on the same system. It also verifies that it is possible to do this.
@malik_zenon, Better graphics switching, for one, as you don't need to disable the Quadro in Device Manager. If gaming, you can also dedicate the Quadro to Physx to take some strain off the eGPU when gaming.
The main reason I went that route is so that I could just unplug the device and have the dGPU in the laptop still ready to go and not have to worry about making sure the device is disabled. The Studio Drivers make mixing a Quadro and GeForce GPU truly plug and play.
@joseph_hawker, Thank you for the guide !
If you use the two GPU's simultaneously, do you still use your internal Display, or is everything running via external displays to avoid bottlenecks?
@vrtx_void, I do use the internal display paired with two external monitors connected to the eGPU. The primary monitor is one of the external displays. Instead, the biggest motivations for finding this approach was:
1) not being able to drop money on a Quadro GPU for the graphics enclosure and
2) not being satisfied with having to disable the laptop's Quadro in order to use a GeForce eGPU.
I wanted a true plug and play solution (after dealing with drivers of course) since I do have history of using the laptop's capable dGPU on the go (gaming and school once we're done with this pandemic). Plus, the external display connections on the P52 in particular are only connected to the dGPU. If you disable those, you cannot use an external display connected to the laptop and an external display connected to the eGPU (a niche use case, but still one to consider).
Hopefully this answer is to your satisfaction.
@joseph_hawker, Thank you for your answer.
I understand your motivation, but doesn't that significantly impact your performance?
I only ask because I am going to use my P53 alongside with an external RTX 3080 for realtime rendering, but from what I read in other threads the internal display would definitely result in bottlenecks.
@vrtx_void, The answer is it depends on the workload. Here are what my experiences have been (on multiple machines I might add). Most of these experiences are going to be gaming, which from a software workload perspective should be considered as real time rendering. Aside from gaming, with the exception of MATLAB (which offloads to the GPU), most of my workloads are CPU and RAM bound (Vivado Design Tools and Visual Studio/IntelliJ).
- In gaming, the bottleneck when using the internal display at 4k on a Razer Blade Stealth is around a 20-25% FPS loss with a GTX 1070. It is around 15% on my P52 with a 1080p internal display with a GTX 1080 Ti. This is due to having to feed the video signal over the Thunderbolt bus as well as the data for the GPU to process.
- On an external display, the bottleneck in gaming becomes negligible at around a 5-10% loss at 4k on both the Razer Blade Stealth and the P52. The P52 fared better due to having a much better CPU being able to keep the GPU fed than the RBS. Again this was with a GTX 1070 and a GTX 1080 Ti.
- In MATLAB, there was no bottleneck (although that program, while GPU bound, isn't nearly as demanding) as data was sent to the GPU to process then returned.
In what you're planning on doing, you should see an increase in performance (likely a significant increase) with the RTX 3080, even on the internal display, but keep in mind some things.
- Thunderbolt 3 is equivalent to a PCIe 3.0 x4 connection. The Nvidia GeForce RTX 3080 uses a PCIe 4.0 (!) x16 connection (although it doesn't saturate the entire bandwidth allotted). PCIe 3.0 x4 is equivalent to PCIe 4.0 x2 in terms of bandwidth allowance. This is why an external display is so important when doing very demanding applications, since the internal display needs to use some of that PCIe/TB3 bandwidth to send the display data back to the internal display. Higher resolutions and refresh rates take more of a performance hit. Keep that in mind.
- The other issue is having enough CPU power in order to prepare the data for the GPU to process. Since modern desktop Ryzen and Threadripper (with their Intel equivalents) are having a hard time keeping the GPU fed (so to speak), you can also expect a CPU bottleneck with a 3080, especially if you decide to render on both GPUs using the Studio Drivers (despite having enough PCIe lanes for both the dGPU and the eGPU).
- All applications running on an external monitor connected to the eGPU will run exclusively on that GPU. There are no exceptions to this (from what I can tell in my workloads).
All in all, this is around what you can expect based on my experiences. Even with a performance hit compared to an equivalent desktop with the eGPU as a native internal dGPU, the output still returns a net positive performance gain regardless.
TL;DR Expect around a 20-25% fps drop for any real-time rendering type application (like gaming) on an internal display, with a negligible drop on a display connected to the eGPU.
Hi, any chance you tried to connect the Egpu via PCIe x4 Nvme port??
Thinkpad P53 [9th,6C,H] + 1060 6GB @ 32gbps-TB3 (ADT-LINK R43SG + Jeyi TB3 Enclosure + Win10 20H2