2018 15" Lenovo Thinkpad P52 (Q P3200) [8th,6C,H] + GTX 1080 Ti @ 32Gbps-TB3 (Ra...
 
Notifications
Clear all

2018 15" Lenovo ThinkPad P52 (Q P3200) [8th,6C,H] + GTX 1080 Ti @ 32Gbps-TB3 (Razer Core) + Win10  

  RSS

Joseph Hawker
(@joseph_hawker)
Active Member
Joined: 1 year ago
 

System Specs

eGPU Hardware

Installation Steps

If you want the Quadro card and the GeForce card working together, follow these instructions. If your eGPU is also a Quadro card the card should be plug and play, requiring only drivers and software for the enclosure. These instructions should work for anyone wanting to run their system with a Quadro GPU and a GeForce eGPU. In theory, it should also work the other way around, but your mileage should vary.

  1. Download and install the NVIDIA studio drivers on the laptop.
    • This step may be able to be skipped, but better safe than sorry.
    • If using a Quadro eGPU with a laptop that has a Quadro GPU, this step should be skipped.
    • If using a GeForce eGPU with a laptop that has a GeForce GPU, this step should be skipped.
    • If using any eGPU on any laptop that has only an Intel iGPU, this step must be skipped.
    • The Studio drivers from NVIDIA (which can be found when looking for regular GeForce drivers) contain the drivers for both the GeForce cards and the Quadro cards.
  2. Plug in the eGPU and authorize connecting to the device with the Thunderbolt control software.
    • If mixing Quadro and GeForce GPUs/eGPUs, the enclosure should be detected with a GPU present, but the drivers for the GPU won't be loaded (yet).
  3. Disable the Quadro dGPU in the device manager and reboot the computer.
    • This will force the computer to load the drivers for the new eGPU so we can install the correct drivers
  4. Install the NVIDIA studio drivers for the GeForce eGPU.
    • Installing GeForce Experience is wasted here, so don't install GeForce Experience.
  5. Re-enable the Quadro dGPU in the device manager and reboot the computer.
    • This will force the computer to load the Studio drivers for both cards and start both GPUs at the same time.
  6. Both GPUs should now load correct drivers and be visible.

Benchmarks

None included here as this really only verifies functionality of having a Quadro card mixed with a GeForce eGPU with both cards enabled.

Comments

This procedure allows for running both the Quadro GPU and the GeForce GPU at the same time, on the same system. It also verifies that it is possible to do this.

Thinkpad P52 (P3200) + 1080 Ti, both GPUs working

 
2018 15" Lenovo ThinkPad P52 (Q P3200) [8th,6C,H] + GTX 1080 Ti @ 32Gbps-TB3 (Razer Core) + Win10 [build link]  


Mini i5 and vrtx_void liked
ReplyQuote
Malik Zenon
(@malik_zenon)
New Member
Joined: 12 months ago
 

Could you explain the point behind having both quadro gpu and geforce gpu at the same time

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
Joseph Hawker
(@joseph_hawker)
Active Member
Joined: 1 year ago
 

@malik_zenon, Better graphics switching, for one, as you don't need to disable the Quadro in Device Manager. If gaming, you can also dedicate the Quadro to Physx to take some strain off the eGPU when gaming.

The main reason I went that route is so that I could just unplug the device and have the dGPU in the laptop still ready to go and not have to worry about making sure the device is disabled. The Studio Drivers make mixing a Quadro and GeForce GPU truly plug and play.

 

Thinkpad P52 (P3200) + 1080 Ti, both GPUs working

 
2018 15" Lenovo ThinkPad P52 (Q P3200) [8th,6C,H] + GTX 1080 Ti @ 32Gbps-TB3 (Razer Core) + Win10 [build link]  


Mini i5 liked
ReplyQuote
vrtx_void
(@vrtx_void)
New Member
Joined: 9 months ago
 

@joseph_hawker, Thank you for the guide Smile !

If you use the two GPU's simultaneously, do you still use your internal Display, or is everything running via external displays to avoid bottlenecks?

 

This post was modified 9 months ago

Lenovo P53 Intel i7 9850H 9th Gen. / NVIDIA QUADRO RTX 3000 6GB dGPU
Razer Core X 650W / GIGABYTE GEFORCE RTX 3080 GAMING OC 10GB eGPU


ReplyQuote
Joseph Hawker
(@joseph_hawker)
Active Member
Joined: 1 year ago
 

@vrtx_void, I do use the internal display paired with two external monitors connected to the eGPU. The primary monitor is one of the external displays. Instead, the biggest motivations for finding this approach was:

1) not being able to drop money on a Quadro GPU for the graphics enclosure and

2) not being satisfied with having to disable the laptop's Quadro in order to use a GeForce eGPU.

I wanted a true plug and play solution (after dealing with drivers of course) since I do have history of using the laptop's capable dGPU on the go (gaming and school once we're done with this pandemic). Plus, the external display connections on the P52 in particular are only connected to the dGPU. If you disable those, you cannot use an external display connected to the laptop and an external display connected to the eGPU (a niche use case, but still one to consider).

Hopefully this answer is to your satisfaction.

 

Thinkpad P52 (P3200) + 1080 Ti, both GPUs working

 
2018 15" Lenovo ThinkPad P52 (Q P3200) [8th,6C,H] + GTX 1080 Ti @ 32Gbps-TB3 (Razer Core) + Win10 [build link]  


vrtx_void liked
ReplyQuote
vrtx_void
(@vrtx_void)
New Member
Joined: 9 months ago
 

@joseph_hawker, Thank you for your answer.

I understand your motivation, but doesn't that significantly impact your performance?

 

I only ask because I am going to use my P53 alongside with an external RTX 3080 for realtime rendering, but from what I read in other threads the internal display would definitely result in bottlenecks.

 

Lenovo P53 Intel i7 9850H 9th Gen. / NVIDIA QUADRO RTX 3000 6GB dGPU
Razer Core X 650W / GIGABYTE GEFORCE RTX 3080 GAMING OC 10GB eGPU


ReplyQuote
aurelius pontius
(@aurelius_pontius)
Active Member
Joined: 1 year ago
 

I am sure is gonne be cpu bottleneck with 2 videocards!!!

2019 15" Clevo P775TM1-G [9th,8C] + RTX 2070 @ 32Gbps-M.2 (ADT-Link R43SG) + Win10 // custom enclosure [build link]  

vrtx_void liked
ReplyQuote
Joseph Hawker
(@joseph_hawker)
Active Member
Joined: 1 year ago
 

@vrtx_void, The answer is it depends on the workload. Here are what my experiences have been (on multiple machines I might add). Most of these experiences are going to be gaming, which from a software workload perspective should be considered as real time rendering. Aside from gaming, with the exception of MATLAB (which offloads to the GPU), most of my workloads are CPU and RAM bound (Vivado Design Tools and Visual Studio/IntelliJ).

  • In gaming, the bottleneck when using the internal display at 4k on a Razer Blade Stealth is around a 20-25% FPS loss with a GTX 1070. It is around 15% on my P52 with a 1080p internal display with a GTX 1080 Ti. This is due to having to feed the video signal over the Thunderbolt bus as well as the data for the GPU to process.
  • On an external display, the bottleneck in gaming becomes negligible at around a 5-10% loss at 4k on both the Razer Blade Stealth and the P52. The P52 fared better due to having a much better CPU being able to keep the GPU fed than the RBS. Again this was with a GTX 1070 and a GTX 1080 Ti.
  • In MATLAB, there was no bottleneck (although that program, while GPU bound, isn't nearly as demanding) as data was sent to the GPU to process then returned.

In what you're planning on doing, you should see an increase in performance (likely a significant increase) with the RTX 3080, even on the internal display, but keep in mind some things.

  • Thunderbolt 3 is equivalent to a PCIe 3.0 x4 connection. The Nvidia GeForce RTX 3080 uses a PCIe 4.0 (!) x16 connection (although it doesn't saturate the entire bandwidth allotted). PCIe 3.0 x4 is equivalent to PCIe 4.0 x2 in terms of bandwidth allowance. This is why an external display is so important when doing very demanding applications, since the internal display needs to use some of that PCIe/TB3 bandwidth to send the display data back to the internal display. Higher resolutions and refresh rates take more of a performance hit. Keep that in mind.
  • The other issue is having enough CPU power in order to prepare the data for the GPU to process. Since modern desktop Ryzen and Threadripper (with their Intel equivalents) are having a hard time keeping the GPU fed (so to speak), you can also expect a CPU bottleneck with a 3080, especially if you decide to render on both GPUs using the Studio Drivers (despite having enough PCIe lanes for both the dGPU and the eGPU).
  • All applications running on an external monitor connected to the eGPU will run exclusively on that GPU. There are no exceptions to this (from what I can tell in my workloads).

All in all, this is around what you can expect based on my experiences. Even with a performance hit compared to an equivalent desktop with the eGPU as a native internal dGPU, the output still returns a net positive performance gain regardless.

TL;DR Expect around a 20-25% fps drop for any real-time rendering type application (like gaming) on an internal display, with a negligible drop on a display connected to the eGPU.

 

This post was modified 9 months ago

Thinkpad P52 (P3200) + 1080 Ti, both GPUs working

 
2018 15" Lenovo ThinkPad P52 (Q P3200) [8th,6C,H] + GTX 1080 Ti @ 32Gbps-TB3 (Razer Core) + Win10 [build link]  


vrtx_void liked
ReplyQuote
ronald_loulan
(@ronald_loulan)
New Member
Joined: 11 months ago
 

Hi, any chance you tried to connect the Egpu via PCIe x4 Nvme port??

Thinkpad P53 [9th,6C,H] + 1060 6GB @ 32gbps-TB3 (ADT-LINK R43SG + Jeyi TB3 Enclosure + Win10 20H2


ReplyQuote