2019 17" Lenovo Legion Y740-17ICH (RTX2070) [8th,6C,H] + RTX 3080 @ 32Gbps-TB3 (Razer Core X Chroma) + Win10
Laptop in use is: Lenovo Legion Y740-17ICH
CPU: Intel i7-8750H (Turbo hits mostly 3,99 GHz, 6 cores, 12 threads)
dGPU: NVIDIA RTX 2070 Max-Q
iGPU: No idea as it's disabled by default for this computer due to it having G-SYNC monitor.
Memory: 32GB DDR4 2666 memory
Storage: 512GB NVme SSD + 2TB SATA SSD
OS: Windows 10 x64 Pro 20H2 Build 19042
3 monitor setup (extended):
- LG 34" Gaming Curved 3440 x 1440 21:9 IPS (34GK950F-B) 144Hz Freesync
- Dell 24" P2418D 2560x1440 60Hz
- Dell 21" P2211H dying old 1080p monitor
- Gigabyte GeForce RTX 3080 EAGLE 10G
1. Connected the Core X Chroma with TB3, GPU already connected
3. That's it
Benchmarks (External Monitor)
During all tests, it seemed to me that CPU ran as it used to (not higher nor lower clocks. It always runs around 3.9GHz when in full usage).
After testing all of the games listed below, these are the MIN's and MAX's I've reached:
Game performance differences (3440x1440)
I did not change any in-game settings to see differences
I had about 20-40 more FPS (50 - 70) with compared to what I had previously. DLSS enabled at Quality mode
Grand Theft Auto V
During the game benchmark run I saw 10 - 50 FPS (avg around 30) increase depending on the scene.
Previously I managed to run on Medium-High settings with around 30 FPS, now I can run on High with 50-60 or if I set it to ultra, 40-50.
Resolution: 3440 x 1440; DirectX: DirectX 12; Quality: Ultra; Texture filtering: AF 16X; Motion Blur: Normal; Tesselation: Off; Advanced PhysX: On; Ray Trace: High; DLSS: On; Hairworks: On; Shading Rate: 100;
- Average Framerate (99th percentile): 39.17
- Max. Framerate (99th percentile): 57.54
- Min. Framerate (99th percentile): 23.55
Options: Resolution: 3440 x 1440; DirectX: DirectX 12; Quality: High; Texture filtering: AF 16X; Motion Blur: Low; Tesselation: Off; Advanced PhysX: On; Ray Trace: High; DLSS: On; Hairworks: On; Shading Rate: 100;
- Average Framerate (99th percentile): 45.35
- Max. Framerate (99th percentile): 64.87
- Min. Framerate (99th percentile): 26.91
Options: Resolution: 3440 x 1440; DirectX: DirectX 12; Quality: High; Texture filtering: AF 16X; Motion Blur: Low; Tesselation: Off; Advanced PhysX: On; Ray Trace: Off; DLSS: Off; Hairworks: On; Shading Rate: 100;
- Average Framerate (99th percentile): 52.04
- Max. Framerate (99th percentile): 82.67
- Min. Framerate (99th percentile): 26.46
With original hardware I had to run around Medium (RTX off) if I wanted to get 40ish FPS.
Everything related to NVIDIA AI does not seem to be working (DLSS, NVIDIA Brodcast voice)
When benchmarking Cyperpunk, I noticed I can not enable DLSS. Coming out of the game I also noticed that Nvidia broadcast is throwing error that it's unable to start microphone noise removal:
This is extremely odd as both of my GPUs are capable of it. Even disconnecting the eGPU does not seem to help anymore. Just in case tried with DDU to uninstall all drivers and clean install latest one again.
Seemed to have same issue as mentioned here https://egpu.io/forums/pc-setup/microstuttering-when-running-on-external-monitor/ , however just disabling and re-enabling the dGPU in Device Manager seems to fix this.
Reading most of the posts here I expected needing to go into more troubleshooting and more issues. Thunderbolt cable bundled with the core is too short, so I have new cable ordered from Amazon (as none of our local resellers sold 2m 40Gpbs cable, all were 20).
I think for the price I payed (around 900€ for RTX 3080 and 400€ for the Core X Chroma) and it seems in some cases I got double the performance of what I had or even more, this investment was worth it. For me it clearly was mostly plug and play experience.
Update on the NVIDIA AI (DLSS and RTX Voice issue)
Thanks to this post I found out that disabling my dGPU via Device manager, at least in Cyperpunk's case,seems to fix DLSS being disabled. This may actually be Cyberpunk specific issue when you have more than 1 GPU. Downloading Metro Exodus to run more tests (that's the only other DLSS capable game I have).
//EDIT2: Yes, seems to be Cyberpunk issue. Metro Exodes allows normally DLSS to be enabled (with both GPUs enabled in device manager)
Sadly this fix does not seem to bypass NVIDIA Broadcast Microphone issue. This may also be completely unrelated to eGPU, looking around in NVIDIA forums, it may be more widespread issue. Opened support thread on NVIDIA forums related this issue also: https://www.nvidia.com/en-us/geforce/forums/broadcasting/18/426476/unable-to-start-microphone-noise-removal-error-aft/
I was hopingm that this is bypassable by downgrading back to NVIDIA RTX Voice, which is now is enabled for all GTX and RTX Cards. But it comes out it doesn't play nice with 3000 series cards.
//EDIT: However thanks to this I found that downgrading to version 10.0.0.25 of NVIDIA Broadcast fixed the issue
As the shipped TB3 cable is way too short for me, I tried ordering a 2m cable off amazon (sadly none of our local sellers sell TB3 cables that support 40gbps, mostly if any only 20gbps version). I ordered this Maxonar cable: https://www.amazon.de/gp/product/B08HN22G8P/
Cable arrived and felt quite high quality, however once I connected it I noticed reduced picture quality (especially noticeable when I had my consoles open and red text appeared on black background, text was extremely blurry) on my main ultrawide monitor and screen even occasionally flickers to black.
Reverting back to original cable, all of the issues disappear.
Has anybody stumbled upon this issue (maybe even same cable), was I unlucky and the cable faulty or just in reality poorer quality? Does anybody have any recommendations for 2m cables (preferably should be available on amazon.de) or is just that too long for TB3 and I should maximum look at 1m cable?
Found this one from my local resellers, trying this next: https://i-tec.pro/en/produkt/tb3cbl150cm-2/
I just bought an x core, I am also thinking of buying a 3080 but I have a problem. There are two 8 pins on the egpu, but all of the 3080 graphics cards I looked at have 24 pins. I wonder what a solution you found. Did you do it with a converter cable?