Post results of your eGPU score in Heaven Benchmark
2. Run two Benchmarks with the settings on the picture - one with External Display (Monitor) and one with Internal Display (if you can):
EXAMPLE 1: AKiTiO Thunder 2 + EVGA GTX 1060 6GB SC / MacBook Pro (Retina, 15-inch, Later 2013) - Thunderbolt 2 @16Gbps
External Display (Monitor):
EXAMPLE 2: AKiTiO Thunder 3 + EVGA GTX 1060 6GB SC / MacBook Pro (Retina, 13-inch, 2016) - Thunderbolt 3 @32Gbps
External Display (Monitor):
To Run Internal Display we have 4 ways based of your laptop:
for Windows if you don't have dGPU - https://egpu.io/forums/implementation-guides/2013-15-mbpr-iris-only-gtx106016gbps-tb2-akitio-thunder2-win10-internal-display-optimus-enjoy
for Windows if you have dGPU - https://egpu.io/forums/mac-setup/how-to-keep-mbps-irisiris-pro-activated-when-booting-into-windows-boot-camp/#post-1458
for Windows if you have dGPU but you can't activate Iris (iGPU) - https://egpu.io/forums/implementation-guides/2012-15-mbpr-gt650m-gtx106010gbps-tb1-akitio-thunder2-win10-enjoy/
ϟ AKiTiO Thunder2 + EVGA GTX 1060 6GB SC Gaming (macOS Sierra 10.12.4 and Windows 10)
MacBook Pro (Retina, 15-inch, Later 2013) 3.2GHz Quad Core Intel i7-4750HQ / 8 GB 1600 MHz DDR3 / 256GB SSD + 1TB
✪ mini eGPU ● PCI Express vs. Thunderbolt ● Mac CAN game ● Gaming Laptops vs. MacBook Pro with eGPU
AKiTiO Thunder 2 + Gigabyte GTX1050Ti / Lenovo T430s - Thunderbolt1 @ 8Gbps
I cannot run 1080p on the internal display as it is a 1600x900 panel and unlike 3DMark benchmarks, Heaven reverts to 1600x900 if I select a higher resolution (3DMark renders at the selected resolution, then downscales to the display).
"Always listen to experts. They'll tell you what can't be done, and why. Then do it."- Robert A. Heinlein, "Time Enough for Love."
Akitio Node + GTX 1080Ti 11GB / MacBook Pro (Retina, 15-inch, Late 2016) – Thunderbolt 3 (Windows)
Windows NT 6.2 (build 9200) 64bit
Intel(R) Core(TM) i7-6920HQ CPU @ 2.90GHz (2903MHz) x4
Intel(R) HD Graphics 530 22.214.171.12427/NVIDIA GeForce GTX 1080 Ti 126.96.36.19965 (4095MB) x1
1920x1080 8xAA fullscreen
Here are some 4K benchmarks using a MacBook Pro 2.9GHz, Akitio Thunder3, and Zotac GTX1080 mini. I've placed links underneath to reviewers posting benchmarks from their desktop GTX1080 setups for comparison.
4K|Ultra|8X-AA|Tessellation: extreme| Overclocked
Akitio Node | GTX 1070 | Mid-2012 Retina MacBook Pro 15" with 2.6 GHz processor | 27" Dell 1440p external monitor
Similar to how their is an implementation database and page on here, it would be great to actually get a benchmark database. Would need information on Windows vs. Mac OS, Machine (e.g. what model), processor type/speed, thunderbolt version, enclosure, video card, internal or external screen, and resolution. If we had that it would be fascinating to see some of the different results and compare where the bottlenecks occur for different cards and different thunderbolt versions - for instance by scanning through this I'm seeing that my 1070 on TB1 is performing almost as good as a 1070 in TB3 on a 2016 13" MBP (72 vs 66 FPS), and almost as good as a 980Ti in a TB2 MBP (72 vs 66) in Mac OS at 1080p on an external monitor.
I think it would also be super helpful for people when they are thinking about what card to buy, and will help answer the question of is a 1070 worth it in a TB1 computer compared to a 1060 (as far as I can tell, yes) or is the extra power wasted because of the bandwidth limitation. Also, some people will want to game on their internal screen and will want to know what card to buy to do that at a minimum FPS, others will want 1440p or 4K, etc.
Not sure who would have to create the database and page, but if there's other interest I'd love to see it put together and we could have a simple data input sheet and way to attach screenshots as proof. What do you think?
Unigine Heaven Benchmark 4.0
Windows 7 (build 7601, Service Pack 1) 64bit
Intel(R) Core(TM) i7-2630QM CPU @ 2.00GHz (1995MHz) x4
NVIDIA GeForce GTX 1060 3GB 188.8.131.5292 (3072MB) x1
1920x1080 8xAA fullscreen
I have some very interesting results to share. If you look at my activity, you'll see I have a 2015 iMac 4k (iGPU only Iris Pro 6200 Graphics). The tl;dr summary is that I could not find a way to engage Nvidia Optimus. The eGPU does not see the internal iMac display. The only way I could find to engage the eGPU was to use a headless HDMI adapter and then use the Dual Display option in Windows Display Settings. There are some exceptions - some applications will engage the eGPU without the HDMI adapter plugged in, such as Fortnite. Others won't. The only 100% way to use the eGPU resources is to run in Dual Display mode.
If you want to take a look at my hardware info, check out my setup on Userbenchmark.
Onto the results, then something very interesting at the bottom ;-).
Dual Display Mode (1o8op) with Headless HDMI attached
Dual Display Mode (1o8op) with Headless HDMI attached (Also Recording using GeForce Experience overlay)
1080p eGPU Internal Display - Unplugged Headless HDMI adapter while application was running
Dual Display Mode (1o8op) with Headless HDMI Retest - Plugged HDMI adapter back in while application was still running
3d Performance In Task Manager While Heaven application running - 1080p Dual Display Mode using HDMI adapter
3d Performance In Task Manager While Heaven application running - 1080p Internal Display only (unplugged HDMI adapter while application was open)
The most telling information is in the last two images: When running in Duplicate Display mode, the eGPU is running at 100% while the iGPU is running at 20%. When rendering without the HDMI adapter plugged into the eGPU, the eGPU shows 66% and the iGPU 14%. I would assume this is why the Dual Display benchmark shows better performance versus single display mode.
As to why this is happening, I have no idea! Does it have something to do with the Intel 3d Graphics settings? Is there a way to get Nvidia Optimus working as designed on my system (doubtful)?
Edit to add CUDA-Z information: