Total War: Three Kingdoms benchmarks
Never thought this day would arrive, but an AAA title released for Mac only a few days after the Windows release?
I love the Total War series, and it's rock solid and stable (albeit slightly unoptimised on the Mac).
Playing it on a 2018 MBP 15" (i7 2.6/32GB/1TB), and it was surprisingly playable. Some benches from battle_benchmark:
|Resolution @ Preset||Average FPS|
|3440x1440 @ Ultra (Full Screen)||20.0|
|3440x1440 @ High (Full Screen)||30.6|
|3440x1440 @ Medium (Full Screen)||53.7|
|2560x1080 @ Ultra (Windowed)||23.7|
|2560x1080 @ High (Windowed)||39.6|
|2560x1080 @ Medium (Windowed)||72.0|
Think I'll be playing on 3440x1440 High or 2560x1080 High, full screen for both. In 3440x1440 High, changing TAA to FXAA increased average fps by about 10% (3fps).
So far I think Sega and Feral Interactive did a great job on the Mac port! While I believe the fps will increase in Windows, but heck if I just want to play half-an-hour in MacOS it's possible!
If any of you have any tips on optimising the fps for MacOS, or would like to share benchmarks, please do share, thanks!
Feral are by far the best developer for Mac games IMO. Finally started playing the 2013 Tomb raider and its fantastic, full controller support and it even runs on my iGPU. Fantastic game too.
2017 13" MacBook Pro Touch Bar
GTX1060 + AKiTiO Thunder3 + Win10
GTX1070 + Sonnet Breakaway Box + Win10
GTX1070 + Razer Core V1 + Win10
Vega 56 + Razer Core V1 + macOS + Win10
Vega 56 + Mantiz Venus + macOS + W10
We (my wife and me, only playing together) played Tomb Raider 2013 through. It's really a good game. Unfortunately Return of the Tomb Raider is a little repetitive (not so much change compared to TR 2013) so we changed to Far Cry 5, which has too much dump shooting, so we changed to Far Cry 4, which starts fine, but then it's again too much dump shooting.
We will give Hitman 2 a try next. I would like to play Resident Evil 2, but my wife doesn't like Zombies.
I will be purchasing my first eGPU setup next week, I ordered the the Razer Core X. I am now debating between the Vega 56 and 64 and whether the 64 is worth the money. I am a big Total War fan since Shogun in 2001 so I am really looking forward to play it on my MacBook Pro 2016 15"/i7 2,6GHz/16GB/512SSD/Radeon Pro 460. I have a 34" LG 3440*1440 monitor. My question is if you think the Vega 56 will be powerful enough to play Three Kingdoms at full 3440*1440 on at High settings.
Also, did you need to install any scripts from this forum in order for Three Kingdoms to use the eGPU? Or it's just enabling the setting from INFO panel in Finder?
@nevrozel Hey was going to reply earlier but wanted to run some benchmarks for you first. I think it's a great choice buying the Vega 64, people usually baby their components so used or new it doesn't really matter much - that V64 pricing is a no-brainer.
|Resolution @ Settings||Radeon VII + Mantiz Venus||Vega 56 (V64 BIOS) + Razer Core X Chroma|
|3440x1440 @ Ultra||20.0||17.3|
|3440x1440 @ High||30.6||25.1|
|3440x1440 @ Medium||53.7||43.0|
Have fun, and good luck wiping out the yellow turbies!
Edit: also running in bootcamp, TW:3K average fps is very similar to the MacOS port. Which means the port is quite optimised (the game takes a significant hit on egpu connections though).
Thanks for the benchmarks! I am now just waiting for the Razer Core X to arrive. It's interesting that BootCamp did not see a considerable improvement, it usually is. This shows the quality of the port imho, which is great.
Do you have any tips on how to improve the performance of the Vega 64 that I got? Is there a way to update its BIOS, update some MacOS drivers? I think MacOS drivers are baked in and BIOS update will only work on Windows, but you seem more knowledgeable.
What are reffering to when you say "tuned Vega 56"? Is it just BIOS related or any other tweaks?
The gigabyte Vega 64 is one of the standard clocked Vega 64s (1630/945). In macOS we can only do one thing - flash the bios for a higher clocked version (flashing bios is done in windows only, but the benefits are reaped in MacOS).
When the card comes, don't do anything but just run some benchmarks look at the temps and see if it's okay. Then you will at least have a baseline. Be sure the bios switch is not in the low power position.
Once you got your baseline reference, go to techpowerup.com and grab let's say a Sapphire bios clocked at 1750/945. Make a backup of the original bios, flash, then bench it to see how much better it performs and if the cooling can take the heat.
If you are intending to keep the Vega 64 for awhile, consider getting a Morpheus II, or even invest in a water cooled setup. Or just stick with it and enjoy the performance it provides. It's a good card for 3440x1440 @ 50 to 60fps for most games on high settings. And if you are like me, the games we play(strategy, etc) do not really need high fps haha.
In Bootcamp you can tweak all sorts of things that you can't in MacOS. One of the key tweaks is to undervolt (windows only, MacOS no), that can net a good 10% performance increase easily.
The card I got is exactly this one, by the QR code on the box. They say it's an OC edition with 1560/945 MHz. I usually play RTS games on PC and FPS on my PlayStation, so they should work all right. About the undervolting, if we do it in Windows, will it stick to MacOS?
offtopic: how do you mention someone on this forum like @nevrozel, I can't seem to find it.
I don't often buy AAA gaming titles close to release (or at all, to be honest), so I am not often useful for benchmarks on the latest games, but I love the Total War games, and Three Kingdoms has raving reviews, so I just had to buy it. As a result, I have some Three Kingdoms benchmarks from the Windows side of things, comparing my old Asus GTX980Ti Strix and a shiny new Palit Game Rock GTX1080Ti which I just picked up to replace it for 470$. I am not overclocking the cards beyond their factory settings. The Strix is a massively overclocked card with a 20% core overclock over a regular GTX980Ti, the Game Rock runs at nearly the same clocks as the reference design, so the difference between the two is smaller than between two reference cards of each type.
battle_benchmark - All of these are on external monitors, full-screen, with a second monitor active in windows. The same driver (430.97 - latest one available) is used for both cards:
|Asus GTX980Ti Strix||Avg (Min)||Avg (Min)||Avg (Min)||Avg (Min)|
|1920x1080||40 (31)||53 (39)||85 (59)||147 (96)|
|2560x1440||30 (24)||38 (30)||61 (48)||111 (78)|
|3840x2160||17 (14)||21 (18)||34 (29)||64 (49)|
|5120x2880||14 (12)||21 (17)||25 (22)||40 (35)|
1080p is playable at High (Ultra is doable at 30fps).
1440p is playable at Medium (High is doable at 30 fps).
4K is barely bearable at Medium.
5K looks like crap and isn't worth it.
|Palit GTX1080Ti Game Rock||Avg (Min)||Avg (Min)||Avg (Min)||Avg (Min)|
|1920x1080||54 (40)||71 (46)||103 (64)||165 (101)|
|2560x1440||42 (33)||54 (40)||86 (61)||147 (96)|
|3840x2160||25 (21)||31 (26)||52 (42)||93 (70)|
|5120x2880||16 (13)||21 (17)||33 (28)||61 (49)|
1080p is playable at High. Ultra is doable with FPS dips.
1440p runs decent on High and very well on Medium.
4K is doable on Medium. Low is too ugly to be worth it. This is a pretty good setting, as Medium looks good and 4K on a 27" monitor looks very crisp.
5K is doable at ~30 FPS on Medium. Not worth it, to be honest.
There are three interesting things here:
1) The High preset behaves weird, providing the same results at 4K and 5K on the GTX980Ti, and an identical 5K result on both cards, which no other preset does.
2) The best GPU scaling is on the medium and high presets at 2K and 4K. Ultra settings scale worse and I suspect that a lot of the extra eye-candy (probably things like unit sizes) is either CPU bound, or uses extra CPU computation, which means that the TB3 latency has a greater effect. With all due respect to my i7-7820HK, it isn't as fast as a desktop 8700K or 8600K, which is what most of the desktop benchmarks are done with.
3) The minimum frame rates in the benchmarks are nearly universally achieved at the end of the scene when a lot of foliage is on screen. In other words, if you're not staring at trees, but rather at your units, your FPS is likely going to be higher than this.
Let's compare to desktop results. Surprisingly there isn't a lot of good data I could find on the GTX1080Ti performance, but be can extrapolate from a GTX1080. The GTX1080Ti is ~30% faster than the GTX1080 on average, so lets see how I fared. This is on the High preset:
1080p: 91 (71) vs 71 (46)
1440p: 56 (48) vs 54 (40)
4K: 27 (23) vs 31 (26)
Likewise, here, GTX1080Ti results are missing and the same is true for a GTX1080 (Why wouldn't they try to find two of the most popular cards out there is beyond me), but a GTX1070Ti makes its appearance, and it is a card that is very close to a GTX1080 in performance, so we can use it instead:
At 1080p, the 1070Ti result is 64fps on average on Ultra (no minimum specified) vs my 54.
At 1440p, they have 55fps on average on high (again, no minimum specified) vs my 54. On Ultra, the performance is all by identical to mine, with 42 average FPS, and a minimum around 37-ish. I have 42 (33).
At 4K on medium, the 1070Ti has 43 FPS vs my 52. At high the frame rate is 26fps (identical to the GTX1080 result above) vs my 31.
1) 1080p a desktop with a GTX1080/1070Ti outperforms my GTX1080Ti eGPU setup by a lot, setting minimum frame rates that are equal to my averages. Getting a high-end card for a 1080p eGPU is a fool's errand, but not one we were not aware of.
2) At 1440p on High, I am suffering about a 30% performance hit compared to the desktop, with a bigger hit on the minimum frame rates, as my card becomes essentially equivalent to a GTX1080 or so.
3) At 4K on Medium, my card pulls ahead closer to its expected performance advantage, being ~20% faster.
4) At 4K on High the GTX1080Ti pulls ahead in both average and minimum frame-rates, and the performance hit compared to the desktop is about 15%, but the performance of both cards is poor.
5) On Low graphics the game looks like crap. Medium looks pretty good. The gains from Ultra over High are very hard to spot and are probably not worth it.
It is hard to say how much of this is to blame for the TB3 connectivity and how much of this is to blame on the weaker CPU compared to the faster desktop benchmarks, since the Total War games are CPU intensive both on the campaign level and in the battle view, but the eGPU fares better on the Medium setting and higher resolutions. Medium at 4K is a particular sweet spot, and gives a hint about what I should focus on when tweaking settings.
There is also a helpful article here about the performance impact of various graphics settings over here. Spoilers: Shadows and AA reduce frame rates a lot, and unit sizes are also an important consideration. The impact of settings on eGPU performance may be different than on a desktop and is worth looking into, so:
I plan to compare the effect of the various settings on the performance of my setup so we can recommend the eGPU crowd which settings we should avoid. I will use the GTX1080Ti and try to narrow down the effect of each setting between Medium and the higher settings (Low is not worth my time). Stay tuned!
As usual: I buy a game, and then spend more time benchmarking it than playing it, but I can't help it: FOR SCIENCE!
"Always listen to experts. They'll tell you what can't be done, and why. Then do it."- Robert A. Heinlein, "Time Enough for Love."