Light Gamers - do you still need eGPU after Geforce Now is released?
Clear all

Light Gamers - do you still need eGPU after Geforce Now is released?  


Eminent Member
Joined: 8 months ago

I started my eGPU journey since last August and I'm pretty happy with it. I only have very limited time for gaming everyday - maximum 1-2 hours if lucky. With the eGPU(used, with a used RX580, just to try its capability) setup my MBP 15.4" mid-2014 can run the Tomb Raider Trilogy at 1440p (high setting on TR 2013, high-med on ROTTR, and med on SOTTR) and I'm finally able to accomplish these incredible games!

Then I moved on to RX5700 as my friend told me I gotta try AC Odyssey which is a quite demanding game. This is the very first time I invest in a brand new GPU (computer parts depreciates very fast that's why i usually buy secondhand). I'm super happy and enjoying the game. Last night I just hit the 100-hr gameplay mark.

Last week Geforce Now was released to the public. I tried the free tier (one hour gameplay session, has to reconnect when the time is up) and it actually blows me away! AC Odyssey can be played at solid 60 FPS with 1080p Max setting, which is impossible for my eGPU setup because of the the loss from TB3 to TB2 , and the latency (for such single player games) is totally negligible as far as I'm concerned. I'm started to think if i should sell my eGPU system to get $400 back...

Have you tried GFN? What are your thoughts?

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts

itsage liked
New Member
Joined: 2 months ago

No Rockstar products killed it for me. The whole reason I'm doing (just did do it) boot camp on my 2018 Macbook is to play my GTA V game that I use to on my old windows desktop. It would have been nice to transfer my character from PC to XB1, but they don't allow it.

Pending: Add my system information and expected eGPU configuration to my signature to give context to my posts

Eminent Member
Joined: 8 months ago

I went the "Founders" route to try GeForce Now at its best, and will quit before end of the free session (actually not using it anymore).


To give you context, I'm no hardcore gamer, I'm playing something like 10h per week, and these days it's mostly FPS (BF, Apex, Destiny 2, Overwatch) and Witcher (2, but 3 is waiting for me). What I like is the gaming experience to be "involving", so when motion lacks fluidity or picture gets distorted, I'm away.

Typically, my favorite settings will then be screen native resolution or "pure-dividers" (1440p on a 5K screen, 1080p on a 4K screen) to avoid resampling which adds blur. I'll then lower the settings to High or even Medium to get a solid 60 fps, and will prefer a nice 60 Hz screen with vibrant colors (but decent input lag and response time) to some greyed-color 144 Hz screen.

I don't really care about the screen size itself, but for FPS, playing on a 15" MBPro screen is a little too small for detecting elements and other players onscreen. A 27" iMac screen is overkill, and will have to be at a relevant distance from the eyes to "capture" the whole game scene.

Last but not least, I've got a very nice Internet connexion (1 Gbps, 3 ms ping, 0.2 ms jitter) and connect by Ethernet.


So the first time you open GeForce Now and first frames appear, it looks nice. 
Latency is correct as it only adds around 30 ms (which is fully acceptable for non-pro players, let's be honest).
There can be some freezes, but so unfrequent they can be confused by GPU frame stuttering.
Aspect ratio can fit your screen's one (e.g. 1920x1200 on a MBPro 15).
However, compared to local gaming, the picture is greyed, the contrast is highly reduced (black will turn darker green), there are artifacts around edges, and everything is blurred. These are effects from video compression - decompression, which are not visible on movies (today's video compressors has been designed for movies) but on games.
For FPS, it's already far from perfect when you need to quickly detect tiny objects / players onscreen.
For the Witcher 3 (which I use as a benchmark), it's not acceptable at all. First, you cannot go 1080p Ultra and get 60fps. You have to go down to High settings and in some cases you'll only get 45fps. I know this is a demanding game, but come one GeForce Now is about the gaming experience of a "high-end PC". More importantly the picture is no degraded that the game loses far too much. We play the Witcher also for the graphics, it's not the same experience when there is visible graphical noise around every character, or when everything is blurred when you're riding your horse.


If you're playing on a small screen games which are not involving "by the graphics", then maybe GeForce Now is a good deal. But maybe will Apple Arcade do the job then (especially if using an iPad). If you are, just like me, into video gaming, then GeForce Now is still not at the correct level ... but other services are not either, and frankly GeForce Now is the best of them imho, latency, stability, and picture quality wise.
The technology has to be pushed further, I'll wait a little bit more.

2018 15" Macbook Pro i9 / Vega 20 + Razer Core X + GTX 1660 Ti + External QHD display + Win10

Mini i5 liked