New XG Mobile eGPU from Asus (64Gbps - x8 3.0)
 
Notifications
Clear all

New XG Mobile eGPU from Asus (64Gbps - x8 3.0)  

 of  7
  RSS

AquaeAtrae
(@aquaeatrae)
Eminent Member
Joined: 4 years ago
 

Asus ROG Flow X13 (Ryzen 9 5900HS / 5980HS, GTX 1650) + XG Mobile (RTX 3070 / RTX 3080 mobile

After carrying the Intel Thunderbolt torch for years, its restrictive bandwidth issues still seem to persist with little hope in sight. Alienware's PCI-based graphics amplifier will likely be being retired soon. But I'm growing very curious about Asus' new ultra-portable gaming solution which is also PCI-based and combined with a (independently usable) USB-C connection. It's exactly what I've been wanting aside from the lack of Thunderbolt and Intel's other minor perks.

I've only used Intel for decades and the promise of Thunderbolt has kept a firm grip. But it doesn't seem like Thunderbolt 4 or future versions will ever give eGPUs room to flourish. After growing similarly disenchanted with poor SLI scaling, I'm seriously considering jumping ship and giving this Asus option a try. The key problem, of course, is the proprietary port and dependency on Asus to give us a RTX 4080, etc in the future. Theoretically, we could hack the cable and build our own PCI enclosures perhaps.

Thoughts? Benchmarks?

X13 i202101AM120000009 16106726534492140042439

Asus ROG Marketing Hype & Video

U.S. Store Page (on sale at $2,999) - coming soon
Model: ASUS ROG GV301QH-DS96  (basic edition with 5900HS, 120Hz WUXGA, RTX 3080)
Part Number:90NR06C1-M05750   

I'm really hoping NotebookCheck.net gets to review it soon, but I'd be interested in any detailed reviews you can share.

 

PS: It's worth noting that the included RTX 3080 is a custom-built mobile variant running at 150 watts. 

Dave 3D showed a bit of the internals in his video here.
Asus Flow X13 Review - AMD + RTX 3080!

This post was modified 1 month ago

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

itsage liked
ReplyQuote
AquaeAtrae
(@aquaeatrae)
Eminent Member
Joined: 4 years ago
 
Posted by: @itsage

One more crucial piece of information is no hot-plug support (similar to AGA and M.2).

To be clear, it can be plugged / unplugged without powering down. One of the video demos I saw showed a pop-up prompt and a delay of about 5 seconds. We can't just rip it out, but a few seconds isn't bad.

Also thanks for finding and merging my post here. I searched here for "Flow X13" but only saw some spammer. 😉

I'm very encouraged to learn about OCuLink2! If we can confirm it's not actually proprietary, that probably seals the deal for me. I just want to know there's a flexible future, one way or another.

Now, I wonder if they won't later offer Intel variants of the Flow X13 where that USB-C port also carries Thunderbolt 4 to a future I/O hub. I'm just dreaming now.

This post was modified 1 month ago

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

itsage and odin liked
ReplyQuote
odin
 odin
(@odin)
Estimable Member
Joined: 3 years ago
 
Posted by: @aquaeatrae
Posted by: @itsage

One more crucial piece of information is no hot-plug support (similar to AGA and M.2).

To be clear, it can be plugged / unplugged without powering down. One of the video demos I saw showed a pop-up prompt and a delay of about 5 seconds. We can't just rip it out, but a few seconds isn't bad.

Also thanks for finding and merging my post here. I searched here for "Flow X13" but only saw some spammer. 😉

I'm very encouraged to learn about OCuLink2! If we can confirm it's not actually proprietary, that probably seals the deal for me. I just want to know there's a flexible future, one way or another.

Now, I wonder if they won't later offer Intel variants of the Flow X13 where that USB-C port also carries Thunderbolt 4 to a future I/O hub. I'm just dreaming now.

I saw it being hot-plugged and recognized as well but I forgot to come back here and say so, thanks for that!

LG Gram 17 | Sonnet Breakaway Box 550 | Asus Strix RTX 2070 OC Edition | Win 10 Pro 20H2 + Fedora 32 Dual Boot
Build Link

 
2018 17" LG Gram 17 [8th,4C,U] + RTX 2070 @ 32Gbps-TB3 (Sonnet Breakaway 550) + Win10 [build link]  


ReplyQuote
thecoolestname36
(@thecoolestname36)
Active Member
Joined: 11 months ago
 

I was so excited for this product when I first heard about it early January.  I'm still happy for it, but at this point I'm puling my finger back from the trigger as the details come out.  I've been a huge eGPU guy since i ordered by first PCB for a express card hacked together Nvidia 460 eGPU in 2010.  I gave the Thunderbolt 3 + 2080ti eGPU a spin on my old Razer Blade 15 base with only bottlenecks to show for it.  My dream configuration is a laptop with a 15" screen, upgradable RAM, 2x M.2 slots and a x8/x16 eGPU which allows for GPU upgradability.

The X13 Flow had my purchase, despite the non-upgradable GPU, until I found out that the system uses PCI-E 3.0 and only had one DisplayPort on the eGPU with soldered on RAM.

I'm still on the fence about the product at this point, which sucks after being so hyped about it for 3 weeks straight.  What will really make me decide is if the port can be adapted or used in a custom eGPU configuration with an actual desktop GPU whether if its hacky or not... If that happens, I'll buy.

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
AquaeAtrae
(@aquaeatrae)
Eminent Member
Joined: 4 years ago
 

@thecoolestname36, I understand the sentiment. Although, to your concerns: My understanding is that PCI 3.0 won't perform much slower than 4.0. SSD speeds are probably most notable, but just like higher RPM SATA those higher speeds would drain the battery faster so there's some sense to the CPU using PCI 3.0. 

As for the DisplayPort, you should still be able to connect at least one more DisplayPort monitor via the USB-C ports. When using the eGPU those frame buffers should be sent back over PCI the same as the internal screen and no different than USB-C connected monitors on a desktop. 

I'm less confident that a third DisplayPort could be attached to the USB-A 3.2 Gen 1 hub. So I think HDMI 2.0 may be your only option there. 

Am I mistaken? 

 

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
odin
 odin
(@odin)
Estimable Member
Joined: 3 years ago
 

The thing that's killing it for me is the lack of a 2nd NVMe slot. One isn't enough. Hopefully they make a 16" laptop that would have room for two slots and include the 8x port on it, but I'm not going to hold my breath for that. It seems like a niche product within what is already a niche product line in eGPU usage itself, but if it sells decently maybe they might expand the product line or place the port on more laptops in general.

I'm thinking with this particular product I'd still be better served with a better laptop and TB3's drawbacks. If I could find something powerful that was actually quiet (I don't think it's possible given physics..) I would probably just ditch eGPU alltogether, but I'm gathering anything with a decent CPU and a 3080 Laptop is going to be loud.

LG Gram 17 | Sonnet Breakaway Box 550 | Asus Strix RTX 2070 OC Edition | Win 10 Pro 20H2 + Fedora 32 Dual Boot
Build Link

 
2018 17" LG Gram 17 [8th,4C,U] + RTX 2070 @ 32Gbps-TB3 (Sonnet Breakaway 550) + Win10 [build link]  


ReplyQuote
AquaeAtrae
(@aquaeatrae)
Eminent Member
Joined: 4 years ago
 

@odin, It's not just that the X13 is limited to a single NVMe slot, but also that the M.2 slot is the smaller 2230 size. I shopped third party upgrades and the highest capacity the X13 can currently upgrade to would be a single 2TB drive (doubling the stock 1TB). [I may have misread that 2TB option. Looks like it's the 2242 larger size and wouldn't fit. 🙁 ]

The physics of cooling noise vs power vs size are unavoidable. Chip manufacturing processes are the only solution to that but improvements there generally appear gradually every decade or so. The only alternative might be game streaming services which have gotten better but still require a good internet service. 

 

This post was modified 1 month ago

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

odin liked
ReplyQuote
musschrott
(@musschrott)
Active Member
Joined: 1 month ago
 

The single, short m.2 slot is also the #1 point that makes me doubt buying it. I haven't even found a 2TB drive. Afaik there is a single 2TB drive smaller then 2280, but its a 2242 from Sabrent. I wonder if using an m.2 extender cable, you could place a 2280 somewhere else, but most likely it would be to thick.

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
odin
 odin
(@odin)
Estimable Member
Joined: 3 years ago
 
Posted by: @aquaeatrae

@odin, It's not just that the X13 is limited to a single NVMe slot, but also that the M.2 slot is the smaller 2230 size. I shopped third party upgrades and the highest capacity the X13 can currently upgrade to would be a single 2TB drive (doubling the stock 1TB). 

The physics of cooling noise vs power vs size are unavoidable. Chip manufacturing processes are the only solution to that but improvements there generally appear gradually every decade or so. The only alternative might be game streaming services which have gotten better but still require a good internet service. 

 

I've had 6-6.5lb laptops that had larger fans on them that were "quieter" since they were larger and spin a little more slowly. But yep, as with a lot of optimization in software performance or hardware, there's three aspects, pick two, you can realistically have those. I do like that fan designs are iterating improvements in perceptible dBA but it's just realism that a thinner notebook with 150-200W of components in it just isn't going to be quiet under load. That's the thing I really like about my Gram + eGPU setup, I can barely hear it unless its a quiet room, it's just CPU and GPU performance are greatly sacrificed for that. It's starting to show it's weaknesses with today's modern games that just need more CPU power to be fluently playable. CP2077 is the first time It's been downright unacceptable. We'll see if further patches can optimize some of that away, but I'm thinking we're just at a turning point where 4c/8t 15W finally is not enough anymore and it's time to start thinking about what to change about my setup. At least I'll have about 6-9 months to think about it since everything is going to be hard to obtain for about that long. Smile

Posted by: @musschrott

The single, short m.2 slot is also the #1 point that makes me doubt buying it. I haven't even found a 2TB drive. Afaik there is a single 2TB drive smaller then 2280, but its a 2242 from Sabrent. I wonder if using an m.2 extender cable, you could place a 2280 somewhere else, but most likely it would be to thick.

IMO it's not even close to worth the trouble to do that unless you're so hell bent on that 8x port. The other disadvantageous part is you're stuck with the 3080 Laptop SKU in that eGPU, you couldn't without again trying to do complicated mods if it can even be done, can't even use a full size eGPU and a desktop GPU. It's basically a product that strictly a snapshot in time with zero flexibility as an eGPU solution. I'm really hoping they expand the product line.

 

 

LG Gram 17 | Sonnet Breakaway Box 550 | Asus Strix RTX 2070 OC Edition | Win 10 Pro 20H2 + Fedora 32 Dual Boot
Build Link

 
2018 17" LG Gram 17 [8th,4C,U] + RTX 2070 @ 32Gbps-TB3 (Sonnet Breakaway 550) + Win10 [build link]  


ReplyQuote
musschrott
(@musschrott)
Active Member
Joined: 1 month ago
 
Posted by: @odin  

That's the thing I really like about my Gram + eGPU setup, I can barely hear it unless its a quiet room, it's just CPU and GPU performance are greatly sacrificed for that.

[...]

IMO it's not even close to worth the trouble to do that unless you're so hell bent on that 8x port. The other disadvantageous part is you're stuck with the 3080 Laptop SKU in that eGPU, you couldn't without again trying to do complicated mods if it can even be done, can't even use a full size eGPU and a desktop GPU. It's basically a product that strictly a snapshot in time with zero flexibility as an eGPU solution. I'm really hoping they expand the product line.

 

I don't mind the noise to much, in doubt I'd rather not sacrifice the CPU performance when I'm working or gaming. And the CPU performance is something, this notebook promises to be pretty good at. That said, I've seen a report earlier, that seems to indicate that some odd looking throttling is going on with the X13. https://www.kitguru.net/lifestyle/mobile/laptops/luke-hill/ryzen-9-5980hs-hands-on-asus-rog-flow-x13-performance-and-throttling/

Well, its not only 8x, but also the only AMD notebook with eGPU capability (yeah, I'm not counting m.2 as a real option here). I also don't find 'being stuck with a 3080m' a real issue. The eGPU part isn't about flexible upgrading, but having more GPU power/a docking station when needed. When it is time for a replacement, both notebook and GPU are likely to be due for a upgrade. Besides, how much future would any existing TB3/4 eGPU really have? I could also see myself travelling with the X13 eGPU, but certainly not with a traditional eGPU. Smile
If I don't end up with the X13, I'd most likely get some 15-inch notebook with a dGPU. Which is larger then I like, but as far as I can tell the best balance for what I'm looking for.

 

 

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

odin liked
ReplyQuote
 of  7