Splitting one of the pcie 8 pin cables on the Core X Chroma? (evga ftw3 3090 + r...
 
Notifications
Clear all

Splitting one of the pcie 8 pin cables on the Core X Chroma? (evga ftw3 3090 + razer core x chroma)  

  RSS

birrbles
(@birrbles)
Active Member
Joined: 2 months ago
 

A question with many loaded assumptions

 

I have an evga ftw3 ultra 3090. The lite-on PSU in the Core X Chroma explicitly supports up to 500w to the GPU (the PSU itself being 700w). However, it only has two PCIE 8pin cables on what appears to be only a single rail (this could be me misunderstanding the label on the PSU). Without swapping the PSU for the considerably-sold-out SF750, how safe is it to split one of those 8 pins into two 8 pins with an 8pin (NOT a 6pin) splitter using 18 gauge copper? And if so, which two the three 8 pin inputs on the evga 3090 have the least power draw? Or is this not known?

 

There are a LOT of assumptions going into my post. Basically, I'm trying to find the safest way to split one of the pcie 8pin lines into two, e.g. by using the line that I split to feed the two least hungry power inputs on the card. I'm willing to chance it since the PSU itself is rated for 500w to the card, but obviously this isn't ideal.

 

(I wasn't expecting this to happen, by the way. I originally had a 2080 super that I bought mid august along with another for a sibling. When the 3000 series launched, only the 3090s had an SLI-equivalent, so we paid the step-up upgrade cost to 3090s in order to have the option to hand one of the cards to the other when one of us upgrades. But alas, here I am, with a card that technically works with the Core X Chroma but is missing an entire power output from the PSU)

 

---

 

My current area of research is cabling docs via gpu mining resources. Will edit anything else I find into here, but at least initially, it seems the psu might support outputting an absolute max of 288 watts through one of the lines and under 150 for the remaining line. If the power inputs are still gpu, gpu, memory, then I might just go the route of feeding gpu and memory with the split line and gpu with the remaining one. I just don't know which inputs are which on the card.

This topic was modified 2 months ago

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
itsage
(@itsage)
Illustrious Member Admin
Joined: 4 years ago
 

@birrbles, I've seen success reports of members using the Y-adapter to power a 3-plug PCIe power GPU. The Razer Core X Chroma stock PSU is multi-rail and should be able to handle the RTX 3090

 

external graphics card builds
best laptops for external GPU
eGPU enclosure buyer's guide

 
2020 14" MSI Prestige 14 EVO [11th,4C,G] + RTX 3080 @ 32Gbps-TB4 (AORUS Gaming Box) + Win10 2004 [build link]  


ReplyQuote
birrbles
(@birrbles)
Active Member
Joined: 2 months ago
 

This was helpful, thanks.

The EVGA 3090 FTW3 has three 8 pin inputs side by side, so I made the following assumptions:

  • If I plug the split line into the first and third input and the unsplit line into the second input, I've got very good odds of feeding memory with the split input, which should cut power consumption such that I'm not feeding just the GPU with the split line and potentially overloading the wiring. But I'm not an electrician and I could be making some very stupid assumptions.
  • I decided against allowing the Core X Chroma to power my laptop; I'm drawing power directly from the adapter instead. This should save 100W load on the PSU.

 

So far as I can tell, it works, but I haven't tested it with load yet.

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
birrbles
(@birrbles)
Active Member
Joined: 2 months ago
 

Quick updates:

  • Splitting one of the 8pins into the 1st and 3rd GPU 8pin inputs seems to have done the trick. Of the three, it's the 2nd input that seems on average to consume the most power. Between Minecraft Bedrock with RTX, Doom Eternal, and Death Stranding, all at maximum settings, minecraft induced the greatest power draw.
  • Maximums for inputs across all the games I tested:
    • 8pin #1: 90.3W (Minecraft)
    • 8pin #2: 98.5W (Minecraft)
    • 8pin #3: 83.8W (Minecraft)
  • Without knowing any better, it seems like my guess that one of the edge 8pins (either 1 or 3) would be for memory. It seems like it's #3, and the power consumption rather closely aligned with MVDDC power draw (86.1W)
  • Now that I'm reasonably sure that power draw is still within safe levels (though a bit beyond spec - 174W v. 150W spec for the line I'm splitting), I may at some point test drawing power from the Core X Chroma to power the laptop, but I'm not set on it and I'm likely not testing it until I'm appropriately equipped to safely monitor my setup.

Your mileage may vary, but you should be able to reliably run a 3090 on the Razer Core X Chroma provided you use an appropriately gauged 8pin to 2x 8pin splitter. and provided you use the 1st and 3rd inputs into the card with the 8pin line you're splitting (for the EVGA cards. I can't speak for other 3090 cards, but I'd be surprised if it was any different; feels weird for any board to use the 1st and 3rd power inputs to power the GPU). If you look for a splitter, do not use a 6pin to 2x 8pin splitter. These tend to be the first result when you google, and you'll be putting more strain on fewer wires if you do this.

Also, knowing what I know, I would not try to run a 3090 with the Razer Core X non-Chroma given that the non-Chroma has a weaker PSU.

 

Hope this helps. It seems to run just fine for me; I may do updates every few months just to check in for safety.

This post was modified 2 months ago

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote