Splitting one of the pcie 8 pin cables on the Core X Chroma? (evga ftw3 3090 + r...
 
Notifications
Clear all

Splitting one of the pcie 8 pin cables on the Core X Chroma? (evga ftw3 3090 + razer core x chroma)  

 of  2
  RSS

birrbles
(@birrbles)
Active Member
Joined: 6 months ago
 

A question with many loaded assumptions

 

I have an evga ftw3 ultra 3090. The lite-on PSU in the Core X Chroma explicitly supports up to 500w to the GPU (the PSU itself being 700w). However, it only has two PCIE 8pin cables on what appears to be only a single rail (this could be me misunderstanding the label on the PSU). Without swapping the PSU for the considerably-sold-out SF750, how safe is it to split one of those 8 pins into two 8 pins with an 8pin (NOT a 6pin) splitter using 18 gauge copper? And if so, which two the three 8 pin inputs on the evga 3090 have the least power draw? Or is this not known?

 

There are a LOT of assumptions going into my post. Basically, I'm trying to find the safest way to split one of the pcie 8pin lines into two, e.g. by using the line that I split to feed the two least hungry power inputs on the card. I'm willing to chance it since the PSU itself is rated for 500w to the card, but obviously this isn't ideal.

 

(I wasn't expecting this to happen, by the way. I originally had a 2080 super that I bought mid august along with another for a sibling. When the 3000 series launched, only the 3090s had an SLI-equivalent, so we paid the step-up upgrade cost to 3090s in order to have the option to hand one of the cards to the other when one of us upgrades. But alas, here I am, with a card that technically works with the Core X Chroma but is missing an entire power output from the PSU)

 

---

 

My current area of research is cabling docs via gpu mining resources. Will edit anything else I find into here, but at least initially, it seems the psu might support outputting an absolute max of 288 watts through one of the lines and under 150 for the remaining line. If the power inputs are still gpu, gpu, memory, then I might just go the route of feeding gpu and memory with the split line and gpu with the remaining one. I just don't know which inputs are which on the card.

This topic was modified 6 months ago

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
itsage
(@itsage)
Founder Admin
Joined: 4 years ago
 

@birrbles, I've seen success reports of members using the Y-adapter to power a 3-plug PCIe power GPU. The Razer Core X Chroma stock PSU is multi-rail and should be able to handle the RTX 3090

 

LinkedIn | Twitter | Facebook | Youtube | Instagram
 
external graphics card builds
best laptops for external GPU
eGPU enclosure buyer's guide

 
2020 13" MacBook Pro [10th,4C,G] + RX 6700 XT @ 32Gbps-TB3 (CM MasterCase EG200) + Win10 1903 [build link]  


ReplyQuote
birrbles
(@birrbles)
Active Member
Joined: 6 months ago
 

This was helpful, thanks.

The EVGA 3090 FTW3 has three 8 pin inputs side by side, so I made the following assumptions:

  • If I plug the split line into the first and third input and the unsplit line into the second input, I've got very good odds of feeding memory with the split input, which should cut power consumption such that I'm not feeding just the GPU with the split line and potentially overloading the wiring. But I'm not an electrician and I could be making some very stupid assumptions.
  • I decided against allowing the Core X Chroma to power my laptop; I'm drawing power directly from the adapter instead. This should save 100W load on the PSU.

 

So far as I can tell, it works, but I haven't tested it with load yet.

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
birrbles
(@birrbles)
Active Member
Joined: 6 months ago
 

Quick updates:

  • Splitting one of the 8pins into the 1st and 3rd GPU 8pin inputs seems to have done the trick. Of the three, it's the 2nd input that seems on average to consume the most power. Between Minecraft Bedrock with RTX, Doom Eternal, and Death Stranding, all at maximum settings, minecraft induced the greatest power draw.
  • Maximums for inputs across all the games I tested:
    • 8pin #1: 90.3W (Minecraft)
    • 8pin #2: 98.5W (Minecraft)
    • 8pin #3: 83.8W (Minecraft)
  • Without knowing any better, it seems like my guess that one of the edge 8pins (either 1 or 3) would be for memory. It seems like it's #3, and the power consumption rather closely aligned with MVDDC power draw (86.1W)
  • Now that I'm reasonably sure that power draw is still within safe levels (though a bit beyond spec - 174W v. 150W spec for the line I'm splitting), I may at some point test drawing power from the Core X Chroma to power the laptop, but I'm not set on it and I'm likely not testing it until I'm appropriately equipped to safely monitor my setup.

Your mileage may vary, but you should be able to reliably run a 3090 on the Razer Core X Chroma provided you use an appropriately gauged 8pin to 2x 8pin splitter. and provided you use the 1st and 3rd inputs into the card with the 8pin line you're splitting (for the EVGA cards. I can't speak for other 3090 cards, but I'd be surprised if it was any different; feels weird for any board to use the 1st and 3rd power inputs to power the GPU). If you look for a splitter, do not use a 6pin to 2x 8pin splitter. These tend to be the first result when you google, and you'll be putting more strain on fewer wires if you do this.

Also, knowing what I know, I would not try to run a 3090 with the Razer Core X non-Chroma given that the non-Chroma has a weaker PSU.

 

Hope this helps. It seems to run just fine for me; I may do updates every few months just to check in for safety.

This post was modified 6 months ago

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

timginter and odin liked
ReplyQuote
diego
(@diego)
Active Member
Joined: 2 years ago
 

Hey @birrbles just checking in on how your build has been from a stability perspective. Has it been running stable all these months? I'm considering putting a ftw3 3090 in my razer core x with a splitter. Thanks!

2017 15" MacBook Pro (RP560) [7th,4C,H] + RTX 2070 @ 32Gbps-TB3 (Razer Core X) + Win10 1803 [build link]  

ReplyQuote
Jeffrey Matheson
(@jeffrey_matheson)
Active Member
Joined: 4 weeks ago
 

@birrbles, Any chance of following up? How’s it running? I just got an EVGA 3080 FTW3 Ultra and it seems to be working fine (haven’t been able to test it with a higher res monitor yet), except that all the fans kick into max when I shut the computer down and I’m wondering if power is playing some sort of part in the issue.

Did you experience anything like that?

 

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
birrbles
(@birrbles)
Active Member
Joined: 6 months ago
 

@jeffrey_matheson, @diego, sorry for the belated reply.

 

I've had a few challenges with it, but I can't tell whether the challenges are driver-related or hardware-related. Notably, once in a blue moon the video driver will crash and be dynamically reloaded (screens freeze, goes black, everything comes back within 10 seconds or so) on Windows 10. This has never happened to me in the midst of gameplay, though it did once happen while Minecraft was minimized and Minecraft crashed thereafter. I'd say it happens once every week or two.

I'd say it's stable enough. There's the aforementioned wonkiness once in a while, but can't tell if that's an artifact of my configuration or the fact that I've got a laptop that has a quadro and an external 3090. I did have some stability issues with the case closed, but none of it was thermal and I could never figure out what was causing it. So I've been running it with the case open just to have an eye on it, and in that time I haven't had any issues aside from the other paragraph. The stability issues with the case closed included hard laptop hangs at least once a week, though no artifacting or other indications of actual graphics performance issues.

You absolutely must not do any hard loads on this for seriously extended periods of time (mining) unless you're willing to risk melting the splitter or worse. I'm using this exclusively for gaming and video loads rather than mining, so it's treated me fine.

 

tldr: works fine with the case open. for some reason, didn't work as fine with the case closed. No hours-long 100% loads (well, you can, but I didn't test that). See attached for how I wired the splitter from the PSU into the GPU. Your mileage may vary.

 

 

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
birrbles
(@birrbles)
Active Member
Joined: 6 months ago
 

@jeffrey_matheson, Also, yes to the fans ramping up. Seems the card just assumes the worst with cooling unless it's connected to an active machine. Makes sense. I just unplug it after power-down and plug it back in before power-up.

 

To do: Create my signature with system and expected eGPU configuration information to give context to my posts. I have no builds.

.

ReplyQuote
olyvers26
(@olyvers26)
Active Member
Joined: 2 years ago
 

@birrbles , what is the exact splitter you bought, there are so many variants on amazon, i dont understand the difference or if one like this will work 

https://www.amazon.co.uk/dp/B08Z8FXR1H

or 

https://www.amazon.co.uk/dp/B08KP9R92V

and what does this mean *This is a GPU 8 pin to GPU 8 pin splitter cable (GPU to GPU), not a CPU 8 pin to 2 X PCIe 8 cable (CPU to GPU), both are very similar Masu.*

which one will work?

@jeffrey_matheson

how is your card working? any difficulties , artefacts? have you tessted with some load on the card.

I have the razer x, and an offer for a 3080 Suprim (that is very similar in pcb and power connectors to the ftw3 ulta), and i wonder if it would work.

Thank you for your answers, and all in this great egpu community.

 

 

 

 

Dell G3 + AORUS GTX 1080 Gaming Box, Lenovo Thinkpad X230 + EXP GDC 8.5C + GTX 460


ReplyQuote
olyvers26
(@olyvers26)
Active Member
Joined: 2 years ago
 

@Jeffrey Matheson

Is it this variant of 3080 that you have?

You manage to make it work in a Razer Core X or Core X Chroma?

Thank you for your answer.

 

Dell G3 + AORUS GTX 1080 Gaming Box, Lenovo Thinkpad X230 + EXP GDC 8.5C + GTX 460


ReplyQuote
 of  2