2011 15" HP 6560b + GTX760@4Gbps-EC2 (EXP GDC 8.4d) + Win10 [metallus84]^ // hand made cable  

 

metallus84
(@metallus84)
Active Member
Joined:12 months  ago
Posts: 9
March 6, 2017 7:53 pm  

Hello to everyone, first post here. I want to ask your opinion on my system.

  • System:
    • Hardware
      • HP ProBook 6560b
      • Core i3-2310M
      • 8GB RAM (4+4)
      • iGPU Intel HD 3000
    • Software
      • Win 10 Pro 64-bit 

  Additions:

    • Hardware
      • nVidia 760 GTX GAINWARD Phantom
      • EXP-GDC v8.4d Express Card version
      • 650W ATX PSU
    • Software
      • Latest nVidia and Intel Graphics Drivers
      • nando’s DIY eGPU Setup 1.3x
      • DSDT Editor + Compiler pack

 

No big issue till now, very happy to made a self soldered HMI cable that allow me to work in GEN2 ExpressCard mode (GPU-z tell me PCIe x 16 2.0 @ 1x 2.0 during 3dMark and Fraps benchmark)

 

  • Performance
    • 3Dmark 06
      • 10143 (SM2.0 4079, HDR/SM3.0 5030, CPU 2507)
    • 3Dmark (steam DEMO version)
      • Firestrike 3886 (24.81 FPS Gt1, 20.92 Gt2, 8.7 FPS Physx, 7.9 FPS Combined)
      • Icestorm 22488 (98 FPS Gt1, 98 FPS Gt2, 71 FPS Physx)

 

Unfortunately i have bad performance on benchmark (except Firestrike) if i made a comparison with other systems. I don’t know if Opt1.2 use pci-e compression through HD3000 or is just a CPU limit that i reach. All test are made on internal lcd only (that is my focus now).

 

Does anyone idea why? Any suggestion?

 

Follow system screenshots and pics

 

Self made cable

 

System (3dMark06 score)

Edited: 12 months  ago

3RYL and xRay liked
ReplyQuote
nando4
(@nando4)
Noble Member Admin
Joined:1 year  ago
Posts: 1580
March 6, 2017 8:07 pm  

Here’s a 2570P with a i7-3520M + GTX960 + 376.88 driver . It scores a 3dmark06=20825 , doubling your SM2.0, SM3.0 results and nearly with a double faster CPU. 

It’s difficult to isolate then what is causing your reduced performance but tentatively you may be correct. Nvidia may have disabled PCIe compression on HD3000 equipped systems which has a dramatic affect on DX9, particularly 3dmark06, results.  Consider I saw 3dmark06 results > 15k with a core2duo CPU.

 

Edited: 12 months  ago

eGPU Port Bandwidth Reference TableeGPU Setup 1.35


3RYL liked
ReplyQuote
ehcoboy
(@ehcoboy)
New Member
Joined:1 year  ago
Posts: 3
March 6, 2017 8:14 pm  

Hi Metallus,

Want to make a home made EC->hdmi cable for my exp gdc V8.4 (unstable @1.2), how did you build your own?

Thanks 

 

 


ReplyQuote
metallus84
(@metallus84)
Active Member
Joined:12 months  ago
Posts: 9
March 6, 2017 9:32 pm  

Hello Nando

thank you a lot for fast answer. If it could be a driver related problem, do you think that doing some DDU + downgrade i’ll find my driver? Or trying to mod some notebook driver instead? How can i check if compression is really enabled or disabled? Just only with benchmarks?

I have read somewhere that without battery plugged on the system “seems” to be without compression and with battery plugged “seems” to run fine. I have no battery (used laptop) but one is on route, i’ll try and let you know if something change.

Do you have a suggestion of which Nvidia driver version try to install?

 

@ehcoboy. I bought an high quality HMI cable for TVs (the one with ferrite  near connector you can see one in my picture) and i cutted it short enough, soldered on a ExpressCard32 connector (not very nice to see but effective) and now i have with problematic EXP-GDC 8.4d, a full stable PCI x1 @Gen2 (2.0) speed. I made a lot of test and i’m 100% sure that i’m on 2.0 because if i force system to run 1.0 i see my scores goes below numbers i posted before.

Some soldering skills is required but with googling you can find pinout and with beep tester you can find EXP-GDC pinout and solder it

Bye all


ReplyQuote
nando4
(@nando4)
Noble Member Admin
Joined:1 year  ago
Posts: 1580
March 6, 2017 9:57 pm  

3dmark06 is the best indicator of PCIe compression being engaged or not. Keep in mind the compression has a large affect on DX9 titles. DX11 sees little performance benefit from it. Your 3dmark11 score is comparable to older core2duo desktop systems as seen here.

If you absolutely want the PCie compression then the best way may be loading the oldest compatible driver you can find and re-running 3dmrk06, looking for a noticable jump in performance figures.

eGPU Port Bandwidth Reference TableeGPU Setup 1.35


3RYL liked
ReplyQuote
metallus84
(@metallus84)
Active Member
Joined:12 months  ago
Posts: 9
March 7, 2017 1:00 pm  

Ok thank you a lot. I have GTX760 i’ll try to use Win8.1 driver 2014 – 2015 years and see. I use my system for some old games (like CS:GO  so dx9 and i need little FPS boost) and new one (trying to use with BattleField1, but maybe i’ll replace i3 with i7 with higher GHz if i am lucky)

I’ll post relevant results here for future!

Bye

Edited: 12 months  ago

nando4 liked
ReplyQuote
metallus84
(@metallus84)
Active Member
Joined:12 months  ago
Posts: 9
March 15, 2017 9:31 pm  

Little update on my test with older driver:

  • Nvidia Driver 331.82:
    • 3Dmark 06
      • 12173 (SM2.0 4952, HDR/SM3.0 6665, CPU 2541)
    • 3Dmark 11 Performance
      • n/a – too old drivers?
    • 3Dmark (steam DEMO version)
      • Firestrike (n/a – too old drivers?)
      • Icestorm 29365 (152 FPS Gt1, 136 FPS Gt2, 66 FPS Physx)

 

New driver:

  • Nvidia Driver 378.66:
    • 3Dmark 06
      • 11666 (SM2.0 4651, HDR/SM3.0 6380, CPU 2493)
    • 3Dmark 11 Performance
      • 4855
    • 3Dmark (steam DEMO version)
      • Firestrike 3692 (25 FPS Gt1, 21 FPS Gt2, 8.3 Physx, 6.46 Comb)
      • Icestorm 22473 (98 FPS Gt1, 98 FPS Gt2, 71 FPS Physx)

I downgraded also Intel HD Driver from 15.28.24.64.4229 (9.17.10.4229) to 15.28.23.4101 (9.17.10.4101)

Boost with older drivers in dx9 is noticeable, unfortunately i want to use latest driver for game compatibility, so i hope that the new i5 i just found (should be compatible with my laptop) will increase dx9 performance.

maybe i will try a Nvidia 350.xx driver but i dont think that i will have new good results, so sad that there is no a way to force Opt compression back on HD3000

 

Bye!

 


ReplyQuote
nando4
(@nando4)
Noble Member Admin
Joined:1 year  ago
Posts: 1580
March 16, 2017 2:54 am  

@metallus, the difference in results between the 331.82 and 378.66 is minor. Pls consider redoing the 3dmark06 benchmarking when the i5 CPU arrives.  The Optimus x1 compression 3dmark06 results being heavily influenced by CPU performance.

eGPU Port Bandwidth Reference TableeGPU Setup 1.35


metallus84 liked
ReplyQuote
neoHannibal
(@neohannibal)
Active Member
Joined:12 months  ago
Posts: 15
March 20, 2017 9:07 pm  

@mettalus84 Did you also have a unstable Gen2 connection on EXP GDC? And you are saying a home made cable solved the problem? Where did you get the ExpressCard34 connnector from? Also where can i find pinout for all that? 


ReplyQuote
nando4
(@nando4)
Noble Member Admin
Joined:1 year  ago
Posts: 1580
March 21, 2017 4:54 am  
Posted by: neoHannibal

 

@mettalus84 Did you also have a unstable Gen2 connection on EXP GDC? And you are saying a home made cable solved the problem? Where did you get the ExpressCard34 connnector from? Also where can i find pinout for all that? 

   

Until mettalus84 responds, consider the following to reduce EMI noise and signalling spikes/interference:

– isolating the PSU for the eGPU to it’s own dedicated power outlet/point
– do not have any USB devices attached to your system, particularly the EXP GDC
– place aluminium cooking foil over the EXP GDC cable
– set SLIM LINE switch to Ultra, which supposedly disables the EXP GDC USB port per here

eGPU Port Bandwidth Reference TableeGPU Setup 1.35


ReplyQuote
neoHannibal
(@neohannibal)
Active Member
Joined:12 months  ago
Posts: 15
March 21, 2017 4:16 pm  
Posted by: nando4

 

Posted by: neoHannibal

 

@mettalus84 Did you also have a unstable Gen2 connection on EXP GDC? And you are saying a home made cable solved the problem? Where did you get the ExpressCard34 connnector from? Also where can i find pinout for all that? 

   

Until mettalus84 responds, consider the following to reduce EMI noise and signalling spikes/interference:

– isolating the PSU for the eGPU to it’s own dedicated power outlet/point
– do not have any USB devices attached to your system, particularly the EXP GDC
– place aluminium cooking foil over the EXP GDC cable
– set SLIM LINE switch to Ultra, which supposedly disables the EXP GDC USB port per here

   

I have tried all of the above except of the SLIME LINE switch, because I don’t have one since my EXP GDC is V7.2C ARES – I have dissasembled the adapter and that is what it says on the PCB even though on the housing it says Beast. I have intentionally bought the adapter and GTX 660 from a person that claims it was all working on Lenovo X230, since there was a rumor that the recent GXP GDC have problems with stable Gen2 connection. I also looked at the PCB and the connections. I have noticed that one of the pins of the PCIe x16 connector has a blob on the pinnout (see picture:  https://goo.gl/photos/ChsArenpBDLksqHXA) also some of the other pins aren’t soldered well either. I have an electronics background and am thinking about reflowing some of the connections. However I am not sure where all the pins go, because some of them might be not used. Does anybody have a schematics/pinout of the connector?

Edited: 11 months  ago

ReplyQuote
nando4
(@nando4)
Noble Member Admin
Joined:1 year  ago
Posts: 1580
March 21, 2017 5:49 pm  
Posted by: neoHannibal  

I I have noticed that one of the pins of the PCIe x16 connector has a blob on the pinnout (see picture:  https://goo.gl/photos/ChsArenpBDLksqHXA) also some of the other pins aren’t soldered well either. I have an electronics background and am thinking about reflowing some of the connections. However I am not sure where all the pins go, because some of them might be not used. Does anybody have a schematics/pinout of the connector?

neoHannibal, we don’t have a pinout of the EXP GDC. Can however use a digital multimeter doing continuity probes to identify what goes where. Though I’m not sure if the effort is worth it.

I did a fair bit of testing with BPlus socketted versus soldered products finding the latter was far better at maintaining a Gen2 link. Gen2 runs a 5Ghz clock where impedance matching is very important to avoid reflective noise which is probably where your EXP GDC falls apart.

You may take a chance on another EXP GDC or grab a PE4C 3.0? Both are linked at their best prices at:

https://egpu.io/external-gpu-buyers-guide-2017/#expresscard2-interface

Edited: 11 months  ago

eGPU Port Bandwidth Reference TableeGPU Setup 1.35


ReplyQuote
neoHannibal
(@neohannibal)
Active Member
Joined:12 months  ago
Posts: 15
March 21, 2017 6:01 pm  
Posted by: nando4

 

Posted by: neoHannibal  

I I have noticed that one of the pins of the PCIe x16 connector has a blob on the pinnout (see picture:  https://goo.gl/photos/ChsArenpBDLksqHXA) also some of the other pins aren’t soldered well either. I have an electronics background and am thinking about reflowing some of the connections. However I am not sure where all the pins go, because some of them might be not used. Does anybody have a schematics/pinout of the connector?

neoHannibal, we don’t have a pinout of the EXP GDC. Can however use a digital multimeter doing continuity probes to identify what goes where. Though I’m not sure if the effort is worth it.

I did a fair bit of testing with BPlus socketted versus soldered products finding the latter was far better at maintaining a Gen2 link. Gen2 runs a 5Ghz clock where impedance matching is very important to avoid reflective noise which is probably where your EXP GDC falls apart.

You may take a chance on another EXP GDC or grab a PE4C 3.0? Both are linked at their best prices at:

https://egpu.io/external-gpu-buyers-guide-2017/#expresscard2-interface

   

Are you talking about the impedance matching at the HDMI connection or on the PCIe x16 slot itself? What impedance are the lines? At 5 GHz speeds this should probably be an impedance controlled microstrip all the way not some kind of HDMI cable crap. Do you know how many signal lines go through that HDMI cable? Do you have any specification/datasheet for PCIe Gen2?

Edited: 11 months  ago

ReplyQuote
nando4
(@nando4)
Noble Member Admin
Joined:1 year  ago
Posts: 1580
March 21, 2017 6:20 pm  
Posted by: neoHannibal

 

Are you talking about the impedance matching at the HDMI connection or on the PCIe x16 slot itself? What impedance are the lines? At 5 GHz speeds this should probably be an impedance controlled microstrip all the way not some kind of HDMI cable crap. Do you know how many signal lines go through that HDMI cable? Do you have any specification/datasheet for PCIe Gen2?

   

For PCIe Gen2 signalling specs look up PCI SIG documentation. There are two RX/TX pairs that form a lane for PCIe.
Here’s an archive of BPlus’ EC2C expresscard-to-mHDMI cable that may help you:

http://web.archive.org/web/20160317183717/http://www.hwtools.net/PDF/EC2C_Schematic.pdf

Edited: 11 months  ago

eGPU Port Bandwidth Reference TableeGPU Setup 1.35


ReplyQuote
neoHannibal
(@neohannibal)
Active Member
Joined:12 months  ago
Posts: 15
March 21, 2017 10:38 pm  

I also found this  http://www.phasure.com/index.php?topic=2862.0 (there is a better quality pinout in a pdf there). From what can I see on my EXP GDC V7.2C there is an RX+ and RX- coming out from what I assume to be one of the HDMI’s four twisted pairs. Then there is two lines coming out form what I agin assume to be a twisted pair from the HDMI and going on to what looks like two resistive voltage dividers. What could that be – TX+ and TX- or rather CLK+ and CLK-? Why would you lower the voltage? Is ExpressCard not 12V? Probably it is not 12V, can anybody confirm? But that would mean the RX+ and RX- being lower voltage than should go on PCIe port. There is also another pair coming of the HDMI connector which I assume would be the other signals that don’t go onto the voltage dividers. Finally what could be on the last twisted pair, the two Card Presence detetion lines? there is also additional line which could be the CLKREQ signal. What I am concerned the most is that I can’t see the twisted paits shieldings being grounded. That could be a huge factor in bad Gen2 performance.


ReplyQuote
metallus84
(@metallus84)
Active Member
Joined:12 months  ago
Posts: 9
March 22, 2017 7:49 pm  

Hello neoHannibal

i followed information found here

http://forum.notebookreview.com/threads/diy-egpu-experiences.418851/page-876#post8356280

sonny_mv posts (you’ll find some pics and a wiring diagram)

What i used is a standard HDMI TV cable with one ferrite core embedded on both side, cutted long max 30 cm and soldered into an used expresscard USB3.0 defective board (so i desoldered only expresscard slot and used to solder my wires, but you can do what sonny_mv done)

what i found is a cable with shielded twisted pair and an external shield. What is important, i think, is that i tried to put all this shield to GND (so no floating shield). Also the cable is round, not flat, so i thing that all twisted pairs are twisted eachother through the cable length.

Maybe the cable maybe good shielding, but i have full stable 2.0 pci-e 1x link (so happy)

i can not show you final work because i used hot glue to mechanically fix everything 🙂 but i can show you the other half cable cutted that i do not need anymore.

nando4:

i have my i5 now, benchamarks are better (16k 3dmark06) but what i noticed now is that seems that with internal LCD it is impossible to break the 100 FPS.

For example i achieved same results on 3dmark benchmark Icestorm (24k instead of 22k). Do you think that HD3000 is at its maximum and internal LCD is 100 FPS capped? Or it could be related to PCI Opt compression that is not (maybe) longer available?

Bye


3RYL liked
ReplyQuote
nando4
(@nando4)
Noble Member Admin
Joined:1 year  ago
Posts: 1580
March 22, 2017 8:09 pm  
Posted by: metallus84

 nando4:

i have my i5 now, benchamarks are better (16k 3dmark06) but what i noticed now is that seems that with internal LCD it is impossible to break the 100 FPS.

For example i achieved same results on 3dmark benchmark Icestorm (24k instead of 22k). Do you think that HD3000 is at its maximum and internal LCD is 100 FPS capped? Or it could be related to PCI Opt compression that is not (maybe) longer available?

Bye

   

Your 3dmark06 results are within the expected range. If PCI Opt compression wasn’t working then would be seeing 3dmark06 of < 10k.  Consider, here's a i5-2540M + GTX660Ti under Win7. It saw a 3dmark06 score of 17893:
http://www.3dmark.com/3dm06/16899445

It has been reported that Optimus internal LCD mode has capped games at < 60Hz. Yours is the first mention of a 100hz cap.

eGPU Port Bandwidth Reference TableeGPU Setup 1.35


3RYL liked
ReplyQuote
metallus84
(@metallus84)
Active Member
Joined:12 months  ago
Posts: 9
March 22, 2017 8:28 pm  

so,

seems that i’m in better conditions now, i tried CS:GO and it is good. It runs around 100 FPS without v-sync. 60 with v-sync (and it is enough for me). Also BF1 tries to run but i think it is too much for my system. LCD resolution is low but game is heavy (2016 game vs 2011/2013 hardware).

thank you!


ReplyQuote
neoHannibal
(@neohannibal)
Active Member
Joined:12 months  ago
Posts: 15
March 22, 2017 8:30 pm  

@metallus84 That would be great if you can show me the other half of the HDMI cable. What I have noticed while examinig the EXP GDC board (I have V7.2C) is that i am not sure whether the shilding of each twisted pair are grounded. It doesn’t look like that on at the HDMI connector. Unfortunately the pins at the HDMI connector are so close to each other that I can’t use a digital multimeter to check the connections.


ReplyQuote
metallus84
(@metallus84)
Active Member
Joined:12 months  ago
Posts: 9
March 23, 2017 8:19 pm  

Here is the cable

and

Remember that my EXP GDC is 8.4d, i do not know what is evolution step from your version to mine


nando4 addition>> https://egpu.io/forums/expresscard-mpcie-m-2-adapters/exp-gdc-hdmi-to-mpcie-wiring-diagram/ tells us:

Courtesy of @wimpzilla  here.  It helps tremendously those users fixing or making their own EXP GDC cables like done here (HDMI to expresscard ).

HDMI Connector Pinout

Here you go!

BEWARE to asses correctly the pin number on the HDMI connector, you will notice that on the pic there is 2 lines with pin number.

The upper one is for the HDMI cable connector view, the line under is for the EXP GDC HDMI connector, since when you look at and pug it it invert the pin number.

Edited: 7 months  ago

3RYL and jeemaline liked
ReplyQuote
neoHannibal
(@neohannibal)
Active Member
Joined:12 months  ago
Posts: 15
March 23, 2017 9:47 pm  

@metallus84 – thanks for the photos. I experimented today with a cut in half HDMI cable. I just need an ExpressCard connector now. On thing that I have noticed is that on my EXP GDC V7.2C there are a resistive voltage dividers on the CLK+ CLK- lines. Also there are capacitors on the TX+ TX- lines. This is weird because neither on this schematics  http://web.archive.org/web/20160317183717/http://www.hwtools.net/PDF/EC2C_Schematic.pdf nor http://forum.notebookreview.com/attachments/adapterpinout-jpg.76140/ there isn’t anything about voltage deviders or capacitors. There isn’t anything with voltage dividers or capacitors on the PE4L V2.1b  https://drive.google.com/file/d/0B9WZhsesdvd5QnFPSFdMZlZUWVk/view?usp=sharing The only place where I found something about capacitors is  http://www.phasure.com/index.php?topic=2862.0 and there are capacitors on the receiver (RX+ RX-) not the transmitter (TX+ TX-). I am really confused by the EXP GDC design. Moreover, the EXP GDC has floating shieldings on all but one twisted pairs.

@metallus84 did you put the shilding on both siedes on of the cable (HDMI and EC) to ground?

And perhaps the most important questions, what kind of problems did you hace with the oryginal EXP GDC cable?

Edited: 11 months  ago

ReplyQuote
metallus84
(@metallus84)
Active Member
Joined:12 months  ago
Posts: 9
March 24, 2017 6:15 pm  

Wow you found schematics, they are very usefull.

However yes i had problems with original ExpressCard cable provided by EXP GDC but maybe to clear our doubts a 8.4d schematics will be helpfull.

My shield are all together (external one with twisted pair ones) to GND. On the EXP GDC side i did not touch anything, it is all original. So PE4L 3.0 directly soldered cable tecnique is not necessary because my system works with HDMI connector.

a good cable with matched impedance and good shielding (with some ferrite clamp near connector) seems to be the solution. Maybe resistor are used to adapt cable impedance, maybe the capacitor are too big for higher speed signal, i dont know. Only with EXP GDC 8.4d schematics we can be sure of what i have.

 

 


ReplyQuote
neoHannibal
(@neohannibal)
Active Member
Joined:12 months  ago
Posts: 15
March 24, 2017 6:33 pm  

I have EXP GDC V7.2C which i bought second hand with GTX 660 and Dell D220P-01 from a person that claimed it all worked at Gen2 speeds. I intentionally did that because I was concerned about the recent EXP GDC Gen2 speed problems, but as it turns out i have those problems anyway.


ReplyQuote
neoHannibal
(@neohannibal)
Active Member
Joined:12 months  ago
Posts: 15
March 27, 2017 8:55 pm  

Today I had a chcance to try other graphics cards except GTX 660 with my EXP GDC v7.2C. First was Radeon HD4850 which worked plug and play and I tested it in GPU-Z render test which showed Gen2 working. Since HD4850 supports only DX10 i performed 3dmark06 obviously on an external monitor and scored 9102 for Gen1 and 10457 for Auto (Gen2) BIOS settings. I assume that 15% performence increase indicates that Gen2 worked properly? Another card that I tried was Geforce 8600 but I coulden’t get T430 to detect it and I can’t use Setup 1.35 because I installed Win 8.1 Pro in UEFI mode in an attempt to get Gen2 working (theory that UEFI works better than BIOS). Finally, I tried Radeon HD7950 but my laptop would freeze every time I attempted to install drivers. However, that particular HD7950 could have been defective. I also tried 3dmark06 on the GTX 660 and scored about 20650 on both Gen1 and Auto (this was the 1st time a benchmark completed on Auto with the GTX 660). Consequently I fallowed with 3dmark11 and scored (combined score) about 4185 again for both Gen1 and Auto BIOS settings. From those results I assume that the GTX 660 operates at Gen1 speeds even on the Auto setting. Overall now I am really confused, even though I know nando4 said that different GPUs can have different impedance but the person I have bought the entire eGPU setup claims that it all worked on Gen2 speeds with Lenovo X230. I will try to test my EXP GDC v7.2C with other GPUs and laptops if I will have a chance. Anyone has any suggestions?

Edited: 11 months  ago

ReplyQuote
Vetjo
(@vetjo)
Active Member
Joined:11 months  ago
Posts: 8
March 29, 2017 7:50 pm  

Hi, I’m trying to redo the HDMI cable that came with my V8.4 EXP GDC to solder it to a M.2 PCI PCB I have around.

I’ve done some ohm’ing here after Nando pointed it out to me but I got some questions around that, I undrstand you have been doing cables for expresscard but maybe it’s the same amount connections on the HDMI port for the expresscard? The cable I’ve got has 13 seperate leads.

3 of the cables from  the HDMI don’t seem to connect to any component on the EGPU’s PCB, or  at last I wasn’t able to find what they connect to. A fourth is connected the “Slim line” button.
 
If I read the PCI pinouts correctly (honestly not sure which side is A/B when looking at it) these are the ones I’ve found, maybe you can help clarify to me if these are all the ones I need to actually get a 1x PCI lane going!

Ground (x2)
TCK
TDI
WAKE# (x2)
Ground
HSOn(0)
HSOp(0)
REFCLK-
PRSNT2#

 


ReplyQuote
neoHannibal
(@neohannibal)
Active Member
Joined:12 months  ago
Posts: 15
April 5, 2017 8:04 pm  

@metallus84 Do you remamber how did you connect the SMCLK and SMDAT lines? On the schematics you said you fallowed https://drive.google.com/file/d/0B9WZhsesdvd5RjZxMHVLMS1GVjA/view?usp=sharing the lines SMCLK and SMDAT are switched on the ExpssCard connector when commpared to what I ohm-mesured on my EXP GDC cable. I you could answer that would be grat. I a trying to solder my own cable but don’t know how to connect the SMCLK and SMDAT – acording to  the above shematics or to my oryginal EXP GDC cable.

Edited: 11 months  ago

ReplyQuote
metallus84
(@metallus84)
Active Member
Joined:12 months  ago
Posts: 9
April 14, 2017 12:45 pm  

sorry for delay, however i did not connected them because EXP GDC did not use it on PCIe connector / HDMI connector. Try without, you should not damage anything because at least it does not works!

Good luck


ReplyQuote
amoeba
(@amoeba)
Eminent Member
Joined:3 months  ago
Posts: 26
November 24, 2017 11:46 pm  

Same as OP – when I dont run my expresscard at gen1 speed, I get nasty crashes very quickly.

Has anyone opened up the cable and soldered the connections on the expresscard side of the standard cable? Or should I just buy an hdmi cable and solder that instead?

 

Edited: 3 months  ago

ReplyQuote
  
Working

Please Login or Register