Anyone doing 40g xl710 dac between two computers?

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

boe

New Member
Apr 7, 2019
20
1
3
I would like to know if anyone has tried this yet. I don't see why it wouldn't work but since I've never done it, I'm curious. I'm trying to avoid purchasing a switch even if it was $50, I'm trying to avoid a switch.
 

boe

New Member
Apr 7, 2019
20
1
3
For anyone wondering, my curiousity did get the best of me and I bought two used xl710 cards on ebay. I bought a 40g qsfp+ dac with intel supported connectors on each end and managed to get them to talk to each other with no real effort other than installing the card and the drivers and putting an IP address on each. It shows connected at 40g. Unfortunately my array is rebuilding (I knocked out a power connector to a drive when moving or installing the card). I should know by Sunday if I get any more speed out of them than my 10g nics. My 10g bandwidth is filled when copying between two PCs (each with 16 drive internal arrays). I tried using SMB with 2 ports on each nic on the PC and saw it briefly get to about 13gbps but never more than that and usually just 10gbps.


I don't expect even double the speed as I'm sure my disk array will be the bottleneck but I couldn't resist trying. I couldn't find another single idiot like myself who had tried this yet so I thought I'd post it can be done.

For anyone curious I'm running Win10 and I'm pretty sure the Dell drivers are just the intel drivers for the card.

I'm not suggesting anyone else do this over 10g I'm just letting people know it can be done if they wonder about things that I was curious about.

Unfortunately the only quad port 40g nic I could find was silicom's at nearly 2K which is a shame because the used dual ports are $150 to $200.
 

Attachments

Last edited:
  • Like
Reactions: T_Minus

boe

New Member
Apr 7, 2019
20
1
3
OK - so my array finally finished rebuilding. While it is about 70% full, I don't think that will have a huge impact on performance as it is an array, not a single drive.

While I got close to these speeds occasionally with SMB with dual port 10g nics on each PC connected with both ports to each, it wasn’t nearly as frequent (getting those speeds).

Capture.JPG

So the disk array is the bottleneck or my motherboard not having enough PCIe lanes I have 1 16x card and 2 8x cards and while I have 16x slots galore, I think I only have about 24 lanes available at most.

For anyone new wondering about my environment, a single nvme disc to nvme disc copy between the two PCs only got to about 800mb transfer so my array is much faster for transfers although the SSD is pretty consistent in copy speed. The bandwidth only matters when transferring large files - e.g. over 1GB as small files don't achieve nearly the bandwidth utilization.

While I haven't really shifted any paradigms for my home network it was fun for me to test. I just completed a replication of files that have changed in the last 2 weeks between the 2 PCs - 626GB (3,959 large and very small files) - the replication took 15 minutes and 43 seconds. That might help someone who wanted to know a mixed calculation of file speeds vs purely large files.
 
Last edited: