40Gbe/56Gbe direct attach ethernet or infiniband - low transfer rates - please help!

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

ebarnes02

New Member
Sep 7, 2024
3
0
1
Hello everyone - I am hoping someone can help me!

I am not new to networking, but I am new to high speed networking and very new to infiniband.

I have 2 dual port Mellanox ConnectX-3 MCX354A-FCBT adapters. I am only using a single port and I have them Direct Attached.
The computers they are installed in for testing are Dell Optiplex 7000 series Desktops one has an i7-12700 CPU, the other has an i7-12700K CPU. Both have 32GB RAM, 1TB Sabrent Rocket 4.0 NVMe SSD. Both are clean, fresh installations of Windows 10 Pro 64-bit. I have updated both cards to the latest firmware (as far as I can tell), and have the latest WIN-OF drivers. I have configured them as 40Gbe Ethernet and as 56Gbe Infiniband. I've tested using both a Copper Passive QSFP+ DAC and an Active Fiber QSFP+ FDR DAC for 56G. If I run iperf3 on them, the best I can get is 6.5Gbs to 7Gbs. If I test with 10 parallel streams, I can get about 20Gbs, but that's it. More streams, less streams that's the max. If I try to transfer files across that connection, Windows Explorer gives me about 1.5GB/s transfer max.

Is this the best I can expect with this hardware, or am I missing something? It just seems like I should be able to get more transfer speed out of this setup.
 

i386

Well-Known Member
Mar 18, 2016
4,389
1,623
113
35
Germany
- explorer is not optimized for high speed transfers, try robocopy or other tools that support paralell transfers and multiple threads
- ipferf3 is not the best choice on windows paltforms for testing, try ntttcp

these cpus have max 20 pcie lanes, google says these optiplex have a x16 and a x4 slot. with the x4 slot expect ~4GByte/s at best
 
  • Like
Reactions: nexox

ebarnes02

New Member
Sep 7, 2024
3
0
1
Thank you for the suggestions. I have the cards in the pcie4.0 x 16 slots.

I will try that tool and see what happens
 

BackupProphet

Well-Known Member
Jul 2, 2014
1,157
732
113
Stavanger, Norway
intellistream.ai
For fast file transfers you need to use RDMA. It is imposssible to do fast file transfer without RDMA. For Windows, your only option is SMB Direct. For Linux you have NFSoRDMA, NVMe-oF, SRP and maybe a few other options.
For ConnectX-3 generation, RDMA works best with Infiniband, where you use IPoIB to initiate the RDMA connection. RoCE, both v1 and v2, can work, but I've wasted enough time getting it to work without weird problems. ConnectX-4++ is better if you want to use RoCE.
I only use the inbox drivers on Linux. Mellanox drivers are just too complicated.
 
  • Like
Reactions: nexox

ebarnes02

New Member
Sep 7, 2024
3
0
1
Thanks! I’m not familiar with RDMA or RoCE so I will do some research and give it a try. Thanks for your help!