What switch for ConnectX-4 100 Gbit SIOMs?

vanfront

New Member
Jun 5, 2020
9
3
3
I am building a four-node hyperconverged infrastructure cluster based on Windows Server 2019 with Storage Spaces Direct (S2D) and Hyper-V. I have this Supermicro X11 system in place, supporting 24 NVMe disks: SYS-2029BT-HNR. This machine has four nodes, each equipped with a SIOM module AOC-MHIBE-m1CGM based on Mellanox ConnectX-4 VPI EDR controller with a single QSFP28 port, supporting speeds up to 100 Gbit. It also has a single gigabit ethernet port.

Besides that, I have three more Supermicro X9-gen servers, which I'd like to equip with a compatible NIC as well. No need for full compatibility with those ConnectX-4 VPI EDR controllers in terms of features, so I was thinking of ConnectX-3 single-port cards. I believe that 10 Gbit will suffice. The storage in those X9 boxes is HDD with some SSD caches, so will be considerably slower than the NVMe beast, and the actual traffic to/from X9-boxes will be just SMB, so file access / file copy.

As S2D will be the major traffic source on the 4-node box, supporting several tens of running VMs, I'd like to purchase a decent switch that can handle this. Prefer Ethernet over Infiniband, as there might be cases for some additional ethernet traffic to go through those ports, such as MS SQL Server cluster in VMs. I absolutely need RDMA or any other compatible remote access protocol supported by S2D. On the other hand, I think that I don't need full 100 Gbit, and 40 Gbit should be more than enough. I am not looking into a fully redundant design for now.

The switch should support server-side traffic only, connecting that 4-node box plus the remaining X9-based systems. Anything like 12 / 16 ports should do it. Traffic from workstations accessing SMB, RDP etc., and internet access would go through the 1 Gbit Ethernet ports connected to a standard gigabit switch.

I was looking for so some cheap small Mellanox switch, but most I can see on eBay are having EMC firmware. I have no experience with Mellanox and optical networking, so I'd really appreciate if someone could point me in the right direction to avoid unnecessary pitfalls.

So, what would you recommend in terms of:
  • switch
  • cabling
  • NICs for those X9 boxes