Mellanox Adapter Help

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

PithyChats

Active Member
Feb 3, 2015
173
85
28
Need a bit of help wading through the Mellanox adapter sea.

I am looking for a 40GBE adapter that supports RDMA. Ideally I would like to use it with a Gnodal GS4008 as a 40GB ethernet adapter. As I understand it, Connectx VPI models are both IB and Ethernet. So will this card work?

Mellanox Certified Refurbished HCA-30024 ConnectX-2 VPI Adapter Card Dual-Port 40Gb/S QSFP PCIe2.0 X8 5.0Gt/S Tall Bracket RoHS R6 - Mellanox Store

Or do I need something else?

(I have not intention of buying it from Mellanox for that price. I can get these around $70 each, so I wanted to check if these will suffice before I pay more).

Thanks for the help!
 

Chuckleb

Moderator
Mar 5, 2013
1,017
332
83
Minnesota
The QDR cards come in either a 10GbE or 40GbE model, even though they can do 40Gb IB.

I have a pair of the 10Gb and a pair if the 40Gb cards, first pair I tried to get the 40E to work and failed. Then realized the mistake so bought the 56/40E cards. MCX354A-FCBT on eBay. Joe @Atlantic trading is good.

I'm selling my cards as well if you wanted to PM me.
 
  • Like
Reactions: Patrick

Patrick

Administrator
Staff member
Dec 21, 2010
12,535
5,860
113
For 40GbE I would also get the 56IB /40GbE ConnectX-3 VPI cards. 10GbE I like the ConnectX-2/ ConnectX-2 EN.
 
  • Like
Reactions: PithyChats

Chuckleb

Moderator
Mar 5, 2013
1,017
332
83
Minnesota
I agree. After realizing that I really didn't need the 40GbE and stopped playing with IB at home (more than enough at work), just switched out to the CX2 cards and now have 4 CX3s to sell.

Anyway, looks like they are at about $375.
New Mellanox MCX354A FCBT Infiniband FDR 40GBE HCA MCX354A FCBS Connectx 3 VPI | eBay

Just make sure you get the FCBT or equiv model that is 40T, the TCBT is the 10GbE but still in a QSFP form factor (for the IB). That's why I have two of those and two of the 40GbE models.

Last point of reference. Doesn't matter if you get 1 or 2 ports. The PCIe bus can only handle about 50Gbps, so you can only use one of them effectively. They are meant as failover, both Intel and Mellanox say this. So if you can get a single port cheaper than dual, go for that.
 
  • Like
Reactions: PithyChats

Chuckleb

Moderator
Mar 5, 2013
1,017
332
83
Minnesota
Finally got on to a computer. The specs for that card match the MHQH29C-XTR so that's what you should compare for when looking at tech sheets. That's a 40Gb IB/ 10GbE card.

Need a bit of help wading through the Mellanox adapter sea.

I am looking for a 40GBE adapter that supports RDMA. Ideally I would like to use it with a Gnodal GS4008 as a 40GB ethernet adapter. As I understand it, Connectx VPI models are both IB and Ethernet. So will this card work?

Mellanox Certified Refurbished HCA-30024 ConnectX-2 VPI Adapter Card Dual-Port 40Gb/S QSFP PCIe2.0 X8 5.0Gt/S Tall Bracket RoHS R6 - Mellanox Store
 
  • Like
Reactions: PithyChats

PithyChats

Active Member
Feb 3, 2015
173
85
28
Thanks this is most helpful. I suspected this card could not do 40GBE. For $375 the 56IB /40GbE ConnectX-3 VPI cards might be a bit heavy. I will stick to 10GBE for now as RDMA is more important to me, and I already have some Connectx-2 EN cards.
 

PigLover

Moderator
Jan 26, 2011
3,215
1,571
113
EN or VPI - you need connextX-3 cards for RDMA over Ethernet. The cx2 cards will do RDMA over IB but to get 'converged ethernet' you have to have cx3.

Also - RDMA is only supported on Windows Server loads. It is crippled out of win8/8.1. Annoying as hades. I didn't discover this until after I had swapped all my cards for cx3.
 

PithyChats

Active Member
Feb 3, 2015
173
85
28
EN or VPI - you need connextX-3 cards for RDMA over Ethernet. The cx2 cards will do RDMA over IB but to get 'converged ethernet' you have to have cx3.

Also - RDMA is only supported on Windows Server loads. It is crippled out of win8/8.1. Annoying as hades. I didn't discover this until after I had swapped all my cards for cx3.

So just to clarify RDMA or more particularly RoCE will not work with MNPH29D-XTR?

That is really annoying for 8/8.1. As I understand however it is fully supported in CentOS which I intend on using. Have you heard anything to the contrary?
 

cesmith9999

Well-Known Member
Mar 26, 2013
1,434
483
83
ROCE v2 technically requires DCB and PFC Switches.

When it is configured server side only. We see ROCE working.

note:we actually saw a 20% increase in copy times with ROCE disabled.

If you get in a saturation situation, you will want your switches to work properly, meaning you will want DCB and PFC configured. We are using Arista Switches at our work. Cisco's Nexus will work as well.

Chris
 

PigLover

Moderator
Jan 26, 2011
3,215
1,571
113
If you get in a saturation situation, you will want your switches to work properly, meaning you will want DCB and PFC configured. We are using Arista Switches at our work. Cisco's Nexus will work as well.
Without a doubt true - but the chances of two servers in a small/home lab saturating even the lamest 10Gb switch are pretty close to zero.

In any case, I agree - I have not seen any material gain using RoCE @ 10Gb.
 

cesmith9999

Well-Known Member
Mar 26, 2013
1,434
483
83
We are using 40GB Mellanox cards. It was actually cheaper for us to go 40 GB. It was a 20% increase in switch cost and a 70% decrease in NIC cost to go to 40 GB for new gear over 10 GB gear.

I agree with PigLover, in the home environment you will probably not see saturation issues. and we skipped from 1/10 gb traditional network to a 40 gb network so I do not know if there is any benefit @ 10 GB ROCE.

Unless you have multiple subnets I recommend ROCE V1 over ROCE V2. more documentation available to set this up.

Chris
 

PithyChats

Active Member
Feb 3, 2015
173
85
28
We are using 40GB Mellanox cards. It was actually cheaper for us to go 40 GB. It was a 20% increase in switch cost and a 70% decrease in NIC cost to go to 40 GB for new gear over 10 GB gear.

I agree with PigLover, in the home environment you will probably not see saturation issues. and we skipped from 1/10 gb traditional network to a 40 gb network so I do not know if there is any benefit @ 10 GB ROCE.

Unless you have multiple subnets I recommend ROCE V1 over ROCE V2. more documentation available to set this up.

Chris
If I may ask what cards and switch are you using?
 

cesmith9999

Well-Known Member
Mar 26, 2013
1,434
483
83
Mellanox MCX314A-BCCT in the servers

Arista 7050X switches - 32 * 40 Gb ports.

Chris
 
Last edited:
  • Like
Reactions: PithyChats