Mellanox purchase - need a sanity check

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

brianmat

Member
Dec 11, 2013
58
9
8
Before I pull the trigger on some hardware I just wanted to get a sanity check from those of you who have used the ConnectX-2 stuff. It gets a little confusing with the different product lines and SKUs so here's what I am planning:

1. OmniOS Napp-It ZFS appliance:
MHRH2A XSR Mellanox Connectx 2 QDR 40g IB Networking Adapter | eBay

2. ESX 5.5 servers (x3):
MHRH2A XSR Mellanox Connectx 2 QDR 40g IB Networking Adapter | eBay
or
Mellanox Connectx 2 VPI 10GbE Dual Port Adapter Card MHQH29C XTR | eBay

Note: 1 ESX server is a Napp-It all-in-one ZFS server

3. Mellanox switch
New F s Mellanox MIS5025Q 1SFC 36 Port QDR Infiniband Switch 2 88TB s IS5025 | eBay

4. Cables
This is where I am really unsure of what I need. This will all run in the same cage so nothing over 6 feet will be necessary.

I am looking to run 10Gb on my NFS connections to my storage appliances and vMotion. I'm looking at the dual port cards since the price difference between a single and dual are not that significant so I went for future expansion.

Now, it looks to me like all of this will work together including the OS support. I still need to sort out #4 though.

We'll be at ESX 5.5 for a while but I would like to go to 6 at some point and it looks like ESX 6 support for the ConnectX-2 cards may be hit or miss. It would not be a deal killer to be stuck at 5.5 for a year if it means saving a few grand over copper 10Gb. By then hopefully the prices will come down a bit.

Please let me know if you see any red flags or if I am completely off base with the hardware and my environment.
 
  • Like
Reactions: T_Minus

brianmat

Member
Dec 11, 2013
58
9
8
FWIW I punted a bit and went with a Voltaire 4036 switch instead since it has a built in subnet manager and that's one less thing I have to worry about. I was able to get the switch, 3 2 port VPI cards, a single port VPI card, and cables for about $600 so it's worth giving it a shot. It seems there is enough info here on the Voltaire to help muddle through this process a bit.
 

PnoT

Active Member
Mar 1, 2015
650
162
43
Texas
I have the MIS5024 (unmanaged version) and you'll like the switch for sure other than the amount of noise it puts out. The switch is the loudest thing in my rack :D

If you run into any issues pipe us as a lot of us here can help out with configuration and whatnot. I've enjoyed being on IB and I hadn't even heard of it until I started visiting these forums.
 

T_Minus

Build. Break. Fix. Repeat
Feb 15, 2015
7,648
2,065
113
FWIW I punted a bit and went with a Voltaire 4036 switch instead since it has a built in subnet manager and that's one less thing I have to worry about. I was able to get the switch, 3 2 port VPI cards, a single port VPI card, and cables for about $600 so it's worth giving it a shot. It seems there is enough info here on the Voltaire to help muddle through this process a bit.
I too am now doing the EXACT same thing with the EXACT same hardware :)
Looking forward to putting this together, and seeing how you do it too :)

Keep us updated!
 

brianmat

Member
Dec 11, 2013
58
9
8

T_Minus

Build. Break. Fix. Repeat
Feb 15, 2015
7,648
2,065
113
Thanks for the links. I had already picked up those cables and luckily I already have 3 of the Cisco cables so it looks like I got lucky in this purchase. Hopefully this will not be too hard to get setup and configured.
Awesome.

I'll be doing the same as you so I look forward to what you find too :)
 

brianmat

Member
Dec 11, 2013
58
9
8
I am sure I will be posting several follow-up questions. One of the curses of a small shop - no fleet of admins just waiting for projects.

I am seeing mostly iSCSI and RDMA posts but I assume we can use IPoIB to run NFS with the configuration. Hopefully someone doesn't come along and tell me we have to start moving back to iSCSI again. NFS is just so much easier to deal with.
 

brianmat

Member
Dec 11, 2013
58
9
8
Looks like our switch came with 3.6.2 already installed and a regular null modem cable worked like a champ. Connected now through SSH and this week we get to test some configurations. We also followed the install procedures at https://forums.servethehome.com/index.php?threads/connectx-2-and-esxi-6-0.5382/#post-46805 for the Mellanox cards on ESX 5.5 and that worked as posted except the MFT tools zip file gave an error on installation. We tried two different versions of the file from the Mellanox site and no dice. We did install the individual .vib files though and everything looks fine in ESX. Interestingly enough one of our Connect-X2 cards only displayed 1 NIC in ESX until we went through the setup procedures and now we see both ports.

Are the steps at https://forums.servethehome.com/ind...aire-4036-subnet-manager-partition-conf.5244/ necessary or is this only for configuring VLANs? Our environment is small with only 5 ESX servers running so we're not quite as elaborate as other configurations out there. We just want 1 10GbE for NFS and 1 10GbE for vMotion on each server. Heck, that is probably even overkill and we could probably just do both and assign both NIC ports to the vSwitch.

If our OmniOS/Napp-It installation goes without a hitch then Christmas came early.