Mellonox 40Gbps Infiniband NIC

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

pc-tecky

Active Member
May 1, 2013
202
26
28
[edited]
I forgot and rediscovered via order history within PayPal that I purchased 2x 40Gbps(?) Dual port QDR(?) Infiniband NIC/HBA back in ~2013.

Where do I find a fair priced cable?
-or-
Cut my losses and move on? But what do I use instead? There's lots of talk about 10GbE with SPF+.

So, I noticed that Natex.us has a few different Cisco Catalyst switches and a few Intel X520-DA2/-DA1 10GbE network cards as well.
$150 for WS-C3560E-48TD-S Catalyst 3560E-48TD
$125 for Cisco Catalyst 3560G-24TS
$120 for Cisco WS-C3750E-24TD-S
$90 for Intel/Dell Ethernet Server Adapter X520-DA2
$85 for Intel Ethernet Server Adapter X520-DA2 w/Intel Hologram
$30 for Intel Ethernet Server Adapter X520-DA1 w/Intel Hologram

Where do I go from here, especially after the thread about counterfeit/fake hardware?
 
Last edited:

briandm81

Active Member
Aug 31, 2014
300
68
28
42
I think we need more information to reallyou help. How many systems are you connecting? What will you be using the network for?

Inifiniband is somewhat complex but super fast. So I think once we have more information we can help.
 

britinpdx

Active Member
Feb 8, 2013
367
184
43
Portland OR
So I think once we have more information we can help
Indeed, we are missing some information in order to move forward.

If you have Mellanox 40Gbps Infiniband cards, and you simply want to direct connect the cards, then you need a QSFP-QSFP cable of the appropriate length.

If it's 2 devices in the same rack, then look for passive cables, such as this one (3m is perhaps overkill, there are shorter ones) ..

That will allow you to physically connect the cards, but Infiniband needs a Subnet Manager to be running "somewhere" on the fabric in order to have devices communicate with each other. In the case of 2 direct connected devices, one of them needs to run a software based manager.

Not sure what OS you are running, but here's a starting point on the Mellanox Web Site.

I've had windows boxes direct connected and it's really not that hard to get going. Here's a great link for Server 2012 implementation to give you a flavor for what's involved.
 

pc-tecky

Active Member
May 1, 2013
202
26
28
Well, I got them to experiment/experience moving larger (ISO) files and folders fast. I only have two cards and no cables. I have been wanting to build a proper FreeNAS box with ECC RAM and enable file deduplication. Perhaps having a high-speed connection between (old) FreeNAS and (newer) FreeNAS at first, and then moving it over to ESXi, Core i5, or Phenom II.

Potential candidate computers:
Windows 7 on Phenom II (6-core) 1090T w/ 16GB (4x 4GB) DDR3 12800
Windows 7/10 on Core i5 2400 12GB (2x 2GB, 2x 4GB) DDR3 mixed PC3-10600? & PC3-12800
FreeNAS (8.x?) on Core 2 Duo E6320 w/ 8GB (4x 2GB) DDR2 PC2-5300
undetermined Core 2 Duo E6320 w/ 8GB (4x 2GB) DDR2 PC2-6400
ESXi on dual E5320 1.86GHz and 8GB (4x 2GB) FB DDR2 PC2-5300F
undetermined X8DA3 w/ 2x E5504 & 24GB ECC DDR3

Potential OSs:
Windows 7/8/10
Windows Server 2003/2008/2012
Linux - any of them, might try Debian, not a huge fan of Ubuntu, but viable
 
Last edited:

britinpdx

Active Member
Feb 8, 2013
367
184
43
Portland OR
I don't think FreeNAS currently has infiniband support, maybe it will make it into the next release.

I use an Intel X520 10G ethernet card in my FreeNAS box, Chelsio is also supported, no support for Mellanox 10G Ethernet.
 

ttabbal

Active Member
Mar 10, 2016
747
207
43
47
FreeNAS 9.10 recently released. It's based on FreeBSD 10 which does support Mellanox 10G cards. I believe you have to compile a driver currently, but everything you need is there, you just have to run a few commands. Hopefully they include it natively soon.

Deduplication is one of those things that sounds like a great idea, but in practice doesn't work all that well for most people. It also needs a TON of RAM or performance takes a very steep nosedive. I wouldn't try it with <64GB RAM. Make sure you read up on it thoroughly before trying it. Another important note, you can't just set a switch to disable it. Once the deduplication tables are on the disks, you have to destroy the dataset to get rid of them. So you need to move all the files to a new dataset, then "zfs destroy" the old one.

Compression works great for most any CPU you would use in a server. I use it for non-media storage areas. Media files are already compressed so it's just a waste of CPU to use it there.
 

ttabbal

Active Member
Mar 10, 2016
747
207
43
47
Yeah, FreeNAS 10 was taking too long, so they released 9.10 with the newer OS but the same web UI etc..
 

JustinClift

Member
Oct 5, 2014
36
15
8
@pc-tecky Out of curiosity, how'd you go with this?

Asking just because if you haven't gotten around to doing anything with the cards yet, then the nightly builds of FreeNAS 9.10 support Mellanox cards now. It's only "ethernet mode" support, but that's still at least 10GbE (or more depending on card).
 

pc-tecky

Active Member
May 1, 2013
202
26
28
I didn't get very far with this, but a managed to get a few of the under $5.00/each tall brackets. However, the funds seem to run away faster than I can get them and keep them.
 

JustinClift

Member
Oct 5, 2014
36
15
8
No worries at all. :)

Btw, the support for 10/40/56/100GbE Mellanox adapters is in FreeNAS 9.10-STABLE now (any ISO from 9.10-STABLE-201606270534 and onwards).

So, all you'd need is a cable... ;)
 

pc-tecky

Active Member
May 1, 2013
202
26
28
Yep, pretty much waiting (to get a cable or two). Who knows, maybe going with 10GbE twisted or 10GbE SPF+ is the more viable option.