I purchased an LSI HBA on eBay but was shipped a Mellanox Infiniband card instead. I'm returning it, but not before giving it a test run. I got some nice results, especially considering the price.
The Infiniband card was listed on eBay for $67 including shipping, so I didn't expect much. It is a dual-port Mellanox card, DDR speed (20Gbit), and it says it's VPI Connect-X2, but it's not a current Mellanox part number. It exists on the Mellanox site, but just barely. The part number, by the way, is MHRH2A-XSR. Does anyone know anything about this card?
I installed the card in a Dell c6100 node running Windows 2008R2 using the Mellanox Ethernet driver and 100% default settings. The card shows up as a 16Gbit Ethernet card - that's 20 Infiniband Gbits minus the overhead of the 8/10 encoding. 40 Gbit QDR cards show up as 32 Gbit when put into Ethernet mode, so this looks normal so far. After install, I assigned the card an IP address, plugged just one of the ports into my Infiniband network, and spun up a RAM disk using StarWind software. The RAM disk is capable of something like 7GB/S, so it won't be a bottleneck.
I then used simple Windows file sharing to mount the ram disk on another c6100 node that is connected to the Infiniband network with a faster QDR card.
With the above in place, I ran IOMeter on the QDR node across the IB/IP network over to the DDR node. I tested 1MB random reads and 4kb random reads with one worker and a queue depth of 32. I also tested writes, but those stress StarWind, not the IB card.
Surprise: Throughput was 1,920MB/Second. Maximum IOPS was 43,200. I really did not expect Windows 2008R2 to do so well with Infiniband DDR single port, especially with a card that is not a current model. Perhaps I should test dual ports on Windows2012 where, hopefully, SMB3 will utilize both channels and saturate the PCIe2 bus to at least 3.4GB/s.
The Infiniband card was listed on eBay for $67 including shipping, so I didn't expect much. It is a dual-port Mellanox card, DDR speed (20Gbit), and it says it's VPI Connect-X2, but it's not a current Mellanox part number. It exists on the Mellanox site, but just barely. The part number, by the way, is MHRH2A-XSR. Does anyone know anything about this card?
I installed the card in a Dell c6100 node running Windows 2008R2 using the Mellanox Ethernet driver and 100% default settings. The card shows up as a 16Gbit Ethernet card - that's 20 Infiniband Gbits minus the overhead of the 8/10 encoding. 40 Gbit QDR cards show up as 32 Gbit when put into Ethernet mode, so this looks normal so far. After install, I assigned the card an IP address, plugged just one of the ports into my Infiniband network, and spun up a RAM disk using StarWind software. The RAM disk is capable of something like 7GB/S, so it won't be a bottleneck.
I then used simple Windows file sharing to mount the ram disk on another c6100 node that is connected to the Infiniband network with a faster QDR card.
With the above in place, I ran IOMeter on the QDR node across the IB/IP network over to the DDR node. I tested 1MB random reads and 4kb random reads with one worker and a queue depth of 32. I also tested writes, but those stress StarWind, not the IB card.
Surprise: Throughput was 1,920MB/Second. Maximum IOPS was 43,200. I really did not expect Windows 2008R2 to do so well with Infiniband DDR single port, especially with a card that is not a current model. Perhaps I should test dual ports on Windows2012 where, hopefully, SMB3 will utilize both channels and saturate the PCIe2 bus to at least 3.4GB/s.
Last edited: