I've been messing around for quite some time.... Trying to figure out the correct set of firmware & drivers to use on these ConnectX-2 dual port QDR cards (MT26428) on ESXi 6.0, with the best results.
Currently running 2.10.720 firmware on all adapters. (tried 2.9.1200 and 2.9.1000)
Currently testing 1.8.2.5 OFED drivers for ESXi (tried 1.8.2.4 alot, just moved on to 1.8.2.5)
ESX List of Drivers Installed
net-ib-cm 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-core 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-ipoib 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-mad 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-sa 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-umad 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-memtrack 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-mlx4-core 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-mlx4-ib 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
scsi-ib-srp 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
I've got the links up, and stable (via 4036E switch with Subnet Mgr)
My Windows 2012 R2 Server is working perfectly (NAS)
My Windows 10 Desktop is working perfectly, and achieving amazing performance from the NAS
- (Yeah I threw a QDR IB in my desktop, for faster NAS access)
My ESXi 6.0 U2 server however, is having massive performance issues.
I originally tried a NFS mount from Windows 2012 R2 to the ESX Server.
- I get about 3MBytes/s or 24Mbit bandwidth
- vMotions, Deploying OVFs, etc all time-out
The storage is basically unusable, thus my entire ESXi lab is unusable. Any advice would be greatly appreciated.
Currently running 2.10.720 firmware on all adapters. (tried 2.9.1200 and 2.9.1000)
Currently testing 1.8.2.5 OFED drivers for ESXi (tried 1.8.2.4 alot, just moved on to 1.8.2.5)
ESX List of Drivers Installed
net-ib-cm 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-core 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-ipoib 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-mad 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-sa 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-ib-umad 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-memtrack 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-mlx4-core 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
net-mlx4-ib 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
scsi-ib-srp 1.8.2.5-1OEM.600.0.0.2494585 MEL PartnerSupported 2016-11-05
I've got the links up, and stable (via 4036E switch with Subnet Mgr)
My Windows 2012 R2 Server is working perfectly (NAS)
My Windows 10 Desktop is working perfectly, and achieving amazing performance from the NAS
- (Yeah I threw a QDR IB in my desktop, for faster NAS access)
My ESXi 6.0 U2 server however, is having massive performance issues.
I originally tried a NFS mount from Windows 2012 R2 to the ESX Server.
- I get about 3MBytes/s or 24Mbit bandwidth
- vMotions, Deploying OVFs, etc all time-out
The storage is basically unusable, thus my entire ESXi lab is unusable. Any advice would be greatly appreciated.
Last edited: