Mellanox ConnectX-3 FDR/EN 56G 2-Port Adapter | HP 649282-B21 (MCX354A-FCBT)

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

TD_Trader

Member
Feb 26, 2013
63
7
8
I just purchased a few of the Mellanox/HP 649282-B21 (MCX354A-FCBT) HCA's from here last week: HP 649282 B21 Infiniband FDR En 10 40GB2P 544FLR QSFP Adapter Dual Port Spare | eBay

I was tempted to stick a few of these cards in all eight nodes of my two C6100 servers and also stick one extra in a PowerEdge T110 (that I was using as a file server), and also possibly try to stick one in a Mac Pro.

I received the first card today, and opened up the box. I'm currently working on getting the card installed and hopefully working. I plan to use this thread to log my details/experiences, and steps that I have taken (as to whether I can get the card working or not in a PowerEdge T110 and the eight Dell C6100 nodes).


The good news is that if you remove the metal bracket, and also remove the second blue plastic bracket (across the top), the cards do fit in a normal PCIe slot.


https://www.dropbox.com/s/xj3y4wjb2wcply4/2014-01-27 19.00.28.jpg

https://www.dropbox.com/s/p9uxrru9kkiznv2/2014-01-27 19.01.06.jpg

https://www.dropbox.com/s/hbnxdkm6pgsfji9/2014-01-27 19.00.54.jpg




I already stuck one into a PowerEdge T110 server earlier tonight, and I powered it on. No smoke yet.


https://www.dropbox.com/sc/y9zg6u8cmxlz52u/yVHv_A6CmU


It doesn't smell like anything is burning... (just yet...)

ESXi 5.5 booted up fine, but I didn't see the card as being detected in ESXi 5.5.


https://www.dropbox.com/s/dx2tjbr9zhwu92g/Screenshot 2014-01-27 18.49.17.png


The PowerEdge T110 doesn't have any x16 slots, and it seems that it's only an x8 card/adapter.

I have the Mellanox/HP 649282-B21 (MCX354A-FCBT) HCA in slot 1 (x8 slot)
I have a ServeRAID M5014/5015 (46M0916) in slot 2 (x8 slot)
I have a MHGH29-XTC (Rev X4) dual 20G ConnectX HCA in slot 3 (x4 slot)

Any suggestions/ideas as to how I can get the Mellanox/HP 649282-B21 (MCX354A-FCBT) HCA adapter to appear in ESXi 5.5.0?

I managed to find Windows Server 2012 R2 drivers, as well as the latest firmware (dated 2014-JAN-15) on HP's website here: Drivers, Software & Firmware for HP InfiniBand/Ethernet Adapters - HP Support Center

I'm trying to get the Adapter cards working with ESXi 5.5.0, but HP's website only seems to have Microsoft Windows 2008/2008R2/2012/2012R2, RHEL 5/6, and SUSE LES 10/11 listed. Any ideas as to whether it's possible to get these cards working with ESXi 5.5.0?

Any ideas? Has anyone managed to get one of these Mellanox ConnectX-3 FDR/EN 56G 2-port Adapter ( HP 649282-B21 ) working with ESXi 5.5.0?
 

TD_Trader

Member
Feb 26, 2013
63
7
8
HP's website doesn't seem to list ESXi 5.5.0 as a supported OS:
Drivers, Software & Firmware for HP InfiniBand/Ethernet Adapters - HP Support Center

I found installation instructions for Windows Server 2012 R2 here: http://h20566.www2.hp.com/portal/site/hpsc/template.PAGE/public/psi/swdDetails/?sp4ts.oid=5194975&spf_p.tpst=swdMain&spf_p.prp_swdMain=wsrp-navigationalState%3Didx%253D2%257CswItem%253DMTX_8502c97bc5714a0da86b2bc396%257CswEnvOID%253D4168%257CitemLocale%253D%257CswLang%253D%257Cmode%253D4%257Caction%253DdriverDocument

I found RHEL 5 Server instructions here: Drivers, Software & Firmware for HP InfiniBand/Ethernet Adapters - HP Support Center

I found RHEL 6 Server instructions here:
Drivers, Software & Firmware for HP InfiniBand/Ethernet Adapters - HP Support Center

I'm debating on whether to just install Windows Server 2012 R2 (clean bare-metal install) onto the PowerEdge T110, just to see whether or not it can/will recognize the Mellanox ConnectX-3 FDR/EN 56G 2-port Adapter ( HP 649282-B21 ). If it does, then maybe install the latest CentOS and see if I can get it working under CentOS as well.

Then if I can, then I may try and figure out a way to get it working under ESXi 5.5.0 (custom vib?). My ultimate goal is to get the adapters all working under ESXi 5.5.0. I've never built a custom vib, and not exactly sure how I would even get it working under ESXi 5.5.0, anyone have any suggestions as to an easier/better way to approach this?
 

Biren78

Active Member
Jan 16, 2013
550
94
28
I would try WS2012R2. Here's why: That should work easily. If it doesn't, then you can suspect a firmware/ hardware issue. If it does, then making a custom VIB becomes an option.

If you don't try WS2012R2 a known good OS then you will be stuck wondering if it is a VIB issue OR an issue with the cards in the system.

BTW - http://www.servethehome.com/install-vmware-esxi-5x-intel-i210-intel-i350-ethernet-adapters/ You can probably follow that and just use Mellanox instead of intel drivers.
 

Chuckleb

Moderator
Mar 5, 2013
1,017
331
83
Minnesota
If that card's a VPI card, then the drivers for the MHGH29 would probably work with it, Mellanox only has one set of drivers for all their ConnectX cards. You are either running in IB mode (MLNX-OFED-ESX-*.zip) or EN mode (mlx4_en-mlnx-16.1.2-471530.zip) are the latest. If you know how to reinstall the driver from the ssh connection, you could try to switch over. Interestingly, I think that ESX 5.5 comes with its own EN drier that is newer than what's on the website I think.

Since you have a lot of C6100 nodes, what I would do is to slide the card into another node, reboot and install a fresh copy of ESX 5.5. During the install, it should detect and add the latest EN drivers for it and you'll be able to see it in the configure networks when done. That's what I just did for a test card, it was quick and easier than playing around with the drivers and I wanted to see if the new EN drivers were any better (not sure if they are).

But as Biren78 said, I'd at least get it working in a known system to see if the card is detected. Even a liveCD and the output of lspci would tell me that the card can be seen and what the OS thinks it is.
 

lmk

Member
Dec 11, 2013
128
20
18
If that card's a VPI card, then the drivers for the MHGH29 would probably work with it, Mellanox only has one set of drivers for all their ConnectX cards. You are either running in IB mode (MLNX-OFED-ESX-*.zip) or EN mode (mlx4_en-mlnx-16.1.2-471530.zip) are the latest. If you know how to reinstall the driver from the ssh connection, you could try to switch over. Interestingly, I think that ESX 5.5 comes with its own EN drier that is newer than what's on the website I think.

Since you have a lot of C6100 nodes, what I would do is to slide the card into another node, reboot and install a fresh copy of ESX 5.5. During the install, it should detect and add the latest EN drivers for it and you'll be able to see it in the configure networks when done. That's what I just did for a test card, it was quick and easier than playing around with the drivers and I wanted to see if the new EN drivers were any better (not sure if they are).

But as Biren78 said, I'd at least get it working in a known system to see if the card is detected. Even a liveCD and the output of lspci would tell me that the card can be seen and what the OS thinks it is.
Good idea on leaving the OS intact and swapping the card for quick testing back and forth. As it is a C6100, I would suggest just swapping the sled and save time on all the screws :)

I replied to the other thread also suggesting the lspci command via ESXi that is already installed. However, Biren78's tip to just try Windows 2012R2 is a good alternative. I have it installed and even without installing the new drivers manually, it showed the devices under Network Adapters and System Devices in Device Manager.
 

TD_Trader

Member
Feb 26, 2013
63
7
8
I would try WS2012R2. Here's why: That should work easily. If it doesn't, then you can suspect a firmware/ hardware issue. If it does, then making a custom VIB becomes an option. If you don't try WS2012R2 a known good OS then you will be stuck wondering if it is a VIB issue OR an issue with the cards in the system.

BTW - http://www.servethehome.com/install-vmware-esxi-5x-intel-i210-intel-i350-ethernet-adapters/ You can probably follow that and just use Mellanox instead of intel drivers.
Thanks Biren78! I'll try to do a WS2012R2 baremetal install later tonight or sometime tomorrow (I'm incredibly swamped at the moment) just to see whether the card is recognized.

If that card's a VPI card, then the drivers for the MHGH29 would probably work with it, Mellanox only has one set of drivers for all their ConnectX cards. You are either running in IB mode (MLNX-OFED-ESX-*.zip) or EN mode (mlx4_en-mlnx-16.1.2-471530.zip) are the latest. If you know how to reinstall the driver from the ssh connection, you could try to switch over. Interestingly, I think that ESX 5.5 comes with its own EN drier that is newer than what's on the website I think.
@Chuckleb - Yes it is listed as a ConnectX-3 VPI HCA card. I'll probably install WS2012R2 first just to see if it's physically recognized (and as what? Mellanox? HP?) and then I'll work on trying to install the Windows 2012R2 drivers (from the HP website). Then the next step is I will probably stick a second card (once the other cards arrive) into one C6100 node, and then I'll do a clean install of ESXi 5.5.0 (onto a 4GB USB stick) and then just do a fresh boot/install of ESXi 5.5.0 from the 4GB USB stick and see if it does recognize/install the proper drivers for the card.

Since you have a lot of C6100 nodes, what I would do is to slide the card into another node, reboot and install a fresh copy of ESX 5.5. During the install, it should detect and add the latest EN drivers for it and you'll be able to see it in the configure networks when done. That's what I just did for a test card, it was quick and easier than playing around with the drivers and I wanted to see if the new EN drivers were any better (not sure if they are).
Sounds like a plan. :)

But as Biren78 said, I'd at least get it working in a known system to see if the card is detected. Even a liveCD and the output of lspci would tell me that the card can be seen and what the OS thinks it is.
I'll try to get something fired up and done tonight (or tomorrow night at the latest) just to make sure that the card does work and is detected, and see what the OS thinks it is. I'll post some pictures and update the thread as I go (for others that may possibly read/follow this thread in the future).

Good idea on leaving the OS intact and swapping the card for quick testing back and forth. As it is a C6100, I would suggest just swapping the sled and save time on all the screws :)

I replied to the other thread also suggesting the lspci command via ESXi that is already installed. However, Biren78's tip to just try Windows 2012R2 is a good alternative. I have it installed and even without installing the new drivers manually, it showed the devices under Network Adapters and System Devices in Device Manager.
I'll try the lspci command first on the existing PowerEdge T110 (that I already have the card installed in temporarily). I'll post screen shots if the card is detected (and as what) by ESXi. If the card is not being detected, then I'll just stick a fresh/new 4GB stick (with a fresh/clean ESXi 5.5.0 install) and just see if it detects the card during a fresh/new install of ESXi 5.5.0.

Hopefully the HCA will be detected, and hopefully the existing Mellanox drivers in ESXi 5.5.0 will be sufficient. If not, then I'll have to try and find other drivers and/or dig deeper. :(

I really wish that HP's website listed VMware ESXi as a supported OS. I'll work on it tonight, and post any new findings/results/pictures later tonight (or sometime tomorrow).
 

TD_Trader

Member
Feb 26, 2013
63
7
8
I tried SSH'ing into the ESXi 5.5.0 install, and I did a lspci, to see what shows up.

Unfortunately the Mellanox ConnectX-3 FDR/EN 56G 2-port Adapter (HP 649282-B21) card/adapter did not show up. :(

But as Biren78 said, I'd at least get it working in a known system to see if the card is detected. Even a liveCD and the output of lspci would tell me that the card can be seen and what the OS thinks it is.
I SSH'd into ESXi 5.5.0 (on the PowerEdge T110 where the card/HCA is installed) and the output of lspci did not show the adapter.

Here is a screenshot of ESXi 5.5.0 when I typed the "lspci" command:
https://www.dropbox.com/s/rcx8qcvxgvpnpu8/Screenshot%202014-01-28%2019.37.11.png

The LSI MegaRAID adapter shows up, the Mellanox MHGH29-XTC Rev X4 (dual 20Gbps DDR ConnectX-1 HCA) shows up, but the new ConnectX-3 FDR/EN 56G 2-port Adapter (HP 649282-B21) doesn't show up when I type the "lspci" command in ESXi 5.5.0.

I guess the next step is to install WS2012R2 on baremetal, and see if it detects the adapter. If not, then I guess I'm out a fairly large chunk of money on some Mellanox adapters that don't seem to work in a normal PCIe slot. :(

Before I resort to installing WS2012R2, I might just create a new USB flash (4GB) with a fresh ESXi 5.5.0 install on it, and maybe the fresh install will detect the new Mellanox adapter. Once I get the fresh install setup/configured, then I'll see if it detects the new adapter. If not, then I'll try to SSH into the new install, and do a "lpsci" command and see if it's even detected or not.

If that doesn't work, then I guess doing a baremetal install of WS2012R2 might be the only option left (just to see if WS2012R2 can detect the adapter).

If that doesn't work, then I'm guessing that the cards might just be headed right back to another e-bay posting. Unless I can think of something/anything different. :(

I replied to the other thread also suggesting the lspci command via ESXi that is already installed. However, Biren78's tip to just try Windows 2012R2 is a good alternative. I have it installed and even without installing the new drivers manually, it showed the devices under Network Adapters and System Devices in Device Manager.
I'm going to try doing a fresh install of ESXi 5.5.0, and maybe that will load the correct drivers (but wouldn't a lspci cmd show the physical hardware it sees/detects even if the drivers were not loaded?) If a fresh install of ESXi 5.5.0 doesn't work, then I'll try the baremetal install of WS2012R2, and if that doesn't work... then I might have some useless cards/adapters on my hands. :(

Unless it's an HP firmware issue? Maybe there's a way to reflash the cards with a different Mellanox firmware? I'll start with doing a fresh install of ESXi 5.5.0 and then a baremetal install of WS2012R2, and we'll see what happens and take it from there.
 

parawizard

New Member
Jan 28, 2014
21
0
1
You bought multiple? Have you tried more than one of them? It is possible it is DOA.

So it is not showing up in your file server or your esxi server?

Have you checked the bios to make sure that those pci-express slots are turned on?

Maybe try emptying out the other addon cards and trying it in a few different slots.
 

lmk

Member
Dec 11, 2013
128
20
18
You bought multiple? Have you tried more than one of them? It is possible it is DOA.

So it is not showing up in your file server or your esxi server?

Have you checked the bios to make sure that those pci-express slots are turned on?

Maybe try emptying out the other addon cards and trying it in a few different slots.
^ lots more good points - and it is related to the suggested testing of ESXi AND Windows, even though lspci should show it.

There could always be something really quirky going on. IRQs can conflict/screw up - different combination of hardware (other add-on cards, mobo, BIOS/UEFI, etc) and software (firmware, drivers, OS) can make or break it :)

And as parawizard suggested, try another card
 

TD_Trader

Member
Feb 26, 2013
63
7
8
^ lots more good points - and it is related to the suggested testing of ESXi AND Windows, even though lspci should show it.

There could always be something really quirky going on. IRQs can conflict/screw up - different combination of hardware (other add-on cards, mobo, BIOS/UEFI, etc) and software (firmware, drivers, OS) can make or break it :)

And as parawizard suggested, try another card
Yes, I agree with IRQ conflicts, and I've already tried pulling all the other cards out of the machine(s). There is nothing in the machine other than the Mellanox/HP card. I've spent the past two days scratching my head. I'm stumped. I've tried every single PCIe slot. I've tried both ESXi 5.5.0 and Windows Server 2012 R2. I've installed the Mellanox software (from HP's website) that is specifically for these cards.

The cards are not showing up in Device Manager.

Here: https://www.dropbox.com/s/crh6ol557d5v9l4/Screenshot%202014-01-29%2016.37.13.png

Here: https://www.dropbox.com/s/iu0p05ulby52490/Screenshot%202014-01-29%2016.37.20.png

I'm wondering if the cards ship with some very strange/specific firmware that is incompatible with Windows 2012 or ESXi 5.5.0? (but wouldn't the "MLNX_VPI_WinOF-4_60_All_win2012R2_x64" software automatically check/update the adapter firmware upon installation of the MLNX_VPI_WinOF-4_60_All_win2012R2 software?)

Any ideas as to what I should try next?

I've seen where people have had problems with the older generation MHGH28-XTC's running firmware older/newer than 2.9.1000 (or the 2.7.x or 2.6.x issues).

These are all brand new cards (new in the box - SEALED) never been opened. The other eleven cards showed up today. They're all brand-new and sealed in their original boxes.

I've tried three different brand-new cards. I'm going to spend a few more hours trying to tinker/figure it out, and then if I still can't get this card to work (or even show up) then I might contact HP/Mellanox, and see if maybe they have any suggestions/solutions. I highly doubt that the first two/three cards that I've tried are ALL bad. Maybe it's the firmware that comes pre-installed on the cards? Maybe it's specifically designed for HP hardware/systems? Maybe the PCIe pin-outs are different? (I doubt that ALL of the cards are bad, they are ALL brand new in brand-new sealed boxes with the original HP seals still intact).

I would have thought that these cards should/would have worked in a normal PCIe slot?

I ran the Mellanox Windows Snapshot Utility.
Here: https://www.dropbox.com/s/xx0ktfo8q3bloe7/Screenshot%202014-01-29%2016.33.32.png

The Mellanox Windows System Snapshot generated HTML document with full System details is here:
https://www.dropbox.com/s/n4rwlv3rdqqvf7j/system_snapshot_W2K12SVR_1-29-2014_11-59-10_AM.html

I have TWELVE brand new cards (from two different vendors) and I can't seem to even get ONE of them to work. :(

I'm guessing it's either a firmware issue, or maybe the cards/pinouts are different on the PCIe bus? Do these cards not work in a standard PCIe slot?

Here's the unboxing of the NINE additional brand new cards that I received today.

Here: https://www.dropbox.com/s/z8597lw6j3foi3i/2014-01-29%2016.59.57.jpg

Here: https://www.dropbox.com/s/ryi1b9d24jg5fcj/2014-01-29%2016.50.51.jpg

Here: https://www.dropbox.com/s/0pe44fqb86lmh76/2014-01-29%2016.51.12.jpg

Here: https://www.dropbox.com/s/13h13md557j88xi/2014-01-29%2016.51.44.jpg

They're brand new, never opened, and all of the HP seals are still intact. :(

I've tried all three PCIe slots, and the instructions said to try and use a x16 mechanical slot (even though it's an x8 card).

HP's website says that it's a PCIe x8 adapter card:

Product dimensions (W x D x H) 2 x 7 x 8.5 in
Weight ( Imperial ) 0.45 kg
Weight ( Metric ) 1 lb
Extended specifications
Interfaces/Ports
Host Interface PCI Express x8
I/O Expansions
Number of Total Expansion Slots 2
Expansion Slot Type QSFP+
Number of QSFP Slots 2
Physical Characteristics
Form Factor Plug-in Card
Height 7"
Width 8.5"
Weight (Approximate) 1 lb
Thickness 2"
Miscellaneous
Device Supported PC


Are there any specific commands that I can type/run to try and query the adapter cards? (Using the Mellanox software/drivers?) I'll need to look/flip through the various manuals on Mellanox's website, and see if I can find anything about querying/reflashing HP OEM cards/adapters.

Any additional thoughts/ideas?
 

TD_Trader

Member
Feb 26, 2013
63
7
8

OBasel

Active Member
Dec 28, 2010
494
62
28
How much is your time worth and how badly do you need this done today? Seems like you are going to spend much time trying to figure this out. At this point I'd sell the cards and wait out a deal on the standard 354A's.
 

TD_Trader

Member
Feb 26, 2013
63
7
8
You bought multiple? Have you tried more than one of them? It is possible it is DOA.
Yep, I bought TWELVE of them. ALL of them are brand-new in the box. SEALED, and never opened.

Have you tried more than one of them? It is possible it is DOA.
Yep, I tried THREE of them. I highly doubt that THREE are all DOA. :(

So it is not showing up in your file server or your esxi server?
Nope, they don't seem to come up in ESXi 5.5.0 or in Windows Server 2012 R2. The system doesn't even detect/recognize the card. The cards don't even appear in Device Manager, nothing! :(

You'd think that you'd at least get the card to appear (and show a yellow exclamation point) or something.

Have you checked the bios to make sure that those pci-express slots are turned on?
Yes, other cards work just fine in the system. I have no problems with the system detecting (and/or using) the LSI/IBM MegaRAID/SERVE 5014/5015 adapter. I've tried the MegaRAID adapter in all three slots and it works. I have tried the Mellanox MGH28-XTC adapter in all three PCIe slots and it works (it's detected by both Windows Server 2012 R2 and ESXi 5.5.0). Why would the HP card not be detected?

Maybe try emptying out the other addon cards and trying it in a few different slots.
Already tried that. I have absolutely NOTHING in the system, other than the ONE single Mellanox MCX354A-FCBT (HP 649282-B21) adapter.

I would think that even if the drivers don't support the adapter/card, that a lspci command would at least show the adapter sitting in the PCIe slot (and connected to the system).

It seems strange that neither ESXi nor Windows Server 2012 R2 would even detect the card/adapter. I stuck it into three different servers, as well as a Mac Pro, and none of the systems seem to even detect the card/adapter.

Any ideas?
 

TD_Trader

Member
Feb 26, 2013
63
7
8
How much is your time worth and how badly do you need this done today? Seems like you are going to spend much time trying to figure this out. At this point I'd sell the cards and wait out a deal on the standard 354A's.
It's just for a personal home lab. I have time, and don't mind messing around with them. I might wait another week or two, and if I can't get them working then I might just dump them. But for now, I'd at least like to try and get them working/running.
 

lmk

Member
Dec 11, 2013
128
20
18
Given all the different things you have tried, I would say it comes down to the pinouts being different.

The giant C6100 thread had Rimblock and Solid (and some others) figure out a pinout on some board (see page 75 and on - http://forums.servethehome.com/proc...xs23-ty3-2u-4-node-8-cpu-cloud-server-75.html). Maybe they could help? You would need the requisite tools.

I'll try to find the resources online that suggested a custom port was being used by HP that cannot be used with regular PCIe cards.
 

lmk

Member
Dec 11, 2013
128
20
18
Found it!

Okay, so lots of documentation exists saying HP is using PCIe x8 3.0 with the Mellanox ConnectX-3 HCA. However, I managed to find an official HP document that explicitly lists the card as a "custom implementation".

Networking flexibility in HP ProLiant Gen8 servers with FlexibleLOM technology (White Paper/4AA4-6708ENW.pdf)

Page 2:

FlexibleLOM technology uses a custom implementation of the PCIe 3.0 x8 interface that maintains the close-coupled nature of an embedded NIC and preserves the HP ProActive Insight architecture of HP ProLiant Gen8 servers. FlexibleLOM technology does not require additional CPU resources over standard LOM architecture and does not occupy a regular PCI slot.
 

TD_Trader

Member
Feb 26, 2013
63
7
8
Found it!

Okay, so lots of documentation exists saying HP is using PCIe x8 3.0 with the Mellanox ConnectX-3 HCA. However, I managed to find an official HP document that explicitly lists the card as a "custom implementation".

Networking flexibility in HP ProLiant Gen8 servers with FlexibleLOM technology (White Paper/4AA4-6708ENW.pdf)

Page 2:
Well, that answers all of my questions (although it would have been nice/helpful if HP would have listed the PINOUTS for their "FlexibleLOM" implementation). Lots of the sales material (as well as three different sales reps) ALL said that the adapter cards use a standard PCIe 3.0 x8 interface and would work in a standard PCIe slot (at x8 performance). Apparently there is quite a bit of confusion as to what "FlexibleLOM" is and why it uses a standard PCIe 3.0 connector/interface (and there doesn't seem to be any documentation as to what the exact pin-outs are or whether a "FlexibleLOM" adapter can/will or will not work in a standard PCIe 3.0 slot).

Based on my trials, it doesn't seem that it will (despite being told that they are PCIe 3.0 x8 adapters and DO work in standard PCIe x8 3.0 slots). Apparently this is not the case, and HP probably shouldn't use the terms "PCIe 3.0 x8" and it probably would have been better for HP to stick with a custom mezzanine connector (instead of using a standard PCIe 3.0 x8 connector with strange off-the-wall custom pin-outs) to avoid any type of confusion.

I guess since I can't get ANY of the brand-new Mellanox/HP cards to work, all the brand-new FDR/EN 649282-B21 cards will probably be heading to e-Bay. :(

I wonder how difficult it would be to create an "adapter" board that would convert the standard PCIe 3.0 pinouts to the HP "FlexibleLOM" pinout conversion?

I was hoping that maybe they could just be re-flashed with standard Mellanox firmware, but if the pinouts are wired differently, then that's probably not an option.

I guess I'll probably just have to sell off ALL of these brand-new HP cards, and try to go back and buy/re-buy some different Mellanox cards. Nice chunk of money up in smoke, but at least we know that these stupid HP cards don't work in a standard PCIe slot. :(

[/ END OF STORY] ;(
 

TD_Trader

Member
Feb 26, 2013
63
7
8
Well, I removed the Mellanox MCX354A-FCBT adapter card (HP 649282-B21) adapter card(s) from the Dell PowerEdge T110, but the Dell PowerEdge T110 is now continuously rebooting. I'm not exactly sure why it's doing it now (continuous reboot), but apparently it didn't like the HP "standard PCIe 3.0 x8" FlexibleLOM adapter cards, which apparently are not wired the same as a standard PCIe 3.0 socket, and now the PowerEdge T110 won't even post-up properly.

I shut the PowerEdge T110 down, powered it off. Removed the Mellanox MCX354A-FCBT adapter card (HP 649282-B21) from the Dell PowerEdge T110, turned it back on and the Dell logo appears, it shows the BIOS version, and goes on to show the post-up information, and then suddenly the monitor shuts off (no video signal) and then I watch the lights on the front of the Dell PowerEdge T110 flash, and then it reboots itself.

So something is now wrong with the PowerEdge T110. Apparently HP's FlexibleLOM adapter cards didn't "play nice" with the standard PCIe sockets. The PowerEdge T110 is now continuously rebooting. I'm hoping that I didn't damage/destroy my mainboard.

Here's a video: [video]https://www.dropbox.com/s/h4odlf5hl435uza/2014-01-29%2020.01.02.3gp[/video]

Download the video clip here: https://www.dropbox.com/s/h4odlf5hl435uza/2014-01-29%2020.01.02.3gp

The "Number 1" light flashes first, then the "Number 3" and "Number 4" light come on (so Numbers 1, 3, and 4 are on). Then "Number 1" and "Number 4" lights shut off, and only "Number 3" light is on, and then after a few seconds the numbers 1, 3, and 4 lights come back on (as the fans quiet down). Then lights 1 and 3 shut off, and lights 2 and 4 turn on. Then after about 2 seconds, light #1 shuts off and lights #2, 3 and 4 turn on. Then the display turns on, and you can see the Dell Screen "loading" and then the system begins to power up/post, and it shows the BIOS version installed, the SATA controller, and then it gets to the point where it checks the PCIe bus, and then suddenly the power cuts out for a split second, and then the PowerEdge T110 resets itself and reboots.

This is continuous (in an endless loop) and seems to do this over and over and over again.

I guess the PowerEdge T110 is now toast. :(

It was definitely a very EXPENSIVE experiment. Several thousands of dollars worth of cards up in smoke, and a PowerEdge T110 that is up in smoke. So apparently there must be a very different pinout configuration on those Mellanox/HP PCIe 3.0 x8 adapter cards.

{LESSON}
Apparently the standard PCIe 3.0 connector is NOT meant to be inserted into a standard PCIe 3.0 slot, and apparently the pinouts are in fact different, and apparently CAN cause damage to your mainboard and/or adapter cards.
{/LESSON}

[/END OF STORY} ;(
 

TD_Trader

Member
Feb 26, 2013
63
7
8
Yep, the PowerEdge T110 is toast.

Not sure whether it can be fixed or not, but after trying THREE different Mellanox/HP FlexibleLOM adapters inside the PowerEdge T110, something must have happened because when I removed the last one, and powered it back on, it now just continuously reboots.

I guess the PowerEdge T110 is now headed towards E-Bay (as scrap/parts). The processor, memory, and other components are probably fine (I would think) but it does look like something definitely happened to the PowerEdge T110 mainboard, and it apparently did NOT like the Mellanox/HP FlexibleLOM [Mellanox MCX354A-FCBT (HP 649282-B21)] adapter cards.

So clearly these adapter cards are for HP Proliant Gen8 Servers ONLY. Don't EVER stick one in a Dell, or you'll definitely be SORRY. :(

Lots of "colorful" words were coming out of my mouth tonight...
:(

It was definitely an "interesting" experiment...
[No happy ending here...]