Windows 10 failed to initialise Mellanox MCX311A-XCAT :(

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

Floggedhorse

New Member
Nov 18, 2019
9
0
1
Hi

Bought two cards

Mellanox MCX311A-XCAT CX311A ConnectX-3 EN 10G Ethernet 10GbE SFP Network Card

for a Peer to Peer (Ubuntu to Windows 10)

First didn't initialise - then initialised - now not :(

Try swapping cards over - no change
Tried without fixing card, just in case screw effected - no change

Motherboard is an MSI 370-A Pro

Mellanox FlexBoot v3.4.467
IPXE (iPXE - open source boot firmware [start]) 03:00.0 5D00 PCI3.00 PnP PMM+15403000+15

MLNX FlexBoot 3.4.467 (PCI 03:00.0) starting execution...ok
MLNX FlexBoot 3.4.467 (PCI 03:00.0) initialising devices...
ConnectX3 0x28544 command 0x3b failed with status 01:
ConnectX3 0x28544 command 0x3b failed with status 01:
ConnectX3 0x28544 command 0x3b failed with status 01:
ConnectX3 0x28544 command 0x3b failed with status 01:
_
I can only ASSuME is something to do with clashes with the Network card, Graphics Card, M.2 - but that's a desperate guess

Any guidance please
 

Falloutboy

Member
Oct 23, 2011
221
23
18
That is the PXE boot stuff on startup of the card, what exactly are you running on this MSI 370-A Pro and what is the CPU, complete hardware list please.
 

Falloutboy

Member
Oct 23, 2011
221
23
18
O.K What CPU is this and make and model of the motherboard, the first thing I want to check is that you have enough PCIe lanes to be able to do what you are trying to do.
 

Falloutboy

Member
Oct 23, 2011
221
23
18
From your motherboard manual Page 15 Specifications:
1x PCIe 3.0 x16 slot (PCI_E1, supports x16 mode)
1x PCIe 3.0 x16 slot (PCI_E4, supports x4 mode)
4x PCIe 3.0 x1 slots

Your video card is 16 PCIe lanes.
The mellanox card is 8 lanes and you have it in a 4 lane slot - I don't know that it can work in that configuration.

I believe that your CPU is LGA1151 and there is a display port output on your backplane...

Test my hypothesis.

Remove your video card - the 1080Ti
Run your video off the connection in the IO area, make sure you can boot and get a picture.
Shut down after removing Nvidia driver
Install the Mellanox card in the first PCIe x 16 slot.
Boot up and see if your problem is now fixed.

If it is - it's because your board has insufficient PCIe lanes so you can have the Nvidia card or the Mellanox card but not both without investing in a motherboard upgrade.

Note the above may not work either depending on how many PCiE lanes the on cpu GPU uses.

Hope this helps.
 

Floggedhorse

New Member
Nov 18, 2019
9
0
1
I feel like a fool :(

Now works perfectly - I guess I need a better motherboard ... more money :(

Many thanks though, you saved me lot of hair pulling ... especially when hair is a rare commodity these days.
 

Falloutboy

Member
Oct 23, 2011
221
23
18
Hey no problems, I only know of this type of issue because I was trying to do the exact same thing on a Rampage V edition 10 and couldn't figure out why my M.2 Drive would cut out completely, it's because the slot I was using had some of the lanes tied to the M.2 and even though there were more than enough lanes put the Mellanox card in logically, no boot, take it out, boot. I was WTF until I figured it out. Hey now you have an excuse to upgrade :) just need to read those specifications a bit more aye.

What part of the world are you in?
 

Floggedhorse

New Member
Nov 18, 2019
9
0
1
Upgrade :O dont tell the wife ... and here was me thinking I had an all sing dancing machine....

Mind me asking what mb you went for?
 

Falloutboy

Member
Oct 23, 2011
221
23
18
I'm go big or go home so i dropped a bomb on my gear, I actually have 6 boards total All supermicro X10DRX is the top end one, two X9DR3-LN4F's two X8DRi's and and one X7DWN+, my main PC's I.E not servers are an ASUS Prime X399a running a threadippper 1900x, Zenith Extreme Alpha running a 2950x and the aforementioned Rampage V Edition 10 running a 6900K. My older machine is a Rampage Extreme Z68 that runs a 2600K but even though it has a PLX chip is a bit limited hardware wise, I can run a raid controller in it, or a Mellanox card but not both.
 

mlc130104

New Member
Feb 26, 2017
6
0
1
29
crap. I had tried to install 544QSFP (indeed MCX354A-FCBT) on the first pcie x4 slot of my ASUS M11G. But it never ran correctly, reporting code 43. I wish I read this post earlier.
 

thigobr

Member
Apr 29, 2020
36
6
8
Is this a Windows issue? I just tried an MCX312 dual SFP+ (IBM OEM) on a very old DFI nF4-Ultra PCIE 2X v1.0 and it worked fine under Linux...
 

Freebsd1976

Active Member
Feb 23, 2018
387
73
28
544QSFP or 544+QSFP both have smbus issue , I try two 544+QSFP on my z230sff (x16) and asus z170-a(x16) both failed , machine beep and can't boot up. but it normal on c246-wu4 .
 

Floggedhorse

New Member
Nov 18, 2019
9
0
1
Just for completion - one new MSI MPG Z390 Gaming Pro Carbonn AC mb

Sacrificed 8 lanes from the graphics cards (1% reduction in performance supposedly)

Dont forget to set Samba access :) if accessing Linux

Tweek your windows host file

Now what can I do with the spare 4 lanes?
 

mlc130104

New Member
Feb 26, 2017
6
0
1
29
Just curious: is this a windows issue or an Intel issue?
I plugged a HP 544SFP card into a Rzyen V1500B-based Synology DS1621+ and it was reported to run at x4 link without issue.
The very same card reported code 43 on a windows 110/Asus M11G machine (x8 slot with x4 link)


Screenshot_20201125-144207_Termius.jpg
 

Floggedhorse

New Member
Nov 18, 2019
9
0
1
Its a Intel chipset Issue - I guess!.... Mine worked, then didnt, then did .... try the experiment and remove the graphics card
 

pqrst

New Member
Dec 27, 2020
1
0
1
The Mellanox ConnectX-3 MCX311A-XCAT with a single SFP+ port is a x4 card according to their documentation (ConnectX-3.book (mellanox.com) Appendix A, Page 34)

I am using an ASUS ROG H370-F board which has one x16 tied to the CPU occupied by a GPU and one x16 (x4 mode) tied to the chipset that the Mellanox is sitting on.

On Windows, the Mellanox card works and reports Code 43 between reboots. It will work randomly on multiple reboots, and then report Code 43 for multiple reboots before working again.

On Linux, it will ALWAYS work.

On my Unraid machine the same card, in the same secondary x4 slot connected to the chipset (as the x16/x8 slot is taken up by a HBA card) ALWAYS works.

This shouldn't be a PCIE lanes issue since it's a x4 card and I'm getting x4 lanes from the chipset on the slot it's on. And again, it always works on Linux.

Makes me think it's a Windows issue? I dunno. It's quite annoying when you have to reboot and the card doesn't work.

Any other thoughts?
 

xaster

New Member
Feb 21, 2021
10
1
3
...has anybody found an solution for the Mellanox ConnectX3 Cards with Windows?
I own two HP 649281-B21/Mellanox MCX354A-FCBT (PCIe 8x) and run it on two Z490 Mainboards.
I have support for PCIe 4x via PCH Chipset. Regular PCIe is backward compatible so an PCIe 3.0 8x should works in 4x Slot with smaller bandwidth.
One system run with Server 2019 Ess and the other one is Win10Pro.
Both Machines working so far but i get the same problems as here described with Code 43 in device manager sometimes!
After reboot the problem is gone. It seems can be a Windows or Intel issue by initialize the hardware at bootup :oops:
I am looking for some driver, firmware or BIOS tweaks to get this solved.

2021-02-22_002446.jpg
 

MichalPL

Active Member
Feb 10, 2019
189
25
28
I am confirming that all Mellanox cards I have (ConnectX 2,3,4) works with PCIE 2.0 and 3.0 and links: x1 x4 x8 and x16 (for 100GbE only), currently I have in my desktop 10GbE ConnectX-2 connected via x4 link and it's working with full speed (of single SFP+, almost full on dual).

They are working good on linux or windows with limited PCIE, sometimes slower - for example max speed (when using 40GbE card) on x4 lanes PCIE 3.0 is about 3.7GBytes/s (instead of 4.1Gbytes/s when using x8).

Here we are talking about 10GbE so link x4 is perfect even with PCIE 2.0 or PCIE 3.0, and almost ok is x1 PCIE 3.0 (~85% of the max speed).


but. sometimes this cards are fixed to IB, and you need to switch to ETH or re-flash it using Mellanox tools from webpage.
also ConnectX 3 (only?) are very sensitive to the damages - please check if you have all capacitors on the PCB.


Mellanox.png
 
Last edited: