Windows 10, Link down: HPE InfiniBand EDR/Ethernet 840QSFP28-Adapter

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

lihp

Active Member
Jan 2, 2021
186
53
28
Ok I plain have no clue what is going on - please help:
  • 2x HPE InfiniBand EDR/Ethernet 840QSFP28-Adapter
  • 1x 100G QSFP28 Passive Direct Attach Copper Twinax Cable (2m)
  • One very old 2012 Workstation (PCIe 3.0) - Windows 10
  • New AMD server - RHEL 8.4
From logs both cards are fine in the RHEL server, updated firmware with newest available HP firmware 12.28.1002 (newest I could find on HP website).

Yet it is impossible to establish a link. When both cards were set in IB mode, a yellow glow could be seen at the backside of the card. In Ethernet I see nothing at all now on the card. What happens under Windows 10 is basically: no link as if cable is not attached.

"mlx5cmd -stat" results in:
...
port_state=PORT_DOWN
port_phys_state=DISABLED
---

"mlx5cmd -dbg -pddrinfo"
...
Troubleshoot Info
Status Opcode : 39
Group Opcode : 0
Message : Negotiation failure
...

So obviously link is not available.

More:
  • In Windows device manager the card looks ok at first glance. Except for link speed, which is empty, see log cx455a:
  • Despite showing as ok, the driver seems to be runnign wild... (Mellanox OF-2)
1623682515062.png
  • Do I need different drivers sind its hp?
  • ...
I have zero replacement stuff, so I cant do troubleshooting- Also I lack experience with those cards, making it worse... Cable should be ok: brand new, from FS,... Cards were tested and show ok under RHEL. So it seems I am missing somethign easy, maybe even stupidly easy?

Any idea, hint,...?
 

Attachments

lihp

Active Member
Jan 2, 2021
186
53
28
ADD: in my workstation I dont see the Firmware springing up, on my server I do.

Any idea or I plain must wait for my new workstation and assume my current workstation is too old?