Gigabyte MZ32-AR0 + 2x U.2 = No drives detected, but if 1 is plugged in then it works?

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

hbhbhb

New Member
May 7, 2024
5
1
3
On all 3 of my Gigabyte MZ32-AR0 Rev 1.0 boards if I plug in a Single u.2 to the board it works, but if I plug in a second they both don't show up. Really weird. Checking via BIOS.

Using these SlimSAS 4x to U.2 NVMe (SFF-8654 to SFF-8639) cables https://www.amazon.com/dp/B099DYK9CW + Molex to single SATA

Anything come to mind for you that may be causing this? The board has 4x SlimSAS SFF-8654 ports. Not a cable issue since if I connect either one individually they show up. Using separate strands from my PSU to power each drive.
 

name stolen

Active Member
Feb 20, 2018
132
38
28
If you're not booting from them and can proceed to OS, do they show up there? Outside of weird power and cable edge case, I'd say just a BIOS/firmware thing.
 
  • Like
Reactions: nexox

hbhbhb

New Member
May 7, 2024
5
1
3
If you're not booting from them and can proceed to OS, do they show up there? Outside of weird power and cable edge case, I'd say just a BIOS/firmware thing.
Nope they don't show up in OS. Updated BIOS to latest version (R28 _F04) MZ32-AR0 (rev. 1.x) | Server Motherboard - GIGABYTE U.S.A. and still same result. Very strange.

Could be these cables are unreliable but it always works when I just have a single drive plugged in. That said, it does have to be in the U2_0 port.
 

Tech Junky

Active Member
Oct 26, 2023
711
240
43
First thing that comes to mind is not enough PCIE lanes to support 8 lanes w/ 2 drives but you have enough for a single 4 lane drive.

Looking at the board info / CPU options though this shouldn't be an issue.

Things get funky though when you see shared w/ xxx noted next to the slots on the specs sheet.
Slot_7: PCIe x16 (Gen4 x16) slot, shared with 4 x NVMe

So, do you have anything in slot 7?

I went with a dumb M2+oculink cable on my setup and it's been flawless for over a year now. Since you want to run dual drives apparently there are PCIE cards with a Oculink dual output you could get and a oculink split cable all in for ~$65. There's also a x16 card that has 2 oculink ports on it that split to 4 drives total.

I also find it odd that the slots are Gen4 and the M2 sockets are G3 but, then again companies tend to do weird stuff when allocating lanes. The only one I found that didn't as much was ASRock.

10 x SlimSAS connectors -- that's what I count as well but, under the storage it shows....

4 x SlimSAS connectors for 4 x Gen3 NVMe
2 x SlimSAS connectors for 8 x SATA 6Gb/s

So, my guess is that 4 connectors are optional use for something else? I'm not going to RTFM but, maybe you should to figure it out.
 
  • Like
Reactions: name stolen

hbhbhb

New Member
May 7, 2024
5
1
3
First thing that comes to mind is not enough PCIE lanes to support 8 lanes w/ 2 drives but you have enough for a single 4 lane drive.

Looking at the board info / CPU options though this shouldn't be an issue.

Things get funky though when you see shared w/ xxx noted next to the slots on the specs sheet.
Slot_7: PCIe x16 (Gen4 x16) slot, shared with 4 x NVMe
Manual does not mention this specific aspect (in these specific terms) but I *think* I solved this by setting bifurcation on slot 7 to x4x4x4x4

also the DiLink cables may not be fully seated even if a "click" is heard, they need an extra push to ensure a solid connection
 
  • Like
Reactions: Tech Junky

Tech Junky

Active Member
Oct 26, 2023
711
240
43
bifurcation on slot 7 to x4x4x4x4
That always helps.

I've had clicking and force issues as well with my AMD board. Slots seem to be tighter and need .more force.

Cables for the NVME drives are finicky and have distance issues over 50cm.