DL380 G6 with multiple LSI controllers POSTing but not booting from a drive.

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

RimBlock

Active Member
Sep 18, 2011
837
28
28
Singapore
Hi,

I have my DL380 G6 I have been using as a SAN for my SSDs and it has been running fine for a month or so.

I have now attached my Dell MD1000 disk shelf to a second LSI controller and it will now not boot to OS. It just goes to 'Sleep mode' when it gets to the actual OS boot stage.

Current Setup;
  • The internal P410 is disabled in the BIOS.
  • A LSI 9211-8i is connecting to the 8x 2.5" front drive bays.
  • The OS is booting from a WD Black 500GB 2.5" drive in the first drive bay in the front of the server.
  • Bays 1-6 (assuming the first bay is 0) are filled with SSDs.
  • Bay 7 is empty.
  • The MD1000 has 9x 15k SAS drives (146GB) and 6x 2TB Ent SATA drives.
  • The MD1000 is connected to a LSI 9200-8e controller.
  • All drives seem to be detected on server POST.
  • The LSI 9211-8i is in PCIe slot 2 (x8 in x16 slot).
  • The LSI 9200-8e is in PCIe slot 1 (x8 in x16 slot).
  • A Mellanox Connect X2 card is in another slot.
  • BIOS controller priority has PCIe 2 as order 1, PCIe 1 as order 2.
  • After the POST screen (Sea of sensors etc) the monitor goes in to power save and the server just seems to sit there.
  • The OS is Solaris 11.1


Any ideas ?.

Thanks
RB
 

dba

Moderator
Feb 20, 2012
1,477
184
63
San Francisco Bay Area, California, USA
I have had quite a few odd problems when using multiple (up to 14) LSI HBAs on a server. I've run out of BIOS memory, had weird conflicts with other cards, had an old version of the BIOS on one card not work with a newer version on another card, and I have had LSI cards halt because they found foreign disks or a server configuration they didn't like. I've never had a "sleep" as the result, but perhaps that's a symptom and not a cause.

The no-brainer steps are these:
1) Update the firmware and BIOS on all of your LSI cards to the same level. Verify with sas2flash -listall
2) Enable boot only on the zero or one card from which you'll be booting

If that doesn't fix it, you could probably still slog through all of the possibilities and find a fix, but there may be an easier way: Change the mode to "Disabled" in the LSI BIOS (option ROM) on one more more of the cards. Assuming you are in IT mode, you don't really need the BIOS, and it can't get in the way if it doesn't exist. Since you know that your old card was working OK, just disable the new card. You won't be able to boot from that card, or create arrays using the now-disabled BIOS, but it will show up properly and work just fine in the OS. In my experience, this almost always fixes LSI boot problems.
 

RimBlock

Active Member
Sep 18, 2011
837
28
28
Singapore
I finally put my LSI9202-16e in the server rather than the LSI 9200-8e and it boots fine. The server also booted fine with the LSI 9200-8e without any drives attached.

I suspect it is a firmware issue so will get round to updating at some point.

Thanks
RB
 

cptbjorn

Member
Aug 16, 2013
100
19
18
Somewhat off topic but what's the noise like on this host w/ the LSI cards installed? Thinking about getting one but I'd put a crossflashed m1015 in it, curious if this would cause it to crank up the fans.