IBM M1015 flashed to IR-mode rebuilds RAID after each host reboot.

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

snerran

New Member
Oct 31, 2013
13
0
1
Stockholm, Sweden
So I've got some issues with my new IBM M1015 controller...

It is flashed to IR-mode. That operation was successful without any hickups.. (Although I had to do it via UEFI instead of DOS)

The controller has three 3TB WD reds connected to it. The card is inserted into my spanking new SuperMicro X10SLM-+F running ESXi 5.5. The M1015 is also passed through to my virtual file server running Windows Server 2012 R2.

2 of the disks are configured as a RAID1 logical volume. This was done via LSIs MegaRAID Storage Manager which is installed on the virtual file server. It took about 2 days for the disks to initialize and then it did a rebuild. After that I migrated all my data to it. When all this was finally completed I tested to reboot the ESXi host to check that everything works as it should. Unfortunately the card started yet another rebuild (!?). In the MegaRAID Storage Manager one of the disk is listed as "degraded" and the other is listed as "rebuild". The third disk fine and is visible in the file explorer on the virtual file server.

I know the disks are working fine, no data corruption. So the question is why it is rebuilding my RAID1 array after every host reboot. The disks are now offline and I can't access my data until the rebuild finishes... Anyone else had this problem?

EDIT:
The 2 HDDs were used in another RAID1 array on my ASUS mobo (old workstation). However that should not affect the new RAID1 array as it wipes the data on the disks, or does it? Do I have to zero the disks first? I'd rather not as it takes ages on two 3TB disks...
 
Last edited:

nodlo

New Member
Nov 6, 2014
2
0
1
76
Hi Snerran
Sorry to be a bit late replying on this one, but I only just saw your post (after searching the web for a couple of months !). I have exactly the same issue and it's driving me nuts.
I have Asus Z87 mobo, two 9211-8i LSI cards flashed to IR mode, four 3TB WD RAID disks ... and it does not matter which card, or which combination of 2 disks (including two of the 'old' ones) to build my RAID1 volume - I get this exact problem, all the time.
It does not matter if I use the MSM under Windows 8.1 or the BIOS utility to build the array. And I am using the disks as storage only, NOT booting from any of them.
PLEASE let me know if you eventually found a solution. I am using the latest LSI firmware (P20).
The weird thing is that this scenario USED to be working OK until a few months ago, until I had to swap one of my WD disks under manufacturers warranty - and no changes whatsoever have been made since then.
 

nodlo

New Member
Nov 6, 2014
2
0
1
76
Thanks for the quick response gea ... but I originally had the problem under P19 !
I guess I should try P18 ? Do you have any evidence to support this suggestion, or is it a gut feeling ?
I'm not being argumentative, I would just like to know before I try rolling back.
 

gea

Well-Known Member
Dec 31, 2010
3,163
1,195
113
DE
If the problem is new after the firmware update, the update could be the problem.
If you had problems with P19 this is not probable.

I would look for a disc/ cabling/ controller/ RAM problem.