Hey all,
I have been using a HP H220 controller flashed to IT mode in one of my HP servers running Ubuntu 16.04 LTS for some time.
One of the ports had a single SAS cable going to the backplane (which had a built in SAS expander) and the other was connected to a SATA breakout cable which I had my boot drives connected to (two Samsung SSD's mirrored in ZFS)
I recently retired this server, and when I did, decided to move the H220 controller over to my other server. As soon as I installed it and connected it, ZFS freaked out, found lots of errors on the boot SSD's (same type of SSD mirror boot config on this one) and wanted to resilver.
I panicked, shut down then:
- Checked all cable connections, no issues, did not solve the problem
- replaced the SAS breakout cable, in case it was damaged, did not solve the problem.
Then I moved the sata drives to another known good machine to check and repair the ZFS mirror. Luckily no data was lost.
After this, I gave up, and plugged the drives into the on-board sata ports on the server, where they work just fine, without any errors reported.
The old server being retired where the H220 worked just fine was an old HP DL180 G6, running Ubuntu 16.04 LTS.
The new server the H220 went into where I had corruption issues is a custom build around a Supermicro X9DRI-F running Proxmox VE 6 (debian based)
Anyone have any ideas why it would work perfectly in one, but cause corruption issues in another? If possible I'd like to use this card as it is PCIe Gen 3 and supports better bandwidth use than the old SAS2008 controllers I am using right now, but I don't trust it after that last issue.
Could this be a driver/firmware "phase" mismatch?
How do I check which driver phase each has to make sure they match?
Much obliged,
Matt
I have been using a HP H220 controller flashed to IT mode in one of my HP servers running Ubuntu 16.04 LTS for some time.
One of the ports had a single SAS cable going to the backplane (which had a built in SAS expander) and the other was connected to a SATA breakout cable which I had my boot drives connected to (two Samsung SSD's mirrored in ZFS)
I recently retired this server, and when I did, decided to move the H220 controller over to my other server. As soon as I installed it and connected it, ZFS freaked out, found lots of errors on the boot SSD's (same type of SSD mirror boot config on this one) and wanted to resilver.
I panicked, shut down then:
- Checked all cable connections, no issues, did not solve the problem
- replaced the SAS breakout cable, in case it was damaged, did not solve the problem.
Then I moved the sata drives to another known good machine to check and repair the ZFS mirror. Luckily no data was lost.
After this, I gave up, and plugged the drives into the on-board sata ports on the server, where they work just fine, without any errors reported.
The old server being retired where the H220 worked just fine was an old HP DL180 G6, running Ubuntu 16.04 LTS.
The new server the H220 went into where I had corruption issues is a custom build around a Supermicro X9DRI-F running Proxmox VE 6 (debian based)
Anyone have any ideas why it would work perfectly in one, but cause corruption issues in another? If possible I'd like to use this card as it is PCIe Gen 3 and supports better bandwidth use than the old SAS2008 controllers I am using right now, but I don't trust it after that last issue.
Could this be a driver/firmware "phase" mismatch?
How do I check which driver phase each has to make sure they match?
Much obliged,
Matt