Supermicro 4U 24 bay 846 chassis SAS2 with rails/motherboard - $300

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

BlueFox

Legendary Member Spam Hunter Extraordinaire
Oct 26, 2015
2,059
1,478
113
Seeing as the SAS2 backplane alone tends to go for $300, I'd consider this to be pretty decent. Rails normally fetch near $100 too. Board is a little dated, but appears to be the following: Super Micro Computer, Inc. - Products | Motherboards | Xeon Boards | X8DAH+-F

Only catch is no drive trays, but with a little effort, you should be able to get them for $1-2 each on eBay. Correction, drive trays are included! 10 available currently and they ship to Canada. My guess is they'll probably take $250?

SuperMicro Storage Array JBOD 4U SAS2 24x Caddies 2x PSU Qty Available | eBay
 
Last edited:

markarr

Active Member
Oct 31, 2013
421
122
43
It looks like the trays are included in the description.

SuperMicro Storage Array JBOD 4U SAS2 24x Caddies 2x PSU Qty Available

Motherboard X8

Backplane: SuperMicro BPN-SAS2-846EL1

24 x Caddies 3.5"

2 x Power Supply

Complete Rails Kit

Not include: CPU, RAM, Hard Drive, Raid Controller

30 days warranty
 

cyantist

New Member
Apr 5, 2017
19
1
3
53
The listing specifically states 24x caddies are included. It's also in the title. So an even better deal.
 

BlueFox

Legendary Member Spam Hunter Extraordinaire
Oct 26, 2015
2,059
1,478
113
Guess I overlooked that! Wonder why they didn't put them in the photos.
 

poto

Active Member
May 18, 2013
239
89
28
Nice find - pw supplies are not the worst, either (gold). BO -10% accepted.
 

BlueFox

Legendary Member Spam Hunter Extraordinaire
Oct 26, 2015
2,059
1,478
113
Looks like two of you got lucky as the seller relisted these at $380 OBO. Still isn't a bad price though.
 

Fleat

New Member
Feb 6, 2016
12
2
3
Lowest he seems to want to go now is $330 plus shipping. Claims the listing and accepting of lower offers yesterday was an error.

"That is mistake of other guys. That's why we have to end and relist."

What do you guys think about $415 shipped considering that it has the rails and caddies?
 

BLinux

cat lover server enthusiast
Jul 7, 2016
2,669
1,081
113
artofserver.com
Lowest he seems to want to go now is $330 plus shipping. Claims the listing and accepting of lower offers yesterday was an error.

"That is mistake of other guys. That's why we have to end and relist."

What do you guys think about $415 shipped considering that it has the rails and caddies?
It kind of depends if that motherboard is useful and worth something to you. I've gotten 846A backplane systems w/ rails + caddies + PWS-1K21P-1R PSUs for $453 total including shipping, no motherboard.

This seller's shipping is high for me, even though I'm in California. With the $330 offer, it would end up being $415 + CA tax which would end up being about $450 for me. For that price, I would rather get another one with a 846A backplane.
 
  • Like
Reactions: Fleat

Fleat

New Member
Feb 6, 2016
12
2
3
It kind of depends if that motherboard is useful and worth something to you. I've gotten 846A backplane systems w/ rails + caddies + PWS-1K21P-1R PSUs for $453 total including shipping, no motherboard.

This seller's shipping is high for me, even though I'm in California. With the $330 offer, it would end up being $415 + CA tax which would end up being about $450 for me. For that price, I would rather get another one with a 846A backplane.
Thanks for the reply. I would think that the single sas connector on the BPN-SAS2-846EL1 would be more desirable. What is your reason for going 846A instead?
 

BLinux

cat lover server enthusiast
Jul 7, 2016
2,669
1,081
113
artofserver.com
Thanks for the reply. I would think that the single sas connector on the BPN-SAS2-846EL1 would be more desirable. What is your reason for going 846A instead?
it might come down to personal preference. i prefer the 846A for performance reasons (direct SAS lane to each drive rather than SAS expander) and compatibility across SAS-1,SAS-2, and SAS-3 which helps to future proof my chassis. I'm hoping that when the time comes, I can simply upgrade the HBA/motherboard/cpu and I can keep the same chassis and modernize my system. the drawbacks are obviously more SFF-8087 connectors and the added cost of needing more controller ports and cables. but it seems worth it to me.

in my case, i run VMs directly on the system, not just across the network, I can use the extra I/O capacity. for those who plan to build a pure NAS system where I/O will always be limited by network I/O, the benefits I see may not matter and the simplicity and lesser cost might be more attractive. either way, the system in this thread at the current price level isn't better than a 846A system so there are no cost benefits in going with this at the current price level.
 
  • Like
Reactions: Fleat

Fleat

New Member
Feb 6, 2016
12
2
3
it might come down to personal preference. i prefer the 846A for performance reasons (direct SAS lane to each drive rather than SAS expander) and compatibility across SAS-1,SAS-2, and SAS-3 which helps to future proof my chassis. I'm hoping that when the time comes, I can simply upgrade the HBA/motherboard/cpu and I can keep the same chassis and modernize my system. the drawbacks are obviously more SFF-8087 connectors and the added cost of needing more controller ports and cables. but it seems worth it to me.

in my case, i run VMs directly on the system, not just across the network, I can use the extra I/O capacity. for those who plan to build a pure NAS system where I/O will always be limited by network I/O, the benefits I see may not matter and the simplicity and lesser cost might be more attractive. either way, the system in this thread at the current price level isn't better than a 846A system so there are no cost benefits in going with this at the current price level.
Thanks for the great info. That is certainly informative to anyone who is interested. I am running unraid right now where the disk I/O performance is already garbage so I can't imagine I would notice a difference with the one connector.

I do however agree with you that it seems a bit pricey at the moment.
 

BLinux

cat lover server enthusiast
Jul 7, 2016
2,669
1,081
113
artofserver.com
Thanks for the great info. That is certainly informative to anyone who is interested. I am running unraid right now where the disk I/O performance is already garbage so I can't imagine I would notice a difference with the one connector.
A bit off-topic, but I've never used unraid but have read about it. Why is the disk I/O performance garbage??
 

Fleat

New Member
Feb 6, 2016
12
2
3
A bit off-topic, but I've never used unraid but have read about it. Why is the disk I/O performance garbage??
The software raid parity writes add overhead that tend to slow things down. You can use ssd cache drives to help speed things up under certain circumstances, but I have also experienced poor performance in VM's running directly on the cache drives in testing.

Frankly none of that really matters for my use case and Unraid fits my needs nicely for what it is. I migrated from a ZFS build a couple of years ago and have not looked back.
 

Fritz

Well-Known Member
Apr 6, 2015
3,370
1,375
113
69
The software raid parity writes add overhead that tend to slow things down. You can use ssd cache drives to help speed things up under certain circumstances, but I have also experienced poor performance in VM's running directly on the cache drives in testing.

Frankly none of that really matters for my use case and Unraid fits my needs nicely for what it is. I migrated from a ZFS build a couple of years ago and have not looked back.
What was it that made you switch?
 
  • Like
Reactions: fossxplorer

talsit

Member
Aug 8, 2013
112
20
18
Same here, I'm on ZFS now.

Was on unraid for years but it got messy, seemed like it was ALWAYS doing parity checks, I had a 20 disk array.

Swapped to Server 2012 storage spaces, lost the entire array to a silent double disk failure, no indication of failing drives, one day the array wouldn't start and it said it was two drives short (happened after a major server 2012 update).

Now I'm on a poorly constructed ZFS on Linux array (8 - 3 TB drives, should have gone with 2 arrays of 4 I think). I really like the ability to see what is happening with ZFS. I like that I can get disk and data information without looking a lot of different places.
 

Fritz

Well-Known Member
Apr 6, 2015
3,370
1,375
113
69
I've gone through several HD failures with FreeNAS/ZFS and recovered from all of them without issues. I stopped using Storage spaces because, as talsit found, it's all too easy to lose it all without warning. ZFS, on the other hand, will warn you in time to act.