I never want to run out of hard drive bays again - starting build.

Discussion in 'DIY Server and Workstation Builds' started by kapone, Dec 20, 2017.

  1. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Background - I have had various "storage servers" over the years, and all have been based on Off-the-shelf chassis/systems. While all have worked fine for the most part, the noise is always there. The intent of this build is to make a seriously quiet, rackmount, lots of bays, storage server.

    This is mostly media/personal data/other stuff storage, so, throughput/speed etc are not that critical. Gigabit connectivity is fine for now, and in the future 10gbe can be added if need be.

    I acquired a failed company years ago and I still have a bunch of crap sitting in my warehouse. The other day I roamed through it and selected a few pieces. I had 7 of these 2U chassis (made by Ci Design apparently) with 12 hot swap bays each that were sitting, collecting dust. These are weird lil chassis that take two motherboards (one on top of each other with a single PCI each) and coldwatt power supplies. Not usable as is, but hmm...will make a nice storage array.

    So, the plan is to take the dremel to these, hack and stack em and come up with a 14U storage enclosure with 84 hard drive bays. :) I recently bought 10x of the Intel/LSI 16 port SAS2 expanders that will be perfect for this. Will be running off of one or may be two M1015 adapters.

    Stay tuned... :)
     
    #1
    Last edited: Dec 20, 2017
    T_Minus likes this.
  2. T_Minus

    T_Minus Moderator

    Joined:
    Feb 15, 2015
    Messages:
    6,392
    Likes Received:
    1,304
    sounds like an interesting build.

    pictures :)
     
    #2
    gigatexal likes this.
  3. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Too early... :) Waiting on stuff to arrive.

    I thought that since the holidays are coming up...well...build something!
     
    #3
  4. K D

    K D Well-Known Member

    Joined:
    Dec 24, 2016
    Messages:
    1,363
    Likes Received:
    286
    Looking forward to seeing this.
     
    #4
  5. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Fan "wall" planning...

    These chassis came with a 4x 80mm fans mid plane each. The good thing is that the chassis drive trays are pretty open with good airflow through them. While I could have 28x 80mm fans running.... :) that'd defeat the purpose. My server room aka part of my unfinished basement, is relatively cool, so I don't need to go crazy, but gotta come up with something.

    The chassis are just about 17" wide (and of course 2U). So, having hacked them up so that only the bottom most chassis has a "floor" and the topmost has a "cover", that gives me 24.5" of vertical space.

    17" = 431.8mm
    24.5" = 622.30mm

    I could use a fan wall of:
    - 200mm fans - 2 across x 3 high. That would give me a bit of clearance on the side to snake the cables through.
    - 140mm fans - 3 across x 4 high. That leaves some extra room at the top, but with a blanking panels, should be workable.

    Hmm....I think I need to go shopping for 200mm fans.
     
    #5
  6. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Been thinking about the engineering side of cutting up the 7 chassis' as I'm not physically at home right now. I think securing the fan wall to all 7 chassis and making sure it's a tight seal is gonna be a challenge.

    Been also thinking about power supply(its) for this. I could go with a single Supermicro 1620w or so, and that "should" be enough, as I'll never write to all 84 disks at the same time (writing takes the most amps on the +5v rail). This will be running Windows Server 2012 R2 with Stablebit Drivepool and I'll be slicing the disks up into 4-5 pools.
     
    #6
  7. pricklypunter

    pricklypunter Well-Known Member

    Joined:
    Nov 10, 2015
    Messages:
    1,359
    Likes Received:
    368
    You're sailing close on the power supply if you want things to run cool, best to aim for max 60-70% full duty loading. Oh and assuming you will never write to all of the disks at the same time is going to come back and bite yer ass. Build and rate stuff based on full load requirements :)
     
    #7
  8. K D

    K D Well-Known Member

    Joined:
    Dec 24, 2016
    Messages:
    1,363
    Likes Received:
    286
    How about 1psu per 4u of drives instead of a couple of huge ones?
     
    #8
  9. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Agreed. That's why this is challenging.

    However, even without (I'm shooting for with) staggered spin up, once you get beyond a certain number of drives, they are not all gonna startup at the exact same time. Looking at examples of hardware for lots of drives (45drives, Supermicro 90 drive chassis), I think they go with the same thinking as well.

    The Supermicro 90 bay chassis "only" has a redundant 2000w PSU (which is rated at far less on 110v vs 230v).
     
    #9
  10. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    I'm actually thinking more along the lines of two relatively big power supplies, one each for ~42 drives (assuming a single 1620w, in a redundant setup, doesn't work)
     
    #10
  11. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    After doing some rough calculations...I may have to break this up into two servers, instead of one big ass one. The math is just not working out (and racking one big ass 14U server is gonna be a bit***)

    Since I'm using the IBM 46M0997 expanders, these can do 20 drives each (hopefully, still need to test) and I have seven 2U chassis with 12 bays each. What I don't wanna do is cascade expanders (even though I have enough), so with a typical M1015 type card, that's 8 ports that can be expanded to 40 ports by using two expanders, one on each 8087 port.

    But...the chassis are 12 bays each...so three of them equal 36 bays. If I combine four chassis on one server, I get 48 bays, but I run out of ports on the HBA and will need to cascade expanders or leave 8 bays disconnected, which seems a waste.

    I could of course get a new HBA that has more than 8 internal ports, or get different expanders, or get a new motherboard/cpu combo that can support more pci-e slots so that I can plug in three HBAs...decisions decisions. It's not that I don't want to spend money, but storage is simple...It shouldn't cost an arm and a leg. I'd much rather spend the money on hard drives and/or compute nodes.

    Hmm...
     
    #11
  12. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Well...fan decision solved. :) These should be almost silent at the speeds I'm intending them to run at.

    Screen Shot 2018-01-05 at 11.09.19 AM.png
     
    #12
  13. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Power supply(ies) decision made. I'll be using 4x HP Common slot 460w platinum power supplies. Each gives me 20a on the +5v rail and more than enough on the +12v rail. The problem in powering this many drives is not only the +12v output, but the +5v is critical too. "KD" was right. Multiple PSUs was the answer.

    Screen Shot 2018-01-05 at 2.57.30 PM.png

    Screen Shot 2018-01-05 at 2.57.49 PM.png

    Each of these will power 20x HDDs, which should be within the parameters of the PSU. And oh...these PSUs and backplanes/PDBs are dead cheap.
     
    #13
  14. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Pictures of the actual build coming soon...It's biting cold right now, and I'm not in the mood for wearing layers of clothing to go out and start using the angle grinder...Because I KNOW someone is gonna say...

    [​IMG]
     
    #14
  15. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Will be using a Chenbro Chassis Management Board for fan control.

    [​IMG]

    (This hacked up, whatever you wanna call it, is essentially a big ass JBOD enclosure, with 84 bays, 80 functional. Four SAS2 expanders in it, attached to 80 bays, and going out to four SFF-8088 connectors for connectivity to the actual storage server).
     
    #15
  16. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    HBAs and SAS to SATA cables coming...

    Screen Shot 2018-01-05 at 3.39.25 PM.png

    Screen Shot 2018-01-05 at 3.40.06 PM.png

    Screen Shot 2018-01-05 at 3.39.56 PM.png
     
    #16
  17. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Other bits and pieces...

    - 4x Add2PSUs (The enclosure will turn on and off with the server)
    - Bare wire for custom harnesses
    - Pcie-e powered risers for the expanders (they don't have a molex...)

    713hVAuNzhL._SL1500_.jpg

    41-48pBShIL.jpg
    41iSQWNBslL.jpg
    41Lm6wzHLuL.jpg
    Screen Shot 2018-01-05 at 4.12.01 PM.png
     
    #17
  18. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    Project shelved. The "Supreme Commander" aka my wife said "You have two toddlers...WTF are you thinking? Shut it down. Now."

    I couldn't argue.
     
    #18
  19. maze

    maze Active Member

    Joined:
    Apr 27, 2013
    Messages:
    451
    Likes Received:
    61
    Sad.. why dont woman get us?
     
    #19
  20. kapone

    kapone Active Member

    Joined:
    May 23, 2015
    Messages:
    389
    Likes Received:
    116
    I know....

    Well, she wouldn't let me build it, so....:rolleyes::rolleyes:, I'll buy something.

    cse-846be2c-r1k03jbod_combo.jpg
     
    #20
    William and maze like this.
Similar Threads: never hard
Forum Title Date
DIY Server and Workstation Builds New 1U pfSense build - Can't decide on hardware Aug 18, 2018
DIY Server and Workstation Builds Looking for suggestions on specific hardware for home lab -ultra quiet build Aug 17, 2018
DIY Server and Workstation Builds ZFS server - hard/software components Dec 25, 2017
DIY Server and Workstation Builds Next flood of server hardware onto the market: considering buying from Natex. May 25, 2017
DIY Server and Workstation Builds ESXi and Cisco VIRL home Lab - Hardware Options Jan 6, 2017

Share This Page