1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

HP SL6500 4U 8-Node (16 CPU) Cloud Servers | 256 Threads | 2,048 GB RAM ~$2000

Discussion in 'Processors and Motherboards' started by Zankza, Jun 30, 2016.

  1. raileon

    raileon Active Member

    Joined:
    Jun 22, 2016
    Messages:
    134
    Likes Received:
    33
    I should be able to start posting this type of info by Tuesday night. Got extra cpus and ram. Plus my psus came in already. Just need the servers which are sitting at the airport a few miles from me...
     
    #21
  2. Fredrik

    Fredrik New Member

    Joined:
    Mar 28, 2016
    Messages:
    5
    Likes Received:
    1
    I brought up 6 of 8 nodes last night. They came with iLO 1.22, advanced key pre-installed. I was able to upgrade to version 2.40 without issue.
    BTW, one of my nodes arrived DOA - may be related to the screws I found floating around in the chassis. It wouldn't be a bad idea to check for loose screws before applying power. They appear to belong to the PCIe brackets, which arrived unsecured.
     
    #22
  3. Zankza

    Zankza Active Member

    Joined:
    Feb 1, 2014
    Messages:
    66
    Likes Received:
    13
    Strange. None of my nodes have screws. It appears they rushed removing the raid card, and honestly I didn't care too. But Thanks for sharing your experience.

    Hell yes. I hope I get preinstalled keys as well !
     
    #23
  4. cookiesowns

    cookiesowns Member

    Joined:
    Feb 12, 2016
    Messages:
    85
    Likes Received:
    21
    Any ideas on what onboard SATA/SAS connectivity there is? Enough room to fit 6x SSD's with velcro and some creativity?

    think I'm going to go in on these too.
     
    #24
  5. Fredrik

    Fredrik New Member

    Joined:
    Mar 28, 2016
    Messages:
    5
    Likes Received:
    1
    2x SFF-8087 (H220i) + 2 SATA. Physically, 8x SSD's will fit without issue. However, you'll need to be creative in sourcing power for the additional drives.
     
    #25
  6. Zankza

    Zankza Active Member

    Joined:
    Feb 1, 2014
    Messages:
    66
    Likes Received:
    13
    That was my first DIY plan, But after quick look, It seems very much more problematic with power. The cables seem to be custom, And I don't think there is any more power connectors, So that means you'd need to rig a custom cable with more tether. But the space is definitely vast, there is more than enough room to fit maybe 16 SSD if carefully designed.
     
    #26
  7. cookiesowns

    cookiesowns Member

    Joined:
    Feb 12, 2016
    Messages:
    85
    Likes Received:
    21
    #27
  8. Zankza

    Zankza Active Member

    Joined:
    Feb 1, 2014
    Messages:
    66
    Likes Received:
    13
    [​IMG]
    [​IMG]
    [​IMG]
    As you can see I already can fit three SSD, and there is even room for fourth, fifth?! I hope those pictures show you why I think 16 SSD is very much possible reality, 20 would be pushing it.
     
    #28
  9. cookiesowns

    cookiesowns Member

    Joined:
    Feb 12, 2016
    Messages:
    85
    Likes Received:
    21
    Looks like HP is using traditional SATA power.. Definitely have room to add 1-2 of these: FrozenCPU ConnectRight DIY SATA EZ Crimp Connector - Black - 90° (M-SCA-16F-BK) - FrozenCPU.com

    But will definitely need to be creative in re-arranging existing layout.

    You can also go with SL250s, and get 4x 2.5" hotswap + 4x 2.5" Coldswap with an adapter kit.

    Side note: Anyone have these in a cabinet yet? Still need to purchase some racks, so was wondering how these will fit in a normal depth rack, or if a "deep" style rack is necessary.
     
    #29
    Last edited: Jul 3, 2016
  10. Zankza

    Zankza Active Member

    Joined:
    Feb 1, 2014
    Messages:
    66
    Likes Received:
    13
    Of course that's possible but those things are HUGE. they need to be really a lot more smaller, Additionally it's 10x safer and space-wise if we somehow that small sata data/power bracket I think so at least.

    Even something like this is by far better chance.

    https://www.amazon.com/CableDeconn-SFF-8087-SFF-8482-Connectors-Power/dp/B010CMW6S4/

    I think deep style might be not deep enough :\
     
    #30
  11. cookiesowns

    cookiesowns Member

    Joined:
    Feb 12, 2016
    Messages:
    85
    Likes Received:
    21
    Yeah that's what I was thinking.

    As for the deep style rack... These things are about 900mm deep right? There are supermicro Chassis's that are about that deep. So I'm thinking 1200mm rack with rear rails pushed slightly back are a requirement?
     
    #31
  12. Zankza

    Zankza Active Member

    Joined:
    Feb 1, 2014
    Messages:
    66
    Likes Received:
    13
    Code:
    The HP ProLiant s6500 Chassis is a modular server hardware system that is optimized for HP Rack 10000
    Series and HP Intelligent Rack Series models that are at least 47.24 in (1200 mm) deep.
     
    #32
  13. cookiesowns

    cookiesowns Member

    Joined:
    Feb 12, 2016
    Messages:
    85
    Likes Received:
    21
    Yeah... that's a bummer... this will more than likely rule out the S6500's for me. 1200mm + extra width racks might not work in space I had planned.

    https://h20565.www2.hpe.com/hpsc/do...177941&docId=emr_na-c03050390&docLocale=en_US
     
    #33
  14. cookiesowns

    cookiesowns Member

    Joined:
    Feb 12, 2016
    Messages:
    85
    Likes Received:
    21
    Yeah... that's a bummer... this will more than likely rule out the S6500's for me. 1200mm + extra width racks might not work in space I had planned.

    https://h20565.www2.hpe.com/hpsc/do...177941&docId=emr_na-c03050390&docLocale=en_US
     
    #34
  15. GuybrushThreepwood

    Joined:
    Aug 2, 2015
    Messages:
    51
    Likes Received:
    19
    #35
  16. cookiesowns

    cookiesowns Member

    Joined:
    Feb 12, 2016
    Messages:
    85
    Likes Received:
    21
    There's an extension KIT that you have to use for the HP RACK 10000 at least according to the documentation above that I linked..

    Either way, I think 1100 or 1200mm rack should be sufficient, it's just a matter of if these racks can recess the Chassis... The Eaton RS I had planned doesn't seem to be able to do so... so that means one of the rack doors will need to be taken off unfortunately....

    I guess I'll be the guinea pig in about a month or two, unless I find some good deals on C6220's, or other chassis setups
     
    #36
  17. raileon

    raileon Active Member

    Joined:
    Jun 22, 2016
    Messages:
    134
    Likes Received:
    33
    I ordered my 2670s but no delivery of the servers today. UPS shows an odd set of time-stamps for today's tracking where I apparently delayed it.

    For those of you that got your shipments already, did UPS call beforehand or was it like a regular delivery?
     
    #37
  18. Layla

    Layla New Member

    Joined:
    Jun 21, 2016
    Messages:
    12
    Likes Received:
    1
    What do you plan to do? Did you manage to bend the SAS cables for all 8 nodes? Do you plan to replace them with 90deg elbows? It appears that SFF-8087 female-female couplers are $50/ea, which would be around $400 in couplers alone, so that doesn't seem like a great option... :/
     
    #38
  19. raileon

    raileon Active Member

    Joined:
    Jun 22, 2016
    Messages:
    134
    Likes Received:
    33
    The servers came in today. I visually inspected all nodes and they look to be in pretty good and mostly clean shape. I'll have to dig a bit of thermal gunk out of some places (cpu sockets...) but that's about it. I'll fire up a few tonight with all the extra E5 cpus I have and post some software details.

    I ordered the listing that didn't mention rails. They came with rails. I now own 4 sets of rails. Don't order rails until after you see what you get.

    The hard drive cages in the back are a scam. If you use SSDs those are begging to be modded. Plenty of extra power headers on the power distro board but I doubt you'd need them.

    Why is there a pcie x16 slot above the ram for the rear cpu? I don't see that in the manual. GPU?

    http://i.imgur.com/fuyGnXq.jpg

    @Layla The sas cables can be bent enough to get the nodes in although it doesn't seem like the best way to handle it.
     
    #39
  20. raileon

    raileon Active Member

    Joined:
    Jun 22, 2016
    Messages:
    134
    Likes Received:
    33
    I have three nodes up, that's all the cpus I have at the moment. No problems getting them to run.

    Some quick notes:

    iLo 4 Advanced on all three. The iLo IPs were set to static in the 172.24.0.0/24 subnet. Passwords were all default as written on the service tag of the nodes. Switching their IPs over to my subnet was pretty simple once I found their IP using nmap then just ssh in.

    If you don't know iLO-speak (I didn't) then check out this quick run through of how to change the IP: Change ILO IP address via CLI on HP C7000 Blade Chassis - Systems Administration Problem Solvers

    I basically ran two commands:
    set /map1/enetport1/lanendpt1/ipendpt1 IPv4Address=<your new ilo static ip> SubnetMask=<subnet mask of your network>
    *iLo reboots automatically here, ssh session freezes, fans attempt to liftoff
    SSH into the new iLO IP and run:
    set /map1/gateway1 AccessInfo=<your gateway ip>
    * another auto-reboot

    Give it a minute then just browse to your new iLO web gui.

    If you screw-up any of the steps and can't get to your iLO through lan like I did then you can connect to it through the serial port. After you've connected with Putty or whatever hit "Esc + Shift + 9" and you get dropped to the iLO login where you can fix the network setup.

    Nodes were at:
    iLO FWs are 1.22 and 1.05
    System ROM at P75 04/04/2012

    Ran the SPP Version 2016.04.0 on each node, with no problems. It updated iLO, System ROM, some intel thing, not sure what else. I think I'm still missing some more updates like the integrated LSI but I'm not sure. I'm firmwared out for the night though.

    Be sure to thank hightail.com for dumping these so cheap =)
     
    #40
    gigatexal and Zankza like this.

Share This Page