Married Guys Can Have Labs Too! My cluster...

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

briandm81

Active Member
Aug 31, 2014
300
68
28
42
So I'm not an IT professional any more, but I still love to stay current as my hobby. I used to be an IT guy, then a DBA, and now a consultant specializing in the EPM offering from Oracle. I like to have several versions up and running all the time, so that means a lot of capacity.

My current cluster started off as a single server towards the end of 2014. My wife was frustrated that she couldn't figure out what to get me for my birthday...so she asked what I wanted. I suggested a server...after I got done explaining why I needed a new server or a server at all, I built the first server in cluster. I was actually able to take the storage part of my old server to start with:

Build’s Name: HyperionESXi
Operating System/ Storage Platform: ESXi 5.5
CPU: Dual Xeon E5-2620 V2 @ 2.1 GHz
Motherboard: ASRock EP2C602-4L/D16
Chassis: Norco 4224 (from the server before this one)
Drives: Samsung 840 500GB SSD (from the server before this one), 2x 840 EVO's, 4x840 Pro's, 8x2TB Desktars in RAID6 (from the server before this one)
RAM: 128GB (8x16GB Kingston @ 1600MHz)
Add-in Cards: Areca 1880i (from the server before this one), LSI 9265-8i, HP SAS Expander (from the server before this one)
Power Supply: Thermaltake 1200W
Other Bits: Norco Open Rack

This was a great start, but I eventually ran out of storage and memory. So I upgraded the server to 256GB of RAM, added another pair of 500GB SSD's, and expanded the RAID array to 12 drives from 8. So the final specs of the server ended up being this:

Build’s Name: HyperionESXi
Operating System/ Storage Platform: ESXi 5.5
CPU: Dual Xeon E5-2620 V2 @ 2.1 GHz
Motherboard: ASRock EP2C602-4L/D16
Chassis: Norco 4224
Drives: Samsung 840 500GB SSD, 2x 840 EVO's, 2x 950 EVO's, 12x 2TB Desktars in RAID6
RAM: 256GB (16x16GB Kingston @ 1600MHz)
Add-in Cards: Areca 1880i (from the server before this one), LSI 9265-8i, HP SAS Expander
Power Supply: Thermaltake 1200W
Other Bits: Norco Open Rack

Next I decided that I needed a separate server for some performance testing of a database product named Hyperion Essbase. This is one of the foundational items in the Oracle EPM stack that I work with. This was around the same time the E5-2670's hit ebay...great timing! Or bad timing if you are a hardware junky... So here's what I put together:

Build’s Name: HyperionESXi2
Operating System/ Storage Platform: ESXi 6.0
CPU: Dual Xeon E5-2670 @ 2.6 GHz
Motherboard: ASRock EP2C602-4L/D16
Chassis: Supermicro SC846
Drives: 4x Samsung 850 EVO 250GB SSD in RAID 0, SanDisk Ultra II 960GB SSD, 2x Samsung 850 EVO 250GB SSD's to run physical windows for some performance comparisons
RAM: 128GB (16x8GB Crucial @ 1600MHz)
Add-in Cards: Areca 1880i, Intel 750 400GB
Power Supply: Supermicro 1200W
Other Bits: Norco Open Rack

This server is being used for some performance testing, but eventually will just be another box in the cluster once I get done with some physical versus virtual baseline testing.

While I was getting that server together...I ran out of capacity on my original box. So I decided it was time to go ahead and take advantage of the E5 deals again (even cheaper this time). So here's the last box in the cluster:

Build’s Name:
HyperionESXi3
Operating System/ Storage Platform: ESXi 6.0
CPU: Dual Xeon E5-2670 @ 2.6 GHz
Motherboard: ASRock EP2C602-4L/D16
Chassis: Random 4U case I had laying around
Drives: 4x 840 Pro's in RAID 0, SanDisk Ultra II 960GB SSD, 2TB Spinner just because
RAM: 256GB (16x16GB Kingston @ 1333MHz)
Add-in Cards: Areca 1880i
Power Supply: Seasonic 750W
Other Bits: Norco Open Rack

So there's my cluster. I'm burning in the latest addition and will be installing it into the rack pretty soon. In the meantime, I am planning my next project: VSAN. I'd love some opinions. I've started by getting some 10GB gear together. I've installed X520 DA1's in two of the servers and a DA2 in the other. I will probably connect them directly for now until I get a 10GB switch. Once that happens, I will likely swap out all of the DA1's for DA2's. I also need a couple of additional UPS's for the two new servers and then I just have to pick out storage hardware for the VSAN itself.

The plan for VSAN is to performance test it in an Essbase environment. I have some 15K 300GB SAS drives that I can use for this, but I think I'd like to test All SSD and Hybrid. For the controllers, I thought I would just pick up 3 of the 9211's. They are on the 5.5 supported list for VSAN. Then all I will need is the SSD part. I'm considering the Samsung enterprise drives pop up pretty cheap periodically. I'm guessing I would need 6 or so of them.

Finally, I'm also considering a storage server of some sort (virtualized). I was thinking about using my original Norco box, getting rid of some of the drives and starting a new array. Perhaps the Toshiba 5TB shucked drives or perhaps the new 8TB WD Red drives. Even a four drive array will expand my backup capacity to 15 to 24 TB, which is a need. I'm down to 2TB free on my backup array. I keep 14 days of VM backup's of almost every VM I run.

So many things to play with...not enough hours in the day. This forum has been a great resource. I saw a final bachelor build thread and I had been meaning to post, so I thought now is as good of a time as any. Here's a diagram:

Lab Layout.png

I'll take pictures once I get the ESXi3 box in the rack and get it all cleaned up a bit. Here's a shameless plug to my blog which you guys will almost certainly have virtually no use for:

Hyperion EPM

And here's a link to the page on my blog devoted to the lab itself:

The Lab

If you've made it this far into the thread, I'd like to say I'm sorry for such a long post. I just re-read it...
 

Patrick

Administrator
Staff member
Dec 21, 2010
12,519
5,827
113
Great stuff! I knew a few teams that deployed Hyperion at PwC when I was there.

BTW - I think we need lab pictures!
 

briandm81

Active Member
Aug 31, 2014
300
68
28
42
Pictures will be coming. Right now my office is a mess and I still have ESXi3 sitting on my desk! I'm trying to get it all cleaned up so I can take pictures!
 

nthu9280

Well-Known Member
Feb 3, 2016
1,629
500
113
San Antonio, TX
Brian, Great setup & EPM blog. I'm in similar shoes but now consulting in SAP BI, BPC area. Started as Novell Admin in the 90s. Now I have a hodge podge of desktops and the new build with E2670 / P4000 chassis. One of these days I'll organize them to be able to post a pic. 22-25U racks are almost as expensive as the full size ones.
 

briandm81

Active Member
Aug 31, 2014
300
68
28
42
Cool nthu9280. I'm a fan of SAP's BI offering. I'm not going to lie, I'm not a huge BPC fan. But I think that is largely a function of competition. ;) I'm also bitter that they are finally getting rid of MSAS, which I have a sentimental attachment to. The last time I touched BPC...it was still called OutlookSoft!

And pricklypunter, it requires around 2ft x 3ft of floor space. As for time...just stop sleeping. Overrated in my opinion. Money...yes...this hobby is expensive! But I enjoy it a lot and it has a lot of professional value for me personally.
 

briandm81

Active Member
Aug 31, 2014
300
68
28
42
I finally had a chance to get it all cleaned up! So here's a picture:

And this is what it has to look like most of the time to keep tiny fingers away from it:
 
  • Like
Reactions: ArcyneTheFirst