Vmware ESXI whitebox build

MiniKnight

Well-Known Member
Mar 30, 2012
3,001
911
113
NYC
Do you have the motherboard already?

I'd say it will work but it will be hard to administer. You would need to connect a GPU to see the local console.

Even selling that board to get a real server board will give you
  • ipmi prob with iKVM and serial
  • more DIMM capacity since you can use RDIMMs
  • better nics if you're using the extreme4, and maybe more nics
  • better hardware compatibility
I'd personally get something more useful for VMware. Most on STH have used consumer gear as ESXi server hardware but you'll end up selling the consumer gear for server gear soon.
 
Jul 2, 2016
62
13
8
36
Iceland
Do you have the motherboard already?

I'd say it will work but it will be hard to administer. You would need to connect a GPU to see the local console.

Even selling that board to get a real server board will give you
  • ipmi prob with iKVM and serial
  • more DIMM capacity since you can use RDIMMs
  • better nics if you're using the extreme4, and maybe more nics
  • better hardware compatibility
I'd personally get something more useful for VMware. Most on STH have used consumer gear as ESXi server hardware but you'll end up selling the consumer gear for server gear soon.
Thank you for your reply.
I haven´t bought the motherboard or the CPU (still planning the build). At least i´m looking for a motherboard that will fit into an ATX case and is decently quiet ( for example Fractal Design Define R5 ATX ). ESXI isn´t the only usecase for this build , i´ll probably try out setting up Azure stack (Minimum of 96GB of memory or 128GB as recommended. Minimum of 12 Cores CPU or 16 Cores CPU as recommended
Minimum of 4x Disks and each 140GB or 4x 250GB Disks recommended for Azure Stack). BTW i live in Iceland and alot of stuff sold on this forum isn´t shipped over to Iceland ( i would proably need to order from wiredzone and pay extra 150 $ for shipping only for the motherboard, but i might need to do that though).

Also looking for a decent 16 port Gigabit switch with POE capability and would have LACP support ( Planning on using a Qotom Q190G4 Intel Celeron Processor 3215U Dual Core Processor Barebone Mini PC Pfsense box for a router on a fiber optic connection to my ISP and Ubiquiti Networks Unifi for wifi access). I might build a Feenas box sooner than later and would probably need the LACP feature for the networking part of Freenas.
 
Jul 2, 2016
62
13
8
36
Iceland
Found a motherboard with Integrated IPMI 2.0 with KVM and Dedicated LAN (RTL8211E) - ASROCK EP2C602-4L/D16
Looks like Amazon ships these motherboards to Iceland for 72 $ (i can live with that) . Could buy two used E5-2670(C2) processors from Ebay for that motherboard :). If you know about two used E5-2670 cpu´s for a good price (and ship to Iceland) then please let me know
 
Jul 2, 2016
62
13
8
36
Iceland

ttabbal

Active Member
Mar 10, 2016
760
204
43
44
One more thing , Will this ASROCK EP2C602-4L/D16 motherboard fit into a Normal ATX case (Fractal Design Define R5 ATX case).

Form Factor - SSI EEB
Dimensions - 12'' x 13'' (30.5 cm x 33 cm)
Probably not. It only lists up to ATX. SSI-EEB is huge. For a consumer case, you need something that can handle at least E-ATX. And the I/O port area might be in the wrong place. I haven't seen that be a problem myself, but people mention it...

Having been there, skip LACP if you are trying to get more speed. It only helps with lots of clients. And even then, it's kind of random. You are far better off getting a switch with a 10Gb port and connecting the server to that. The LB4M is available for not much more than a managed 1Gb switch. LACP can also cause odd network behavior. I had it set up on FreeNAS and working well locally, but the server couldn't get to the Internet for some reason until I disabled LACP.

I would pass on that motherboard myself. Realtek LAN is only kind of supported on server operating systems and tends to fall over with lots of clients. For 1Gb, Intel should be the only NIC considered. For 10Gb, Mellanox and Chelsio are also well supported.
 
  • Like
Reactions: Hjalti Atlason
Jul 2, 2016
62
13
8
36
Iceland
Probably not. It only lists up to ATX. SSI-EEB is huge. For a consumer case, you need something that can handle at least E-ATX. And the I/O port area might be in the wrong place. I haven't seen that be a problem myself, but people mention it...

Having been there, skip LACP if you are trying to get more speed. It only helps with lots of clients. And even then, it's kind of random. You are far better off getting a switch with a 10Gb port and connecting the server to that. The LB4M is available for not much more than a managed 1Gb switch. LACP can also cause odd network behavior. I had it set up on FreeNAS and working well locally, but the server couldn't get to the Internet for some reason until I disabled LACP.

I would pass on that motherboard myself. Realtek LAN is only kind of supported on server operating systems and tends to fall over with lots of clients. For 1Gb, Intel should be the only NIC considered. For 10Gb, Mellanox and Chelsio are also well supported.
Noted, thank you for your input.

I might end with my first choice and have the Case on my table/under my table connected to my monitor via VGA (to connect to the esxi console). I could buy an extra PCI-E intel network adapter for the ASRock X99 Extreme4 motherboard. This is a basic lab computer. But good to know about the LACP feature on the freenas. I´m actually going to try out SITE to SITE tunnel from a PFSENSE home router to Pfsense appliance in Azure (budget hybrid cloud). Some workload might end up on a Freenas box later on.

work.jpg
 
Jul 2, 2016
62
13
8
36
Iceland
A little change of plan , i will build a smaller ESXI whitebox build from parts i bought used in Iceland.

Mobo: Asrock Extreme z97
CPU: Intel i7 4790k
HDD: 4 X 1 tb 7200 rpm seagate drives ( used for PCI passthrough storage for a Freenas virtual machine)


DSC_0383_Easy-Resize.com.jpg
 
  • Like
Reactions: archangel.dmitry
Jul 2, 2016
62
13
8
36
Iceland
This is only a test lab machine (decided to build from parts i picked up for a good price)

Now i need:

(4x8) 32 gb of DDR3 Dimm ram.
SSD for the ESXI datastore
LSI 9210-8i (for the sata drives and PCI passthrough for the Freenas virtualmachine running on ESXI)
Powersupply
CPU Heatsink
 
Jul 2, 2016
62
13
8
36
Iceland
I bought couple of items for this ESXI whitebox build yesterday:

Thermaltake Berlin 630W
ADATA 32GB DDR3 1600MHz (4x8GB)
Noctua NH-L9x65 Cpu Cooler

rsz_build.jpg



Waiting for packages i ordered from Ebay:

1 X Intel DC S3710 Enterprise 400GB 2.5" SSD
1 X LSI SAS 9210-8i 8-port 6Gb/s PCIe HBA RAID SATA Controller card
1 X 85cm Mini SAS SFF-8087 36-PIN to 4 SATA 7-PIN Vertical 90 Degrees Breakout Cable

Might buy
Extra SSD for Plex video storage ( if i decide to install the Plex plugin on a Freenas virtual machine running in the ESXI environment).
2 X 1 TB 7200 RPM HDD (Extra drives if any of the 4 x 1 TB drives in the Raid-z pool on the Freenas virtual machine dies)
 
Jul 2, 2016
62
13
8
36
Iceland
I might do that , and use Intel DC S3710 Enterprise 400GB 2.5" SSD disk for the ESXI datastore.
the 4 X 1 TB 7200 RPM sata disks will only be used for on virtual machine (Raid-z pool on Freenas).

I´m also setting up a small physical server (Freenas backup server) with 4x 1 tb 7200 RPM sata disks to try out ZFS replication from the Freenas virtual machine to the physical Freenas server. Do vmware snapshots and test Crashplan backup for on site backup.
Also want to try out going from 4 x 1 TB drives to 4 x 2 TB drives on a diffrent hardware (import the ZFS pool on diffrent hardware and expand the ZFS pool with 4 X 2 tb drives)

This is just for learning purposes at the moment , but will hopefully be able to use what i learn from this setup in production environment.
I´m kind of a hands learning person ( i´ve read stuff and watched Freenas and ESXI videos but i think i will need hands on experience on hardware to get a better context of what i´ve learned).
 

Rand__

Well-Known Member
Mar 6, 2014
5,160
1,070
113
I would recommend installing ESXi on an USB stick. I am not sure which way you are going.
I'd recommend that only if you don't intend to use vsphere later - those two don't mingle well in my experience. But for your stand alone ESX it should be fine:)

I might do that , and use Intel DC S3710 Enterprise 400GB 2.5" SSD disk for the ESXI datastore.
You can use the disk as datastore even if you install esx on it.
 
  • Like
Reactions: Hjalti Atlason
Jul 2, 2016
62
13
8
36
Iceland
Now i have extra 5 X 2 TB 7200 RPM Sata drives (for the Physical Freenas server) and a 500 GB Samsung EVo 850 SSD (For the Plex storage on the Freenas Virtual machine which will run on the ESXI host machine). Also bought sound absorbing material for the Freenas physical backup server.


rsz_dsc_0394.jpg