Are you sure about that? I had Windows 10 and Server 2019 running on this without any issues.it is not possible to run windows on these ( you can get HyperV core running ) as it is bios disabled and the graphic chip was disabled, there is a bios hack to make this work but I did not take it with me once I left
How were they usually bundled as shipped to customers, hardware-wise? I remembered reading their case studies where one of the deployment profile was a clustered setup with 64GB of RAM (maxed), Windows Server VMs that consumed up to 56GB of RAM (with 8GB for the Zynstra orchestration). How does it jive compared to HPe's own sizing guide?i used to work at zynstra when they were still selling these, if anyone has any questions ask away
The BIOS could've been updated/unlocked after a while, so I am not surprised if they can run Win10/WinS19 nowadays.Are you sure about that? I had Windows 10 and Server 2019 running on this without any issues.
The fan goes full jet engine at max speed. So I'm trying to swap in another fan. But since that won't be enough by itself then. I'll cut open the cover to have a big fan blowing fresh air in from the top. But first I need to fool iLO's PWM tach reading I believe. Otherwise just says no fan connected.also they were awful and overheated constantly
Yeah well, there's only, like, what 6-10 of those fully boxed machines with HDD brackets, while there are probably around 60-100 of those EC200a machines without brackets.I didn't mean it selling out but just more people looking at/buying them. Based on when the article was released until now, more total units have been sold between all the sellers than prior to that article.
Also, the post you quoted was for the one with the HDD brackets in them which did sell quickly after the initial person posted the link in this thread.
Could you PM the shared link?For anyone interested, I have a copy of the March 2020 HPE Service Pack disc I can share privately. My EC200A came with the original 2016 v1.00 system ROM. This brings it up the latest v2.66 available via entitlement. I can confirm that both Debian and FreeBSD installed without error via remote .iso attach from the iLO Remote Console.
Stock firmware upon receipt:
Firmware Name Firmware Version Location iLO 2.40 Dec 02 2015 System Board Intelligent Platform Abstraction Data 10.7 System Board Intelligent Provisioning N/A System Board Redundant System ROM U26 v1.00 (04/14/2016) System Board Server Platform Services (SPS) Firmware 3.0.3.9.1 System Board System Programmable Logic Device Version 0x07 System Board System ROM U26 v1.00 (04/14/2016) System Board
Firmware post-update:
Firmware Name Firmware Version Location HPE Ethernet 1Gb 4-port 366i Adapter - NIC 1.2529.0 Embedded iLO 2.73 Feb 11 2020 System Board Intelligent Platform Abstraction Data 12.02 System Board Intelligent Provisioning N/A System Board Redundant System ROM U26 v1.00 (04/14/2016) System Board Server Platform Services (SPS) Firmware 3.0.3.9.1 System Board System Programmable Logic Device Version 0x07 System Board System ROM U26 v2.66 (07/19/2019) System Board
Yeah, I don't think the link to them has been posted over there yet or on their discord so they probably don't know about it. Maybe I'll swing over there and post itYeah well, there's only, like, what 6-10 of those fully-boxed machines with HDD brackets, while there are probably around 60-100 of those EC200a machines without brackets.
I am surprised the forums at serverbuild.net didn't jump on this one yet. These machines seem to be right up their alley...
Do it. At least it'll keep them from using ex-OEM CSE-815s for firewalls (seriously, how many 1U Nehalem Xeons or Magny-Cours Opterons do you need?) or buying up obsolete pre-Haswell TinyMiniMicro boxes.Yeah, I don't think the link to them has been posted over there yet or on their discord so they probably don't know about it. Maybe I'll swing over there and post it![]()
Yeah, their big kick now is the HP 290s with the G4900t processors.Do it. At least it'll keep them from using ex-OEM CSE-815s for firewalls (seriously, how many 1U Nehalem Xeons or Magny-Cours Opterons do you need?) or buying up obsolete pre-Haswell TinyMiniMicro boxes.
I plan to use mine for various tasks I want to have available 24/7 - Smarthome platform, Plex server, Wireguard endpoint, SMB shares, data backup.Out of interest what some of applications folks here are using this machine for? Looking for a bit of inspiration.
The only thing I'm reserving my judgement for is the fan noise... Really hopeful this can stay in the living room. Otherwise, I might look at a fan mod. Does anyone have successful examples of a fan swap?The fan goes full jet engine at max speed. So I'm trying to swap in another fan. But since that won't be enough by itself then. I'll cut open the cover to have a big fan blowing fresh air in from the top. But first I need to fool iLO's PWM tach reading I believe. Otherwise just says no fan connected.
That's been their kick for the past 9 months.Yeah, their big kick now is the HP 290s with the G4900t processors.
I have 3 of these running proxmox. I have a total of 9 VMs running across them. Each of them has 32GB. I run nextcloud, Proxmox Mail gateway, Pi-Hole, Ubuiquiti Cloudkey, plex, crashplan, and an ubuntu and windows vm. They seem to handle all of them fine. I could use additional memory in them though. That would enable me to possibly double the amount of VMs I could run.Out of interest what some of applications folks here are using this machine for? Looking for a bit of inspiration.
there were two configs ( all used same processor, hpe did make others with more beefy processors but unsure if they rverHow were they usually bundled as shipped to customers, hardware-wise? I remembered reading their case studies where one of the deployment profile was a clustered setup with 64GB of RAM (maxed), Windows Server VMs that consumed up to 56GB of RAM (with 8GB for the Zynstra orchestration). How does it jive compared to HPe's own sizing guide?
Who were their target customers, and how much was this Zynstra bundle? My guess is that they are trying to keep CapX low and OpX consistent?
How were they generally deployed? Is it replacing the usual front-office in-store-processor for retail locations, or is it mostly a backoffice inventory/warehouse supply chain thing?
Is there a way to restore the originally shipped Zynstra disk images back to the machine just to give this thing a test drive?
Did anyone in the partnership manage to make money from the EasyConnect line, or was it mostly a dud? I didn't see NCR pushing the product after their buyout of Zynstra in late '19.
The updated bios can be downloaded from the link below with free hpe account.
![]()
404 Error
techlibrary.hpe.com
I was given a brand new one about 9 months ago and it died. When I press the power button the power light flashes white and nothing comes on the screen. There is a bank mini green LEDs on the motherboard that are flashing what I believe is an error code but haven't been able to find that info.there were two configs ( all used same processor, hpe did make others with more beefy processors but unsure if they rver
standered
2x2tb Disk
1x 256gb ssd
32gb ram
Premium
2x4tb disk
1x 516gb ssd
64gb ram
you could also purchase the extension and the extra ports and a fancy stand ( think I still have those at home I’ll have to look )
there was a wall mount so you could mount it on the wall and use it as a WiFi router plus sever, Thee things ( if they ever worked properly ) would have been killer
how ever the Zynstra solution was unique and is still going to this day so I can’t provide you with a image due to a whole lotta laws, it currently runs A whole lotta of other large scale company’s as it is a essentially a self “ healing “ self “ building “ hyper visor. As in it gives you a web page and you can put it the details such as how many windows VMs, Linux VM domain name dns dhcp settings as it automatically builds the VMs for you. This is what makes it popular in retail, all they have to do is ship a box, it builds a load of VMs etc for you and done
HPE thought this would be a gold mine, how ever they failed to count on the fact that people actually want to control there own hypervisor and stuff, Zynstra does not let people into it, you had to pay for there service pack. Which is where the trouble came from
The target customers were mostly SMB’s from schools to small GP clients ( UK ), they also sold larger HPE rack mounted but only to larger clients. These things were not cheap and started at around £120 a month, new they were around £1300 to buy so it made sense to a lot of small company’s to buy there servers and there support at the same time.
the reason you had to reserve 8GB away for the hypervisor was bescuse there were a lot more VMs running
with this solution you also got a firewall and a router ( hence the WAN and LAN port )
the retail is a new thing and they have moved away from ec200as due to there crapness with over heating and reliability. How ever the software is still the same.
We used to have hundreds of these things in the server room as test build boxes purely beacuse we had so many that HPE gave us as they couldn’t sell them
The buy out was purely for the software as they were the first to get to the punch of a “ self building server “ that was commercially viable, people were trying to buy them out for years, and when a deal was made it was for more then 5x the company’s worth just due to the other bids coming in. VMware has something similar if you want to try but they gave up after NcR backed Zynstra and they took off