So.. need help getting into a Quanta Windmill in 2018 :-P

Should I be looking for single nodes or those dual node/single PSU systems?

  • Get a single node (and hack an ATX PSU to work)

    Votes: 0 0.0%

  • Total voters
    2
  • Poll closed .
Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.
So i'm hoping to finally join the ranks of Quanta Windmill owners.


I know the last year or so i've seemed a little waffly, asking alot of diverse questions in different directions, and not yet jumping on deals offered by others - cash wasn't yet available during the struggle to get into college. This time finaid should exceed immediate bills. I'm looking to buy in hopefully in Feb-Mar of 2018. I'm looking to pay around $1200-2000 for computer upgrades between now and August which I hope to have an extended lifespan (not future proof but future ready) lasting me all through college and actually into grad school even if RAM/GPU upgrades (or even CPU) are needed by then. I'm drawn to Quanta Windmill's massive RAM capacity and expecting I only need to plan around known needs to keep it useful into the early 2020's.

However this 1.2k-2k is not for one computer but was planned for the basic components of 3 TO 6 workstations built with cross compatible hardware - I need multiple seats within that budget. Including because both my GF and me are in video/graphic design, a friend who wants to code who just needs a PC to learn on, wanting backup hardware including at an out of town location we all visit and are tired of hauling computers carefully over slippery winter ice fearing a drop, etc etc etc.

More than 6 could even happen if the board ends up usable for midrange gaming (have a deal with someone that if I can build them an affordable PC on this hardware for that I get a little side cash for future builds... which will then buy more hardware for me) or I can use it for an always on NAS server by throwing an SAS card in there and such and the power use isn't too obscene.


Definate uses the system will see - heavy Adobe CC use (mostly Premiere for 4k and 6k video, probably 8k later, also After Effects), Davinci Resolve, 3D Studio Max/Maya, I dont code but one station is for someone who will - on Unity/UE4 for VR headset stuff as we want to go into game development. (no Windows Mixed Reality yet/i'm aware that requires Haswell) Those are the important demanding uses anyways. I hope to also learn about ESX/other virtualization because it sounds incredibly useful to know for the future. I expect 64-128gb per workstation and it's primarily low used RAM costs that are driving me to this generation hardware - by the time DDR4 gets this cheap i'll be out of school. :(


Anyways, I would like to start with just one computer - once I learn that, if it all goes well, i'll hopefully buy into the rest of the hardware by summer because it will be familiar by then and i'll have the time to build them all up then finally. I have to have it ready by fall 2018 semester.

So i've got two main questions to start: should I be looking at buying single nodes (and modifying ATX PSU's to run them) or should I be getting one of those dual-node racks (with the single 220vac PSU) because extras make it worth it?

Next is there any reading you recommend I do whether in the forums, or general online, or articles elsewhere? Or a big howto to get everything up? I read through the big 33 page thread I saw taking notes and was still left with questions that (despite google and admonishments to use it more) didn't pop up as answered to me.


Things I think I will want (and not to hunt for in the future):
- Those PCIe risers allowing dual x16 cards
- Those 10gig Ethernet mezzanine cards (at least eventually, or if not too expensive right away)
- Suggested SAS card (if there are issues with fitment and such, otherwise is it just the same ones recommended elsewhere? etc) that will work with win 7, linux, freeBSD, and hopefully ESX.
- Included heatsinks since i'm just running stock speeds
 

StammesOpfer

Active Member
Mar 15, 2016
383
136
43
I would go with the dual nodes. Hacking together Power is more work. The fans included with the chassis are actually pretty good and quite under normal conditions. Only downside is it won't run on 120v power. But I ran mine on a cheap 120v>240v transformer for almost a year and never had an issue. The quantas came with the 10gig mez card (at least mine did, no riser though).

The LSI SAS cards recommended everywhere else will work. But you might need to add a fan since there isn't a ton of airflow at the back. (or is it considered the front on these?)

Stock heatsinks in the dual chassis with the black air shrouds is a good solution. if running outside of the chassis or without the shroud then you need aftermarket heatsinks with a fan on it.
 
  • Like
Reactions: Twice_Shy
Right, i've read that far, about people suggesting the transformer and such... what I was curious about was there's a pretty decent cost difference it seemed - $60/board vs $200 for the pair (120/two boards) so was wondering what the extra $80 got you.. if it was only the shared quirky PSU, or if it was a bunch of little things (like 10gigE) it would cost you more to track down on your own.
 

StammesOpfer

Active Member
Mar 15, 2016
383
136
43
The PSU, an engineered cooling solution. 10GbE is more about what the seller you buy from includes. Not having to rig something together for power and cooling
 
So if I WERE willing to hack up ATX->PSU for it that reduces the difference a bit... because I assume the cooling should be standard LGA2011 coolers and fans? (even if it may sit open case) I haven't yet investigated the PSU hack in depth but i'm not afraid from a bit of soldering if that's all it is.


I mean in part if I want to use a GPU with an external PSU i'm going to need to add a PSU anyways and it's going to be hacky no matter what on some level... i'm okay with that, it just makes me think its worth learning to do the PSU mod since it's something I need multiple of in all likelihood. :)
 
Okay I think i'm starting to see the logic... looking around those heatsinks are like $30 apiece, so two cpu's and a $60 mobo becomes a $120 mobo just to add coolers... even if I didn't use the shared PSU it wouldn't matter. Or i'm assuming the heatsinks are passively cooled from the included fans just having air ducted thru the enclosure insofar as there is one so this is a fair comparison?


And with the wonky PSU connector thats an additional barrier - adding a 'second' PSU just for the graphics card be easier than powering the whole thing from ATX potentially.


So the 'dual node' setups are starting to seem like a better idea. (guess that question was answered quicker than I originally thought) Especially to just get something up and working ASAP without the higher power GPU option.


Step two I guess becomes who the main sellers are (are there really that many choices?), and what expectations should I have? Ie is it easy or hard to find the 10gigE card and such later if not included for instance... (someone linked the PCIe dual slot riser elsewhere) is there anything else to help distinguish a better vs worser deal of what I want in the package?

Is the time to buy 'now'/ASAP or is there any reason to wait/cyclic availability of better offers? Or any reason to believe prices will drop, availability/extras in packages will improve, and similar?

==========
EDIT: Well crap, a perusal of previous assumptions is starting to skew the economics a bit. The $200 dual node systems are now up to $300 shipped with a terrible return policy (must test within 72hrs, but if I dont have cpu and ram handy i'm testing those at the same time from other sellers?) and the 8 core cpu's that were $60 last year and $100 in december now are showing at $140... what was a good deal is losing some lustre at 50% more per node with a 3 day warranty when i'm starting to see other ways to put up 3-6 seats with alot of RAM on Nehalem/Westmere systems with far less hassle and more pcie slots... esp if I can't use most of those extra cores for anything useful anymore. Did I miss the boat of the really great Windmill deals or does anyone know of gear sources out there for below current/closer to previous ebay prices?

Does anyone think i'd regret going LGA 1366 instead of LGA 2011 for my stated needs of 6k/8k video editing and all the rest, assuming i'm running just as much RAM (128gb or more) in the right boards? (and despite half the cores)
 
Last edited:

StammesOpfer

Active Member
Mar 15, 2016
383
136
43
I don't think i would go 1366 at this point. But yes these are not as cheap as they used to be. I would guess we see another big dump of hardware this year but that is just based on how long it has been since these came on the market. I've got one of these dual node systems that I could sell (no ram or CPU) but it looks like you are looking for multiple units. It may not be a great time for now.
 
  • Like
Reactions: Twice_Shy
It's more the warranty term that bothered me, searching ebay I either get used for $175 shipped or new for $300 shipped with crap 72hr warranty. Plus still thinking about those little hassles everyone talked about in 35 pages of reading elsewhere when i'm already as busy as I am when I wont have much tinker time until late summer.

That said, if another Windmill buying opportunity presents itself AND cpu prices have a similar forcedown in values from market dumping, hopefully i'll be ready to hit the ground running. If I get DDR3 for some LGA1366 gear, that will still work fine in Windmill I hope. (like RAM used to seem to be alot more system sensitive of what was compatible, but isn't it alot less persnickety now and compatible generally?) If I spend very little on LGA1366 mobo's and cpu's because they are ultracheap, that doesn't deprive me the ability to get yet more Windmill boards and tinker with them then. It's more postponing the big gearing up six workstation purchase plan to wait and see if another dump creates a buying opportunity. My reason for thinking 1366 mobo over 2011 is faster ghz quad cores being available for less money in a board that still takes the cheap ECC Reg DDR3 that's really the centerpiece of anything i'm eyeing as a build option.

This thread could still be valid. :) I just said "Windmill in 2018", it might get necroposted next December if there's a new price drop happening by then!
 

StammesOpfer

Active Member
Mar 15, 2016
383
136
43
Not terrible logic if you think you will be happy with performance. Not crazy expensive to give it a shot. Keep in mind that if a new set of equipment drops it will be newer and that would probably mean DDR4. But it also might mean a bunch of us are dropping our DDR3 equipment even cheaper too. I don't know what the right answer is for you. I've gone through about 6 different setups in the last 3 years. Try a few things and figure out what works best for your budget and needs.
 
  • Like
Reactions: Twice_Shy

Evan

Well-Known Member
Jan 6, 2016
3,346
598
113
E5 v2 is 4 years old in 2018, so this will likely be the last year of big disposals before we see e5 v3 (ddr4) next year.
 
  • Like
Reactions: Twice_Shy
Not terrible logic if you think you will be happy with performance. Not crazy expensive to give it a shot. Keep in mind that if a new set of equipment drops it will be newer and that would probably mean DDR4. But it also might mean a bunch of us are dropping our DDR3 equipment even cheaper too. I don't know what the right answer is for you. I've gone through about 6 different setups in the last 3 years. Try a few things and figure out what works best for your budget and needs.
Good point! At somepoint the secondhand equipment dumpers put all their Windmills back onto the market because something new also caught their eye, I just don't know if it will drive the market down as much or be as 'all at once' as the big decommissioners.

Mostly i'm just trying to eliminate bottlenecks - not chase performance. If something gives me 10% of the speed I need to eliminate it. (like disk thrash - eliminated with RAM and an SSD) But if it's just "50% more clockspeed for 50% more performance" thats not currently worth the money to me, this early in my career. I just need a capability - not to do it exceptionally well, just to do it at all... so I can accept jobs, get paid, and upgrade as I go.

I found this video searching around which mirrored my philosophy exactly:


My biggest points if any is that i'm not risking much money (therefore a mistake is not expensive, like it is if I overspend on the wrong gear) and it gets me through - like duct tape and bailing wire on a commercial vehicle, not to the point of unsafety but just over something that isn't essential to fix today - until the paying jobs give you what you need to do it right and all at once.

As it is i'm continuing researching in parallel in basically three directions: Upgrading old Workstations, Buying old Workstation motherboards (just the boards for far cheaper shipping and perhaps better value - like Supermicro and Tyan tend to just sell as boards not whole units), and i'll still consider the Quanta Windmill to be a topic to revisit if suddenly there's a new dump and prices go even lower OR i'll revisit it in the future... because i'll still probably have 400gigs of DDR3 laying around that can slap right in if I get some new old stock mobos for nothing then and the relevant cpu's are back down.

The only thing that shuffles around is what is bought when/the best prioritized value of this moment to help do the jobs I keep missing out on so far. I might just get a pair of old Intel 5520 boards to start with something, watch local ads for old workstation gear so I don't have to muss about with shipping, and wait for the next Windmill buying opportunity to present itself since with a few paying jobs i'll have more cash in hand by then as well so it's less splitting pennies to begin with!

At some point i'll want to have a plan for the DDR4 era mostly when i'm doing enough work that the power bill starts significantly eating into profit. I can easily afford upgrades then. Yet CPU's have still sorta plateaued for awhile and only been on a slow rise ever since the core 2 duo era and especially leveling off since sandy bridge... if I get into some ivy bridges in those Quanta boards i'm pretty sure i'd be 8k ready and possibly still able to use them in 2025 limited only by the electric bill.
 
E5 v2 is 4 years old in 2018, so this will likely be the last year of big disposals before we see e5 v3 (ddr4) next year.
That's a good point, arent workstation cycles normally considered about five years? So the last of the Westmere mobos and cpus and DDR3 that goes with it should be getting dumped in all likelihood.

Any idea if used server DDR3 would likely continue to drop? Have you ever seen DDR2 cheaper now that it's obsolete or did it just kind of hit a floor and stay there or even bounce back up a little? I'm just curious to ask those who have watched markets longer than me I guess.
 

Evan

Well-Known Member
Jan 6, 2016
3,346
598
113
Generally most leases are 36,48,60 months, with as far as I know a strong weighting to 48.
The cloud providers are also more likely to have 36 months, combined with early access to CPU’s etc.
 
  • Like
Reactions: Twice_Shy