Centralized liquid cooling is something that has been used in some data centers for years so certainly others have taken notice.I have a sort of centralized water cooling. I have yet to connect my new server to it but my desktop is. I have a couple heater cores and a huge 5' hydronic radiator, about 40' of tubing currently and it's driven by an Iwaki MD-20 (Japanese motor). Personally I wouldn't screw with a bunch of pumps and loops. Just use manifolds. Like is done for hydronic heating. The technology for this has been around for a long time in the HVAC industry.
The Koolance is a nice idea but is still not cheap at US$1,700. I would imagine a PC watercooling setup with a couple of video cards could come in at less than that quite easily and that solution still does not include water blocks for the CPU / GPUs or pipe / connectors outside of the unit. I would also imagine that a lot of people, like myself, who do have a rack at home dont actually have it filled so using a chunk of space at the bottom for an external pump and res may not be so bad. If the solution were to be moved in to a data center then the Koolance may be a better solution although I don't see any redundancy / leak monitoring on it. It does look good though . I may look to go that way for my phase 2 on my own watercooling rack project.A tip for those stuck in the PC world, stop wasting money on expensive PC-based WC options !!!!! If you are stuck, then maybe some 3-5RU solutions maybe in order:
https://www.google.com.au/search?q=...m_3k4u_water_cooled_server_chassis%2F;500;272
Koolance ERM-3K4U - water-cooled server chassis - www.nordichardware.com
Yep, car or motorcycle rads are an interesting option, leak detection and protection for the watercooled servers and the rack are also preferable but it really depends on the scope of the solution as to the importance. S0lid is doing for his own hoem rack with a few servers and lots of spare space. It has different requirements to an enterprise solution and is really just a proof of concept for him as the one I am doing is for me. Hopefully once the base build is up then he will look at enhancements and protection.I see from the images, a lot of wasted rack space is about to follow for no good reason along with some serious potential issues. The placement of WC gear above the fronts of expensive servers is not a prudent idea either.
Options to look at, bigger rads in the form of those designed for vehicles and other bigger situations. Place the rads in the top of the rack where the normal top vent fans would be, if not venting up, place rads at rear of rack or situate them somewhere else to get rid of the heat rather than keeping it in the area.
Always use multiple pumps with check-valves and flow monitors, a failure can be messy or costly.
Always monitor the level of the coolant in the res to make sure that there are no leaks. Just for the record, Evaporation is not possible either if the loops are sealed.
The Muc Supercomputers copper pipework in the blades looks great but the pipes are of small bore (easier to block with corrosion etc). It would also be interesting to see what sort of wattage pipes of that size can cope with and an idea of what sort of pressures they are using. Going along these lines, how about a feed into a manafold in the server splitting a 1/4" external connection to two 1/8" internal loops, one for each CPU., possibly one also taking in the chipset as well.Well the plan is to use one cpu per server in cpu0 socket so it doesn't matter if the cpu1's rams get blocked, atleast in my case.
Copper pipes, now that's an interesting idea. I was kinda wondering how to do the tubing so it doesn't go outside the nodes height. So copper pipes would be just perfect for the task. Now I need to ask around what kind of fittings work with copper pipe.