Phi and GRID K1/K2: Liquid Cooling (Alternatives?)

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.
Aug 17, 2021
35
7
8
Sorry for the botched initial post. Got distracted and hit enter by mistake.

I've got a bunch of phi and nvidia gpu cards. I'm wondering if they are worth hanging onto and trying to use them for a new project, or should I sell them used and buy newer cpu/gpu resources? I have five nvidia grid k1 cards and two grid k2 cards. I also have five (maybe six) phi 3120 cards and one (or maybe two) phi 5000 series cards. I have seven 2u servers. They were purchased five together, and then an additional two. The first five came with phi 3120 cards and grid k1 cards. I don't remember exactly what came in the final two I bought, I believe they are both k2 cards and at least one of the phi cards is a 5000 series card.


I'm ditching all of my dual-xeon 5600 cpu rack hardware and racks and I'm building a couple new tower machines in fractal node 804 cases that I have already purchased. In the end I'd like to end up with four fractal 804 machines: 2x nas/das, 2x cpu/gpu and then I'll figure something out for a few esxi homelab host machines.

Are phi cards worth trying to use or should I sell them? I'm going to hang onto the grid k2 cards because I haven't done esxi 7 upgrade yet so I can still use the k2 cards for vGPU. That means that I'm left with 5x grid k1 cards to be used for gpu compute resources. Is it worth it to try and use the k1 cards? They are worth roughly $100 each used and I'd match that to buy better cards. For $500/card is there something that would be significantly better than a grid k1 or k2?


The liquid cooling part is just an idea and totally new to me. I've never done it and know nothing about it. The logic behind it is two reasons: size/space saving and eight paws/two tails. The cpu/gpu cards I have are all fanless. Our home doesn't have hot/cold isles, the cases I've chosen make two gpu cards tight (or not possible) and finally, we have dogs. One of which I'd swear sheds her entire coat of fur on a daily basis. By liquid cooling the machines I gain the space I want/need to run cpu/gpu cards (or dual-gpu) in each machine, and I don't have to worry as much about fans clogging, dog hair, etc.


I have looked online and don't see much regarding people liquid cooling phi cards or nvidia grid k1/k2 cards. I did find one place that seems to have a cooling plate for (dumb luck) the 3120 phi cards I happen to have. But that said, I have no clue if the parts are available and how much the parts cost. I'm wondering if the cost of the heat plates would far exceed the value of the phi cards (especially since I believe the 3120 cards are sequential processing, not random processing).

Maybe my only other last question would be regarding mobo cpu sockets, data access and instant/flash resources. I've never used instant or flash resources like Optane. I assume it is going to be welcome and positive. Is the 3647 socket a solid platform to work with that'll get me a couple years of work? Should I consider something else? What about 10GbE networking for data access. I have a mess of infiniband connectivity that goes along with the 7x 2u servers that I'd really like to get rid of. Is 10GbE networking enough of a pipe that I won't grow old and die waiting to move things around?

Thanks for reading and I appreciate any replies/feedback in advance.
 
Last edited: