Hardware pron thread

manxam

Active Member
Jul 25, 2015
235
49
28
Nothing to see here move alo.. Oh, wait.. Orange! Lots and lots of orange cool stuff. Everyone crowd around.

P.S. I, too, workout right beside the server rack, but the vacuum hose is on the opposite wall. The vacuum, however, is right above the rack. Makes cleaning the bin out... interesting...
 

Tom5051

Active Member
Jan 18, 2017
291
50
28
43
What's with the TV next to the rack? Can you even hear it over the fan noise?
 

Dawg10

Associate
Dec 24, 2016
220
113
43
Makes cleaning the bin out... interesting...
There is a certain pucker factor there...

What's with the TV next to the rack? Can you even hear it over the fan noise?
Loudest thing in the rack is the UPS. Normal Ops = 1x R320/8c/48GB/4x4TB/16VMs + a SuperMicro D525/4x4TB storage barge + 1x Nortel 5520-48PWR with upgraded fans + Synology 2disk + (router.r.pi.thingies). I usually have the radio up loud and the TV on to the security cams, ebay auto refreshing, or the Playmate Party at 0930 on channel 780.
 
  • Like
Reactions: capn_pineapple

funkywizard

mmm.... bandwidth.
Jan 15, 2017
847
400
63
USA
ioflood.com
It works surprisingly well. The cards before modding that way cool very well "stand alone" but hit maximum temperatures in a minute or two crammed next to other cards. The gigabyte performs the worst in that "crammed together" environment so I mounted that one on the end where it gets some access to airflow.

First off, the branded logo on the cards blocks a large percentage of the heatsink exhaust fins so I ripped that off right away. I tried out quite a few configurations of fans to see what would perform adequately without excessive power use. I settled on 4 "VH" model 40mm delta fans (power use around 2 watts ea) and a fifth "SH" model fan (6 watts give or take).

Letting the linux driver decide on its own fan speed, at full workload mining zec at full 250w tdp, the cards maintained an adequate temperature. Been a while since I put that together so I don't recall the temperatures, but I definitely would not be satisfied with a steady-state full-load temperature unless it was below 80c.

If anyone cares I can track down my notes and additional details.
 

funkywizard

mmm.... bandwidth.
Jan 15, 2017
847
400
63
USA
ioflood.com
Speaking of gigabyte, here's a before and after. A little less proud of this one -- the stock cooling works better if there is sufficient airflow, but this mod works better when packing cards next to each other. Unlike the last post, this takes away the stock fans completely. Unfortunately the gigabyte card doesnt give me much option with the way the air shroud, fans, and heatsinks are placed, it's very much take it or leave it.

SkypePhoto_20170916_001751.jpg SkypePhoto_20170916_001746.jpg
 

moblaw

Member
Jun 23, 2017
77
13
8
36
Aren't you guys worried about pump-mtbf?

An air-cooled gpu will not spin below 60C, which reduces the load, slightly. But a gpu-aio solution, the pump will spin all of the time.
Ofc, "modern" pumps are rated at 5 years mtbf at 50c, from what I could gather.

A failed pump, would cause the gpu to overheat - shutdown/bsod the system. Worst case senario, I don't want to speculate this.
 

funkywizard

mmm.... bandwidth.
Jan 15, 2017
847
400
63
USA
ioflood.com
Aren't you guys worried about pump-mtbf?

An air-cooled gpu will not spin below 60C, which reduces the load, slightly. But a gpu-aio solution, the pump will spin all of the time.
Ofc, "modern" pumps are rated at 5 years mtbf at 50c, from what I could gather.

A failed pump, would cause the gpu to overheat - shutdown/bsod the system. Worst case senario, I don't want to speculate this.
Yeah, there is some concern there, but so far so good (fingers crossed).

I've found that the CPU all-in-one cooling is relatively pointless. If you use 2u heatsinks or desktop heatsinks, the temperatures are excellent and the power use is low.

For GPUs, however, it can be a mission and a half to keep them from overheating. For the water cooled cards, there is zero issue with that.

Two very similar cards:

1080ti, evga, air cooled, fans at 100%, max tdp reduced to 190w, running 59c without any major obstructions to airflow. This temperature is acceptable but not ideal, especially considering I'm at <80% power limit. If 2-3 cards like this are packed closely together, all bets are off -- without monster supplemental fans, you'll hit 85c within a couple minutes and throttle the speeds way down.

1080ti evga sc2 hybrid, low power fan + pump, max tdp reduced to 190w, running 29c, and it will continue running at a ridiculously low temperature even if I pack 3 of these cards back to back with one another.

edit: Another point is that the AIO GPUs are much much quieter than the air cooled ones. So that's a big plus as well. For home use it can be a different story -- gaming doesn't throttle up the temperatures or fans nearly as much as mining or computing, and with a single gpu in a desktop chassis, the cooling is generally adequate. In a server however, only the "blower fan" models and the water cooled models have any hope of cooling adequately, and the blower-fan models are very loud.
 
Last edited:

maze

Active Member
Apr 27, 2013
571
91
28
100G Transceiver (yellow cables)

Maybe today is no more a big deal, but three years ago when it arrived it was very exciting!
100G is still way cool :) even if its pretty common these days.

People actually loading up 100g interfaces on average... now thats awesome ;)
 

whitey

Moderator
Jun 30, 2014
2,774
869
113
39
Right I cannot even load up my 40GbE, certainly not on the stg side of the house. :-D