Deeplearning010 - 8x GTX 1080 Ti Build (Deep learning and crypto mining)

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

Patrick

Administrator
Staff member
Dec 21, 2010
12,511
5,792
113
Since I often ask others to make build threads, and we are about to take off, I figured I would make a quick build thread.

The goal of this system is to get 8x GTX 1080 Ti's into a Supermicro SYS-4028GR-TR (not the TR2 model).

Ideally, I would be using the TR2 model with the single root but this is good enough. The goal is to have two of these systems up for DemoEval, STH deep learning (until Volta) and crypto mining (to help pay for power and rack space.) I am planning to put this on the main site, so this may just be an interim build piece.

Phase 1: The first four 1080 Ti's (three of which will hopefully stay):



Phase 2: 8x GPUs

There are a few items I wanted to note thus far:
  • I ended up buying two of the "hump" tops that a hump for the GTX power cables. That makes the unit essentially 4.5U or really 5U. I plan to use the lost U for an IPMI switch or something similar. Cost each was around $110.
  • The baffle in the picture is out. I was told that both baffles are really unnecessary.
  • The card is currently a Mellanox ConnectX-3 VPI card. The lab mostly runs on 40GbE so this way I can use it for 40GbE but also swap to IB if I can get a switch.
  • You can see one of the GTX 1080 Ti's is a 3-fan Gigabyte (this one: Amazon.com: Gigabyte GeForce GTX 1080 Ti GAMING OC 11GB Graphic Cards N108TGAMINGOC-11GD: Computers & Accessories) I really like that card, but it will be removed from this system in a future revision. In dense GPU situations, you need blower fans if you cannot get the passively cooled GPUs.
  • Total power consumption is sub 1300W mining on four GPUs.
This is nothing like many mining/ gaming rigs for a few reasons.

First, I wanted bandwidth for deep learning. The normal mining builds with PCIe x1 connections would not work. Second, it is too loud and filled with GPUs it will be 2.5kW+ constant load. Therefore, I needed it to be located in the data center. I also needed both machines to be relatively compact. Two of them will occupy 9-10U total. Finally, there is something to be said for systems that were designed for this type of load and application rather than Frankenstein builds.

Total cost:
  • Barebones with PSUs: $3300
  • 2x E5 V3 CPUs: $1000
  • 128GB RAM: $800 (will be adding more)
  • 4x GTX 1080 Ti - $2700
  • Mellanox ConnectX-3 VPI - $120
  • "Hump" top for GTX cards - $110
Overall, it is an expensive build at around $8k thus far. My hope is that this turns into a build that pays for the lab racks each month crypto mining while not being used for other purposes.
 
Last edited:

T_Minus

Build. Break. Fix. Repeat
Feb 15, 2015
7,625
2,043
113
How much bandwidth do you expect this to use while mining/100% utilization?
 

voxadam

Member
Apr 21, 2016
107
14
18
Portland, Oregon
Jesus! That thing's a tank.

One quick question, what exactly is the "hump" that you mention? I'm guessing that it's some sort of power bus for the GPUs, though, I could be totally off base. I'm having a difficult time parsing this statement:
I ended up buying two of the "hump" tops that a hump for the GTX power cables.
Though, it's after 0300 here and I just got back from having a couple pints with some friends so there's a fairly good chance that the "parse error" is a client side issue.

Also, you mention that you'd rather have used the SYS-4028GR-TR2 with its single PCIe root complex and I'm curious in what ways such a layout would benefit your use case(s). Unfortunately, I only have a limited understanding of modern GPU memory addressing schemes. A lot changed in the past few years with the advances such as unified memory architectures, peer-to-peer addressing, GPUDirect, RDMA, and whatnot. I imagine that I wouldn't be the only one here that would be interested to hear your thoughts on these topics especially as they relate to machine learning and crypto currency mining.
 

msvirtualguy

Active Member
Jan 23, 2013
494
244
43
msvirtualguy.com
I have that Gigabyte 1080Ti FE card and it runs hot!!...hopefully that chassis blows some nice air across those FE cards. I'm actually looking to get the EVGA water cooling kit for mine.
 

Patrick

Administrator
Staff member
Dec 21, 2010
12,511
5,792
113
I have that Gigabyte 1080Ti FE card and it runs hot!!...hopefully that chassis blows some nice air across those FE cards. I'm actually looking to get the EVGA water cooling kit for mine.
Yea this is a chassis designed for cooling 3kW in the 4U including passive 300W cards. I was bolstered by how many folks I see with these and 8x Titan X, 1080, and 1080 Ti.

The Ti actually is a bit better since the DVI port is removed allowing better blower fan cooling.
 

Patrick

Administrator
Staff member
Dec 21, 2010
12,511
5,792
113
@T_Minus bandwidth or power? Bandwidth is negligible. Power is substantial.

@voxadam there will be a main site post on the hump soon. Single root is less exciting for me now but it helps when you have big distributed workloads and RDMA IB. Too much to type while walking on my phone.

3x more 1080 Ti FE cards are already en route. Need to pick up two more (replacing the 3 fan model) to finish this build off.
 
  • Like
Reactions: voxadam

T_Minus

Build. Break. Fix. Repeat
Feb 15, 2015
7,625
2,043
113
Bandwidth.

I've only read a little about mining but nothing about bandwidth, and with 8 GPUs I was wondering how much it's up to now?

Thanks! :)
 

Vondra

New Member
Jun 4, 2017
1
1
3
44
Hi Patrick,

wow, great build. I was thinking about similar concept. Ended up with openframe rig on Asus X99 and 7x gtx1070 on pcie x1 risers.
How are you satisfied with cooling? GPU are to close together, is there enough air flow?
If I see correctly you connected standard 8pin and 6pin from above. Do them fit eventually?
 
  • Like
Reactions: Patrick

Patrick

Administrator
Staff member
Dec 21, 2010
12,511
5,792
113
@Vondra the big reason for this over the x1 risers is that the cards have bandwidth for other purposes.

Cooling is fine. The mid-plane fans are meant to cool 8x (10x in newer chassis) passive GPUs. Adding the blowers help. The FE cards also have the right heatsink fin orientation and openings at the correct end. I did talk to someone about the possibility of pulling off the air shrouds and running them passive which may work better.

@T_Minus I am trying to get all of the red SLI connector covers on CPU 2 PCIe ports. All of the CPU1 GPUs to have black SLI covers. The 3-fan is from when I thought I was going to do a single GPU. That outlook has... changed.
 
  • Like
Reactions: T_Minus and Klee

Patrick

Administrator
Staff member
Dec 21, 2010
12,511
5,792
113
Thus far peak power is 2461W with all four PSUs plugged in and the GPUs all going (CPUs at low load).

Doing some mining right now. EWBF Zcash is around 5.4KH/s.

Assuming a $300/ card 180-day depreciation it is ~60 day payback for the cards including data center power and pool fees. Someone pointed out to me that the resale value of the cards is not necessarily 0 if you can sell them off in a few months.
 

gotchu

New Member
Apr 25, 2016
25
4
3
34
Since I often ask others to make build threads, and we are about to take off, I figured I would make a quick build thread.

The goal of this system is to get 8x GTX 1080 Ti's into a Supermicro SYS-4028GR-TR (not the TR2 model).

Ideally, I would be using the TR2 model with the single root but this is good enough. The goal is to have two of these systems up for DemoEval, STH deep learning (until Volta) and crypto mining (to help pay for power and rack space.) I am planning to put this on the main site, so this may just be an interim build piece.

Phase 1: The first four 1080 Ti's (three of which will hopefully stay):



There are a few items I wanted to note thus far:
  • I ended up buying two of the "hump" tops that a hump for the GTX power cables. That makes the unit essentially 4.5U or really 5U. I plan to use the lost U for an IPMI switch or something similar. Cost each was around $110.
  • The baffle in the picture is out. I was told that both baffles are really unnecessary.
  • The card is currently a Mellanox ConnectX-3 VPI card. The lab mostly runs on 40GbE so this way I can use it for 40GbE but also swap to IB if I can get a switch.
  • You can see one of the GTX 1080 Ti's is a 3-fan Gigabyte (this one: Amazon.com: Gigabyte GeForce GTX 1080 Ti GAMING OC 11GB Graphic Cards N108TGAMINGOC-11GD: Computers & Accessories) I really like that card, but it will be removed from this system in a future revision. In dense GPU situations, you need blower fans if you cannot get the passively cooled GPUs.
  • Total power consumption is sub 1300W mining on four GPUs.
This is nothing like many mining/ gaming rigs for a few reasons.

First, I wanted bandwidth for deep learning. The normal mining builds with PCIe x1 connections would not work. Second, it is too loud and filled with GPUs it will be 2.5kW+ constant load. Therefore, I needed it to be located in the data center. I also needed both machines to be relatively compact. Two of them will occupy 9-10U total. Finally, there is something to be said for systems that were designed for this type of load and application rather than Frankenstein builds.

Total cost:
  • Barebones with PSUs: $3300
  • 2x E5 V3 CPUs: $1000
  • 128GB RAM: $800 (will be adding more)
  • 4x GTX 1080 Ti - $2700
  • Mellanox ConnectX-3 VPI - $120
  • "Hump" top for GTX cards - $110
Overall, it is an expensive build at around $8k thus far. My hope is that this turns into a build that pays for the lab racks each month crypto mining while not being used for other purposes.
Any cheap case/mobo combo for a 4 gpu setup? Want to put in the lab doing some DL and mining while idle. 8 gpu may be just too loud in the lab I guess (may I ask how loud it can go under load?).
 

gotchu

New Member
Apr 25, 2016
25
4
3
34
Hi Patrick,

I am really concerned about the noise level. My plan was to place one of such machine (maybe less cards depends on the noise level) in the lab for my DL projects and utilize the free electricity for mining :). If it gets too loud, my labmates may complaint about it. Could you give me a rough estimation how loud it can get when 8 cards/or 4 cards full-loaded? Thanks.