Since I often ask others to make build threads, and we are about to take off, I figured I would make a quick build thread.
The goal of this system is to get 8x GTX 1080 Ti's into a Supermicro SYS-4028GR-TR (not the TR2 model).
Ideally, I would be using the TR2 model with the single root but this is good enough. The goal is to have two of these systems up for DemoEval, STH deep learning (until Volta) and crypto mining (to help pay for power and rack space.) I am planning to put this on the main site, so this may just be an interim build piece.
Phase 1: The first four 1080 Ti's (three of which will hopefully stay):
Phase 2: 8x GPUs
There are a few items I wanted to note thus far:
First, I wanted bandwidth for deep learning. The normal mining builds with PCIe x1 connections would not work. Second, it is too loud and filled with GPUs it will be 2.5kW+ constant load. Therefore, I needed it to be located in the data center. I also needed both machines to be relatively compact. Two of them will occupy 9-10U total. Finally, there is something to be said for systems that were designed for this type of load and application rather than Frankenstein builds.
Total cost:
The goal of this system is to get 8x GTX 1080 Ti's into a Supermicro SYS-4028GR-TR (not the TR2 model).
Ideally, I would be using the TR2 model with the single root but this is good enough. The goal is to have two of these systems up for DemoEval, STH deep learning (until Volta) and crypto mining (to help pay for power and rack space.) I am planning to put this on the main site, so this may just be an interim build piece.
Phase 1: The first four 1080 Ti's (three of which will hopefully stay):
Phase 2: 8x GPUs
There are a few items I wanted to note thus far:
- I ended up buying two of the "hump" tops that a hump for the GTX power cables. That makes the unit essentially 4.5U or really 5U. I plan to use the lost U for an IPMI switch or something similar. Cost each was around $110.
- The baffle in the picture is out. I was told that both baffles are really unnecessary.
- The card is currently a Mellanox ConnectX-3 VPI card. The lab mostly runs on 40GbE so this way I can use it for 40GbE but also swap to IB if I can get a switch.
- You can see one of the GTX 1080 Ti's is a 3-fan Gigabyte (this one: Amazon.com: Gigabyte GeForce GTX 1080 Ti GAMING OC 11GB Graphic Cards N108TGAMINGOC-11GD: Computers & Accessories) I really like that card, but it will be removed from this system in a future revision. In dense GPU situations, you need blower fans if you cannot get the passively cooled GPUs.
- Total power consumption is sub 1300W mining on four GPUs.
First, I wanted bandwidth for deep learning. The normal mining builds with PCIe x1 connections would not work. Second, it is too loud and filled with GPUs it will be 2.5kW+ constant load. Therefore, I needed it to be located in the data center. I also needed both machines to be relatively compact. Two of them will occupy 9-10U total. Finally, there is something to be said for systems that were designed for this type of load and application rather than Frankenstein builds.
Total cost:
- Barebones with PSUs: $3300
- 2x E5 V3 CPUs: $1000
- 128GB RAM: $800 (will be adding more)
- 4x GTX 1080 Ti - $2700
- Mellanox ConnectX-3 VPI - $120
- "Hump" top for GTX cards - $110
Last edited: