Workstation Dual Xeon Gold 6254 Workstation

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

pluralform

New Member
Apr 11, 2019
15
2
3
Build Nomenclature: Malkovich····•••····Malkovich
Distro: Linux - Arch
CPU: (2x) Xeon Gold 6254
RAM: 384GB Crucial 2933 DDR4 ECC ((12x) 32GB modules)
Chassis: Phanteks Elite Black
Drives: (1x) Samsung 970 Pro 1TB [boot], (2x) Samsung 970 EVO 2TB [cache]
GPU: (2x) RTX Titan, (1x)2080ti
Motherboard: Supermicro X11DPG-QT
PSU: Corsair AX 1600i
Coolers: (2x) Noctua NH-U12S DX-3647
Extra: NVLink Titan RTX


Focus: 3D VFX, Simulation & Rendering
 
Last edited:

pluralform

New Member
Apr 11, 2019
15
2
3
I have had some great advice elsewhere, but I'm still a tad undecided on the motherboard. As far as expanding m.2 drives, isn't the Supermicro a bit limited to using expansion PCIe, versus the ASUS having 4x U.2 connectors(adapter << m.2), VROC etc.? Plus, I'd have to "slightly" modify that Elite case to fit the proprietary form factor Supermicro board.

Curious as to everyone's thoughts in general, but tips/insights with the motherboard decision are greatly appreciated. Also, the power supply. If I just start with the 2 T-Rex cards, it should be fine, but if I put the 2080ti in there, or eventually watercool 4 cards...that is gonna need more juice I think. Is it possible to safely and properly use two psu's to power up simultaneously? Maybe one dedicated to GPU's for example?

Have some other things I'd like to ask if some of you smart folks are willing to engage with me in getting this right.

Thanks a ton
 
Last edited:

Patrick

Administrator
Staff member
Dec 21, 2010
12,511
5,792
113
Why have the 2080 Time and 1080 Ti too?

Beyond getting the PSUs you also need to get power to the PSUs for that setup. Not all circuits can handle that much draw.
 

William

Well-Known Member
May 7, 2015
789
252
63
66
Yeah, not sure why the mix of GPU's ?
Just a note, my bench system is based on the ASUS Sage and I have run 2x Titan RTX's on it, see the review.
If you install all those GPU's in a case, temps are going to be a real issue if you load them all up.
Power draw for just the two Titan RTX's under load was 633watts, I did see total system power jump to 850watts !
I saw 290watts on the RTXD2080 Ti and my GTX1080 Ti will pull 230watts.
So you are looking at 1,370ish watts total (CPU's + RAM) system power... not counting storage.
 

pluralform

New Member
Apr 11, 2019
15
2
3
Why have the 2080 Time and 1080 Ti too?

Beyond getting the PSUs you also need to get power to the PSUs for that setup. Not all circuits can handle that much draw.
It will be fine with just the two RTX cards with 1600 watts. I mentioned in my post I was only using 2 and not sure what the plan was yet with the other two. edit: misread your respones from another post, but curious how https://forums.servethehome.com/index.php?threads/dual-xeon-gold-6154-workstation.19051/ doesn't have issues with only 1500w psu.

But yeah, running all those cards even watercooled is going to be tricky eh. How do other people manage 4 card super workstations with 2200 titanium redundant PSU's? Guessing they have better managed outlets for such.

As far as the reason I own the other two... They are from my current machine. They have matching VRAM of 11GB for Redshift/Octane rendering. I'd like to run a 3rd card for monitors/alt programs while the 2 Titans are free for OpenCL use etc., or maybe 4 watercooled at some point to split up render workloads. Owning those cards regardless won't go unused for too long, as I'll scale out and build a render node with them eventually when 2080ti prices plummet.
 
Last edited:

pluralform

New Member
Apr 11, 2019
15
2
3
Yeah, not sure why the mix of GPU's ?
Just a note, my bench system is based on the ASUS Sage and I have run 2x Titan RTX's on it, see the review.
If you install all those GPU's in a case, temps are going to be a real issue if you load them all up.
Power draw for just the two Titan RTX's under load was 633watts, I did see total system power jump to 850watts !
I saw 290watts on the RTXD2080 Ti and my GTX1080 Ti will pull 230watts.
So you are looking at 1,370ish watts total (CPU's + RAM) system power... not counting storage.
Thanks for the link. Will read up momentarily. I don't know if I just typed this weird, but I don't feel like I implied I was going to run 4 cards air-cooled in one case. I'm running 2 RTX Titans to start, and MAYBE a third for monitors and designated apps to free up the main cards while sims/rendering. That case is huge and doesn't have a wall of HDD's to obstruct front intake, has top panel exit and bottom intake as well, so I'll get really good airflow to experiment with IF I do.

The other two cards are from my current machine for GPU rendering. Matching VRAM of 11GB
 
  • Like
Reactions: William

pluralform

New Member
Apr 11, 2019
15
2
3
Really great write up btw! I've actually read that about 5 times before today. Would have liked to see some other comparisons as mentioned in the comments, but appreciate and gained a lot from the review.
 
Last edited:

William

Well-Known Member
May 7, 2015
789
252
63
66
Yup, many of our readers are far more advanced than I am when it comes to DLing things, I literally started doing DL benchmarks with the first Titan RTX review, there is a huge amount to learn and I am just starting at it so I expect more benchmarks will follow. I did scan around and could only find one site that did with and without NVLink, and frankly TBH it was a huge time investment with what I did on those reviews. There is only so much time in the day and reviews need to go out. And then, there are still more GPU's to review to add to the data, and 25 benchmarks for each GPU takes a lot of time. Commenters on the review links complain about so many things, all from different areas that do this work daily prolly, I do not, but it feels like it, others just complain to complain it seems LOL.

And Thank you !
 

pluralform

New Member
Apr 11, 2019
15
2
3
I certainly understand about the investment of time. Just in my own research for this build, it can be frustrating getting reliable information, and waiting around for parts, not being told when components may or may not be released, reading up on all the pieces of the puzzle with a big investment. So...I appreciated that article very much.

Still have a mountain of things to do once I get my motherboard, apps and everything else up and running; but I feel like I might have a slightly unique build on the higher end with these RTX Titans, Cascade Lake SP's utilized for rendering and such. Could maybe throw some results around with some guidance on benchmarks and whatnot eventually. Feel like there is going to be some really neat stuff ahead with NVLink on that realm once devs kind of catch up a bit. And then a frenzie of stuff happening once EPYC 2 arrives. Very curious how they manage clock speeds with the flagship.

Any thoughts on that Sage mobo? Any troubles? My gut is telling me to just figure out how to fit that SuperMicro board, but there are some nice advantages with the ASUS as well.
 
Last edited:

William

Well-Known Member
May 7, 2015
789
252
63
66
The benchmarks I did came from NVIDIA in a guide, any one could run them and get similar results. None the less just starting out it took me a fair amount of time to trouble shoot issues... like why when I install Ubuntu on the SAGE board with any RTX cards I get a black screen and cannot proceed, that took me a few days to figure out.... then downloading this huge 150GB+ file from ImageNet was a pain.

The ASUS SAGE board has been running here for over a year now, I have several sets of SSD's with different OS's installed that I simply swap out and reboot for what ever I am doing. So far its been rock solid <knock on wood>. My main workstation uses the Z10P D16 that's been running for years now. I do not like ASUS BMC, Supermicros IPMI is much better, but I hardly ever use either so its no biggy to me.
 

pluralform

New Member
Apr 11, 2019
15
2
3
I've ordered the Supermicro x11DPG-QT. Appreciate the feedback on the ASUS c621e Sage. I'm going to at least get that Supermicro board in hand to see if I can make it fit without a too much difficulty in that Elite chassis.

Out of curiosity, how did you arrive at the @1370watt power draw number? Were you using the 6134 Gold's TDP for that based on your article review? Thank you
 

pluralform

New Member
Apr 11, 2019
15
2
3
1370watts ?

I think I maxed out around 400ish watts total system power.

Oh, sorry. No, you had mentioned @1,370 above for the system I was potentially building with adding the 3rd card in there. Just not quite sure how you arrived at that.
 

William

Well-Known Member
May 7, 2015
789
252
63
66
Oh sorry, yeah that was for the same review system I used for the Titan RTX's and then the other GPUs you had on your first post. So the 4x GPU's we thought you planned on running.
 

larrysb

Active Member
Nov 7, 2018
108
49
28
I've been running a pair of T-RTX on a Xeon E5 X99 WS board. More or less, built it similar to the DGX station, which uses the same board. I also used the same EVGA 1600 T2. I recommend it. I was using another brand of PSU, but had some issues with the intense transient loads that DL workloads impose. It makes the room lights flicker and really warms up the room when it is running a work load. Generally, you can throttle the cards down to fit the power availability with the nvidia-smi utility and still get good performance. FWIW, the standard RTX 2080ti-FW cards run hotter though they're supposedly pulling 20 fewer watts. The deal with the RTX cards is they only accept a single NVLink bridge, so you can only really take full advantage of two cards. They also don't p2p over PCIe. The Volta cards do. I'm not certain if PCIe p2p is enabled on the Quadro Turing cards. I had to locate the several workstations in different locations in the building because two (with 2x cards each) of them running hard will pop the breaker in a standard office or home circuit.