Rack server for machine learning - multi-user GPUs

Thun

New Member
Jan 12, 2019
2
0
1
Hi,

I have built and used successfully a home server for machine learning with two GeForce GTX 1060 (6Gb)s. I am now looking to invest in a rack server to run some models and research further, but I am quite stuck.

In general, does anyone have a suggestion for a mid/low-range rack server of similar or somewhat higher GPU capabilities, preferably DELL, of around the cost of $3K-$4K?

In particular, while I was looking at the options, I found that some GPUs are marketed as multi-user (e.g. Tesla M10). Is that the same thing as vGPUs? Would that allow one GPU to process multiple models without any slowdown?

Thank you!
 

chilipepperz

Active Member
Mar 17, 2016
212
64
28
51
There are cards like the GRID M40 but in 2019 the lowest-end card you'll want is the 1080 Ti for a server, with a blower cooler. Building a new server and going Maxwell for ML isn't the best idea.

You can have multiple GPUs and assign one or many to a user without using vGPU. vGPU is for virtualization.

The Dell R740 has more cooling for hotter GPUs, but they're expensive.
 
  • Like
Reactions: Thun

Thun

New Member
Jan 12, 2019
2
0
1
Thanks! What are the considerations for installing a fan-cooled PCI card in a 1U/2U server? Will the air flow be OK as long as the card fits?
 

chilipepperz

Active Member
Mar 17, 2016
212
64
28
51
Thanks! What are the considerations for installing a fan-cooled PCI card in a 1U/2U server? Will the air flow be OK as long as the card fits?
Generally you want what is called a "blower" style cooler. NVIDIA 10x0 Founders Edition had them. Now it's the aftermarket guys like Zotac that have them but not for the 2080 Ti's as much. You can find 2080 with a blower like https://www.amazon.com/ZOTAC-GeForce-256-bit-Backplate-Graphics/dp/B07GJ8X6Q2/

The backplate is good there.

Then you need to get power so you'll need power cables for them. That varies a lot since now Tesla cards use different power than GTX and RTX cards.