3080 deep learning rig questions

balnazzar

Active Member
Mar 6, 2019
206
25
28
My humble and limited experience:

1. Always prefer VRAM amount vs. speed.
2. Get blower cards or Founder's for the 3000 series. All the other are more or less poor quality and blow a lot of heat inside your case.
3. If you just need to use two cards, x570 and ryzen 3000 do provide a lot of value (x8/x8, 4.0).
4. Consider that used scalables are quite cheap these days. I got a 8260M QS for 350 eur and it's like new.
5. Some epycs are cheap too (these days the 7282 can be had for less than 600$/eur).
6. I found experimentally that three 2060 Super are on par with a single 3090 in terms of performance. Four of them (~1500$/eur) are superior and do have more VRAM. *And* they are available.
7. If you have a lot of cards in a tight setup, do use pcie cable extenders to separate them.
8. Best cases in my experience for multi-gpu setups.. Tower: Fractal 7 XL and EnthooPro 2. Rack: Chenbro RM41300-F81.
 
  • Like
Reactions: Patrick and Marsh

larrysb

Active Member
Nov 7, 2018
108
49
28
My humble and limited experience:

1. Always prefer VRAM amount vs. speed.
2. Get blower cards or Founder's for the 3000 series. All the other are more or less poor quality and blow a lot of heat inside your case.
3. If you just need to use two cards, x570 and ryzen 3000 do provide a lot of value (x8/x8, 4.0).
4. Consider that used scalables are quite cheap these days. I got a 8260M QS for 350 eur and it's like new.
5. Some epycs are cheap too (these days the 7282 can be had for less than 600$/eur).
6. I found experimentally that three 2060 Super are on par with a single 3090 in terms of performance. Four of them (~1500$/eur) are superior and do have more VRAM. *And* they are available.
7. If you have a lot of cards in a tight setup, do use pcie cable extenders to separate them.
8. Best cases in my experience for multi-gpu setups.. Tower: Fractal 7 XL and EnthooPro 2. Rack: Chenbro RM41300-F81.
Ditto the memory recommendation.

I've not put hands upon the 3000 series yet. I saw the RTX A6000 just became available at around $4600.00. :rolleyes:
 

balnazzar

Active Member
Mar 6, 2019
206
25
28
Ditto the memory recommendation.

I've not put hands upon the 3000 series yet. I saw the RTX A6000 just became available at around $4600.00. :rolleyes:
I'm considering waiting for the alleged (and probable) 3080 with 20gb. They say feb/march 2021. Since my budget is limited to ~3000eur, I'm hoping to get three of them.
 

William

Well-Known Member
May 7, 2015
789
251
63
64
Yeah Patrick, let's grab an RTX A6000 also, looks like a beast of a card ! ;)
Another impossible card to get :(
 

balnazzar

Active Member
Mar 6, 2019
206
25
28
I read that the 3090s are intentionally handicapped on the driver level but not sure if it extends to 3080 and 3070.
They are. And, for example, the Titan RTX is not.
But no matter that, the 3090 is still superior to the Titan in DL applications, albeit not by much. Considering that it costs 1000 bucks less and supports new data types, I think that the 3090 still is the way to go, since there are just no better alternatives.
 

josh

Active Member
Oct 21, 2013
530
137
43
My humble and limited experience:

1. Always prefer VRAM amount vs. speed.
2. Get blower cards or Founder's for the 3000 series. All the other are more or less poor quality and blow a lot of heat inside your case.
3. If you just need to use two cards, x570 and ryzen 3000 do provide a lot of value (x8/x8, 4.0).
4. Consider that used scalables are quite cheap these days. I got a 8260M QS for 350 eur and it's like new.
5. Some epycs are cheap too (these days the 7282 can be had for less than 600$/eur).
6. I found experimentally that three 2060 Super are on par with a single 3090 in terms of performance. Four of them (~1500$/eur) are superior and do have more VRAM. *And* they are available.
7. If you have a lot of cards in a tight setup, do use pcie cable extenders to separate them.
8. Best cases in my experience for multi-gpu setups.. Tower: Fractal 7 XL and EnthooPro 2. Rack: Chenbro RM41300-F81.
1. There's only one blower model for the 3000 series and it's a 3090 which is out of budget actually.
2. The reviewers seem to suggest that the TUF and the Gaming XTrio has some really good thermals (~61/62C on full load). Not sure if this is true for deep learning.
3. Electricity is expensive over here so spending more on a power efficient card is more cost effective than multiple cheaper cards that don't run as well.
4. 3000 series are pretty available for close to MSRP where I'm from. The 3070s and 3090s are basically walk in stock guaranteed while the 3080s might take a little more patience to obtain. There's always the Amazon option that sells at US MSRP which takes about half a month to ship out.
5. Thanks for the case suggestions. Currently using a Fractal Define XL and definitely considering switching to the Fractal 7 XL.

They are. And, for example, the Titan RTX is not.
But no matter that, the 3090 is still superior to the Titan in DL applications, albeit not by much. Considering that it costs 1000 bucks less and supports new data types, I think that the 3090 still is the way to go, since there are just no better alternatives.
Even with the upcoming 20G 3080s? Hold out a couple of months and there could be 3090 level VRAM at 3/4 the price.
 

balnazzar

Active Member
Mar 6, 2019
206
25
28
1. There's only one blower model for the 3000 series and it's a 3090 which is out of budget actually.
2. The reviewers seem to suggest that the TUF and the Gaming XTrio has some really good thermals (~61/62C on full load). Not sure if this is true for deep learning.
3. Electricity is expensive over here so spending more on a power efficient card is more cost effective than multiple cheaper cards that don't run as well.
4. 3000 series are pretty available for close to MSRP where I'm from. The 3070s and 3090s are basically walk in stock guaranteed while the 3080s might take a little more patience to obtain. There's always the Amazon option that sells at US MSRP which takes about half a month to ship out.
5. Thanks for the case suggestions. Currently using a Fractal Define XL and definitely considering switching to the Fractal 7 XL.
Good points.

1. Actually, apart from Gigabyte's there is one model from Asus and one from Colorful. Both 3090s.

2. Yes, I think. But these were tested with only one card in the case.

3. True. I'm tempted by the A6000 too... But then, how much electricity could you buy with the price premium you pay for the card? And being able to experiment with parallelism could be a good thing, regardless..

5. Just ordered one more 7 XL! Great case, but heavy as hell.


Even with the upcoming 20G 3080s? Hold out a couple of months and there could be 3090 level VRAM at 3/4 the price.
But no NVlink.
 
Last edited:

balnazzar

Active Member
Mar 6, 2019
206
25
28
Yea if the 3080TI has NVLink it'd be perfect.

Do you know if it's worth it to WC the 3080s?
I know nothing about watercooling, but I reckon it depends.. The FEs never overheat (3080 and 3090). Puget tested four 3090s blower (2-slot) stacked tightly inside a case, and still they reached 80C maximum.. If so, I think that WC isn't worth the hassle.
 

larrysb

Active Member
Nov 7, 2018
108
49
28
The cards will reduce their performance when they get too hot, or draw too much current. Even a poorly ventilated blower card will not exceed it's temperature limits.

The point of water cooling to is to keep the card in the highest possible boost mode all the time. It will only reduce or throttle based on power consumption (which the card measures and monitors).

All of the consumer cards today are designed to dump heat inside the PC case, with larger, low rpm fans to keep noise under control. It is then up to the enclosure to circulate cool air in and remove the hot air from the case.

The "Quadro" (or equivalent level now that brand is going away) are still all "blower" type, meaning they push the heated air out the backside of the card. That's so they can be densely packed adjacent to each other. The blower is noisier at high power levels.

The former "Tesla" level cards deleted the display connections and the fans altogether, relying on the server-enclosure to provide airflow through the cards.

This is Nvidia's solution for the DGX Station A100


Notice the SMX format cards with a refrigerant loop on a bespoke AMD motherboard.

The DGX Station Volta used 4 proprietary V100 V100's on a Asus X99-E WS 10g motherboard, and a watercooling loop with the radiator uptop, and a separate AIO watercooled Xeon E52699 v4.
 

balnazzar

Active Member
Mar 6, 2019
206
25
28
The thing about watercooling is that often it introduces a lot of headaches (leaks, ruptures, maintenance required, etc..). If you buy a Nvidia DGX product (and those sell for > 50K bucks) you get maintenance and technical assistance.

As for the Blower-style Amperes, I was initially skeptical, since a 2-slot blower design didn't seem capable of cooling a 350W monster like the 3090, but initial reviews and user feedbacks seem to indicate the contrary. See for example:


https://www.reddit.com/r/MachineLearning/comments/jhof1z/_/ggsf50r
The cards should throttle as soon as they reach their throttling temp, that for Ampere is, AFAIK, 83C. But these cards stay well below, even when stacked tightly...
 

balnazzar

Active Member
Mar 6, 2019
206
25
28
@larrysb thanks for the reminder. I want to do a DGX Station review
I'm not sure Nvidia will send such an expensive item around for reviews, but in case I'm wrong, it will be quite an interesting review.
In case I'm right, however, would you please ask for an A100 pci-express? Such a review would set some baseline upon which to evaluate any other gpu. Thanks!
 

LukeP

Member
Feb 12, 2017
183
21
18
42
The main bottle neck is storage. NVMe really helps and that might be enough reason to make the upgrade.
why are you bottlenecked by storage? if you are swapping to storage during training, arent you doing something wrong?