Supermicro M12SWA-TF with RTX-3080 GPU

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

freemarket

Member
Nov 29, 2015
36
6
8
Florida
Hi,
Does anyone have any luck with the following components:
- Supermicro M12SWA-TF
- Supermicro chassis SC743-1000-SQ
- Zotac RTX-3080 10Gb (x2)
- 1000W PS
This was a complete build purchased last week from Newegg with a Supermicro case and 128Gb RAM and two drives. Issue is that machine doesn’t post with either GPU in either of the 4 possible PCI-E x16 slots.
It works fine without the GPUs and I’ve had no problem loading alma Linux 8.5 off a usb stick. I’ve spent a day emailing Supermicro support and had a zoom to conclude GPUs are bad. Tried same GPUs in my old Supermicro SYS7038A-i and no luck either. That one has a Nvidia P4000 in it.
I have nearly 10 years experience with Supermicro and double that with other headless servers, Sun, Tyan so know that if the GPU has an issue I can still shell into server or use IPMI. Neither work here so inability to post impedes O/S booting. Anyone get this GPU working on this mobo?

Thanks,
Henry
 

NablaSquaredG

Layer 1 Magician
Aug 17, 2020
1,374
835
113
Do you have another non-Supermicro (e.g. Asus Consumer) system to confirm that the GPU is not faulty?
 

freemarket

Member
Nov 29, 2015
36
6
8
Florida
So, to complete the mystery, both Supermicro and Zotac concluded that the GPUs were defective, issued me an RMA and I am waiting for replacements, currently running on my old GPU.
 

JShort

New Member
Jun 11, 2016
1
0
1
65
Henry,

I'm having good luck with a similar build:
- Supermicro M12SWA-TF
- Supermicro chassis SC743 (SQ of some sort)
- Upgraded from 2x SQ mid-case fans (Silent, but not enough to cool a single 3-fan RTX 3090) to 4x Supermicro Fan-0074L4 fans
- Zotac RTX-3090 24GB OC - currently using 8-pin to 2x 6+2 pin cable that came with another Supermicro workstation (Xeon, but that really doesn't matter); I was on the phone with Supermicro for 30 minutes and they have no idea how to get more of these cables, so I might have to make some myself!!!
- 1200W PS (pretty similar to your 1000W PS)

Having 2 bad brand new video cards sounds unlikely to me, so possibly what's wrong might have something to do with the power cables for the RTX 3080's?

Cheers,

Jim
 

freemarket

Member
Nov 29, 2015
36
6
8
Florida
Henry,

I'm having good luck with a similar build:
- Supermicro M12SWA-TF
- Supermicro chassis SC743 (SQ of some sort)
- Upgraded from 2x SQ mid-case fans (Silent, but not enough to cool a single 3-fan RTX 3090) to 4x Supermicro Fan-0074L4 fans
- Zotac RTX-3090 24GB OC - currently using 8-pin to 2x 6+2 pin cable that came with another Supermicro workstation (Xeon, but that really doesn't matter); I was on the phone with Supermicro for 30 minutes and they have no idea how to get more of these cables, so I might have to make some myself!!!
- 1200W PS (pretty similar to your 1000W PS)

Having 2 bad brand new video cards sounds unlikely to me, so possibly what's wrong might have something to do with the power cables for the RTX 3080's?

Cheers,

Jim
Jim,

Sorry for the long delayed response. Both video cards were bad according to Zotac and they had sent replacements. However, one small caveat, was that they claimed that my cards had minor defects to the plastic casing in several locations called the frame and thus they would not replace the frame. In addition, the replacement cards they sent contained a bent PCI slot bracket making it impossible to insert the card requiring me to remove that detachable part and await replacement. Only after that, and a careful manual installation with the previous cracked frame did the machine post successfully.

Have you used any of the internal m.2 slots on your mobo yet?

Henry
 

Gnome-O-Copter

New Member
Jun 8, 2023
8
3
3
Wait, they wanted you to return two defective cards (which don't get that way on their own, and shipping or handling damage is a pretty likely reason) and they used this as an excuse to not replace... something that was meant to match up with the cards / case?

Then they sent another one with damage might mean HDMI / Displayports on the GPU won't work? Torsion on those metal brackets is going to put ugly amounts of stress on the card PCB, if it was bad enough to prevent it from being installed I'd be returning that immediately. Don't accept garbage or fix anything for them, the only reason whatsoever to buy a pre-built, new system is so you don't have to screw around with janky unreturnable items. Most ebay sellers would take that card back in a second IME.

I'm using 3 / 4 of the internal M.2 slots, 2 with 2TB gen 4s (one as boot) and 1 2TB gen 3. Never had any issues with them and they all run at their rated speeds.




Also, @JShort... 2x6+2 GPU-end connectors usually aren't rated for quite what 2x8 pins might be, they're more of a convenience for the generation of GPU that needed 1x8 and 1x6 or 2x6. I wouldn't trust them on a 3090 with how high they can spike. I was always running AMD cards until recently but from what I've read the 4090 fixed some severe over-draw issues the 3090 had. The 7900XTX did the same thing vs. the 6900XT on AMD's side. Makes sense since both cards came out when bitcoin miners were buying up almost everything and since the miners were some of the most monumentally stupid people on the planet regarding hardware, investment, cost vs. profit, and pretty much anything else you can name the card companies didn't need to worry as much about completely off spec out of control numbers. Why not when your best customers are running the cards off of theoretically 2200W PSUs plugged into 15A circuits? You'll need to look at the specs of the PSU if it's modular and the wire gauge to determine if that can carry 300W... and didn't the 3090 have 3x8pin connectors. If it's a supermicro power supply it was probably only designed for whatever they sold that case with. Something like a dual Xeon Scalable might have had the 1000W psu primarily to keep the CPUs and a bunch of ram fed and keep one of the lower powered GPUs of the gen 1 scalable era before nearly as much GPU compute was being done happy, and keep the whole mess in the middle 80% of the output curve most of the time where the PSU was most efficient. If it came with a mid-tower it probably wasn't a dual processor system though. NVidia cards should detect vdroop on the inputs and downclock to prevent problems so you'd have to monitor it and it might be preventing you from seeing any issues. If it's not doing that and the cables aren't warm after running something that'll stress a 3090 (stable diffusion or Renderman XPU / Karma XPU comes to mind) you're probably fine.


Random aside follows:
They didn't include any split GPU so when I installed a 4090 I skipped the stupid card-side octopus cable it came with and bought a manufacturer cable with two PSU-side 8-pins and a 600W capable PCIe5 12 pin on the card side so there's no weight dangling off the thing. BTW after installing one, I absolutely agree with NVidia that the melted connectors were user error (and I don't like nvidia very much), but like most companies that statement has the huge caveat that absolutely nobody told people HOW to install the connectors and they have to be inserted so hard that you expect you're going to break the socket off the card before you get the little snap sound from the retention clip and everything is visibly flush. The insane pressure retention clip levers on x99 boards that make you think you're crushing the CPU (the cute little crunchy sound as the pins all flex at once really adds to the horror) and the 24-pin ATX Power connector are a joke compared to seating one of these things. The card came with absolutely no instructions about that part (or anything else) and treated it like it was no different than the old connectors in the online instructions. And they're simply amazed that people got it wrong? How lulzy. They were sending out replacements so I can't give them too much crap but maybe sticking some instructions in the box would have helped avoid that in the future? Nope. I bought mine like 8 months later.