Supermicro AI development platform $131,999.99

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

Fritz

Well-Known Member
Apr 6, 2015
3,386
1,386
113
70
A deal if you can use it to generate income?

 

CyklonDX

Well-Known Member
Nov 8, 2022
834
272
63
4x A100 80G = 96K
1x A4000 = 1K
2x Gold 6444Y = 7.4K
6x P5520 = 2.4K
16x32G 4800MHz ECC RDIMM = 5.6K
NV CX-6 2 25Gbe = 1.2K
2x 2.2kW PSU Titanium = 1.1K


Total:
$114.7K

Around 16.9k for:
Case, Backplane 8x 2.5" NVMe/SATA3/SSD Liquid cooling A100's, Liquid Cooling 2x CPU's, 335x160mm radiator, 2x 120mm 4700RPM fans, mobo, ipmi + work. (mobo prob around 1.2-2k)

1683322337375.png

Well its not a bad workstation - I like it, just not for my finances. Potentially if you earn 500-600K per year this may be something you would buy.


Its a nice motherboard; prob costs around 2k
1683322968715.png
Each slot is x16 gen5 (total of 7)

Supports up to 4TB of ram (4TB 3DS ECC RDIMM 4800MHz) up-to 256GB sticks

Built in sata controller is a joke tho (SATA3...)
Built in Dual 10Gig Eth X550
7x USB 3.0's
IPMI
Up-to 12 fans

Crap that it only supports up-to 60cores...

Normal case use, I would only want 2x A100's, and have other cards like sas3/4 controller, maybe external one as well, some nvme pcie card... and more cores. Really 16c per cpu is so yesterday.
 
Last edited:

Wasmachineman_NL

Wittgenstein the Supercomputer FTW!
Aug 7, 2019
1,880
620
113
If I was Bob Page, and had coding/AI knowledge, I would buy one of these for the sole purpose of...

running an AI chatbot in VRChat, to give the middle finger to a certain individual I have a intense disdain of.
 

unwind-protect

Active Member
Mar 7, 2016
416
156
43
Boston
I am never quite sure about these machines that have both high CPU core count and multiple GPUs. Wouldn't you have a workload that demands one but not the other?

Also, most machine learning models are fed with Python, so single core speed is of utmost importance. 13900KS + 1 GPU and more GPUs in different machines.
 

CyklonDX

Well-Known Member
Nov 8, 2022
834
272
63
well AI is one thing, but those offer vgpu, and even multi-instancing; You could make a home server or small office system offering 8 kvms per GPU; leave one GPU for 'AI', If you want to dev games that lets say will use AI to converse with your or make choices in real time. When you go with kvms, dockers and such cores matter; As you want each core to be isolated to specific service.