I am looking for a colocation facility and have a few questions.
Background and requirements:
1. This will be my first time colocating.
2. I need a rack of around 10 kW, which I will fill with 4u 1 kW inference servers (hardware discussion here...
I am about to build some inference GPU servers. This will be my first time dealing with server-grade hardware and I have some questions.
Requirements and background info:
1. The servers should support Epyc CPUs and multiple double-slot blower-style Nvidia consumer GPUs (ie 2080 Ti).
2. Price...
Does anyone here know if it's possible to place low-profile Tesla T4 graphics cards into 2u4n servers such as the Gigabyte H261-Z61 or the Supermicro 2123BT-HNR?
Tesla T4: NVIDIA T4 Tensor Core GPUs for Accelerating Inference
Gigabyte H261-Z61: H261-Z61 (rev. 100) | Hyper-Converged System -...
This is my first post here and my first computer build (as will soon become apparent :)). I work in deep learning and, after renting various servers for several years, have built the following machine for a specific workflow centered around cuDNN:
Build’s Name: Server-1
Operating System/...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.