Search results

  1. F

    Questions about colocating

    I am looking for a colocation facility and have a few questions. Background and requirements: 1. This will be my first time colocating. 2. I need a rack of around 10 kW, which I will fill with 4u 1 kW inference servers (hardware discussion here...
  2. F

    Choosing a server/chassis for GPU workload

    I am about to build some inference GPU servers. This will be my first time dealing with server-grade hardware and I have some questions. Requirements and background info: 1. The servers should support Epyc CPUs and multiple double-slot blower-style Nvidia consumer GPUs (ie 2080 Ti). 2. Price...
  3. F

    Can Tesla T4 cards be used in 2u4n servers?

    Does anyone here know if it's possible to place low-profile Tesla T4 graphics cards into 2u4n servers such as the Gigabyte H261-Z61 or the Supermicro 2123BT-HNR? Tesla T4: NVIDIA T4 Tensor Core GPUs for Accelerating Inference Gigabyte H261-Z61: H261-Z61 (rev. 100) | Hyper-Converged System -...
  4. F

    need advice about quad GPU build for cuDNN

    This is my first post here and my first computer build (as will soon become apparent :)). I work in deep learning and, after renting various servers for several years, have built the following machine for a specific workflow centered around cuDNN: Build’s Name: Server-1 Operating System/...