Can Tesla T4 cards be used in 2u4n servers?

fragar

Member
Feb 4, 2019
32
0
6
Does anyone here know if it's possible to place low-profile Tesla T4 graphics cards into 2u4n servers such as the Gigabyte H261-Z61 or the Supermicro 2123BT-HNR?

Tesla T4: NVIDIA T4 Tensor Core GPUs for Accelerating Inference

Gigabyte H261-Z61: H261-Z61 (rev. 100) | Hyper-Converged System - GIGABYTE Global
Supermicro 2123BT-HNR: 2123BT-HNR | 2U | A+ Servers | Products - Super Micro Computer, Inc.

As far as I can tell the cards would fit physically and the power supplies could handle them, but every writeup of those servers that I've seen states or implies that those low profile PCI-e x16 slots are for drives and I/O, and not for GPUs.
 

Patriot

Moderator
Apr 18, 2011
1,304
689
113
Powerwise if they are x16 electrically they can support them, x8, not so much.
Cooling... is probably based around 25w cards and the T4s have a habit of being picky.
Contact Gigabyte and SM, and well, buy 1 to try... but I certainly wouldn't buy enough to outfit everything.
 

fragar

Member
Feb 4, 2019
32
0
6
Powerwise if they are x16 electrically they can support them, x8, not so much.
Cooling... is probably based around 25w cards and the T4s have a habit of being picky.
Contact Gigabyte and SM, and well, buy 1 to try... but I certainly wouldn't buy enough to outfit everything.
Both systems have 2 x16 slots with enough space for LP cards.

There should be plenty of airflow through the card but the air will be slightly warm after going through the two CPUs.

I can't easily try this as I don't have either of these servers and won't get one unless I am confident that I can put at least one Turing card per node in there.

Here is what Gigabyte wrote:

Dear Customer,

We cannot support using T4 inference card in our Computing designed system.
It is of course applicable but please understand we have not done any thermal testing and cannot provide support if issues arise related with the T4 card inside a H261-Z60.
Test at your own risk.

Best Regards,