WTB: 100gb infiniband cards and cable

Discussion in 'For Sale/ For Trade/ Want to Buy' started by Myth, May 14, 2018.

  1. Myth

    Myth Member

    Joined:
    Feb 27, 2018
    Messages:
    138
    Likes Received:
    7
    Hi Guys,

    Anyone have any links or know where I can buy two 100Gig Infiniband cards plus a cable to direct connect two servers together?

    I believe that all 100Gig cards are x16 PCIe, correct?

    Best,
    Myth
     
    #1
  2. Rand__

    Rand__ Well-Known Member

    Joined:
    Mar 6, 2014
    Messages:
    2,482
    Likes Received:
    324
    If you can afford new Mellanox will be happy to sell some;)

    And yes
     
    #2
  3. i386

    i386 Well-Known Member

    Joined:
    Mar 18, 2016
    Messages:
    1,384
    Likes Received:
    317
    No, there are pcie 4.0 version (ibm uses them on power 8/9 systems) and there are pcie 3.0 cards that use two x8 pcie slots.
    Search for "mellanox (ccat,ecat,edat,hcat,hdat)" on ebay, these are the 100 gbit/s (xcat) and 200 gbit/s cards (xdat) from mellanox.
     
    #3
  4. Rand__

    Rand__ Well-Known Member

    Joined:
    Mar 6, 2014
    Messages:
    2,482
    Likes Received:
    324
    Ah you are right, forgot about those (2x x8) - cx5's right?
     
    #4
  5. i386

    i386 Well-Known Member

    Joined:
    Mar 18, 2016
    Messages:
    1,384
    Likes Received:
    317
    cx4 I think, didn't find a cx5 on ebay yet :p
     
    #5
  6. Rand__

    Rand__ Well-Known Member

    Joined:
    Mar 6, 2014
    Messages:
    2,482
    Likes Received:
    324
    I thought that splitting function was a cx5 thing. Never found a working pair though on Ebay, always only half (for a price I can get a x16 card for).

    But got a cx5 EN once... seller mistake i think ;)
     
    #6
  7. Myth

    Myth Member

    Joined:
    Feb 27, 2018
    Messages:
    138
    Likes Received:
    7
    Yes I've been searching on Ebay for 100gb infiniband, but havent found anything cost effective. I remember last time I posted about a 40GigE card someone gave me a linke for a dual port for like $35. Just hoping something like that existed for 100gb IB as well.
     
    #7
  8. i386

    i386 Well-Known Member

    Joined:
    Mar 18, 2016
    Messages:
    1,384
    Likes Received:
    317
    maybe in 5 years :p
     
    #8
  9. Myth

    Myth Member

    Joined:
    Feb 27, 2018
    Messages:
    138
    Likes Received:
    7
    Do you guys think I could bond/team four 40gE ports to get the same performance as a 100GB IB?

    We did a bit of testing in the lab, but when we teamed two ports together the speeds dropped...
     
    #9
  10. Rand__

    Rand__ Well-Known Member

    Joined:
    Mar 6, 2014
    Messages:
    2,482
    Likes Received:
    324
    Depends on your workload type .... parallel streams might work with some optimization, single stream - nope.
     
    #10
  11. i386

    i386 Well-Known Member

    Joined:
    Mar 18, 2016
    Messages:
    1,384
    Likes Received:
    317
    Ethernet or ib? FDR ~ 56 gbit/s -> 2 fdr ports ~ 112gbit (bottlenecked about~80gbit/s when using dual port x8 card)
     
    #11
  12. _alex

    _alex Active Member

    Joined:
    Jan 28, 2016
    Messages:
    848
    Likes Received:
    88
    pcie bandwidth will limit you here. if you bond two fdr ports from different cards maybe, but no way on a single x8.
    i tried with vxlan on a leaf/spine on single card and ended up somewhere around 64gbit. dual cards were faster (one port from each card) but also didn't reach 100gbit. What reminds me i should have another look at this 'problem'
     
    #12
  13. Myth

    Myth Member

    Joined:
    Feb 27, 2018
    Messages:
    138
    Likes Received:
    7
    @_alex how did you perform your benchmark, or what program did you use?
     
    #13
  14. _alex

    _alex Active Member

    Joined:
    Jan 28, 2016
    Messages:
    848
    Likes Received:
    88
    iperf with a sufficient number of parallel streams.
    Setups was quagga + pimd on three PVE nodes acting as leaves within the HV, and 2x SX6012 with ospf enabled on the ports as spines, all /32 Nets. Used global pause for PFC as i had no intention to spend days getting this sorted.
    Then i put VXLAN over the fabric/loopback-adresses and measured from endpoints on the hosts and also within/between VM's.

    At a certain point the virtio process that serves KVM saturates, guess multiqueues could help (if a single VM really would need that amount of bandwidth).

    In the end, not so unhappy about what i got from this poc-setup, tbh, i expected it to be much worse.
     
    #14
Similar Threads: 100gb infiniband
Forum Title Date
For Sale/ For Trade/ Want to Buy FS: DDR3 Kingston RAM, ConnectX-4 100gbe Oct 8, 2018
For Sale/ For Trade/ Want to Buy FS: (2) Mellanox ConnectX-4 Lx EN Network 50GbE Cards + 100GbE DAC Cable Jul 13, 2018
For Sale/ For Trade/ Want to Buy Want to Borrow - 100GbE NICs and Cables Jan 30, 2018
For Sale/ For Trade/ Want to Buy 4x HGST HUSSL4010BSS600 100GB SSDs for sale Jan 15, 2018
For Sale/ For Trade/ Want to Buy WTB: Intel DC S3700 100GB Oct 21, 2017

Share This Page