Search results

  1. A

    FS: Mini HPC Cluster, 108 cores

    For sale is my mini-HPC cluster setup. I built it for running CFD and astrophysics simulations, but don't have any room for it in my current place. I know roughly what I could part it out for, but it has some value as a ready-to-go cluster, so I'd like to try to sell it as a whole first...
  2. A

    FS: 80+ Intel Xeon Phi 7-ES, 71S1P, 7110P Co-processor PCIe cards -used

    For sale are 80+ Intel Xeon Phi Co-processors. The available model numbers are 7-series ES (engineering sample), 71S1P, and 7110P. I also have many of the extension brackets and pcie brackets. Conditions range from tested/good to for-parts/broken. Prices range from $15 to $179. See THE TABLE for...
  3. A

    QDR Mellanox Performance Test Results with Varying BAR-space Size

    System: Desktop (headnode): I7-5960X, GA-X99-SLI motherboard Server (slave nodes): 6027TR-HTR (four X9DRT-HF). Only using one node for this testing. 2x E5-2667 V2 OS: CentOS 7.5 with Infiniband Support and various other packages. No custom OFED installs. HCAs: One Sun/Oracle X4242A QDR...
  4. A

    Infiniband PCIe card preventing boot in one server but not another

    Hi, I have a weird problem. I purchased 4x used identical Sun QDR X4242A Infiniband cards (Mellanox MHQH29B rebrands) for my 4 node supermicro 6027TR-HTR (X9DRT-HF) server. 3/4 cause boot to hang at post code 91, which is when the PCI stuff is loaded. The fourth one seems to let the node boot...
  5. A

    Mellanox IS5030 Speed Issues: Only DDR?

    Hi, I'm having speed issues that I've isolated to my IS5030 switch. System: I have two Infiniband switches: 1. A Mellanox IS50XX with 36 ports enabled and the FrabricIT internal subnet manager running, making it an IS5030, with latest firmware (IBM P/N: 98Y3756) 2. A Sun 36 port QDR Infiniband...