Recent content by OkiieDoe

  1. O

    Qotom Denverton fanless system with 4 SFP+

    I‘ve had the Qotom Q20331G9 1U version for more than a month running OPNsense 13.2 today I did some tuning to get better 10 GB performance Setting used and iperf3 results. { sysctl -a | grep -i "hw.machine\|hw.model\|hw.ncpu"; sysctl -n hw.physmem | awk '{ byte =$1 /1024/1024/1024; print byte "...
  2. O

    Which container manager do you guys use?

    I use cockpit on Fedora to manage Podman containers all running on a proxmox server. I stopped using protainer business as I was using it with podman/debian did not work right found Fedora/Podman with cockpit no more problems.
  3. O

    HGST ULTRASTAR SSD800MM HUSMM8080ASS200 800GB 12GB /s - SAS MLC $650 OBO

    12gb, 36.5PB write , low latency high iops I think I have an addiction now I need two
  4. O

    Mellanox MIS5035Q-1BRC - Managed 36-Port QDR Infiniband Switch $350

    the newest firmware verison is 1.1.3004 12/9/2014
  5. O

    Mellanox unhappy w/ vtD passthru in vSphere

    oh and in the test server which is a x9srl-f board running esxi 6 4510822 with ofed 2.4.0.0
  6. O

    Mellanox unhappy w/ vtD passthru in vSphere

    you can update your card firmware with an intermediate update found here release notes here in the .tgz file look for your card .ini file MHQH19B-XTR_A1-A3 this will update your card to firmware to Rev 2.9.1200.
  7. O

    Mellanox unhappy w/ vtD passthru in vSphere

    i would try and update the ib card firmware if its a OEM then do a custom oem update. i also tested this out on a x9srl-f board running esxi 6 4510822 with a ibm oem connectx-2 firmware ver 2.9.1200 Custom Firmware for Mellanox OEM Infiniband Cards - WS2012 RDMA
  8. O

    Deep breath...my infiniband journey begins

    all you would need is a mellanox infiniscale IV switch with a mellanox connect-x 2/3 vpi cards running in IB mode (default) for IPoIB.
  9. O

    Deep breath...my infiniband journey begins

    if the mellanox card is VPI then it defaults to infiniband not ethernet it should be plug n play and you would run the IB network separate from the ethernet network. in my home lab every server has dual 1gb for front end and dual 40gb IB connection for back end.
  10. O

    Mellanox MIS5035Q-1BRC - Managed 36-Port QDR Infiniband Switch $350

    I upgraded from a IS5022 8 port having the subnet manager is very nice plus the IS5035 is short depth and is not much louder.
  11. O

    Mellanox MIS5035Q-1BRC - Managed 36-Port QDR Infiniband Switch $350

    seen this on the bay and got one at $300 it has a built in subnet manager the condition is like brand new came with console cable and rails. Mellanox MIS5035Q-1BRC 36-Port InfiniScale IV IS5035 QDR InfiniBand Switch | eBay
  12. O

    Docker on PhotonOS?

    i'm running a test setup in my home lab but i'm using vsphere integrated containers engine with vmware admiral to manage and vmware harbor to store my docker images.
  13. O

    Quanta LB6M (10GbE) -- Discussion

    i'm looking to get a 10gb switch but wanted to know if this combination would work connectx-3 354A-FCBT with mellanox mc2609130-003 passive QSFP to 4xSFP+ cable.