Recent content by RCS

  1. R

    Epyc 7b13 + supermicro motherboard

    I just bought one of these CPUs (7B13) and can confirm it boots up in my AsRock Rack motherboard without issue. I was quite surprised to see it list 64 cores/128 threads when I went into the BIOS. It's listed as a 60 core/60 thread CPU on PassMark's benchmark comparison site.
  2. R

    SlimSAS LP to SlimSAS 8i Cables? Hard to source...

    Hey guys, Apologies if this is in the wrong forum... I'm working with an ASRock ROME2D32GM-2T Motherboard that uses slimSAS LP ports for a bunch of the PCIE lanes. Problem is I'm having a really hard time finding slimSAS LP to regular slimSAS 8i cables... I found these on a Dell forum but...
  3. R

    Samsung PM1733 NVMe Drive - Very Slow

    I messed around with block size and number of jobs more afterwards, and got totally different numbers. (I didn't finish all the randread and write tests as I was less interested in those) As you can see, I started getting over 7GB/s (slightly over the rated perf) on jobs with larger block...
  4. R

    EPYC 7452 CPU's - Showing Very Slow L3 Cache and RAM in memtest86

    FYI - For RAM speeds, sysbench in Ubuntu is showing double the speed (32GB/s read) with only 4 DIMM's installed currently (so dual channel) It shows approx the same speed for write. Still don't understand the odd L3 cache speed, but I'm assuming it has something to do with how the NUMA nodes...
  5. R

    EPYC 7452 CPU's - Showing Very Slow L3 Cache and RAM in memtest86

    Is there a reason that L3 cache and RAM would show speeds this slow in memtest86? I have two of these systems and both are showing the exact same results. Gigabyte G482-Z52 systems 2x EPYC 7452 CPU's 16 DIMMS of DDR4 2933MHz (1 DIMM per channel for each CPU, so should be at full 8 channel...
  6. R

    Samsung PM1733 NVMe Drive - Very Slow

    So I did some benchmarking with "fio" using the PM1733 drive as well as another 970 EVO I had laying around. Here are the results: As info, this is an EPYC 7002 system so the PM1733 is running at PCIE 4.0. The 970 EVO does very well with 64K blocks, and (surprisingly to me) poorly with 4K...
  7. R

    Samsung PM1733 NVMe Drive - Very Slow

    Timing O_DIRECT disk reads: 4614 MB in 3.00 seconds = 1537.71 MB/sec Possibly a slight bump in perf but around the same.
  8. R

    Samsung PM1733 NVMe Drive - Very Slow

    Thanks for that. Results are around the same. 1024 MB in 3.00 seconds = 341.26 MB/sec 1024 MB in 3.00 seconds = 341.12 MB/sec 1022 MB in 3.00 seconds = 340.45 MB/sec 1026 MB in 3.00 seconds = 341.73 MB/sec 1024 MB in 3.00 seconds = 341.06 MB/sec So 1704MB/s. Still underperforming by a...
  9. R

    Samsung PM1733 NVMe Drive - Very Slow

    I purchased a Samsung PM1733 7.68TB NVMe drive for a build I'm working on. This morning I installed it and quickly threw Ubuntu 18.04 onto it to do some testing/benchmarking for the whole system. Speeds are WELL below the stated 7000MB/s read that Samsung advertises. hdparm returns the...
  10. R

    Supermico 2x NVMe Add-On Card - Will this work in a Supermicro 4027/28 GPU Server?

    Looking at picking one of these up for a build I'll be working on but understand that bifurcation is required for both the M2 slots to work with this add on card. Card Part#: AOC-SLG3-2M2 Does anyone know or have experience using one of these cards with a Supermicro 4027/28GR GPU server? Thanks
  11. R

    Supermicro IPMI Fan Speed Control - GPU System

    Update. Did some more reading and found some commands that work on this board. Maybe its an X9 generation thing? System seems to have 4 zones. One for each fan pair. The following code works after setting the fan mode to "Full". #set fans in "Zone 1" to 30% #ipmitool raw 0x30 0x91 0x5A 0x3...
  12. R

    Supermicro IPMI Fan Speed Control - GPU System

    Hi guys, I'm having issues setting fan speeds on a 4027GR-TR Supermicro system. System has 8 fans for both CPU/GPU/chassis cooling, and are controlled via the IPMI. I've followed this guide from PigLover but it doesn't seem to work with this system...
  13. R

    GPU Server - Exxact Tensor TS4-264546-DP2?... Is this an old April fools joke?

    Thanks for the reply's! Makes sense. Would have been one hell of a machine, but I'm sure also very very expensive.
  14. R

    GPU Server - Exxact Tensor TS4-264546-DP2?... Is this an old April fools joke?

    I stumbled upon this article on Exxact's blog that covers the advantages of using PLX chips and single root PCIE complexes for GPU deep learning etc. Example 4 highlights their new "Tensor TXR414-1000R" system which can apparently take up to 20 GPU's, all while using 5 PLX switches to...