1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Anyone else with troubles on ASRock ep2c602-4ld16?

Discussion in 'DIY Server and Workstation Builds' started by Jez van den Oever, Jul 13, 2017.

  1. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    Hi all,

    Been reading a lot of threads here, including an older thread on Windows 10 freezing with ASRock ep2c602-4ld16, but I think I have an issue with the latest versions of Windows 10 & Windows 2016 Server. Sorry for long thread, but I need to ask around as the issues I'm experiencing mean Windows 10 & Windows 2016 Server are completely unusable on fairly standard enthusiast server hardware.

    Background
    Wanted to offload server duties off a game-oriented 6 core 4930k machine and free up its becoming-redundant 980 ti card (because it was doing server duties). The 4930k will become a secondary games machine (primary is an overclocked 5960x with a 1080 ti card).

    My Background
    Work as a chief architect for a large international company, been building my own computers since 12. programming since 7.

    Purpose
    Have 48tb of home movies, media, game images etc --> DAS and diy-SAN.
    Have home services: DHCP, DNS, AD via Samba.
    Need workstation capabilities: Virtualisation for AI and other memory hungry, thread intensive applications I build.

    Setup
    Bought the ASRock ep2c602-4ld16 for the 16 RDIMM slots and needed the 4 x pci-e x16 3.0 slots for:

    2 x LSI 2307e SAS expander cards
    1 x LSI 2307i SAS expander card (8 internal drives)
    1 x old 780 ti card I had in reserve (on-board graphics is rubbish)
    1 x pci-e usb 3 card (on the x4 slot)

    256GB 1066MHz ECC RDIMM (bought a while ago at a not-to-be-missed price).
    2 x Xeon E5-2670v2 (bought some QS after researching to ensure maximum compatibility).
    1050 gold-certified PSU (spare).

    The Problem
    After assembling the computer, Windows 10 install proceeded OK. Once in Windows 10, and about to start my scripted-auto-install of all software I use, I attached the pci-e cards (as above) and that's when all the troubles began. Random hard freezes. No BSOD, no crash, no reboot - just a hard freeze (no numlock change etc, reset not working, only hold-down-power-button reset).

    After reset, the system would last between 30 seconds and 20 minutes before freezing.

    I removed all but 4 sticks of RAM, no effect.
    I removed one CPU and 2 further sticks of RAM, no effect.
    Removed all the pci-e cards, no effect.

    Ran bootable memtestx86+ for many hours, no issues.

    Now I got more drastic, and acquired 2 x Xeon E5-2670 (v1) chips, not ES to rule out the QS chips I have.

    This made matters worse - crash now occurs almost certainly right after boot into Windows 10.

    Repeated above (removing all RAM, one CPU, all pci-e cards) - no effect.

    Got my UEFI USB Windows 10 install, and it freezes during that too!!!

    Remembering an earlier thread about lockups in Windows 10, tried Windows 2016 Server. That won't make it past the install either.

    What did work?
    Getting desperate as I was eliminating all items, I had two more options and following wise words "Once you eliminate the obvious, whatever remains, however improbable, must be the truth"

    * Motherboard?
    * OS?

    New motherboard and RMA existing means proving it, so I attacked OS first.

    Installed latest Ubuntu linux bootable. That worked.
    Did various light tasks in linux. That worked.
    Got mem test and cpu test packages (stress) to stress RAM and CPU. Ran all day on CPU test - max temp 50C, no problems, all 16 [32] cores [threads]. Ran all night on memory test - 16GB chunks across all 256GB. No issues.

    Theories
    Windows 10 & Windows 2016 Server share many components and development. Clearly something in the latest builds is causing issues on my machine. I have tried many combinations, but I only have the Samsung memory and the one server motherboard, so I can't vary those to validate.

    Anyone else got this problem?

    *Edited for minor spelling mistakes.
     
    #1
    Last edited: Jul 13, 2017
  2. i386

    i386 Active Member

    Joined:
    Mar 18, 2016
    Messages:
    941
    Likes Received:
    207
    No, I'm using supermicro :p

    Stupid jokes aside, are firmware/bios up to date?
    Did you use the latest windows 10/server 2016 iso? (verify the checksums, we had a corrupted, normal sized iso of server 2016 at work that would install fine in esxi but didn't boot after installation)
     
    #2
  3. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    Yes, latest version, had to ensure that because the first set of CPUs were ES/QS. Windows 10 checksum is good, haven't checked 2016 ISO, so I'll check that.

    I am not familiar with Linux microcode updates - when Linux booted, the Ubuntu distro asked to install a "Intel microcode update" driver - is that the equivalent of "firmware" for a chip? I wonder if I tried Windows 10 again if anything would change.

    I am thinking about virtualising in Linux and trying an install again with Windows 10 today in a VM. I'd lose performance but if it works, the performance will be mitigated by 8/10 x 2 cores [16 / 20 threads] (2670 vs 2670v2).
     
    #3
  4. i386

    i386 Active Member

    Joined:
    Mar 18, 2016
    Messages:
    941
    Likes Received:
    207
    Before you try to virtualize windows 10/server 2016, can you try to install server 2012 r2?
    It's listed as a supported os and generally well tested, so it should work.
     
    #4
  5. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    I think that's a worthwhile suggestion except that I've rather got my heart set on Windows 10. I looked at Server 2016 because it is the same architecture.

    Not that I'm a Windows die hard (I grew up on Linux, starting 0.996 with Slackware :) ), just that I have an awful lot of auto-install powershell scripts and packages all configured for Windows 10 (and I've built a comprehensive network-based auto-update system for the other 8 computers in the house) and its too much hard work to port it all to Linux, and besides Linux doesn't have everything.

    So I'm going to have a go at virtualisation (and struggle a bit with VT-d to get pass-throughs for all the HBAs) just for the experience factor.

    I might quickly try an installation of 2012R2 as you suggest just to validate (and get closure) that it is the Windows 10/server 2016 platform being the problem...
     
    #5
  6. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    Curious - Windows 2012 Server R2, once phase 1 installation (copying all files to disk, after re-partitioning and then rebooting) starts, the server freezes. Approximately the same place every time.

    Thoughts from anyone? Is this a duff motherboard? But if so why does Linux work flawlessly?
    (Reminder: I used memtest86+, and 'stress' on Linux for many, many hours with no errors or freezes).

    Fails so far from:

    Server 2012 R2 (install crashes on 2 x E5-2670)
    Server 2016 (install crashes on 2 x E5-2670, can't even get to the partitioning screen)
    Windows 10 (install crashes on 2 x E5-2670, can't even get to the partitioning screen).

    I am going to swap one more time with the ES CPUs (E5-2670v2) and see what's going on - when I first got this machine and built it I got to install Windows 10 faultlessly before all the problems started.

    In the meanwhile I am thinking of returning the board (I can do that in the Netherlands and besides I have enough proof to show this is not fit for purpose), and getting a Supermicro board.

    If anyone has any other thoughts, I would appreciate it.
     
    #6
  7. briandm81

    briandm81 Active Member

    Joined:
    Aug 31, 2014
    Messages:
    290
    Likes Received:
    65
    I have three of this motherboard. Two are running esxi and one swaps back and fourth between esxi windows server 2012r2. My original board is about 4 years old now and has been solid in esxi since day 1. I am using 2620v2 processors. I will be replacing it soon however as the IPMI stopped working.

    The windows box has been running solid for 18 months and is running 2670v1s.
     
    #7
    Last edited: Jul 14, 2017
  8. i386

    i386 Active Member

    Joined:
    Mar 18, 2016
    Messages:
    941
    Likes Received:
    207
    Memtest tests the memory, "stress test" (and prime95 on windows) are for cpu + cooling testing.

    What happens when you reset the bios/energy settings in bios?
     
    #8
  9. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    I know :) I was just pointing out that with bootable memtest86+ and with Linux running memory *and* CPU tests, there is no freezing whatsoever, but as I said, it only takes a simple install for either Windows 10, server 2016 or server 2012R2 for my setup to freeze. Gotta be the motherboard.

    I've just retried by replacing the 2 x E5-2670 (v1) with 2 x ES E5-2670v2 as as you suggested (once again) reloaded all defaults for the BIOS (only modification: remove Windows Boot Manager to be able to re-install OS):

    Windows 10 USB UEFI Install: Get to the "Select languages/locale/etc" screen and move the mouse around, within 10-20 seconds it hard freezes.
    Windows 2016 USB UEFI Install: As above
    Windows 2012R2 USB Boot: Can partition and install Windows 2012R2, but first reboot after install, I now get WHEA_UNCORRECTABLE_ERROR. Going from bad to worse...
     
    #9
  10. squidman

    squidman Member

    Joined:
    Jul 8, 2017
    Messages:
    52
    Likes Received:
    1
    Am beginning to be unsure of buying ASRock WS version now on reading this..not a lot of threads/posts out there as it is. Asus maybe a lot more unstable, but a lot of enthusiasts use it at least. Ditto Supermicro, supposed to be more stable more ES/QS friendly, but not a lot of love for them out there on the internets. And I don't even know linux distros, strictly Windows!
     
    #10
  11. i386

    i386 Active Member

    Joined:
    Mar 18, 2016
    Messages:
    941
    Likes Received:
    207
    This sounds to me like a hardware problem. Windows can't boot because it tries to start/initialize all hardware components on the mainboard, but linux works because it runs in "minimal" mode and only uses hardware it really needs.
    A quick google search for "windows doesn't boot but linux works" points to spiceworks, superuser and other websites mentioning mainboard problems.
     
    #11
  12. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    Same conclusion I am getting to. I've been in touch with ASRock support (very good by the way) and they are also getting to the same conclusion - they've asked me to try different variations and they've all failed.

    Edit to above: Changing the CPUs from 2670 to 2670v2 simply moves the freeze point. Whereas 1 x 2670 or 2 x 2670 (v1) freezes during install, the 2670v2 freezes when first boot occurs into the newly installed operating system.

    It would be unfair to bias against ASRock because of a single duff board. I have to say when I got this from the supplier here in the Netherlands, the box was scuffed and didn't look new, so even more suspect.

    The only thing they did say is that Windows 10 and Windows 2016 Server is unsupported on my board, which I guess is disappointing, but fair enough; although I would expect that kind of hardware to work with modern operating systems.

    Their tech support was same-day and they respond to my replies in e-mail within a few hours - very good.
     
    #12
  13. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    Update as of today:

    I went out and bought a Supermicro board (X9DAI) and to my surprise had exactly the same issues as above.

    So I returned to my Asrock MB to figure out what else I could do. I decided to explore Linux as that passed all the soak testing, and appeared stable as a rock on the Asrock.

    I've been working with it (Linux) for 4 weeks now and not a single crash bar a known freeze issue with Avast in a VM (nested hyper-visor issue). Points for Linux there vs. Windows.

    Cutting a long story short, removing all PCI-e devices and it seems stable for a while under Windows 2012 Server and even Windows 10. But didn't do any soak testing or other testing to prove this, because this is a pointless setup if I can't have any PCI-e cards.

    Since I was getting grumpy in terms of figuring out what is going on, I installed Linux Mint + KVM and struggled for a week to get VT-d passthrough to work with my LSI 9207-8i and 2x LSI-9207-8e adapters and pass-through with the nVidia GTX 780ti.

    For the record, QEMU and KVM documentation sucks and most of it is "try-and-see" when it comes to VT-d and pass-through. Also the libvirt/virt-manager and QEMU CLI is infuriating as they are two completely different systems but luckily I found a way to convert the XML files from the former into CLI of the latter and vice-verse.

    Using VirtIO and VT-d pass-through of all the above PCI-e devices, I am now running Windows 10 Professional happily, with native performance on the graphics card, the 3 Bluray drives (pass-through) all appear to be working with AnyDVD, and I am about to test the PCI-e performance of the LSI (SAS) cards for the file services.

    My only concern now is can this virtualised semi-pass-through environment keep up the 300MB/s transfer rates on the file servers? Given the LSI cards are passed through to the guest OS, I have *high hopes*.

    I still have *no* clue what is causing the issues (latest patch levels of W10, Server 2012, Server 2016 perhaps????) on the motherboard, but since everything is now working fine, I am happier. Would be completely *happy* if I knew what the L1 Hypervisor was needed to make everything work, but hey-ho.

    Hoping I'll get my refund soon on the 2nd motherboard I shelled out for :)

    Anyone else get into this position, I can give step-by-step instructions on how to get a Linux level 1 hypervisor working with pass-through on PCI-e cards, pass-through Blu-ray and USB.
     
    #13
  14. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    And another update:

    I was struggling to get pass-through BluRay (I have 3 x BR to encode the vast DVD/Blu-ray library we have) in the Linux VM so I decided to be brave and install Windows 7.

    I have put off Windows 7 because I've built an entire software update suite in PowerShell for all my Windows 10 Workstations (we have 12 computers + laptops here, all installed identically). I didn't want to go through the hassle of refactoring the PowerShell script (all modules together add up to over 5,000 lines of code) to make it compatible with Windows 7.

    However, it appears Windows 7 has none of the above problems. Machine is now stable as a rock, and I'm busy now refactoring my software update suite to work in Windows 7.

    So after all of this headache, I have come to the following conclusion:

    1. Despite reading in other forums about Windows 10 working on this motherboard, I wonder if these people have tried the very latest version of Windows 10 (Creator's Update) because I refuse to believe they have a working system on the motherboard I have.
    2. People who say they got it to work by avoiding Windows Driver install and Windows Update is a non-solution for me (I want my computer to be up-to-date with drivers, software patches etc).
    3. It appears from the latest Windows 8.1/WIndows 2012 Server family, this motherboard must have some critical software incompatibility. Here is what I've tried:

    Windows 10 (Family: Windows 10 / Windows 2016 Server): Won't get past the rotating dots when loading Windows Install but will with on-board graphics with no PCI-e cards but then crashing again when PCI-e cards added into motherboard.

    Windows 2016 Server (Family: Windows 10 / Windows 2016 Server): As above.

    Windows 2012 Server (Family: Windows 8 / Windows 2012 Server): Won't make it past first install screen (asking for language, input device locality) with PCI-e cards (locks up every time); installs with on-board graphics with no PCI-e cards but then crashes again when PCI-e cards added into motherboard.

    Windows 7 (Family: Windows 7 / Windows 2008 Server): Works perfectly with all PCI-e cards. Not a single crash since installing and now happily refactoring my software update suite.

    Windows 10 on VM (Hypervisor: KVM, Emulation: QEMU, Host OS: Linux Mint 18.2): Installs fine, and generally works well, but AnyDVD can kill the VM and Host OS because I can't get the on-board M1/M2/M3/M4 SATA pass-through working, and using SCSI Virtio to pass in the Blu-ray players, it will crash the guest OS with Handbrake.

    I have spent countless hours now trying every combination (motherboards, CPUs, memory, PCI-e cards, hours on BIOS options) and one thing is immutable: No matter how I do it, with:

    1 x PCI-e Gigabyte Windforce OC GTX 780Ti
    1 x LSI PCI-e 9207-8i
    2 x LSI PCI-e 9207-8e
    1 x Silverstone USB 3.0 PCI-e
    256GB 1066MHz ECC RDIMM
    2 x E5 2670 Xeon

    Windows 10, Windows 2016, Windows 2012 R2 Server just won't make it past the first few screens of boot up with all the PCI-e cards, and will just about install the OS (sometimes the dreaded Ooops WHEA error) with no PCI-e cards, but inevitably, the installed OS will freeze between 5 minutes and 35 minutes of operation.

    I have not tried older versions of Windows 10 but as stated above, this isn't really an option for me.

    What is odd is the same thing occurs on both ASRock EP2C602-4LD16 and the Supermicro X9DAI motherboards, so if I were so inclined, I'd start at looking / disabling the Intel 602 chipset drivers to see if that is causing issues in the newer operating systems.
     
    #14
    Last edited: Aug 16, 2017
  15. briandm81

    briandm81 Active Member

    Joined:
    Aug 31, 2014
    Messages:
    290
    Likes Received:
    65
    I'm running three of these boards now. One runs windows server 2012r2 without issue. The other two run esxi 6.5 (previously 5.5 and 6.0). So I know the board is perfectly compatible with 2012r2 without issue. I did experience issues on one build with memory. I would get the purple screen of death on an ESXi and freezes during a windows install. In took out all but 4 dimms and it all went away. I sent back the whole set and got a new one. Flawless ever since.
     
    #15
  16. briandm81

    briandm81 Active Member

    Joined:
    Aug 31, 2014
    Messages:
    290
    Likes Received:
    65
    Windows2012Mobo.png

    For reference. This server has two X520-DA2's, an LSI 9265-8i and an Intel P3605 for PCIe devices.
     
    #16
  17. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    I did a 48 hour memory soak test (memtest86) and heavy memory tests with Linux stress but nothing popped out. I too have tried just 4 rdimms. Made no difference. The only thing that made a difference is no PCI-e cards - then I could install Windows 2012R2 but it wouldn't stay stable.

    Is this 2012r2 original version + all latest patches? I tried W2012 as of May this year, all latest updates.

    Also what PCI-e cards are you running?
     
    #17
  18. Jez van den Oever

    Jez van den Oever New Member

    Joined:
    Jul 13, 2017
    Messages:
    10
    Likes Received:
    0
    Ignore that you've already posted that answer :)
     
    #18
  19. briandm81

    briandm81 Active Member

    Joined:
    Aug 31, 2014
    Messages:
    290
    Likes Received:
    65
    I've not updated this particular system. I'll run the updates and see what happens. It's a benchmark box that is backed up, so if it all goes poorly, I'll let you know. I do have two other ESXi hosts running Windows 2012 R2 guests that are 100% fully updated. The PCIe devices are:
    X520-DA2
    LSI 9265-8i
    Intel P3605 1.6 TB

    I'm about to add a Supermicro AOC-3008 and a ioDriveII as well for benchmarking purposes. I'm actually planning on phasing out my ASRockRack motherboards in favor of Supermicro equivalents, but this one will be the last one to be phased out.
     
    #19
  20. briandm81

    briandm81 Active Member

    Joined:
    Aug 31, 2014
    Messages:
    290
    Likes Received:
    65
    Fully updated...
    Windows2012MoboUpdated.png

    I also ran memtest86 on my bad RAM. I ran it for 4 days thinking it had to eventually figure it out...nope. I didn't try any tests in Linux, just memtest86+. It said it was just fine, but once I swapped it out, no issues since.
     
    #20
Similar Threads: Anyone else
Forum Title Date
DIY Server and Workstation Builds HP Z800 PSU repairs, anyone? Oct 16, 2017
DIY Server and Workstation Builds DIY Powerwall 18650 Anyone Here? Aug 27, 2017
DIY Server and Workstation Builds File serve anyone? Sep 17, 2015
DIY Server and Workstation Builds Anyone have experience with the Habey EMC-600S case? Jul 5, 2015
DIY Server and Workstation Builds Anyone using Supermicro SC847E26-JBOD at home? Dec 17, 2013

Share This Page