64GB LRDIMM DDR3 ECC ram for $20 or less

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

zachj

Active Member
Apr 17, 2019
192
129
43
Yeah, there's a certain point where motherboards will wear. Luckily I haven't had any issues so far, but I also haven't pushed it at all like I originally planned.

I'm pretty curious too as the machine wasn't even supposed to support RDIMMs or LRDIMMs at all.
I too suspect the max memory possible on the z420–at least using rdimms—would require mixing two different dimm sizes on each channel. Using lrdimms I think 512gb is possible on e5-2600 v2 series with 8x64gb modules.

I personally think you should buy a single module and see if it boots at all. If it does then throw in your existing 32gb dimms and check it still boots. If it does buy one more 64gb module and see if it boots with 320gb. If it does I’d say you’re almost certain to succeed with 512gb booting.

doing it that way doesn’t waste any more than $40.
 

Samir

Post Liker and Deal Hunter Extraordinaire!
Jul 21, 2017
3,585
1,677
113
49
HSV and SFO
Without bothering to reference ARK I’m pretty sure the e5-2600 v2 series supports lrdimms whereas the e5-1600 v2 does not.
That would make more sense, but I seem to recall this person using the v2 of an e5-26xx series.

So my memory sucks (pun intended). It was an e5-1650 v2 that the person was trying and LRDIMM support was dropped in the v2 of that processor:

I think you're right that with the v2 of an e5-26xx, the hp z420 will be able to hit 512GB easily and may even be possible with the v1 if it was not fully tested by Intel.
I personally think you should buy a single module and see if it boots at all. If it does then throw in your existing 32gb dimms and check it still boots. If it does buy one more 64gb module and see if it boots with 320gb. If it does I’d say you’re almost certain to succeed with 512gb booting.

doing it that way doesn’t waste any more than $40.
The ultimate test would be to get two modules as you suggest, but if I'm getting any, I would get enough to max it out since shipping is what kills deals when things are this cheap. And if it doesn't work, I have other servers I can put the ram in, but I've already maxed them out using 32GB modules so I don't want to lose that investment atm.

Once they come down in price, I'll try it if someone else doesn't read this and beat me to it. ;)
 

wildpig1234

Well-Known Member
Aug 22, 2016
2,303
559
113
49
there is the advantage of being able to overclock the e5-1600 compared to 2600. But i guess the disadvantage would be not able to use more ram
 
  • Like
Reactions: Samir

wildpig1234

Well-Known Member
Aug 22, 2016
2,303
559
113
49

According to that, z420 does take up 32GB lrdimm. I bet that you would be ok with the 64GB lrdimm too. But it could act weird.

My asus Z9PE-D16 official specs only go up to 256GB per cpu or 8x32GB. However, it booted with the 64GB lrdimm. The only weird thing is that inside windows 10, it wouldn't load the driver for my graphic card, treating it like a generic display adapter. But if i take out the ram back down to 256GB, then the gfx driver would load.
 
  • Like
Reactions: Samir

Samir

Post Liker and Deal Hunter Extraordinaire!
Jul 21, 2017
3,585
1,677
113
49
HSV and SFO
there is the advantage of being able to overclock the e5-1600 compared to 2600. But i guess the disadvantage would be not able to use more ram
Yep, and another disadvantage is the ability to run duals as the 16xx series is single only. Different processors for different workloads. :)
 

Samir

Post Liker and Deal Hunter Extraordinaire!
Jul 21, 2017
3,585
1,677
113
49
HSV and SFO

According to that, z420 does take up 32GB lrdimm. I bet that you would be ok with the 64GB lrdimm too. But it could act weird.

My asus Z9PE-D16 official specs only go up to 256GB per cpu or 8x32GB. However, it booted with the 64GB lrdimm. The only weird thing is that inside windows 10, it wouldn't load the driver for my graphic card, treating it like a generic display adapter. But if i take out the ram back down to 256GB, then the gfx driver would load.
haha, that never existed before I published my findings--looks like others paid attention to my work and profited from it.

I think it would take the 64GB module, but only one of them in a channel since 2x of them would exceed 96GB as well as 12 ranks. (At 2x 32GB modules currently, I'm only at 64GB in a channel and 8 ranks--hence why I'm also not at the 384GB limit.)

That's an interesting problem. Did you boot with a full load of 64Gb modules or just one? Sounds like it has something to do with the ram registers overwriting the gpu's registers so the card 'disappears'. That's one thing I was thinking could happen with the z420 with 512GB--it is recognized, but 'loops back around' to overwrite itself when you start to use >384GB. But I would expect Intel to have several protections for this in the cpu and it would simply not allow the ram to register and then the system won't boot.
 

wildpig1234

Well-Known Member
Aug 22, 2016
2,303
559
113
49
haha, that never existed before I published my findings--looks like others paid attention to my work and profited from it.

I think it would take the 64GB module, but only one of them in a channel since 2x of them would exceed 96GB as well as 12 ranks. (At 2x 32GB modules currently, I'm only at 64GB in a channel and 8 ranks--hence why I'm also not at the 384GB limit.)

That's an interesting problem. Did you boot with a full load of 64Gb modules or just one? Sounds like it has something to do with the ram registers overwriting the gpu's registers so the card 'disappears'. That's one thing I was thinking could happen with the z420 with 512GB--it is recognized, but 'loops back around' to overwrite itself when you start to use >384GB. But I would expect Intel to have several protections for this in the cpu and it would simply not allow the ram to register and then the system won't boot.
My asus Z9PE board would load the gfx driver with 1 cpu with 512GB or 2 cpu with 960GB. But as soon as i load that one last 64GB stick, it fails to load gfx driver. I don't have a 32GB LRdimm to see what happen with 992GB .... So weird lol.. Must be a limitation or bug of that MB.

Even if it doesn't load the gfx driver, i guess the only way to see that it actually access all that 1TB of ram is by running a ram test?
 
  • Like
Reactions: Samir

wildpig1234

Well-Known Member
Aug 22, 2016
2,303
559
113
49
It's incrediable how much memory prices have plummeted in the past year! DDR4 now costs as much as DDR3 used to and now DDR3 costs as much as DDR2 used to.
Downward pressure on DDR4 due to DDR5 ;)
DDR4 is actually very useful. It has the longest lifespan i think so far. can be used on quite a lot of platforms! a lot more than DDR3
 

wildpig1234

Well-Known Member
Aug 22, 2016
2,303
559
113
49
will phantek enthoo pro work?
May not fit since that case only go up to E ATX, not EE ATX . I have phantek enthoo pro and it's a tight fit for Asus Z9PE which is E ATX
The size of the SM board is 13.68" (L) x 13.05" (W) (347.47 mm x 331.47 mm) per the manual.

E5-V2 cpu can take max memory of 768GB. So for dual CPU how much max ram we can fit in X9DRi-LNF4+?
That MB is speced for up to 1.5TB ram (24x64GB LRdimm). This is the max ram supported by dual CPU
 
Last edited:
  • Like
Reactions: Samir

wildpig1234

Well-Known Member
Aug 22, 2016
2,303
559
113
49
Supermicro X9DRi-LN4F+ EE-ATX LGA 2011 X79 Motherboard w/ Test CPU Memory | eBay

2011 v2 dual cpu MB with 1.5TB capacity for $71 so you can fill it up....lol.. This is the v1.2 which will take the v2 cpu
E5-V2 cpu can take max memory of 768GB. So for dual CPU how much max ram we can fit in X9DRi-LNF4+?
This is great find! Finding a chassis for these EEATX motherboards is challenging though
I am testing out the board i got. As stated in the description, test cpus and ram were included and consisted of two 2650 v2 cpu and two 4GB DDR3 dimms ;)

The board was already upgraded with the latest bios 3.4. Also, i don't know how they did it but the cpu fan speed control is enabled already!!! yayyy. Didn't have to mess with the ipmi for this myself. I actually wish i can do this because i need to do it for my X10 board too.!
 
  • Like
Reactions: Samir

Puppetfreek

New Member
Nov 17, 2014
23
10
3
I'm running a X9DRi-LNF4+ with 1.5TB ram (24x64GB). It's been stable since I did the upgrade (summer), but I needed to increase the airflow over the ram sticks to keep the temperature in check. So something to consider. You might be better of only populating half of the sticks, to provide more space between them and better cooling, as well as less heat generation.
 

wildpig1234

Well-Known Member
Aug 22, 2016
2,303
559
113
49
I'm running a X9DRi-LNF4+ with 1.5TB ram (24x64GB). It's been stable since I did the upgrade (summer), but I needed to increase the airflow over the ram sticks to keep the temperature in check. So something to consider. You might be better of only populating half of the sticks, to provide more space between them and better cooling, as well as less heat generation.

The z420 wasn't even supposed to support ecc reg, lol. I found out by accident when I tried some 32GB LRDIMMs and it booted. I then ran memtest and it passed. Apparently I set the world record at 256GB for a z420 when I did this. :) What's interesting is that the v2 processors apparently dropped LRDIMM support, at least on the z420 when someone trying to replicate my setup ran into a problem with just beeps. The modules were the exact same part number as mine, known working, but his still wouldn't boot with them even though RDIMMs would. We then saw in the spec sheets how LRDIMM support was removed in v2, or at least that's what I recall now.

It is interesting the v2 says 768GB though--I guess the number of ranks allowed in a channel must have doubled--so instead of just 96GB, there is a total of 192GB possible in a channel and that would be achievable via 6x memory slots with 32GB modules or just 3x slots with 64GB modules, but then those would have to be LRDIMMs. with just regular RDIMMs, you'd need 24 memory slots to get 768GB and for a dual processor system to max out at 1.5TB, a whopping 48 memory slots, which I don't think exists.

I have no doubt that the memory limit on my e5-2630L v1 might be real, but now I'm wondering is how I can hit that 384GB number? A 64GB 8Rx4 module and a 32GB 4Rx4 module together would total 96GB in a single channel, but that means that the number of ranks supported by the processor has to be 12 vs 8. And 12 is a bit of an oddball number so I wonder if it is more like 16, which would allow 128GB per channel. Only one way to find out, but it's too pricey right now 'for science!'
So i found out something new today after loading up the X9Dri with 1.5TB. Although the board recognized all the ram, it does not want to load the video driver with any amount of ram equal to or above 1TB. This is also what happened on the Asus Z9PE. I am wondering if it's some limitation of the C602 chipset.

Puppet, are you also using an external video card in a PCIe slot or are you using the Aspeed onboard gfx? Any chance you can slide in a Pcie gfx card and also confirm what i am seeing? thanks.
 

Attachments

  • Like
Reactions: Samir

wildpig1234

Well-Known Member
Aug 22, 2016
2,303
559
113
49
The z420 wasn't even supposed to support ecc reg, lol. I found out by accident when I tried some 32GB LRDIMMs and it booted. I then ran memtest and it passed. Apparently I set the world record at 256GB for a z420 when I did this. :) What's interesting is that the v2 processors apparently dropped LRDIMM support, at least on the z420 when someone trying to replicate my setup ran into a problem with just beeps. The modules were the exact same part number as mine, known working, but his still wouldn't boot with them even though RDIMMs would. We then saw in the spec sheets how LRDIMM support was removed in v2, or at least that's what I recall now.

It is interesting the v2 says 768GB though--I guess the number of ranks allowed in a channel must have doubled--so instead of just 96GB, there is a total of 192GB possible in a channel and that would be achievable via 6x memory slots with 32GB modules or just 3x slots with 64GB modules, but then those would have to be LRDIMMs. with just regular RDIMMs, you'd need 24 memory slots to get 768GB and for a dual processor system to max out at 1.5TB, a whopping 48 memory slots, which I don't think exists.

I have no doubt that the memory limit on my e5-2630L v1 might be real, but now I'm wondering is how I can hit that 384GB number? A 64GB 8Rx4 module and a 32GB 4Rx4 module together would total 96GB in a single channel, but that means that the number of ranks supported by the processor has to be 12 vs 8. And 12 is a bit of an oddball number so I wonder if it is more like 16, which would allow 128GB per channel. Only one way to find out, but it's too pricey right now 'for science!'
I have an HP z420 my workplace gave to me for remote work from home. updated the bios to the latest today. it has a 1620 v1 cpu. I put in 4X64GB and it was able to boot into bios showing 256GB DDR3 LRDIMM under the total ram section in line 2. But interestingly, the ram section about the size of each dimm below show blank instead of displaying the size of each dimm.

I then put in a 1607 v2 cpu leaving in the 256GB and i got a red light warning. I guess as we suspected,1600 v2 support for lrdimm was removed. Then i went back to the 1620 with the 256GB and it boot fine again. But putting in 2 more 64GB it won't boot which i guess because 1620 v1 only supports 256GB ram.

So i put in a 2609 v2 with one 64GB stick. It does not boot at all. After a while the fan just speeded up to max and stay there. Remove the 64GB and replace with a 4GB rdimm and it booted fine.

So the result so far for me is that LRDIMM support on Z420 is incomplete. The saddest part is not able to get it to boot with a 2609 v2 with just even one 64GB stick... ;(
 
  • Like
Reactions: Samir