Looking for an M.2 carrier card with PLX (4x M.2 on x16, no RAID)

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.
Sep 4, 2017
47
25
18
46
Thanks but am I missing something, it looks like they only make a $400 RAID card, not an M.2 carrier card. I can't use anything that requires an out-of-box device driver, it has to work as straight NVMe for Spaces. Can the Highpoint work with inbox NVMe drivers if it's in JBOD mode?
 
Last edited:

Morphers

Member
Nov 24, 2017
43
22
8
42
it's not a raid card, it's a straight jbod so it will work.

QUOTE="KC@Gadgetblues, post: 178958, member: 13998"]Thanks but am I missing something, it looks like they only make a $400 RAID card, not an M.2 carrier card. I can't use anything that requires an out-of-box device driver, it has to work as straight NVMe for Spaces. Can the Highpoint work with inbox NVMe drivers if it's in JBOD mode?[/QUOTE]
 

foureight84

Well-Known Member
Jun 26, 2018
276
252
63

D8V1D

New Member
Aug 30, 2018
1
1
3
That card works fine. You just need to make sure you have bios control to turn on bifurcation on that pcie slot it's connected to. It has to be an x16 slot and bifurcation settings to x4x4x4x4. I am currently using it on my Asus z10pe-d8 ws using a bios settings tweak using RU RU.EXE + RU.EFI
I have the same motherboard. Could you please provide the steps for the bios tweaks?
Thanks
 
  • Like
Reactions: liv3010m

Myth

Member
Feb 27, 2018
148
7
18
Los Angeles
update the BIOS to the most recent BIOS from the manufacturers website. Then after rebooting go into PCIe settings and search for the bifurnication option. If your motherboard doesn't have it then you will have to buy the high point controller card which uses the PXE chip and can give you full speed in a x16PCIe lane even without bifurnication.

You don't need the highpoint RAID card driver installed in order for it to pass through the NVMe drives into the system. The driver is just there for supporting RAID but you can RAID 0 them in Windows Disk Management. I'd leave the default windows drivers even though it's not as fast, but ASUS is much cheaper and you can install the Samsung drivers if you have a windows ten machine, but for a windows server machine you are stuck with standard windows NVMe drivers.

The highpoint is faster on the windows server with it's own drivers, but it does do some preboot shit to the system if you install the highpoint driver.
 

foureight84

Well-Known Member
Jun 26, 2018
276
252
63
I have the same motherboard. Could you please provide the steps for the bios tweaks?
Thanks
Here you go: Will Intel add VROC support for older CPUs? (Virtual RAID On CPU) : intel

Since we have the same motherboard, address x533 is PCIE-5 and X537 is PCIE-7.

"Once RU.efi launches, you will type ALT+ =+ (the hold ALT and press the =+ button). It should give you a list of options. Look for IntelSetup. Remember those 4 addresses we looked for earlier? 0x536 0x537 0x538 0x539. You will notice that on the left most column, it will read something like 0200, 0210... This correspond to the first 2 digits, and the top row will correspond to the 3rd digit. So hold control and hit pagedn until you reach the desired address. For me, I scroll down until I see 0530, then I press right until I am highlighted on 06. This is the first address I need to modify. So from the text file earlier you will see different option values. For me to put it in x4x4x4x4, You will need to modify the existing value to 00. Do that for 1 of those 4 addresses, and hit CTRL+W to write changes. then ALT+Q to quit. Once you've quit the RU Bios Utility, type reset in the UEFI Shell to reboot the system. If done correctly, you should see that when the system POSTS, it will show all 4 NVMe drives. If not then you will need repeat this step and change the next values until you see the nvme drives show up on post. When it does, write down this address as that is the corresponding address to your PCIE slot. For the Asus Z10PE D8 WS, it looks like 0x537 is slot & and 0x533 is slot 5. Once you're done then make sure to reset the bios settings back to default to undo all of the accumulated changes and now you can change the specific slot you need."
 

Citizen03212

New Member
Sep 7, 2018
4
1
3
A project that's releasing FPGAs-on-M.2 is also selling a 4-slot 16x PCIe card to match - out in a few weeks and only $149 USD:

AcornNest X4

Has it's own PCIe switch, so will work on any motherboard and doesn't need BIOS bifurcation support.
 
  • Like
Reactions: foureight84

foureight84

Well-Known Member
Jun 26, 2018
276
252
63
A project that's releasing FPGAs-on-M.2 is also selling a 4-slot 16x PCIe card to match - out in a few weeks and only $149 USD:

AcornNest X4

Has it's own PCIe switch, so will work on any motherboard and doesn't need BIOS bifurcation support.
That's actually really good to know. It is quite annoying having to edit your bios via RU. Sometimes you forget which address you need to modify.
 

sth

Active Member
Oct 29, 2015
379
91
28
i was hoping supermicro would come out with a half height quad m2 card when they went quad. Amfeltec still the only half height quad available at $$$$$.
 

foureight84

Well-Known Member
Jun 26, 2018
276
252
63
and it is relatively cheap.

https://www.amazon.com/s/ref=nb_sb_noss?url=search-alias=aps&field-keywords=AOC-SHG3-4M2P

yes there is a PLX Switch that connects all 4 * m.2 SSD's to an 8x PCIe bus.

In raw throughput the ASUS card will be faster, in compatibility, this card can possibly work on motherboards that do not have bifurcation support.

Chris
Yea they're relatively cheap. I've seen for as low as $120 ish dollars. http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=282796
 

Toddh

Member
Jan 30, 2013
122
10
18
Has anyone tried one of the Supermicro adapters. Specifically if you can see more than 1 NVMe on a MB without Bifurcation?