[SOLVED] Unable to boot FreeNAS with Intel P3700

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

K D

Well-Known Member
Dec 24, 2016
1,439
320
83
30041
I am having trouble getting freenas to boot with a P3700 AIC. This is an AIO system running esxi 6.5d

Specs -
Board - X10SDV-4c-7TPF,
RAM - 64 GB ECC
PCIe 1 - Dell H31 flashed to LSI
PCIe 2 - Intel DC P3700 400GB AIC
m.2 - Crucial SATA drive.
Chassis - SC846

The system has been running stable for a few months. I recently acquired a P3700 and decided to swap out an S3700 that I was using for a SLOG. The VM does not boot. Attached is the error. I've tried updating FreeNAS to the latest 11.0U1 and still have the same issue. Also tried with a clean new FreeNAS VM.

Tried passing through the P3700 to other Windows 10, and Windows Server 2016 VMs and it is recognized. Intel SSD toolbox shows the drive as clean.

Any thoughts?

m2stor.kd11.net-2017-07-09-15-01-40.png
 

T_Minus

Build. Break. Fix. Repeat
Feb 15, 2015
7,641
2,058
113
Hmmm, was this issue resolved or am I thinking of Omni-OS?
 

K D

Well-Known Member
Dec 24, 2016
1,439
320
83
30041
This is an esxi VM. I will change the VM settings to use uefi and try. The host is already set to uefi.
 

K D

Well-Known Member
Dec 24, 2016
1,439
320
83
30041
This is the error when I change the boot mode to EFI in the VM Settings.

Screen Shot 2017-07-11 at 5.07.42 PM.png
 

Rand__

Well-Known Member
Mar 6, 2014
6,634
1,767
113
Have you updated the P3700 to the latest firmware?
I have no issues whatsoever with mine in Freenas...

FreeNAS-9.10.2-U5 (561f0d7a1)

Have not tried 11 though
 

T_Minus

Build. Break. Fix. Repeat
Feb 15, 2015
7,641
2,058
113
Have you updated the P3700 to the latest firmware?
I have no issues whatsoever with mine in Freenas...

FreeNAS-9.10.2-U5 (561f0d7a1)

Have not tried 11 though
In a ESXI VM too?
 

Rand__

Well-Known Member
Mar 6, 2014
6,634
1,767
113
Yep. 6.5 latest update
Upgrading Freenas to 11 ...

Edit: No issues on FreeNAS-11.0-U1 (aa82cc58d)
 
Last edited:
  • Like
Reactions: T_Minus

K D

Well-Known Member
Dec 24, 2016
1,439
320
83
30041
Have you updated the P3700 to the latest firmware?
I have no issues whatsoever with mine in Freenas...

FreeNAS-9.10.2-U5 (561f0d7a1)

Have not tried 11 though
Intel ssd toolbox says contact oem manufacturer for latest firmware. Not sure what Oem the drive is. The markings on the drive seem to indicate that it is a stock intel drive.
 

K D

Well-Known Member
Dec 24, 2016
1,439
320
83
30041
  1. Identified this to be a dell nvme drive and updated to the latest firmware from dell
  2. BIOS was not the latest version - Updated to v1.0b
  3. Enabled legacy OPROM for the PCIe slot in the BIOS.
  4. Was using an old full height bracket as the drive came with a 1/2 height one and it looked as though it was not seated properly. Removed the bracket and just inserted the card into the pcie slot without the bracket.
  5. Disabled and re-enabled passthrough for all devices in esxi
Not sure which of the steps fixed it for me but now it works. Thanks all.
 

Rand__

Well-Known Member
Mar 6, 2014
6,634
1,767
113
Well, you can run the tests I used here (Zeusram vs. Intel P3700) so we'd have comparability.
Mine were not entirely satisfactory for @T_Minus though;), I was missing a long time 4K random write test (to get the drives into steady state to see whether P3700 or Zeus would drop off sooner go - if at all).

On the other hand it still would be great if we could come up with a STH standard for testing hdds/ssds/nvme's so that results are comparable and benchmarking is easy (i.e. have a post that provides the commands to run and necessary transformations to produce charts and interpret results).

Additionally it would need to cope with various criteria (drive condition i.e new/security erased, empty (usable as device) vs filled [partially] (usable with file output only) and surly we'd need one set for each drive type (hdds/ssds/nvme's)... Drive access/config would also need to be noted (direct disk access to a single drive, raid(z)/mirror, # of drives, iscsi/nfs/samba, (a)sync).

And also the 'lifetime cost' (TBW) should be calculated, wouldn't want to loose 20% total lifetime of a new drive just to run a couple of benchmarks because I ran "48h steady state all out write test" :)

I am totally aware that this is matter of almost faith like proportions (fio vs iometer vs bonnie vs sysbench vs various windows tools);) but maybe we could start with one of them.

I would imagine that this could be very popular content for the site as there are many pages providing some guides on running a benchmark, but few have end to end copy/paste ready content and even fewer have actual numbers.

To circle back - thats why I originally ended up with using Benjamin's setup to be able to compare to his (popular) results :)
 
  • Like
Reactions: K D and Patrick

realtomatoes

Active Member
Oct 3, 2016
252
32
28
44
On the other hand it still would be great if we could come up with a STH standard for testing hdds/ssds/nvme's so that results are comparable and benchmarking is easy (i.e. have a post that provides the commands to run and necessary transformations to produce charts and interpret results).
awesome idea.
 
  • Like
Reactions: K D and Patrick