Shared storage ideas...vSphere 6.5

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

BSDguy

Member
Sep 22, 2014
168
7
18
53
Beside the napp-it free edition there is a Pro edition with support and some extra features. After first setup there is a 30 day trial of the additional Pro features (no OS/ZFS restrictions, mainly comfort features).

There is currently no VAAI support in the free Solaris forks beside NexentaStor (there is a free community edition - forbidden for commercial use and with capacity restrictions) but for VM storage NFS is as fast but much easier than iSCSI and you can have concurrent SMB access for copy/clone/move/backup with snaps as Windows previous versions.
Damn, I was afraid someone was going to say that about VAAI.

I really don't want to be without VAAI so what are my options here?
 

BSDguy

Member
Sep 22, 2014
168
7
18
53
Does FreeNAS 11 support VAAI?

Also, anyone experimented with XPEnology for vSphere storage? heh
 

whitey

Moderator
Jun 30, 2014
2,766
868
113
41
Does FreeNAS 11 support VAAI?

Also, anyone experimented with XPEnology for vSphere storage? heh
I already addressed this...yes since 9.3 ver, iSCSI/block only. Please see previous references.
 

BSDguy

Member
Sep 22, 2014
168
7
18
53
An update! I rebuilt my SAN server with FreeNAS 11 and have setup a RAID 10 volume with my 4 Samsung SM863 SSD drives and enabled iSCSI and presented the storage to one of my ESXi hosts. After doing this I migrated a single VM to the datastore on the RAID10 volume and after doing a quick benchmark test these are the results (I'm using 10Gb for the storage network):

upload_2017-11-4_13-5-28.png

upload_2017-11-4_13-9-10.png

This looks really good from what I can tell! Completely saturating the 10Gb links which is great.

And VAAI is fully supported and enabled:

upload_2017-11-4_13-10-2.png
 
  • Like
Reactions: StevenDTX

whitey

Moderator
Jun 30, 2014
2,766
868
113
41
Good stuff, glad you made a sound and reliable choice there, sure you'll be happy w/ that stg platform. Gotta tip your hat to FreeNAS supporting VAAI.
 

BSDguy

Member
Sep 22, 2014
168
7
18
53
Heh, thanks! Am amazed I tolerated Starwinds Virtual SAN for so long but time will tell how well this setup works I guess. Doing a storage vMotion is insanely fast.
 

whitey

Moderator
Jun 30, 2014
2,766
868
113
41
Heh, thanks! Am amazed I tolerated Starwinds Virtual SAN for so long but time will tell how well this setup works I guess. Doing a storage vMotion is insanely fast.
Can we get a 'zpool iostat -v poolname 2' w/ a few readouts while svmotioning?
 

BSDguy

Member
Sep 22, 2014
168
7
18
53
Can we get a 'zpool iostat -v poolname 2' w/ a few readouts while svmotioning?
Sure things:

upload_2017-11-4_14-2-27.png

I just svmotioned a VM that is about 20GB in size from an NVMe drive to the RAID10 iSCSI storage and it took 20 seconds...

I just need to test UNMAP to ensure it is working as that is a really nice feature to take advantage of.

Is there somewhere in FreeNAS I can see the IOPS on the zpool?

FYI: I am using jumbo frames throughout the storage network
 

whitey

Moderator
Jun 30, 2014
2,766
868
113
41
Missing -v on that zpool iostat, as for iops, in FreeNAS reporting tab -> disks, also can see on vSphere side under datastore -> monitor.

Here's a sample of datastore iops of a NFS datastore of mine for reference.

husmm-rz-nfs-iops.png
 
Last edited:

BSDguy

Member
Sep 22, 2014
168
7
18
53
VERY nice! What is src NVMe, on another host, what device/class?
Thanks! ;-) 10Gb is looking slow now haha. In theory I should get read speeds of all 4 SSDs but the 10Gb NIC is the bottleneck now.

The NVMe drive is a Samsung 960 EVO 250GB (got it last week).
 

whitey

Moderator
Jun 30, 2014
2,766
868
113
41
Thanks! ;-) 10Gb is looking slow now haha. In theory I should get read speeds of all 4 SSDs but the 10Gb NIC is the bottleneck now.

The NVMe drive is a Samsung 960 EVO 250GB (got it last week).
I hear 40GbE is 'cheapish' for the insane folks arnd here :-D eh hem
 

BSDguy

Member
Sep 22, 2014
168
7
18
53
I hear 40GbE is 'cheapish' for the insane folks arnd here :-D eh hem
I almost went to a 40Gb Infiniband setup for my storage but decided to stick with 10Gb for now. I may be adding a third host later in the year so will add a second NIC to the FreeNAS server for this.

I have lots of reading/testing to do with FreeNAS now!!
 

whitey

Moderator
Jun 30, 2014
2,766
868
113
41
Heck w/ IB, too much of a PITA, stick w/ ethernet for sure is my vote. Juniper/Arista 40GbE switches can be had for $300-600 range...if your patient/stalk ebay listings.

EDIT: Ohh are you direct connect between nic's/ESXi hosts and no switch in the mix for 10GbE?
 

BSDguy

Member
Sep 22, 2014
168
7
18
53
Yeah, that was my conclusion as well re IB. Still waiting for a practical (ie: quiet) 10Gb switch to become available...
 

whitey

Moderator
Jun 30, 2014
2,766
868
113
41
I think the only semi-quiet 10G switch I can think of would be one of those Ubiquiti 10G switches, least of any of the quiet brands I would trust.
 
  • Like
Reactions: Chris Web

whitey

Moderator
Jun 30, 2014
2,766
868
113
41
In a sad attempt to get close to where you are I just blew away my raidz 4 disk 400gb husmm's w/ a matching 200GB husmm SLOG. Wonder why I can't make these sing, hell raidz was just as good w/ these devices.

Code:
                                           capacity     operations    bandwidth
pool                                    alloc   free   read  write   read  write
--------------------------------------  -----  -----  -----  -----  -----  -----
husmm1640-r10                           32.9G   707G      0  3.70K  3.99K   247M
  mirror                                17.5G   352G      0    763  2.00K  48.2M
    gptid/6afb105f-c16e-11e7-a22f-0050569a060b      -      -      0    685  2.00K  48.3M
    gptid/6b3068e0-c16e-11e7-a22f-0050569a060b      -      -      0    684      0  48.3M
  mirror                                15.4G   355G      0    749  2.00K  47.8M
    gptid/947df593-c16f-11e7-a22f-0050569a060b      -      -      0    670  2.00K  47.8M
    gptid/94af626d-c16f-11e7-a22f-0050569a060b      -      -      0    669      0  47.8M
logs                                        -      -      -      -      -      -
  gptid/fd87bbee-c16f-11e7-a22f-0050569a060b   252M   186G      0  2.22K      0   151M
--------------------------------------  -----  -----  -----  -----  -----  -----

                                           capacity     operations    bandwidth
pool                                    alloc   free   read  write   read  write
--------------------------------------  -----  -----  -----  -----  -----  -----
husmm1640-r10                           33.2G   707G      0  3.52K  4.00K   265M
  mirror                                17.6G   352G      0    660  4.00K  56.7M
    gptid/6afb105f-c16e-11e7-a22f-0050569a060b      -      -      0    598  4.00K  56.7M
    gptid/6b3068e0-c16e-11e7-a22f-0050569a060b      -      -      0    596      0  56.7M
  mirror                                15.5G   354G      0    671      0  56.8M
    gptid/947df593-c16f-11e7-a22f-0050569a060b      -      -      0    584      0  56.9M
    gptid/94af626d-c16f-11e7-a22f-0050569a060b      -      -      0    582      0  56.9M
logs                                        -      -      -      -      -      -
  gptid/fd87bbee-c16f-11e7-a22f-0050569a060b   252M   186G      0  2.22K      0   151M
--------------------------------------  -----  -----  -----  -----  -----  -----

                                           capacity     operations    bandwidth
pool                                    alloc   free   read  write   read  write
--------------------------------------  -----  -----  -----  -----  -----  -----
husmm1640-r10                           33.4G   707G      0  3.83K  4.00K   252M
  mirror                                17.7G   352G      0    807  2.00K  48.3M
    gptid/6afb105f-c16e-11e7-a22f-0050569a060b      -      -      0    694  2.00K  48.3M
    gptid/6b3068e0-c16e-11e7-a22f-0050569a060b      -      -      0    693      0  48.3M
  mirror                                15.6G   354G      0    764  2.00K  47.8M
    gptid/947df593-c16f-11e7-a22f-0050569a060b      -      -      0    669  2.00K  47.9M
    gptid/94af626d-c16f-11e7-a22f-0050569a060b      -      -      0    669      0  47.9M
logs                                        -      -      -      -      -      -
  gptid/fd87bbee-c16f-11e7-a22f-0050569a060b   252M   186G      0  2.29K      0   156M
--------------------------------------  -----  -----  -----  -----  -----  -----
Can try iSCSI I guess, may get a 10-20% gain. Starting to get a bit dejected abt these husmm's.