Home Setup - Design changes

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

Rand__

Well-Known Member
Mar 6, 2014
6,633
1,767
113
well there is some of your theoretical speed that's getting lost...

With "iperf3 server setup on freenas / client on ubuntu 18.4", your bandwith 12,8 GB/s , thats going to be ~1300 MB/s (o/c the bandwith is no static value and may change every few seconds for unknown reasons, especially on esxi).

And just realised that your read perf was significantly lower than write which is exactly a scenario I have seen with CDM/FreeNas/SSD Pools before. I never found an a solution but @gea had provided a possible explanation at some point. I think I was performing tests with S3700's back then, might be this thread: https://forums.servethehome.com/index.php?posts/164693/ , unfortunately no time to actually look it up, sorry
 

marcoi

Well-Known Member
Apr 6, 2013
1,533
289
83
Gotha Florida
ok so i been messing with two ubuntu images using iperf3. both vms on the same host. I enabled tso and lro and jumbo frames. Tesitng using vmnet3 nics and storage switch where freenas is setup to use.
Info from: VMware Knowledge Base

upload_2019-1-10_14-8-30.png

upload_2019-1-10_14-9-7.png

So far this is the best test VM to VM.
 

marcoi

Well-Known Member
Apr 6, 2013
1,533
289
83
Gotha Florida
This is same VMs as above but doing it local so only one VM is involved.
upload_2019-1-10_14-10-45.png

upload_2019-1-10_14-10-59.png


Seems like there should be room for improvement on the VM to VM side still?
 

Rand__

Well-Known Member
Mar 6, 2014
6,633
1,767
113
Have you looked at cpu load at that time?

Dont think that this is it, but just to be sure
 

marcoi

Well-Known Member
Apr 6, 2013
1,533
289
83
Gotha Florida
At three threads it failed. with two I got the below results.
I did disabled the mtu = 9k on the vm and esxi hosts and enabled tso and lro.

upload_2019-1-10_16-19-25.png

upload_2019-1-10_16-19-51.png
 

Rand__

Well-Known Member
Mar 6, 2014
6,633
1,767
113
So its not really scaling well.

Still you should be able to reach higher values with that network bandwith...
 

Rand__

Well-Known Member
Mar 6, 2014
6,633
1,767
113
well vswitch is a quick test, but I'd be surprised if reinstall would help. Not sure we discussed that, but all bios settings/energy saving options are off?
 

marcoi

Well-Known Member
Apr 6, 2013
1,533
289
83
Gotha Florida
well vswitch is a quick test, but I'd be surprised if reinstall would help. Not sure we discussed that, but all bios settings/energy saving options are off?
i have the bios setting for optimized RAM and Dell Power to watt profile. Ill go back in and re-check.
 

marcoi

Well-Known Member
Apr 6, 2013
1,533
289
83
Gotha Florida
slightly better performance with 2 threads with performance set in bios and esxi. I also removed the nic from the vswitch.

upload_2019-1-10_17-6-40.png

upload_2019-1-10_17-7-0.png


single performance
upload_2019-1-10_17-7-56.png
upload_2019-1-10_17-8-9.png

Not sure its worth the extra 5-60watts of power since it isnt that much more over prior setting.
 

marcoi

Well-Known Member
Apr 6, 2013
1,533
289
83
Gotha Florida
i deleted the test storage but ill try again.

I did some other testing on my 2nd host. vm to vm on 2nd host
upload_2019-1-10_21-0-48.png

upload_2019-1-10_21-1-13.png



esxi host to esxi host seems to hit the 10GB limits
upload_2019-1-10_21-12-57.png

upload_2019-1-10_21-13-46.png
 

marcoi

Well-Known Member
Apr 6, 2013
1,533
289
83
Gotha Florida
i recreated my stripe pool 8x800 gb no slog and sync=standard
datastore to ESXI local host as ISCSI with 200GB drive added to w7

upload_2019-1-11_0-21-26.png