any performance difference between 16 / 24 port expanders ?

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

MasterCATZ

New Member
Jun 8, 2011
17
0
3
I am planning on getting another 24 bay rack and also thinking about using an expander in the main case for its 16 bays so I can try and run everything from 1 pci-e slot via an LSI-SAS9201 Ideally I wanted 8e8i but not finding any such beast going cheap

LSI SAS9201-16e 16 Port 6Gb/s SAS SATA PCI-E HBA Adapter JBOD RAID H3-25577-00A | eBay

should I get a 16 port expander or 24 port expander? only a few $ difference between them

I am assuming no degraded performance using 24 port for running 16 drives?
at least this way I can have a backup part if a 24 port expander fails and I could jerry-rig my M1015's to run the 16 internal bays again

I am just not sure how the lanes are divided, assuming they get shared as more drives are added?

New HP SAS Expender Card 24-Port SAS PCI-E Expander Board 468405-001 | eBay


IBM 46M0997 ServeRAID Expansion Adapter 16-Port SAS Expander | eBay
 

K D

Well-Known Member
Dec 24, 2016
1,439
320
83
30041
I don't think 16 or 24 expander would be the bottleneck in themselves. Since you plan to connect 16+24 drives to a single x8 card, you have 40Gbps/3.2GBps available bandwidth to split across 40 drives. The single pciex8 is going to be the bottleneck.

Whether this is acceptable or not would depend on the workload, whether you plan to use ssd, do you use 10gbe or faster links etc.
 

MasterCATZ

New Member
Jun 8, 2011
17
0
3
I am just wanting to know if the 24 port expander would be slower than the 16 port expander when running 16 bays

my main bottle kneck is if the pci-e slot will be running @ 1x speed ( killed the 8x slot )
GA-F2A85X-UP4 (rev. 1.0) | Motherboard - GIGABYTE Global

if I use 1 card the PCI-e slot will be 4x speed if I use the 1x slots the 4x slot will run at 1x

so my options are getting a quad 6gbs cards and using it with expanders with 4x bandwidth
or
running 2x m1015's from the 1x slots to keep running the 16 bays and get another card with external ports to connect to an expander in the 24 bay rack
or getting some more quad port cards and seeing if I can connect 4 ports to an expander ( I know it can take 2x incoming ports )
and possibly doubling the lanes avail for the 16bay but they would be @ 1x PCI-e speed until I get a new mainboard
but being rev2 1x its capped @ 500mbs or 4x 2gbs anyhow ...

Code:
PCI Express
version    Introduced    Linecode    Transferrate[i]    Throughput[i]
                                          ×1        ×2         ×4        ×8        ×16
1.0    2003    8b/10b    2.5 GT/s    250 MB/s    0.50 GB/s    1.0 GB/s    2.0 GB/s    4.0 GB/s
2.0    2007    8b/10b    5.0 GT/s    500 MB/s    1.0 GB/s    2.0 GB/s    4.0 GB/s    8.0 GB/s
3.0    2010    128b/130b    8.0 GT/s    984.6 MB/s    1.97 GB/s    3.94 GB/s    7.88 GB/s    15.8 GB/s
4.0    2017    128b/130b    16.0 GT/s    1969 MB/s    3.94 GB/s    7.88 GB/s    15.75 GB/s    31.5 GB/s
5.0    2019    128b/130b    32.0 GT/s[ii]    3938 MB/s    7.88 GB/s    15.75 GB/s    31.51 GB/s    63.0 GB/s
the 24bay will be running SnapRaid for Multimedia storage
the 16bay will be ZFS ( which I need the most bandwidth for however it's rarely used these days it's just for my valued data )

until I get my self a new threadripper system anyhow, still waiting for my old APU system to die and memory prices to darn drop
 
Last edited:

MasterCATZ

New Member
Jun 8, 2011
17
0
3
anyhow thanks for your input
I might just get a quad 3gbs card for now and get a faster one's later on when mainboard is upgraded as the main reason I don't use the ZFS much is that the scrubs took days

also, I am pretty sure my other PCI-e slot is killing hardware 9 so a cheap $30 part will not affect the budget , 2x revo drives killed within a week of each other etc ,and other m1015 and now it seems that backup ones getting a faulty port , pretty doubtfull 4x HDD's are getting crc errors all of a sudden

I could possible have something shorting on the dead 8x pci-e slot , but videocard and other devices in the other pci-e slots seem ok ..