Hi Guys,
Spent all weekend working on this new build, got a Supermicro X8DAH motherboard with intel 5520 chipset and dual IOH-36D PCI-E hub.The mobo has the latest bios and ipmi firmware, dual x5650 cpu & 192Gb ram 16x16gb dimm, multiple emulex 10gb adapter & adaptec raid adapter.
Anyway to cut to the chase, I am getting really bad network performance with this system which I associated with the PCI-E bottleneck. I have a multiple of similar setup on a X8DT3 motherboard that could easily get line rate performance on the 10gb adapter.
The main differences between this new system and the existing one is that intel turbo boost does not kicked in to max cpu speed like it does on system with X8DT3 mobo. (power setting set to max and bios setup is similar)
Max saturation I can get on a dual port card is about 5Gbps while on system with X8DT3 i was able to get close to 18Gbps when saturating both links on the same card.
Since I have multiple dual 10Gb nic on this system, I tried running multiple parallel iperf testing with very disturbing result. It would seem that the whole PCI-E bus is limited to 5Gbps of throughput.
I've also tried testing with different brand 10Gb nic (emulex, intel, qlogic) with very negligible result.
Just curious if anyone here has any system with dual intel IOH-36D PCI-E and able to do some testing for comparison. Anyway all testing was done using Windows 2012R2 updated to the latest patch, just like the other systems I am using to run iperf from.
Let me know if you guys also have any suggestion on what other test I should do ?? I wonder if having the dual IOH-36D PCI-E is truly what is causing this performance issue.
Thanks
Spent all weekend working on this new build, got a Supermicro X8DAH motherboard with intel 5520 chipset and dual IOH-36D PCI-E hub.The mobo has the latest bios and ipmi firmware, dual x5650 cpu & 192Gb ram 16x16gb dimm, multiple emulex 10gb adapter & adaptec raid adapter.
Anyway to cut to the chase, I am getting really bad network performance with this system which I associated with the PCI-E bottleneck. I have a multiple of similar setup on a X8DT3 motherboard that could easily get line rate performance on the 10gb adapter.
The main differences between this new system and the existing one is that intel turbo boost does not kicked in to max cpu speed like it does on system with X8DT3 mobo. (power setting set to max and bios setup is similar)
Max saturation I can get on a dual port card is about 5Gbps while on system with X8DT3 i was able to get close to 18Gbps when saturating both links on the same card.
Since I have multiple dual 10Gb nic on this system, I tried running multiple parallel iperf testing with very disturbing result. It would seem that the whole PCI-E bus is limited to 5Gbps of throughput.
I've also tried testing with different brand 10Gb nic (emulex, intel, qlogic) with very negligible result.
Just curious if anyone here has any system with dual intel IOH-36D PCI-E and able to do some testing for comparison. Anyway all testing was done using Windows 2012R2 updated to the latest patch, just like the other systems I am using to run iperf from.
Let me know if you guys also have any suggestion on what other test I should do ?? I wonder if having the dual IOH-36D PCI-E is truly what is causing this performance issue.
Thanks