Brocade ICX Series (cheap & powerful 10gbE/40gbE switching)

fohdeesha

Kaini Industries
Nov 20, 2016
1,816
1,582
113
29
fohdeesha.com
updated the icx7xxx firmware on the guide from 8080ca to the latest 8080d. Interestingly this release included new bootloaders for both the icx7150 and icx7250, but no mention of bootloader changes in the changelog.

just bugfixes: Imgur
 
Jan 10, 2019
52
10
8
blog.azureinfra.com
Does anyone have some NIC tips, for the ones that work with Server 2019 and this switch and that are not too expensive?..
I have the HP NC523SFP but there are no native 2016/2019 drivers.. and using the 2012R2 drivers result in a bluescreen as soon as I change settings (like MTU)... My connection is using SFP Direct cables..
 

ljvb

Member
Nov 8, 2015
87
20
8
44
Does anyone have some NIC tips, for the ones that work with Server 2019 and this switch and that are not too expensive?..
I have the HP NC523SFP but there are no native 2016/2019 drivers.. and using the 2012R2 drivers result in a bluescreen as soon as I change settings (like MTU)... My connection is using SFP Direct cables..
Try the 571, it is newer, Solar Flair based 10G card still being sold by HP. I just picked one up from ebay for 35$ (I got the FLR version since I have HP servers, but they have a regular non FLR pci version). I'll know more when I get it, should arrive tomorrow, but it is still actively supported by HP, so drivers should be available.
 

ljvb

Member
Nov 8, 2015
87
20
8
44
Energy must be very cheap where you live!
Heh.. uhh.. no.. That is why it is turned off and sitting in the corner of my Garage while I try to figure out how to sell a 300 pound brick.. My wife did not like the $300 a month power biill... Tried local pickup a few times on Craigslist and Ebay but no luck. At this point, and the fact it is like 5'F today and my house is freezing.. I'm debating on cranking that sucker up inside the house.. it generates enormous amounts of heat :)
 

mrgstiffler

New Member
Jan 20, 2019
6
4
3
Does anyone have some NIC tips, for the ones that work with Server 2019 and this switch and that are not too expensive?..
I have the HP NC523SFP but there are no native 2016/2019 drivers.. and using the 2012R2 drivers result in a bluescreen as soon as I change settings (like MTU)... My connection is using SFP Direct cables..
Mellanox cards are cheap and work. I have a single port CX-2 in my Server 2019 machine and a dual port in my Unraid.
 

fohdeesha

Kaini Industries
Nov 20, 2016
1,816
1,582
113
29
fohdeesha.com
mellanox has been perfectly supported in freebsd/freenas for a very long time. the chelsio cards you will find for 20 or 30 dollars (the price of a much newer connect x3) are going to be ancient. Mellanox also has super open firmware tools that allow you to do a lot of stuff
 

747builder

Active Member
Dec 17, 2017
103
50
28
mellanox has been perfectly supported in freebsd/freenas for a very long time. the chelsio cards you will find for 20 or 30 dollars (the price of a much newer connect x3) are going to be ancient. Mellanox also has super open firmware tools that allow you to do a lot of stuff
will have to check them out. any preferred models in that pricerange that is dual sfp+ ? i have a few servers that take low profile cards only and i have a mix of FreeBSD/FreeNAS and Vmware in the rack so a model that can go in any would be awesome. hardware tends to be a mixture of Dell and Supermicro..
 

juey

Member
Oct 1, 2018
56
14
8
Germany
Jumbo Frames have very few advantages, Speed Increase is bellow 10-15 % and they make a heck of trouble with much devices anyway.
 

fohdeesha

Kaini Industries
Nov 20, 2016
1,816
1,582
113
29
fohdeesha.com
yeah I honestly had no clue people were still using jumbo frames. I think it's been a long time since I've seen CPU/hardware old enough that increasing MTU size gave me more than 4 or 5% increase at the expensive of 50% more headache. Even my 40gbE storage is at 1500mtu

I'll add them to the OP if you can confirm they don't fragment/work properly linked at 10gbE (from your post in that thread it seems they were linked at 1gbps when you ran the test)
 
  • Like
Reactions: BackupProphet
Jan 10, 2019
52
10
8
blog.azureinfra.com
The funny thing I noticed is that my iPerf max'd out at 5Gbps, but copying a Windows file between two servers actually managed to get the adapter to 9.7Gbps.. not sure how/why that is possible.. but just going by the numbers of iPerf seems not to be everything..
 

DanielWood

Member
Sep 14, 2018
44
17
8
The funny thing I noticed is that my iPerf max'd out at 5Gbps, but copying a Windows file between two servers actually managed to get the adapter to 9.7Gbps.. not sure how/why that is possible.. but just going by the numbers of iPerf seems not to be everything..

I’m guessing SMB multichannel was at play, so you were no longer maxing out a single CPU thread.
 

gslavov

New Member
Jun 7, 2017
17
4
3
41
Depends on what parameters you run iPerf with. It is notoriously bad on Windows with defaults. Try it with at least 4 threads and change the window size to 1M and see what you get.

The funny thing I noticed is that my iPerf max'd out at 5Gbps, but copying a Windows file between two servers actually managed to get the adapter to 9.7Gbps.. not sure how/why that is possible.. but just going by the numbers of iPerf seems not to be everything..