Hello experts,
I'm building a compute cluster out of retired R610 and R710 servers.
A great deal of them have dual Qlogic fibre 2460 channel cards, and I have gotten a bunch of cables also.
Infiniband is probadly the best network for a compute cluster / HPC, but it is doubtful that I well get the funds to ever add infiniband.
Now I have a 1 GBit network with a HP 2510-48G, and a Intel four Gbit nic card for storage/frontend in the cluster. Since the cluster is gonna be aroung 40 compute-nodes, I will probadly have problems with having both communication between nodes and shared storage (NFS shares) through 1 Gbit, and it would probadly help a lot building a fibre channel target with eg. Linux SCST, perhaps with a dedicated os such as https://code.google.com/p/enterprise-storage-os/
I could use one of the R710 for this.
It seems fibre channel switches are really difficult to spec, since they seem to need licenses for active ports, trunking, etc. and GBIC modules. So I'm afraid that I'll buy a dud that is missing some license etc. So if somebody here are willing to give me some advice on fibre channel switch buying, I would really appreciate it.
Eg. would it make more sense buying 2x32 port switches or should I go for eg. a 48 or 64 port?
Any one that I should avoid?
Something that is a safe bet?
I also have some questions, if it is possible to bond/aggregate interfaces for the target/storage server, eg. find a couple of QLE2462 with dual ports, and get effective 4x4Gb bandwith, or is this not possible?
Any help is appreciated, and if you live nearby I'll gladly change beers for advice
I'm building a compute cluster out of retired R610 and R710 servers.
A great deal of them have dual Qlogic fibre 2460 channel cards, and I have gotten a bunch of cables also.
Infiniband is probadly the best network for a compute cluster / HPC, but it is doubtful that I well get the funds to ever add infiniband.
Now I have a 1 GBit network with a HP 2510-48G, and a Intel four Gbit nic card for storage/frontend in the cluster. Since the cluster is gonna be aroung 40 compute-nodes, I will probadly have problems with having both communication between nodes and shared storage (NFS shares) through 1 Gbit, and it would probadly help a lot building a fibre channel target with eg. Linux SCST, perhaps with a dedicated os such as https://code.google.com/p/enterprise-storage-os/
I could use one of the R710 for this.
It seems fibre channel switches are really difficult to spec, since they seem to need licenses for active ports, trunking, etc. and GBIC modules. So I'm afraid that I'll buy a dud that is missing some license etc. So if somebody here are willing to give me some advice on fibre channel switch buying, I would really appreciate it.
Eg. would it make more sense buying 2x32 port switches or should I go for eg. a 48 or 64 port?
Any one that I should avoid?
Something that is a safe bet?
I also have some questions, if it is possible to bond/aggregate interfaces for the target/storage server, eg. find a couple of QLE2462 with dual ports, and get effective 4x4Gb bandwith, or is this not possible?
Any help is appreciated, and if you live nearby I'll gladly change beers for advice