10 GB point to point connection via ASUS PEB-10G/SFP PLUS/DUAL or Infiniband?

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

Kristian

Active Member
Jun 1, 2013
347
84
28
Hello,

I would like to establish a point to point 10 gb network connection between my esxi home lab and my storage/file/backup server.
Distance is approximatly 2 meters.

As I am on a budget I came across this card:
ASUS PEB-10G/SFP PLUS Dual Network Adaptor: Amazon.co.uk: Computers & Accessories

They are selling in my home country for 180 EUR each (new)

Has anyone of you experiences with those cards?
Would there be the possibility of daisy chaining the cards? (for a upcoming update in the more distant future

Infiniband would also be an option, but I am not shure which cards to choose.
And they don't seem to be very common in Europe.

Some of the available ones are:
Voltaire InfiniBand 4X DDR PCI-e 2Port - HCA 500Ex-D 99EUR
Cisco Infiniband 74-4538-01 2 Port SFS-HCA-X2T7-A1 126EUR
HP Infiniband 4X DDR PCI-e Dual Port HCA Card 448397-B21 452372-001 69 EUR
HP InfiniBand HCA 4X DDR Dual Port PCI-E - 483513-B21/487504-001 99 EUR
HP InfiniBand 4X QDR PCI-E G2 Dual Port HCA 583211-B21 239 EUR
 
Last edited:

mervincm

Active Member
Jun 18, 2014
160
39
28
My solution for this were a couple IBM 42C1790 (used) Broadcom based BCM57710 cards (about $100 US ea) with transceivers and an LC/LC multimode fiber. Total solution less than 250$ US
 
  • Like
Reactions: Kristian

Kristian

Active Member
Jun 1, 2013
347
84
28
Unfortunatelly those cards do not seem to be available in europe very often. I can buy them new at about 340 EUR each.
Anyone there with a different idea?
 

RTM

Well-Known Member
Jan 26, 2014
956
359
63
How about the Brocade 1020?

The product description even suggests that they offer combined shipment, so perhaps you can talk to them to save a little on the shipment (at the moment they want 2x 40.5$ to ship two NICs to Denmark)
 
  • Like
Reactions: Kristian

s0lid

Active Member
Feb 25, 2013
259
35
28
Tampere, Finland
Yeah brocade 1020 is a cheap solution, for 2 meter distance I would suggest using the Brocade active SFP cables, they're like 20-30USD/piece.
OS support is okish, Windows, Solaris, ESXi and newest linux kernels.

Cable PNs:
1M: 58-1000026-01
3M: 58-1000027-01
5M: 58-1000023-01
Other brand cables WILL NOT WORK, neither do non-Brocade MM SFP+ lamps.
 
  • Like
Reactions: Kristian

Kristian

Active Member
Jun 1, 2013
347
84
28
That sounds really interesting. Do you happen to know if there is the possibility to daisy chain those cards (in case there would be a future need to have a third node connected to the 10gb network and not having to buy a 10GB switch)?
 

Clownius

Member
Aug 5, 2013
85
0
6
Out of interest is there a reason your going for these solutions over a simple 10GBaseT standard cat6/6a cable?

I saw 10Gbps dual Ethernet cards for around the $300 mark on ebay. You seem to be getting close to that anyway with the expensive cables and SFP+ modules plus the PCIe cards.
 

Kristian

Active Member
Jun 1, 2013
347
84
28
No other reason than the budget.
I have ordered the Brocade cards and a 3m cable today.
2 x $300 = $600 vs 2 x $47.49 (PCIe cards) + $ 35 (cable) = $ 129.98

So thats $ 470 difference...
and that leaves the power consumption aspect out of the calculation.
The Brocade cards eat about 8.5 watts

from what I have read 10GBaseT will eat up more than twice the power.
And the electricity bill is always something you want to consider in Europe.
 

Clownius

Member
Aug 5, 2013
85
0
6
Fair enough.

I was considering some 10GBaseT to run between my HTPc's (2 of them) and my storage server to stop them cluttering up the normal household network. Mainly i want to get away from constantly sorting out HDD's on the HTPC's hence moving everything storage wise to my storage server. But of course that requires high speed access (its not unusual to have 4 HD TV streams recording at once here) between server and HTPC without hopefully stopping the rest of the households computers from having access if needed.

So yeah im up for around a grand on cards. That said the cost of running Fibre to the HTPC's would negate any savings in my case i would think. Ethernet is already in place (and all runs are cat6a thankfully) and i just need some patch cables (although my cat6 ones may work i will test).

Power wise i wont comment. Its far from cheap in Australia too. But i dont stress over 10W on anything. All the drives im spinning are my biggest power draw in reality. Although i plan to start spinning 2.5" disks and SSD's more in future as my older drives (im still spinning a few 160GB Sata and 73GB SaS 15k drives) die off. So i guess i am kinda thinking about it. I just have this thing about not throwing out/replacing working hardware.
 

Kristian

Active Member
Jun 1, 2013
347
84
28
Based on the scenario you are dealing with I would have most possibly chosen to go down the 10GBase-T road too.
The HTPc that is running in my scenario doesn't even saturate a single 1GB link.
All recording happens directly to the esxi node.
And the only fast connection I need right now is the connection between the 12 TB esxi node and the storage server.