Networking options

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

halfelite

Member
Oct 10, 2014
62
17
8
40
I started out with a 40TB server using an ARECA hardware raid controller. I have outgrown this setup and I am in the process of building a new ZFS based 24bay server to replace the ARECA setup. The problem is the best and cheapest way to transfer everything over in a reasonable amount of time. Currently the Areca box has 1 gbit port. The ZFS box has 2 gbit ports and I have a ProCurve Switch 1400 Series that is unmanaged so no link aggregation.

Anyone have any suggestions on a cheap/fast way to move all this data over?
 

whitey

Moderator
Jun 30, 2014
2,766
868
113
41
My vote...Microtik or LB4M switch w/ 2 10G ports, hook each SAN host via 10G and off to the races (if you got nics/optics/cables).
 
Last edited:
  • Like
Reactions: Kristian

bds1904

Active Member
Aug 30, 2013
271
76
28
I recommend a quanta lb4m for the switch. you can find one on ebay with 2sfp+ modules for cheap.

Get 2x connectx-2 nic's & 2 Finisar FTLX8571D3BCL sfp+ modules

Get 2x om3 lc-lc duplex jumpers.

Ebay is your friend for this project, you should be under $200 for everything.

You could get by without the switch but I think you'll like having the 10gb option for the fileserver and main workstation after you are done with the transfer.
 
  • Like
Reactions: NeverDie

petree77

New Member
Mar 10, 2015
12
3
3
46
If your dataset is larger files and not millions of small files (for the case of this exercise, small file are anything under 50MB) you can make this work, if its millions of smaller files you're going to spend all your time seeking and it will kill your speed. All of the rest assumes you have large files:

The theoretical numbers have the transfer lasting 4 days over a single interface. but you'll have to be careful to not using an encryption during the copy process (rsync over ssh will be out).

Link aggregation probably won't help you too much unless you're doing some non-standard hashing and running multiple streams.

The cheapest way monetarily is to invest in some decent 1GB nics and cable them up 1-to-1 on the two systems, assign a different network range to each interface, ie:

192.168.1.1->192.168.1.2 255.255.255.0
192.168.2.1->192.168.2.2 255.255.255.0
192.168.3.1->192.168.3.2 255.255.255.0
etc..

Split your directory tree into chunks:

/src/dir1
/src/dir2
/src/dir3
etc.

Then run an rsync on each source to its neighbor interface:

rsync -a /src/dir1/192.168.1.2:/dst/dir1/
rsync -a /src/dir2/192.168.2.2:/dst/dir2/
rsync -a /src/dir3/192.168.3.2:/dst/dir3/

Its not very glamorous, but you can move a hell of a lot of data this way. This all assumes that you can actually do enough random io on the source volume to keep the streams running.
 

PigLover

Moderator
Jan 26, 2011
3,184
1,545
113
Just get two cheap 10G nics on ebay and wire them back to back. Don't even need a switch. When you are done resell them on ebay for the same price. Total cost: your eBay value fees on the resale.
 

halfelite

Member
Oct 10, 2014
62
17
8
40
Just get two cheap 10G nics on ebay and wire them back to back. Don't even need a switch. When you are done resell them on ebay for the same price. Total cost: your eBay value fees on the resale.
Seems this would be the cheapest way. Is there something better then just going with two HP 671798-001 MNPA19-XTR 10GB PCI-E Ethernet Network Interface Card they are at 39$ a piece.
 

PigLover

Moderator
Jan 26, 2011
3,184
1,545
113
Unfortunately I can't comment on cards I haven't used. I do know that the ConnectX-2 EN cards work well and can usually be found for ~$50/each + the cost of short DAC cable to connect them. They should also resell well so you shouldn't get stuck with them. The oddball cards might be cheaper but they also might not resell quickly when you are done.
 

NeverDie

Active Member
Jan 28, 2015
307
27
28
USA
I recommend a quanta lb4m for the switch. you can find one on ebay with 2sfp+ modules for cheap.

Get 2x connectx-2 nic's & 2 Finisar FTLX8571D3BCL sfp+ modules

Get 2x om3 lc-lc duplex jumpers.

Ebay is your friend for this project, you should be under $200 for everything.

You could get by without the switch but I think you'll like having the 10gb option for the fileserver and main workstation after you are done with the transfer.
Thanks for posting that. I didn't realize it could be done for so little. When I'm a bit further along I'lll probably want to do this!

@halfelite: Please post an update letting us know what you decided to do and how it went.
 

NeverDie

Active Member
Jan 28, 2015
307
27
28
USA
So, is what HalfElite picked the total extent of what's needed? Nothing else? A fully functional PC-to-PC 10GigE link for $115, including shipping. Wow! What's not to like about that? I should think that would be awesome for connecting a NAS box to a box of VM's, which (out of an abundance of caution) was the road I was heading down but just using regular gigabit ethernet.
 
Last edited:

Chuckleb

Moderator
Mar 5, 2013
1,017
331
83
Minnesota
I bought about 60 units of that from that seller, he's great and the cards work well. The cable should work fine. Native support in most operating systems.
 

soapbox

New Member
Feb 24, 2014
17
0
1
I bought about 60 units of that from that seller, he's great and the cards work well. The cable should work fine. Native support in most operating systems.
Will the above Mellanox ConnectX-2 setup work on Windows 8.1 64-bit as well?

If not, what's the best way to connect 2 Windows 8.1 64-bit PCs into a 10GbE Local Network for transferring files?

I recently tried 2 StarTech ST10000SPEX NICs, cost me $600, but the connection is very unreliable, keeps failing and freezing up both computers.
 

Scott Laird

Active Member
Aug 30, 2014
312
144
43
Any SFP+ DAC cable should work. It doesn't need to say Mellanox. On the other hand, that price isn't that bad. The only cables I see on eBay that are cheaper ship from China and will probably take 2+ weeks to arrive.