Intel XL710-QDA2 - $549


New Member
May 5, 2015
Hey Lance,
I'd be very interested in 8x10 configs of any cards you can get a hand on. Performance is secondary for me, because of only one specific server being able to serve with 1GB/sec at all ;)
It's more about using the NIC as soft-switch for a 10G-home network.
  • Like
Reactions: Kristian


Active Member
Apr 2, 2015
You guys are right on time. I've been curious about this and I'd just started setting up a test bed at work today.
One Mellanox ConnectX-3 dual 40G card connected by a breakout cable to a couple of systems with Intel X540-DA2 cards.
My hope was to create an Interface Group in pfSense and to do some testing between the 10G cards with iperf.
I'm particularly interested in finding out how the processor behaves when doing NIC performance tests.
I may either switch from a Mellanox to a Solarflare SFN7142Q or order a sample of the XL710 cards.
I'll follow up in this thread next week when I get a chance to resume testing.
@Lance Joseph: Any chance you got this to work?

Lance Joseph

Oct 5, 2014
@Lance Joseph: Any chance you got this to work?
I'd tried the Mellanox Connectx-3 cards with the 4-to-1 (QSFP+ -> SFP+) cable but only got link on the first SFP+ cable.
It may be possible to get this to work, but that's not a rabbit hole I'm prepared to go down at the moment.

I was going to pick up a pair of XL710-DA2 cards, but my vendor said there was a hardware issue with the currently available revision.
That may have been fixed since I last inquired but haven't had the time or inclination to follow up just yet.

There's a new testbed that I created with three systems with various flavors of Connectx-3 cards.
I'm doing some tests with Storage Spaces Direct, but once I finish, I'll bridge the adapters in pfSense.
From what I've read, there shouldn't be any major issues in getting these Mellanox cards working on FreeBSD.

I've a pfSense box running a i5-3330s, 4G RAM, and two X520-DA2's.
You might find some mention of it in one of my previous posts on this forum.
Iperf is able to move traffic across the bridge (going client to client) at around 9.5Gb/s.
If I run an iperf server on the pfSense box itself, three clients peg throughput at around 26Gb/s.


Staff member
Dec 21, 2010
I think the 40Gb cards can only run one connection per port, 10Gb or 40Gb.


New Member
Mar 23, 2021
Was this ever figured out?

I'm wanting to get the XL710-qda2 card, toss it into ubuntu nas (truenas scale), using a breakout cable to a Mikrotik CRS317-1G-16S+RM switch, LAG them up and let the NAS have 40gbit.

This possible? Or just go intel dual 10gbit LAG'em up and have 20gbit?


New Member
Mar 23, 2021
You need the Intel QSFP Configuration Tool:, Linux Version here: Download Intel® QSFP+ Configuration Utility - Linux* - Final Release

You can then tell it to run 4x10 -or 2x2x10 - You won't get 8x10GE from it as the XL710 is limited in that regard due to PCI-e x8, so it won't let you chose it.
I didn't want to chance it, so i went with straight 40gbit cables and just use the 40gbit port on the switch, but now I know it's possible, might try it on other servers.