Unmanaged i got a cheap TPLink that came with rackmount brackets. For your LAN if your not doing anything tricky it should do the job. Obviously if you want managed for some reason its going to cost more.
Unmanaged i have a 16 port version of this. Just sits quietly in the corner and does...
How can a drive change the ability of a NAS to support multiple drives?
To me it just looks like a recommendation on the scale you would look at WD Reds as your drive of choice.
Some of the OEM branded cards lock certain features for a cost like raid 6 yes.
Check how it was advertised some of them are not clear what version of the card it is and may mention somewhere is actually an IBM version.
Personally i have 6 assorted LSI 9260/1 cards and 2 of them were IBM cross...
You answered your own question on that but yeah the DCS AMD 3 node versions only fit 3 nodes. God knows who wanted that custom setup. Maybe a limitation of the PSU?
I do have the BBU's on mine. I should run some tests with SSD's but i dont have in use currently. Not to mention my boards are all older and run PCIe 2 in any case. Maybe grab a 9271 and compare them.
Fair enough.
I was considering some 10GBaseT to run between my HTPc's (2 of them) and my storage server to stop them cluttering up the normal household network. Mainly i want to get away from constantly sorting out HDD's on the HTPC's hence moving everything storage wise to my storage...
Out of interest is there a reason your going for these solutions over a simple 10GBaseT standard cat6/6a cable?
I saw 10Gbps dual Ethernet cards for around the $300 mark on ebay. You seem to be getting close to that anyway with the expensive cables and SFP+ modules plus the PCIe cards.
WD Red's have a history of being reliable NAS drives.
Drives shucked from external cases have a history of being unreliable cheap green drives.
Personally i like my data too much. Removing the drive from the case generally voids the warranty too. So your asking to loose your data and the...
If thats really 96GB per node its a damned fine deal. Sell off half the sticks on each node to get some of the $$$ back if its more than you need and maybe even the X5667 and buy L5520's instead or something
The Intel version of the C6100 should support 4 nodes unless it was custom built not to. I have two of them personally with 4 nodes each plus a spare node.
The one you linked looks like a custom job built to only support 2 nodes for some reason. It may or may not have other customisations...
Im actually looking at something like this with 2.5" bays for future expansion at home as i retire 3.5" drives and move to a lower power setup. It would need to connect to my LSI 9280-8e. Anyone know of an affordable option?
Sorry to hear of your trouble. Have you determined exactly what was faulty.
Im again reminded of the importance of good backups and the time i had been working on a site for a few months when the HDD died. When i enquired about the backup system i was told daily to the same disk and monthly...
DCS is Data Centre Solutions if i remember rightly. But its Dell Servers for customers big enough to dictate specs if they choose to. They may dictate they want a certain configuration node wise and that includes differences like no mez card, different bios etc. The only real way to tell is...
To be honest i wouldnt touch one of the DCS 6005's. So much drama and the processors are so outdated now.
The C6100's as long as you dont get a (highly different) DCS model are fairly well explored and solid machines. They are getting a bit pricey though.
Im running a pair of C6100's. One...
Im sure some nodes are different. DCS models in particular. Sounds like he got a server with Nodes from different servers in it. They should be able to coexist but likely came from two different C6100's at some point
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.