I try to Update as soon AS the Updates aRe available; for my Servers and for PCs & Notebooks.
For the companies Notebook i have to Install BIOS within 6 weeks after Release.
Usually there are no Problems (with the BIOS Update process itself or with the updated stuff inthe Update), but around...
Pure Storages uses open channel ssds: they have controllers but no flash transition layer (basically the host cpu & ram must take take care of wear level, gc, logical to physical storage mapping etc.)
Did you run the bpa (best practice analyzer) and looked at the suggestions?
I had for ~6 years a dual core xeon d (2.2GHz), 16GB ram, 10GBE + >150TB hardware raid windows server as a nas and no problems sharing files to windows pcs or android tablets & nvidia shields running kodi
it's great for a 40gbe switch: low power consumption compared to arista switches, can be "silenced" with the fae fan commands, known/good brand with a lot of documentation, works so far with all dac cables and optic transceivers that I have
"High availability" with a single switch? This screams for me single point of failure...
How many ports?
What speed?
Estimated lifetime/you want to keep it around?
New? Or used?
Budget?
Without more information everybody will just guess and the answers won't help you or anybody else coming via...
rotational speed (7200rpm/120 rotations per second for most drives)
numbers of platters (inside the hdd casing)
higher rotation speed would increase heat and noise levels (sas hdds are screamers at 15k rpm and produce a lot of heat)
numbers of platters would add weight and require more powerful...
sas host bus adapter or raid controller (with hba mode)
mainboard with a pcie slot (x1 should be enough but it's better to have a x8 slot)
cable(s)
Cables will depend on how you want to connect the drives (directly? with a backplane?) and what connectors your controller has
I would "invert" that statement and say the consumer hardware manufacturers don't test their systems enough...
(My most recent experience was an asus board that wouldn't boot with a >x4 pice device (hba or 25GBE nic) in a x8 slot and the gpu in the x16 slot. This was later fixed with a bios update)
There was a front page post about a 2u4node system failure a few years ago where a power surge fried the nodes at the same. I think sth was down for a few hours/days.
@Patrick do you have a link to that post? I rember it but can't find it with the forum search or google (oldest post in main...
1) yes
2) the stock/vanilla versions no, oem models maybe
3) not sure as these cards (sas2) came around a similar time as the (u)efi stuff
4) "stupid" consumer stuff, check for a bios update (I had a problem with a mellanox nic in an asus system that was solved with a newer bios version)
5) In...
No, Omniparth is a proprietary protocol (based on qlogics infiniband ip)
Edit: I should read the complete posts before writing my answer :D
A dac cable should work with all supported protocolls. It's possible that a switch accepts only verified (coded) cables...
Do you have a link? I never saw a switch with >40GBit/s and gateway functionallity, only a 2u appliance from mellanox/nvidia with 8x dual port cx-7*...
* NVIDIA Mellanox Skyway InfiniBand to Ethernet Gateway
Edit:added link to appliance
There is no hard definition or specification what makes a server, a workstation or pc (excluding ibm pc from the 80s).
A server can be a small form factor system running a small website or a rack size system handling 150k+ transaction per seconds (ibm z mainframes). Same with workstations: some...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.