Icy Dock MB699VP-B - 4x NVMe U.2 SSD in 5.25" - $185

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

pimposh

hardware pimp
Nov 19, 2022
377
217
43
Questionable matter with all these IcyDocks is about powering up drives with single SATA-15pin power plug.

Always thought that y specs single plug can deliver up to 30W (12V rail).

4x NVMe dancing with intensive writes (but SAS versions are also using same power socket) can certainly go over 30W.

Given how tiny these connectors are, and how many we’ve seen melting, its an odd choice.

I might be wrong, please correct if i am.

(But that was the reason i went with supermicro 5,25” caddies (powered up by regular 4-pin old school plug) instead of IcyDocks.)
 
Last edited:

Cruzader

Well-Known Member
Jan 1, 2021
799
819
93
Questionable matter with all these IcyDocks is about powering up drives with single SATA-15pin power plug.

Always thought that y specs single plug can deliver up to 30W.
A single plug would be problematic yes, it does use 2 plugs tho.
2 plugs has a combined 108W max draw on 12v plus u.2 has some of its draw on 3.3v also.

As for melting you can see that on any connector if you overload it or its low quality, not a specific or increased risk for the sata connector overall.
 

Prophes0r

Active Member
Sep 23, 2023
126
160
43
East Coast, USA
2 plugs has a combined 108W max draw on 12v
That's not how that works...

If you are using the maximum rated power for more than 1 second every 10(?) you are out of spec.[1]
Many older U.2 drives will HAPPILY chug down 25w constantly, which would be WAY out of spec for those pins.

Heck, it can even be a problem for a SINGLE drive when using the injection molded connectors instead of the vampire tap style ones, because those are rated for WAY less continuous power.

[1] I'd have to go look up the specifications again. I have it around somewhere. If I recall right it's 1 second max at 10%. Those pins need time to dissipate heat...
 

Cruzader

Well-Known Member
Jan 1, 2021
799
819
93
That's not how that works...

If you are using the maximum rated power for more than 1 second every 10(?) you are out of spec.[1]
Many older U.2 drives will HAPPILY chug down 25w constantly, which would be WAY out of spec for those pins.

Heck, it can even be a problem for a SINGLE drive when using the injection molded connectors instead of the vampire tap style ones, because those are rated for WAY less continuous power.

[1] I'd have to go look up the specifications again. I have it around somewhere. If I recall right it's 1 second max at 10%. Those pins need time to dissipate heat...
Feels like you are talking about loads over the 4.5A rating, there is a peak load rating that is higher than 4.5A.

To verify a psu holds spec/requirement when troubleshooting you draw 4.5A over some time, would be strange for the testing of spec to be outside spec.
 

Prophes0r

Active Member
Sep 23, 2023
126
160
43
East Coast, USA
Feels like you are talking about loads over the 4.5A rating, there is a peak load rating that is higher than 4.5A.
The peak load is 1.5A per 12v pin.
There are 3x pins each for 12v, 5v, and 3.3v.

They quoted 108w combined power on the 12v lines for 2x connectors, which is correct, if we are talking about peak load.

Continuous load is supposed to be well below that.
 

jode

Member
Jul 27, 2021
77
59
18
It's older v1 version with Mini-SAS HD SFF-8643, but looks like good price.
I liked Icydock products when they started out. However, I don't think that 10%-25% cost ratio of enclosure to the nvme drives it holds is a good price.
I think these are targeting the homelab/enthusiast market and as a target consumer I happily fix mount my U2/U3 nvme drives and save the difference...
 

Cruzader

Well-Known Member
Jan 1, 2021
799
819
93
The peak load is 1.5A per 12v pin.
There are 3x pins each for 12v, 5v, and 3.3v.

They quoted 108w combined power on the 12v lines for 2x connectors, which is correct, if we are talking about peak load.

Continuous load is supposed to be well below that.
You can peak above the 4.5A across the 3 pins.

To load one connector per row from psu with 1.5A on all pins while the tester also monitors voltage was first testing we did on psus.

Same with rated load on 24pin/eps12v/pcie.

If it cant handle that continuous within allowed voltage range it was marked as defective, and unless a fairly highend model just into the ewaste.
 

zachj

Active Member
Apr 17, 2019
252
147
43
I’ve got one of their 16-bay SATA enclosures and these I don’t think are as easy to replace with zip ties.

I’m reasonably confident icydock’s entire portfolio can be replaced these days with access to a 3d printer.
 
  • Like
Reactions: Whaaat

chlastakov

Active Member
Jan 26, 2025
178
56
28
Czech Republic
I’ve got one of their 16-bay SATA enclosures and these I don’t think are as easy to replace with zip ties.

I’m reasonably confident icydock’s entire portfolio can be replaced these days with access to a 3d printer.
Sure, when 3D printers will print backplanes too :)
 

zachj

Active Member
Apr 17, 2019
252
147
43
That’s what cables are for…It can’t possibly be that common for a home lab user to be hot swapping disks, so for the few times every decade when it becomes necessary I don’t think it should be overly burdensome to have to fiddle with cables and curse under one’s breath that a backplane would make things easier.

Since even the cables to interface between an HBA/motherboard and a backplane are so damned expensive there’s probably little financial incentive to prefer a backplane over a fistful of 1:1 drive cables.

So if the financial and ease-of-use arguments are essentially moot then the only remaining argument in favor of a backplane is airflow. I won’t hazard a guess as to the financial or effs-given value of airflow but I assume for the vast majority of homelab users the 3d printed drive cage would probably be preferred.
 

wardtj

Member
Jan 23, 2015
99
30
18
48
That’s what cables are for…It can’t possibly be that common for a home lab user to be hot swapping disks, so for the few times every decade when it becomes necessary I don’t think it should be overly burdensome to have to fiddle with cables and curse under one’s breath that a backplane would make things easier.

Since even the cables to interface between an HBA/motherboard and a backplane are so damned expensive there’s probably little financial incentive to prefer a backplane over a fistful of 1:1 drive cables.

So if the financial and ease-of-use arguments are essentially moot then the only remaining argument in favor of a backplane is airflow. I won’t hazard a guess as to the financial or effs-given value of airflow but I assume for the vast majority of homelab users the 3d printed drive cage would probably be preferred.
Have you ever had 8 SSDs hanging from cables inside the case before? Sure one or two that's fine. You start getting above 2 drives you start running into all kinds of issues, such as power splitters and all the wonder they are for drive placement. Not all cases have room to mount 8 plus drives, even using the 3M tape method.

The most inside a case I ever just dangled was 4 drives. It worked but when that MD array lost a drive its was quite a bit harder to replace it. That was with regular SATA cables and a couple power splitters.

Inexpensive Icy dock makes life quite a bit easier. Its still cheaper than a 3D printer and having to fiddle with that. Cable management in the case on the Inexpensive icy dock is still painful, as it can be spaghetti whether from a HBA or on board controllers. Biggest benefit is power and placement, followed by easily replacing a failed drive.

All im saying is its really impractical to go beyond 4 drives in typical cases using cables and tape. Sure if I bought a Fractal XL and some 1200w power supply maybe its not a big of an issue but for most users the Icydock is helpful.
 

tunatoksoz

New Member
Sep 24, 2024
12
2
3
Not taking into account tariffs, I have been digging for some zhenloong products


They seem to have this


Maybe if it works for you...
 

zachj

Active Member
Apr 17, 2019
252
147
43
To be clear I’m not advocating dangling drives from cables, though I’ve been doing that for years.

I’m advocating for 3d printing a 5.25” drive cage and running cables directly to the drives after installing drives in the cage.

Example:
 

jode

Member
Jul 27, 2021
77
59
18
Have you ever had 8 SSDs hanging from cables inside the case before?
In the world of consumer hardware you won't find that as there are not enough PCIe lanes to drive 8 drives (we're talking about a NVMe enclosure after all).

Also, the enclosure does not eliminate the cables - it just eliminates the dangling.