Does SSD impact unzip speeds?

Notice: Page may contain affiliate links for which we may earn a small commission through services like Amazon Affiliates or Skimlinks.

OBasel

Active Member
Dec 28, 2010
494
62
28
We always see benchmarks on throughput. I have a few web apps that each have 4000+ individual files. Most are small code files. They take minutes to unzip on my Crucial MX100. Would it be better to get a faster SSD?
 

mrkrad

Well-Known Member
Oct 13, 2012
1,244
52
48
Setup a free ramdisk program, load the files to ramdisk, and compare SSD Versus RAMDISK to get an honest opinion!

I'd say yes it matters, how much depends on many factors!
 
  • Like
Reactions: coolrunnings82

sean

Member
Sep 26, 2013
67
33
18
CT
I have the same SSD. Running gzip over 4389 source code files (~13 MB) takes half a second.

Benching gzip under ideal circumstances (1 GB of zeros) yields an average speed of 140 MB/s. Directly dumping to disk gets over 400 MB/s. gzip isn't zip although vaguely similar. My guess is it's something other than the SSD itself.
 

TuxDude

Well-Known Member
Sep 17, 2011
616
338
63
It depends on what type of compression algorithm you are using. Though a faster SSD will almost certainly make the process at least a bit faster, if you are mostly CPU limited changing out the SSD won't have that much of an impact. Just playing around on my desktop, I took sean's test from the previous post a bit further. I did nothing to flush caches to disk, so all of the tests probably ran completely in RAM (16GB in my desktop, plenty to cache these tests). All starting from an identical 1GB file containing nothing but zeros, regular gzip took 5.9 seconds to compress it down to a 996KB file. Adding the '-1' option to gzip to use its fastest algorithm sped that up to only 4.8 seconds, but also increased the size of the resulting file to 4468KB. I also tried lz4 which is a compression algorithm specifically designed to be fast - it got the source file down to 4020KB in only 0.46 seconds.

Back to the original question - considering that there are plenty of free alternative compression algorithms available it doesn't cost anything to try a few of them before spending $ on an SSD that might not improve things as much as you like.
 

Patrick

Administrator
Staff member
Dec 21, 2010
12,519
5,826
113
@OBasel - I have a STECH s480, Samsung 845DC EVO, and a 400GB S3700 in the test rig right now. Is what you are asking more or less that I use something like an Xenforo upgrade pack (3000-4000 files) and simply time the unzip on each? From what it sounds like you have the source file and the destination on the SSD.
 

Patrick

Administrator
Staff member
Dec 21, 2010
12,519
5,826
113
@OBasel - I just tried this on the above drives and a RAID 0 S3500 800GB set. Using 7-zip so the timer was easy, each drive was between 2.2 and 2.6s on the xenforo full install zip (~400 directories and 2600 files.) Now using windows -> extract all took sometimes 10x the time.

As others have mentioned, save money and just use 7-zip instead.
 

swerff

New Member
Jan 13, 2015
24
4
3
54
Try pigz

It's coded for multicore setups unlike gzip. It'll give you idea if you're cpu is being the hold back, or at least affecting tests
 
Last edited:

yanmercal

New Member
Mar 2, 2016
1
0
1
44
@OBasel - I just tried this on the above drives and a RAID 0 S3500 800GB set. Using 7-zip so the timer was easy, each drive was between 2.2 and 2.6s on the xenforo full install zip (~400 directories and 2600 files.) Now using windows -> extract all took sometimes 10x the time.

As others have mentioned, save money and just use 7-zip instead.
LZMA2 is faster for 4-threads, if you compress big file (more than 256 MB), so 7-Zip will be able to split it to blocks.

Mercal