It depends on what type of compression algorithm you are using. Though a faster SSD will almost certainly make the process at least a bit faster, if you are mostly CPU limited changing out the SSD won't have that much of an impact. Just playing around on my desktop, I took sean's test from the previous post a bit further. I did nothing to flush caches to disk, so all of the tests probably ran completely in RAM (16GB in my desktop, plenty to cache these tests). All starting from an identical 1GB file containing nothing but zeros, regular gzip took 5.9 seconds to compress it down to a 996KB file. Adding the '-1' option to gzip to use its fastest algorithm sped that up to only 4.8 seconds, but also increased the size of the resulting file to 4468KB. I also tried lz4 which is a compression algorithm specifically designed to be fast - it got the source file down to 4020KB in only 0.46 seconds.
Back to the original question - considering that there are plenty of free alternative compression algorithms available it doesn't cost anything to try a few of them before spending $ on an SSD that might not improve things as much as you like.