bzip2, an standard compression utility in Linux, Mac and friends has a very efficient compression ratio for low entropy files. I just tested to compress a big 89Gigabytes "all-white-spaces" file. The final compressed size was about 0.7Megabtyes, with a compression ratio of about x130.000. This is not very impressive from a mathematical point of view, since all the information could be packed in just 6 few bytes (a single byte for the white space plus 5 more bytes for the total size or around 16.000 million compression ratio). But the important thing is that bzip2 packs data in an standard format, and by just looking at the bz2 extensions, many software knows what to do with it.
Now the funny part. 89 Gigabytes is much more that the storage capacity of most old/unattended servers and low end cloud servers, that curiously, are the main target of worms infecting the Internet. I have a personal web server and it is a bit frustrating checking that around 99% of visits are (were) attacks trying to find holes in some WordPress, php services that could be installed on the machine or just trying good luck searching for passwords files and crypto wallets.