In doing research for my podcast every week, I come across a few interesting things. Many of them will affect me in some way, but most I will not use in any appreciable manner (either not being able to, or not interested). I like the stuff that I can use and actually can apply it to something I already do and has a measurable effect. Every so often, I find something like that.
Enter Zopfli. This is a general purpose compression algorithm recently released by Google that compresses about 3-8% better than competing algorithms like zlib. But before you go off on some anti-Google diatribe about how they are the new Microsoft, Zopfli is an implementation of DEFLATE. DEFLATE is perhaps the most widely used compression algorithm. If it can talk compressed files, chances are that it can use DEFLATE. So files compressed with Zopfli can be decompressed by anything that supports DEFLATE, with no penalty. (Note: Zopfli can only COMpress, not DEcompress.) The only downside is that the compression takes (almost) forever. But that isn't much of a cost, since entire CPUs are idling all the time, and idle CPUs are the devil's workshop.
So what can you do with this? One immediate and big way is web-related stuff. PNGs, for instance, are compressed with DEFLATE and are ripe candidates for Zopfli, since you generally only create and compress them once, then display them statically on websites forever. I eventually found programs that do just that- compresses PNGs further with Zopfli. I grabbed a Windows version.
Great! Except now I need to actually run the thing on my PNGs, and it's a command line program. Should be simple enough, but I have so many! Maybe I can automate this a bit? Hey, there's a for loop on the command line! I just need to know its arcane incantations...
First, copy your PNGs and those EXEs into a directory, then open a command prompt (cmd.exe)* and browse there (cd C:\Users\what\ever). Finally, run this:
for /F %x in ('dir /B *.png') do ( timeout /T 1 & start "%x" /BELOWNORMAL PNGZopfli.exe %x 1000 )
Woah! A bit too much there! I'll dissect it a bit. Let's start off with the simple things.
'dir /B *.png' means go through the directory (dir), find all the PNGs (.PNG), and put them on their own line with nothing else (/B).
timeout /T 1 should be obvious: wait for one second. `for /F %x in ('dir /B .png') do (
means for every line in the dir (for /F), assign that line to a variable (%x), and do what's in the last parenthesis.&
is there just because I have everything on one line.start "%x" /BELOWNORMAL PNGZopfli.exe %x 1000` means open a new window (start) with a particular title ("%x", which is the variable of the loop run), run it at lowest OS priority so you can do other things (/BELOWNORMAL), and run a command (PNGZopfli.exe %x 1000). I noticed that when you set the priority of things really fast (even different things), something freaks out, so you have to wait a bit in order for the kernel to calm down a little. One second was fine for me, but your mileage may vary for slower CPUs.
This will cause several more command line windows to appear, once every second, each one compressing a PNG in that directory. You can tell by the name of the windows; remember that
start "%x" part? When complete (it WILL take a long time), there will be a new PNG for each old one, ending in
.zopfli.png. See for yourself how much smaller it is.
Feel free to adjust that 1000 to your liking. That is the iteration count of how many times Zopfli gets run over the PNG data, so less makes it compress faster, more makes it take longer but it compresses more, with diminishing returns on time invested. I compressed about 50 megs of decently compressed PNGs (already ran PNGOUT on them), and it pegged my i7-2600 for about 3 hours before everything finished. Because I set the priority of the compression programs so low, I could still use my machine. On average, I saved about 8% or so of space versus pre-PNGOUT files. Some files went down by 20% or more!
I have already uploaded these improved PNGs to my site. What I mostly do, is save game screenshots as PNGs (what most of those 50 megs were), then copy, downscale, and save to JPG. I have most every screenshot saved at 1920x1200 resolution, since about the TES 4 article (one is saved like that, the other isn't). When there is such a time that internet speeds improve for everyone (because I LOVE a fast site), I can up the quality and resolution by re-saving the PNGs as JPGs, and drop those into a fresh backup of the site and restore it.
*It's not DOS, it's a command line interface to the Windows operating system. Even though it's text only and doesn't have fancy buttons, it is still Windows. It even identifies itself as such on the first line.