Pngout will get rid of every expendable chunk of png data unless specified. Your test is interesting but I personally believe that it is completely broken and thus pointless. Looking at your batch files, you automated every image to be processed with optipng and pngcrush with a brute compression routine... but with pngout you simply used the default compression method. I ran all methods on my own just in case and managed to shave off 50KBs more from 0.png. That makes your comparison worthless. Based on personal experience with the program, I can assure you -I daresay "prove me wrong"- that any png found on the internet can be compressed better with pngout than with those two programs in your comparison. Advpng wouldn't be far behind... you could consider it if you find all this interesting. As to why people still use pngcrush is beyond me. It's like web developers still using GIFs or comic/manga readers still using CDisplay. And why it need to be forked... well, like you said: no clue. Perhaps the only disadvantage I see is speed. But is speed really important, all things considered?
Here is a suggested correction for your pngout batch file to make it use a "manual" brute compression if you're interested in doing it again proplerly:
Code: Select all
pngout 0.png /f0 /y 0_pngout.png
pngout 0.png /f1 /y 0_pngout.png
pngout 0.png /f2 /y 0_pngout.png
pngout 0.png /f3 /y 0_pngout.png
pngout 0.png /f4 /y 0_pngout.png
pngout 0.png /f5 /y 0_pngout.png
pngout 1.png /f0 /y 1_pngout.png
pngout 1.png /f1 /y 1_pngout.png
pngout 1.png /f2 /y 1_pngout.png
pngout 1.png /f3 /y 1_pngout.png
pngout 1.png /f4 /y 1_pngout.png
pngout 1.png /f5 /y 1_pngout.png
pngout 2.png /f0 /y 2_pngout.png
pngout 2.png /f1 /y 2_pngout.png
pngout 2.png /f2 /y 2_pngout.png
pngout 2.png /f3 /y 2_pngout.png
pngout 2.png /f4 /y 2_pngout.png
pngout 2.png /f5 /y 2_pngout.png
pngout 3.png /f0 /y 3_pngout.png
pngout 3.png /f1 /y 3_pngout.png
pngout 3.png /f2 /y 3_pngout.png
pngout 3.png /f3 /y 3_pngout.png
pngout 3.png /f4 /y 3_pngout.png
pngout 3.png /f5 /y 3_pngout.png
pngout 4.png /f0 /y 4_pngout.png
pngout 4.png /f1 /y 4_pngout.png
pngout 4.png /f2 /y 4_pngout.png
pngout 4.png /f3 /y 4_pngout.png
pngout 4.png /f4 /y 4_pngout.png
pngout 4.png /f5 /y 4_pngout.png
pngout 5.png /f0 /y 5_pngout.png
pngout 5.png /f1 /y 5_pngout.png
pngout 5.png /f2 /y 5_pngout.png
pngout 5.png /f3 /y 5_pngout.png
pngout 5.png /f4 /y 5_pngout.png
pngout 5.png /f5 /y 5_pngout.png
pngout 6.png /f0 /y 6_pngout.png
pngout 6.png /f1 /y 6_pngout.png
pngout 6.png /f2 /y 6_pngout.png
pngout 6.png /f3 /y 6_pngout.png
pngout 6.png /f4 /y 6_pngout.png
pngout 6.png /f5 /y 6_pngout.png
pngout 7.png /f0 /y 7_pngout.png
pngout 7.png /f1 /y 7_pngout.png
pngout 7.png /f2 /y 7_pngout.png
pngout 7.png /f3 /y 7_pngout.png
pngout 7.png /f4 /y 7_pngout.png
pngout 7.png /f5 /y 7_pngout.png
pngout 8.png /f0 /y 8_pngout.png
pngout 8.png /f1 /y 8_pngout.png
pngout 8.png /f2 /y 8_pngout.png
pngout 8.png /f3 /y 8_pngout.png
pngout 8.png /f4 /y 8_pngout.png
pngout 8.png /f5 /y 8_pngout.png
pngout 9.png /f0 /y 9_pngout.png
pngout 9.png /f1 /y 9_pngout.png
pngout 9.png /f2 /y 9_pngout.png
pngout 9.png /f3 /y 9_pngout.png
pngout 9.png /f4 /y 9_pngout.png
pngout 9.png /f5 /y 9_pngout.png
The idea is that pngout will not overwrite the source image if it produces a larger file: brute compression "borked"

.
Gale! This overwriting routing can be used on the wiki dump. Simply making a quick script that processes the same image with all 6 filters guarantees maximum compression
PS: X-Fi6: Bionic Commando. Very cool!
