interesting, not earth-shattering
Given the hardware requirements, and that they're apparently CPU-bound, and that it looks at groups of images, leads to the conclusion that they're just setting larger windows on how far back they look for similar data.
You can test this by compressing large numbers of almost-alike files. One file per archive, and you get a certain level of compression. Multiple files per archive, and you'll probably achieve a better compression ratio (assuming file size smaller than look-behind window).
You can also see this in action if you use pngout or kzip (see http://advsys.net/ken/utils.htm ), which uses a more exhaustive search of all compression methods (compared to normal png or zip producing apps). The output you get is (usually) smaller, and is a standard png/zip file to boot.