Speed
compression

Google Launches Zopfli To Compress Data More Densely And Make Web Pages Load Faster

Next Story

After Recruiting Founding StubHub CTO, Buzzmob Overhauls Its Social Events App To Help Brands Connect With Fans

Google just launched Zopfli, a new open source compression algorithm that can compress web content about 3 to eight 8 more densely (PDF) than the standard zlib library. Because Zopfli is compatible with the decompression algorithms that are already part of all modern web browsers, using Google’s new algorithm and library on a server could lead to faster data transmission speeds and lower web page latencies, which would ultimately make the web a little bit faster.

The new algorithm, which Zurich-based Google engineer Lode Vandevenne created as a 20 percent project, is an implementation of the Deflate algorithms – the same algorithm that’s also used for the ZIP and gzip file formats and PNG image format. Zopfli’s output is compatible with zlib, but uses a different and more effective algorithm to compress data.

As Vandevenne writes in the announcement today, “the exhaustive method is based on iterating entropy modeling and a shortest path search algorithm to find a low bit cost path through the graph of all possible deflate representations.”

There is, however, a price that needs to be paid for this: it takes significantly longer to compress files with Zopfli (decompression times are virtually the same, though). Indeed, as Vandevenne notes, “due to the amount of CPU time required — two to three orders of magnitude more than zlib at maximum quality — Zopfli is best suited for applications where data is compressed once and sent over a network many times, for example, static content for the web.”

Image credit: Volvo