WebP

Lossless and Transparency Encoding in WebP

Jyrki Alakuijala, Ph.D., Google, Inc.
Last updated: 2012-08-16

Abstract -- We present the compression and decompression characteristics of WebP with its newly added alpha support and compare them with libpng and pngout. We compare the resource usage of WebP encoder/decoder to that of PNG in both lossless and lossy modes. We use a corpus of 1000 randomly chosen translucent PNG images from the web, and simpler measurements to show variation in performance. We have recompressed the PNGs in our corpus to compare WebP images to size-optimized PNGs. In our results we show that WebP is a good replacement for PNG for use on the web regarding both size and processing speed.

Introduction

Recently, transparency and lossless encoding support have been added to WebP image format. This makes it an alternative format for translucent and lossless images, encoded today as PNG images. Many of the fundamental techniques used in PNG compression, such as dictionary coding, Huffman coding and color indexing transform are supported in WebP as well, which results in similar speed and compression density in the worst case. At the same time, a number of new features -- such as separate entropy codes for different color channels, 2D locality of backward reference distances, and a color cache of recently used colors -- allow for improved compression density with most images.

In this work, we compare the performance of WebP to PNGs that are highly compressed using pngcrush and pngout. We recompressed our reference corpus of web images using best practices, and compared both lossless and lossy WebP compression against this corpus. In addition to the reference corpus, we chose two larger images, one photographic and the other graphical, for speed and memory use benchmarking.

Decoding speeds faster than PNG have been demonstrated, as well as 26% denser compression than can be achieved using today's PNG format. We conclude that WebP with its new features is an uncomplicated and more efficient replacement for today's PNG image format. Further, the lossy image compression with alpha support gives further possibilities in speeding up web sites.

Methods

Command line Tools

We use the following command-line tools to measure performance:

  1. cwebp and dwebp. These tools that are part of the libwebp library (compiled from head).

  2. convert. This is a command-line tool part of ImageMagick software (v6.5.7-8 2-12-04-30 Q16).

  3. pngout (May 30 2012)

  4. Memory profiling and user time profiling by /usr/bin/time -v (GNU time 1.7)

We use the command line tools with their respective control flags. For example, if we refer to cwebp -q 1 -m 0, it means that the cwebp tool has been evoked with -q 1 and -m 0 flags.

Image Corpora

Three corpora were chosen:

  1. A single photographic image (Figure 1),

  2. A single graphical image with translucency (Figure 2), and

  3. A web corpus: 1000 randomly chosen PNG images with translucency, crawled from the Internet. These PNG images are optimized via convert, pngcrush, pngout, and the smallest version of each image is considered for the study.

Figure 1. Photographic image, 1024 x 752 pixels. Fire breathing "Jaipur Maharaja Brass Band" Chassepierre Belgium, Author: Luc Viatour, Photo licensed under the Creative Commons Attribution-Share Alike 3.0 Unported license. Author website is here.

Figure 2. Graphical image, 1024 x 752 pixels. Collage images from Google Chart Tools

To measure the full capability of the existing format, PNG, we have recompressed all these original PNG images using six methods:

  1. Clamp to 8 bits per component: convert input.png -depth 8 output.png

  2. ImageMagick(1) with no predictors: convert input.png -quality 90 output-candidate.png

  3. ImageMagick with adaptive predictors: convert input.png -quality 95 output-candidate.png

  4. Pngcrush(2): pngcrush -brute -rem tEXt -rem tIME -rem iTXt -rem zTXt input.png output-candidate.png

  5. PNGOUT(3): pngout -f0 output-candidate.png

  6. PNGOUT using the mixed predictor: pngout -f5 output-candidate.png

Results

We computed compression density for each of the images in the web corpus, relative to optimized PNG image sizes for two methods:

  1. WebP lossless (default settings) and

  2. best of WebP lossless and WebP lossy with alpha (default settings).

We sorted these compression factors, and plotted them in Figure 3.

Figure 3. PNG compression density is used as a reference, at 1.0. The same images are compressed using both lossless and lossy methods. For each image, the size ratio to compressed png is computed, and the size ratios are sorted, and shown for both lossless and lossy compression. For the lossy compression curve, the lossless compression is chosen in those cases where it produces a smaller WebP image.

WebP goes beyond PNG compression density for both libpng at maximum quality (convert) as well as pngout (Table 1), with encoding (Table 2), and decoding (Table 3) speeds being roughly comparable to those of PNG.

Table 1. Average bits-per-pixel for the three corpora using the different compression methods.

Image Set convert -quality 95 pngout WebP lossless (default settings) WebP lossless -q 0 -m 1 WebP lossy with alpha
photo 12.0 11.9 9.62 10.2 0.71
graphic 1.36 1.12 0.74 0.85 0.56
web 3.69 3.27 2.42 2.70 0.60

Table 2. Average encoding average time for the compression corpora, and for different compression methods.

Image Set convert -quality 95 pngout WebP lossless (default settings) WebP lossless -q 0 -m 1 WebP lossy with alpha
photo 0.640 s 16.3 s 3.00 s 0.520 s 3.25 s
graphic 0.260 s 55.9 s 5.27 s 0.040 s 6.00 s
web 0.041 s 2.77 s 0.89 s 0.019 s 0.96 s

Table 3. Average decoding time for the three corpora for image files that are compressed with different methods and settings.

Image Set convert -quality 95 pngout WebP lossless (default settings) WebP lossless -q 0 -m 1 WebP lossy with alpha
photo 0.130 s 0.130 s 0.060 s 0.060 s 0.010 s
graphics 0.120 s 0.120 s 0.010 s 0.010 s 0.010 s
web 0.038 s 0.040 s 0.006 s 0.006 s 0.005 s

Memory Profiling

For the memory profiling, we recorded the maximum resident set size as reported by /usr/bin/time -v

For the web corpus, the size of the largest image alone defines the maximal memory use. To keep the memory measurement better defined we we use a single photographic image (Figure 1) to give an overview of memory use. The graphical image gives similar results.

We measured 10 to 19 MiB for libpng and pngout, and 25 MiB and 32 MiB for WebP lossless encoding at settings -q 0 -m 1, and -q 95 (with a default value of -m), respectively.

In a decoding experiment, convert -resize 1x1 uses 10 MiB for both the libpng and pngout generated png files. Using cwebp, WebP lossless decoding uses 7 MiB, and lossy decoding 3 MiB.

Conclusions

We have shown that both encoding and decoding speeds are in the same domain of those of PNG. There is an increase in the memory use during the encoding phase, but the decoding phase shows a healthy decrease, at least when comparing the behavior of cwebp to that of ImageMagick's convert.

The compression density is better for about 97% of the web images, suggesting that one can relatively easily change from PNG to WebP.

When WebP is run with default settings, it compresses 34% better than libpng, and 26% better than pngout. This suggests that WebP is promising for speeding up image heavy websites.

References

  1. ImageMagick

  2. Pngcrush

  3. PNGOUT release from May 30 2012 by Ken Silverman

The following are independent studies not sponsored by Google, and Google doesn't necessarily stand behind the correctness of all their contents.

  1. Yoav Weiss Blog

Authentication required

You need to be signed in with Google+ to do that.

Signing you in...

Google Developers needs your permission to do that.