Author: Susie Sahim, Web Designer and Google Doodler
Recommended skills: Basic image manipulation
When you optimize every line of code for your website, don't forget about your static content - including images. Simple improvements can drastically decrease your download size, without diminishing the site's quality.
Here are a few tips to help you make your web graphics load faster
Crop out excess white space
Sometimes you have extra space or padding around graphics so that they don't touch accompanying text or web page elements. Instead, crop out that space and use CSS to create the padding around the graphic.
Use the best file format
For images containing flat illustrations or artwork, use the 8-bit PNG or GIF format and reduce the number of colors in the palette. Some image programs such as PhotoShop allow you to save the image for the web and fine-tune the image settings. By reducing the color palette from 256 to something like 32, you greatly reduce the size of the file. The less colors that the image has, the smaller the file size is going to be.
For very detailed and colorful artwork or for photographics, JPG and 24-bit PNG are typically used because they have a much larger color palette. While a 24-bit PNG results in superior image quality, this comes at the price of a larger file size. When you can, use JPG instead and adjust the quality setting so you can compress the image as much as possible within your desired tolerance for image quality.
To compare and contrast, here are the file sizes of the above graphic in various formats:
JPG, 60 quality - 32K
PNG-8, 256 colors - 37K
GIF, 256 colors - 42K
PNG-24 - 146K
Also note that JPG has an option called "Progressive" mode. This option adds multiple copies of the image at lower resolution to make the image appear quickly on the screen, while progressively improving in quality. But it also increases the overall size of the image.
PNG also has a similar feature called "Interlaced". You may want to turn this feature off so that the full image downloads quicker.
Because the 8-bit PNG and GIF formats have the potential to result in much smaller image files, try to keep this in mind when creating graphics and illustrations for your site. Try to keep the amount of colors to a minimum and use flat graphics instead of photographs. This way you can create images with palettes of 16 colors, keeping the file size extremely small and fast to download.
Here are some statistics about the size, number of resources and other such metrics of pages on the world wide web. These are collected from a sample of several billions of pages that are processed as part of Google's crawl and indexing pipeline. In processing these pages, we not only take into account the main HTML of the page, but also discover and process all embedded resources such as images, scripts and stylesheets.
The average web page takes up 320 KB on the wire.
Only two-thirds of the compressible material on a page is actually compressed.
In 80% of pages, 10 or more resources are loaded from a single host.
The most popular sites could eliminate more than 8 HTTP requests per page if they combined all scripts on the same host into one and all stylesheets on the same host into one.
All resources are fetched using Googlebot, which means they are subject to robots.txt restrictions. For example, some sites (such as the BBC) block CSS and JS.
Some sites may present a different view of the resources to Googlebot than to regular users. For example, until recently, Google's own servers used to serve CSS and JS uncompressed to Googlebot, while compressing them for regular user browsers.
Pages are rendered and subresources are discovered through the eye of WebKit. If a page serves resources differently for Internet Explorer or Firefox, those won't be visible here.
Sampling of pages for processing is not uniformly random or unbiased. For example, pages with higher PageRank are more likely to be included in these metrics.
Number of sample pages analyzed.
Average number of resources per page.
Average number of GETs per page. Similar to number of resources, but also includes redirects.
Average number of unique hostnames encountered per page.
Resources Per Host
Average number of resources per host (derived from the 'Resources' and 'Hosts' values).
Average size transferred over the network per page, including HTTP headers. If resources were compressed, this would use the compressed size.
Average uncompressed size of a page and its resources, excluding HTTP headers.
Average uncompressed size of the compressible resources on a page, i.e., those with a Content-Type of 'text/*' or equivalent.
Average size of the compressible resources that were not sent compressed, i.e., the Content-Type was 'text/*', but Content-Encoding did not include 'gzip' or 'deflate'.
Average percentage of compressible bytes that were actually compressed (derived from the 'Zippable' and 'Unzipped' values).
Average number of unique images per page.
Average network size of the images per page.
Average number of external scripts per page.
Average network size of the external scripts per page.
Average number of requests that could be saved per page if external scripts on the same host were combined.
Average number of external stylesheets per page.
Average network size of the external stylesheets per page.
Average number of requests that could be saved per page if external stylesheets on the same host were combined.
Number of sample SSL (HTTPS) pages analyzed.
Average number of unique hostnames encountered per SSL page.
Average size of the compressible resources per SSL page.
Average size of the compressible resources that were not sent compressed, per SSL page.
SSL Zipped Ratio
Average percentage of compressible bytes that were actually compressed, per SSL page (derived from the 'SSL Zippable' and 'SSL Unzipped' values).
Authors: By Arvind Jain, Engineering Director and Jason Glasgow, Staff Software Engineer
Every day, more than 99 human years are wasted because of uncompressed content. Although support for compression is a standard feature of all modern browsers, there are still many cases in which users of these browsers do not receive compressed content. This wastes bandwidth and slows down users' interactions with web pages.
Uncompressed content hurts all users. For bandwidth-constrained users, it takes longer just to transfer the additional bits. For broadband connections, even though the bits are transferred quickly, it takes several round trips between client and server before the two can communicate at the highest possible speed. For these users the number of round trips is the larger factor in determining the time required to load a web page. Even for well-connected users these round trips often take tens of milliseconds and sometimes well over one hundred milliseconds.
In Steve Souder's book Even Faster Web Sites, Tony Gentilcore presents data showing the page load time increase with compression disabled. We've reproduced the results for three highest ranked sites from the Alexa top 100 with permission here:
Total download size increase (on first load)
Page load time increase (1000/384 Kbps DSL)
Page load time increase (56 Kbps modem)
348 KB (175%)
331 KB (126%)
Data, with permission, from Steve Souders, "Chapter 9: Going Beyond Gzipping," in Even Faster Web Sites (Sebastapol, CA: O'Reilly, 2009), 122.
The data from Google's web search logs show that the average page load time for users getting uncompressed content is 25% higher compared to the time for users getting compressed content. In a randomized experiment where we forced compression for some users who would otherwise not get compressed content, we measured a latency improvement of 300ms. While this experiment did not capture the full difference, that is probably because users getting forced compression have older computers and older software.
Why no compression?
We have found that there are 4 major reasons why users do not get compressed content: anti-virus software, browser bugs, web proxies, and misconfigured web servers. The first three modify theweb request so that the web server does not know that the browser can uncompress content. Specifically, they remove or mangle the Accept-Encoding header that is normally sent with every request.
Anti-virus software may try to minimize CPU operations by intercepting and altering requests so that web servers send back uncompressed content. But if the CPU is not the bottleneck, the software is not doing users any favors. Some popular antivirus programs interfere with compression. Users can check if their anti-virus software is interfering with compression by visiting thebrowser compression test page at Browserscope.org.
By default Internet Explorer 6 downgrades to HTTP/1.0 when behind a proxy, and as a result does not send the Accept-Encoding request header. The table below, generated from Google's web search logs, shows that IE 6 represents 36% of all search results that are sent without compression. This number is far higher than the percentage of people using IE 6.
Percentage of browser's search results sent uncompressed
Percentage of total uncompressed search results
Internet Explorer 8
Internet Explorer 7
Internet Explorer 6
Data from Google Web Search Logs
There are a handful of ISPs, where the percentage of uncompressed content is over 95%. One likely hypothesis is that either an ISP or a corporate proxy removes or mangles the Accept-Encoding header. As with anti-virus software, a user who suspects an ISP is interfering with compression should visit the browser compression test page at Browserscope.org.
Finally, in many cases, users are not getting compressed content because the websites they visit are not compressing their content. The following table shows a few popular websites that do not compress all of their content. If these websites were to compress their content, they could decrease the page load times by hundreds of milliseconds for the average user, and even more for users on modem connections.
Potential savings (bytes)
Data generated using Page Speed
What should I do?
To reduce uncompressed content, we all need to work together.
Corporate IT departments and individual users can upgrade their browsers, especially if they are using IE 6 with a proxy. Using the latest version of Firefox, Internet Explorer, Opera, Safari, orGoogle Chrome will increase the chances of getting compressed content. A recent editorial in IEEE Spectrum lists additional reasons - besides compression - for upgrading from IE6.
Anti-virus software vendors can start handling compression properly and would need to stop removing or mangling the Accept-Encoding header in upcoming releases of their software.
ISPs that use an HTTP proxy which strips or mangles the Accept-Encoding header can upgrade, reconfigure or install a better proxy which doesn't prevent their users from getting compressed content.
Webmasters can use Page Speed (or other similar tools) to check that the content of their pages is compressed.
Recommended experience: Some experience creating web pages. Basic understanding of HTML and CSS.
This article explains how to capture and analyze browser paint events using the Page Speed Activity Panel's paint snapshot feature. Page Speed is a Firebug/Firefox add-on; to use it, you will need to install Firefox, Firebug, and Page Speed. Links to install all three tools are available on the Page Speed download page.
Background: progressive rendering
Fast web pages render progressively. That is, they display their content incrementally, as it is loaded by the browser. A web page that renders progressively gives the user visual feedback that the page is loading, and gives the user the information they requested as soon as it is available. Google and Yahoo both suggest best practices to make web pages render progressively, such as putting stylesheets in the document head.
There are several additional best practices that you can apply to optimize progressive rendering for most pages. First, a fast page should render the content that is visible to the user first, and render the off-screen content (that is, the content outside of the current scroll region) later. Second, a fast page might also load and render the lightweight resources such as text before loading and rendering heavyweight resources like images and video.
On the other hand, some techniques are known to inhibit progressive rendering. The use of large tables, even for layout, disables progressive rendering in some browsers. Applying stylesheets late in the document, even if those stylesheets aren't needed for the initial page load, can also prevent progressive rendering.
Using Page Speed Activity to capture browser paint events
It can be difficult to determine whether a page is optimized for progressive rendering. Most pages render too quickly for the human eye to notice the individual paint events (especially when the page is loaded on a fast network connection), and it is not possible to see whether a page is rendering content in the off-screen regions.
Fortunately, as of version 3.5, Firefox supports capturing browser paint events within the browser. The Page Speed Activity Panel uses this feature to produce a "film strip" of page rendering activity. Each cell in the strip shows which regions of the screen were repainted (in yellow), as well as which regions were off screen (in gray) and thus not visible to the user.
Because capturing paint snapshots adds some overhead and can slow down the browser, Page Speed Activity screen snapshots are disabled by default. To enable paint snapshots, make sure Paint Snapshots (slow) is checked in the Activity Panel's drop-down options menu.
Use the Page Speed Activity Panel options menu to enable paint snapshots.
Once paint snapshots are enabled, and you begin recording events in the Activity Panel, the film strip of paint snapshots appears on the right side of the Activity Panel. Paint snapshots are drawn in the order they were captured, with the earliest snapshot at the top. You can use the scrollbar on the right of the paint pane to view all of the snapshots.
The Activity Panel, with the paint pane enabled.
Example snapshot playback
In this example, we look at the progressive rendering of a Google search results page, by playing back the paint snapshots in slow motion. These snapshots were captured using a modem connection.
To play back the example paint snapshots captured using Page Speed Activity, click the button.
In the snapshots, we see the portion of the page visible to the user in white, and the portion of the page that is not visible to the user (outside of the current browser scroll region) shaded in gray. Each snapshot shows the region of the screen that was repainted, shaded in yellow.
Notice that the visible portion of the text content of the page renders first, followed by the off-screen portion of the text content. By rendering the visible portion of the screen first, the user is provided with as much useful information as possible, as soon as possible.
After the text content on the page has finished rendering, the image content renders next. Deferring rendering of the image content until after the text content has loaded and rendered allows the browser display the text content as quickly as possible, which again gives the user as much of the useful information as possible, as quickly as possible.
Because width and height attributes are specified for all of the images on the page in the document markup, the browser does not have to reflow the page as the images are loaded. Though not strictly related to progressive rendering, specifying a width and height for images leads to a better user experience since the content of the page doesn't shift around as the page is loading.
Finally, notice that each image itself renders progressively, so that the the user begins to see the image content before the entire image has finished loading. Modern browsers render images in HTML <img> tags progressively. By contrast, many browsers do not render images specified using CSS background-image attributes progressively. To enable progressive rendering of images, use an HTML <img> tag instead of a CSS background-image attribute.