Webmasters

JavaScript and Responsive Web Design

One part of building smartphone-optimized sites that requires careful consideration is the use of JavaScript to alter the rendering and behavior of the site on different devices. Typical uses of JavaScript include deciding which ad or which image resolution variant to show in the page.

This page describes different approaches to using JavaScript and how they relate to Google's recommendation of using responsive web design.

Common configurations

Three popular implementations of JavaScript for smartphone-optimized sites are:

  • JavaScript-adaptive: In this configuration, all devices are served the same HTML, CSS, and JavaScript content. When the JavaScript is executed on the device, the rendering or behavior of the site is altered. If a website requires JavaScript, this is Google's recommended configuration.
  • Combined detection: In this implementation, the website uses both JavaScript and server-side detection of device capabilities to serve different content to different devices.
  • Dynamically-served JavaScript: In this configuration, all devices are served the same HTML, but the JavaScript is served from a URL that dynamically serves different JavaScript code depending on the device's user-agent.

Let's look at each of these configurations in detail.

JavaScript-adaptive

In this configuration, a URL serves the same contents (HTML, CSS, Javascript, an image) to all devices. Only when the JavaScript is executed on the device is the rendering or behavior of the site altered. This is similar to how responsive web design, using CSS media queries, works.

As an example, a page serves all devices the same HTML which includes a <script> element that requests an external URL that serves the JavaScript. All devices requesting the JavaScript's URL get the same code. When executed, the JavaScript detects the device and decides to alter something about the page, say to include a smartphone-optimized image or ad code instead of the desktop-optimized alternatives.

This configuration is very closely related to responsive web design and our algorithms can detect this setup automatically. Further, this configuration does not have a requirement for the Vary HTTP header because the URLs of the page and its assets do not dynamically serve content. Because of these advantages, if your website requires the use of JavaScript, this is our recommended configuration.

Combined detection

Combined detection is a setup where the server works in tandem with JavaScript on the client to detect the device's capabilities and alter the content being served.

For example, a site may choose to alter the rendering of the content based on whether the device is a desktop or smartphone. In this case, the website can include JavaScript that detects the screen dimensions, which are then sent to the server that updates or alters the code sent to the device. Typically, the JavaScript stores the detected device capabilities in a cookie that the server reads on subsequent visits from the same device.

Given that the server returns different HTML to different user-agents, combined detection is considered a type of dynamic serving configuration. The details are described in full here, but to briefly summarize them, the website should include the "Vary: User-agent" HTTP response header when a URL that serves different HTML content to different user-agents is requested.

Dynamically-served JavaScript

In this configuration, all devices are served the same HTML which includes a <script> element to include an external JavaScript file that can have different content depending on the requesting user-agent. That is, the JavaScript code is dynamically served.

In this case, we recommend that the JavaScript file be served with the "Vary: User-agent" HTTP header. This is a signal to Internet caches and Googlebot that the JavaScript can be different for different user agents, and is a signal for Googlebot to crawl the JavaScript file using different Googlebot user-agents.

Crawling requirements

Don't block Googlebot from crawling any page assets (CSS, JavaScript, and images) using robots.txt or otherwise. Being able to access these external files fully will help our algorithms detect your site's configuration and treat it appropriately.

Authentication required

You need to be signed in with Google+ to do that.

Signing you in...

Google Developers needs your permission to do that.