Overview of crawling and indexing topics

The topics in this section describe how you can control Google's ability to find and parse your content in order to show it in Search and other Google properties, as well as how to prevent Google from crawling specific content on your site.

Here's a brief description of each page. To get an overview of crawling and indexing, read the Advanced guide to how Search works.

Topics
Sitemaps Tell Google about pages on your site that are new or updated.
robots.txt A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site.
Meta tags Google supports several tag extensions that govern search appearance and behavior.
Crawler management
Removals
Duplicate content Tell Google about any duplicate pages on your site in order to avoid excessive crawling. Learn how Google auto-detects duplicate content, how it treats duplicate content, and how it assigns a canonical page to any duplicate page groups found.
Site moves and changes
International and multilingual sites If your site contains content in different languages, or with different content for different locations, here's how to help Google understand your site.
JavaScript content There are some differences and limitations that you need to account for when designing your pages and applications to accommodate how crawlers access and render your content.