This page shows you how to improve the breadth of coverage and freshness of the search results in your custom search engine.
If webpages included in your search engine have not been indexed by Google search, you can make the webpages more discoverable to Google by submitting a Sitemap. A Sitemap is an XML file that lists pages on your website and includes information about your webpages, such as when they were most recently updated, how frequently they change, and how important they are in relation to each other. To learn more about Sitemaps, see Selecting Sites to Search. If you do not want to use Sitemaps, you can add individual URLs of webpages you want Custom Search to index.
Google can index only webpages that can be accessed and crawled. Make sure that your site does not use robots.txt file or meta tags to block Googlebot from crawling pages that you want indexed.
On-demand indexing takes up to 24 hours to completely index your Sitemap or individual URLs. On-demand indexing indexes documents in a special index created for custom search engines. The documents indexed in on-demand index may not appear in Google.com search results until Google.com indexing takes place for these documents.
On the Indexing tab, you can view information about the indexing status of your search engine and Sitemaps. If you have Google Site Search, this tab also includes information about your plan quota and the approximate number of pages indexed. You can request indexing for up to a certain number of webpages for the search engine. If you have upgraded your search engine to Google Site Search, you have higher limits that vary according to your account level. More information about on-demand quotas. You can also use this process to submit updated PageMap data for your URLs.
As the web pages that you previously submitted for on-demand indexing move into the regular Google.com index, your on-demand indexing quota gets freed up for you to submit additional documents for on-demand indexing.
You can request immediate indexing of the following:
- Individual URLs. Google will index individual URLs you submit. This works well if you have only a small number of new or updated pages you want indexed.
- URLs linked from a specific page or Sitemap. Google will crawl a specific page or Sitemap to discover links to other pages on the same domain. All new pages will be added to the index, and will be crawled and indexed. In addition, Google will periodically revisit this page to discover and crawl new links. This method is ideal if your site lists its new or most important content on a single page—for example, the landing page of a blog or a table of contents—or if you frequently update Sitemaps to include links to your new content. You can specify multiple pages or Sitemaps for Google to use.
- URLs in a Sitemap. Sitemaps are a way to tell Google about pages on your content we might not otherwise discover. This method is recommended if you have a large or complex site. You can also use a Sitemap to specify URLs you want removed from the custom search index. However, you will need to submit a new request every time you update your Sitemap or submit a new one.
Custom Search starts indexing the most important webpages that have not already been included in the index, up to available on-demand indexing quota. Custom Search determines the importance of the webpages by the
prioritytag in your Sitemap. If the number of webpages with the highest priority exceeds your allotment for on-demand indexing, Custom Search selects the highest priority webpages with the most recent last-modified dates. As Google crawls and indexes your webpages, your custom search engine results might improve.
Note: Sitemaps are also submitted to Google search, which means the results may appear in Google search too, although it is not guaranteed.
You can only submit Sitemaps or URLs that belong to a website that you have verified through Google Search Console.
If you have a web page that you just deleted but it is still showing up in your custom search results, and it is very important to you that it does not show up in search results as soon as possible; or if one of your web pages contains inappropriate content and you want to block it immediately while taking it down, you can prefix the URL with a "-" symbol, and submit it in the On-demand indexing using individual URLs section, like this:
You can also add an
expires tag in your Sitemap to specify a date in the past when
the URL expired and submit the Sitemap. Following is an example from a hypothetical Sitemap
<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> ... <url> <loc>http://www.example.com/expired.html</loc> <expires>2011-06-21</expires> </url> <url> <loc>http://www.example.com/also_expired.html</loc> <expires>2011-06-21T18:00:00+00:00</expires> </url> </urlset>
Then, assuming that you have verified you own
www.example.com, you can go to
your search engine control panel's Indexing tab, and submit your updated Sitemap.
http://www.example.com/expired.html will be removed from your custom search engine
within 24 hours and stop showing in Custom Search engine.
Please note that this method of removing URLs from custom search has following limitations:
- It will not affect Google search. If users go to google.com to search for the deleted content, they may still get the deleted URLs.
- You have limited on-demand removal quota. If you have more URLs to remove from index than your available on-demand quota, then URLs considered for removal will be equal to the available on-demand quota. If you have multiple removal request submissions, removal requests in later submissions will be honored first.
- Your removal request will be valid for at most 60 days from the last time it is submitted.
- If you submit removal requests through Sitemap, the expired url must not be removed from the Sitemap within the next 60 days. If you want to revert the removal, you need to remove the expires tag and resbumit the Sitemap.