Stay organized with collections
Save and categorize content based on your preferences.
Thursday, February 28, 2008
Last spring, the Sitemaps protocol was expanded to include the
autodiscovery of Sitemaps using robots.txt
to let us and other search engines supporting the protocol know about your Sitemaps. We
subsequently also announced support for
Sitemap cross-submissions using Google Webmaster Tools,
making it possible to submit Sitemaps for multiple hosts on a single dedicated host. So it was
only time before we took the next logical step of marrying the two and allowing Sitemap
cross-submissions using robots.txt. And today we're doing just that.
How would this work? Say for example you want to submit a Sitemap for each of the two hosts you
own, www.example.com and host2.google.com. For simplicity's sake, you
may want to host the Sitemaps on one of the hosts, www.example.com. For example, if
you have a Content Management System (CMS), it might be easier for you to change your robots.txt
files than to change content in a directory.
You can now exercise the cross-submission support via robots.txt (by letting us know the location
of the Sitemaps):
By indicating in each individual host's robots.txt file where that host's Sitemap lives you are
in essence proving that you own the host for which you are specifying the Sitemap. And by
choosing to host all of the Sitemaps on a single host, it becomes simpler to manage your Sitemaps.
We are making this announcement today on
Sitemaps.org as a joint
effort. To see what our colleagues have to say, you can also check out the blog posts published
by Yahoo! and
Microsoft.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],[],[[["\u003cp\u003eGoogle, Yahoo!, and Microsoft now support cross-submissions of Sitemaps using robots.txt, simplifying Sitemap management.\u003c/p\u003e\n"],["\u003cp\u003eWebmasters can host Sitemaps for multiple websites on a single server and reference them in each site's robots.txt.\u003c/p\u003e\n"],["\u003cp\u003eThis update allows easier management of Sitemaps, especially for those using Content Management Systems (CMS).\u003c/p\u003e\n"],["\u003cp\u003eUsing robots.txt for cross-submissions helps verify website ownership by specifying the Sitemap location.\u003c/p\u003e\n"],["\u003cp\u003eThis change is a collaborative effort announced on Sitemaps.org, with supporting information available on Yahoo! and Microsoft blogs.\u003c/p\u003e\n"]]],["Webmasters can now manage Sitemaps for multiple hosts from a single host using robots.txt. This allows indicating Sitemap locations for different hosts within each host's respective robots.txt file. For example, `www.example.com`'s robots.txt would point to its Sitemap, and `host2.google.com`'s robots.txt would point to its respective Sitemap, both hosted on `www.example.com`. This cross-submission method via robots.txt simplifies Sitemap management and verifies host ownership. This is a collaborative effort among Sitemaps.org, Yahoo!, and Microsoft.\n"],null,["# Cross-submissions via robots.txt on Sitemaps.org\n\nThursday, February 28, 2008\n\n\nLast spring, the Sitemaps protocol was expanded to include the\n[autodiscovery of Sitemaps using robots.txt](/search/blog/2007/04/whats-new-with-sitemapsorg)\nto let us and other search engines supporting the protocol know about your Sitemaps. We\nsubsequently also announced support for\n[Sitemap cross-submissions using Google Webmaster Tools](/search/blog/2007/10/dealing-with-sitemap-cross-submissions),\nmaking it possible to submit Sitemaps for multiple hosts on a single dedicated host. So it was\nonly time before we took the next logical step of marrying the two and allowing Sitemap\ncross-submissions using robots.txt. And today we're doing just that.\n\n\nWe're making it\n[easier for webmasters to place Sitemaps for multiple hosts on a single host](https://www.sitemaps.org/protocol.php#sitemaps_cross_submits)\nand then letting us know by including the location of these Sitemaps in the appropriate\nrobots.txt.\n\n\nHow would this work? Say for example you want to submit a Sitemap for each of the two hosts you\nown, `www.example.com` and `host2.google.com`. For simplicity's sake, you\nmay want to host the Sitemaps on one of the hosts, `www.example.com`. For example, if\nyou have a Content Management System (CMS), it might be easier for you to change your robots.txt\nfiles than to change content in a directory.\n\n\nYou can now exercise the cross-submission support via robots.txt (by letting us know the location\nof the Sitemaps):\n\n1. The robots.txt for `www.example.com` would include: \n\n ```\n Sitemap: https://www.example.com/sitemap-www-example.xml\n ```\n2. And similarly, the robots.txt for `host2.google.com` would include: \n\n ```\n Sitemap: https://www.example.com/sitemap-host2-google.xml\n ```\n\n\nBy indicating in each individual host's robots.txt file where that host's Sitemap lives you are\nin essence proving that you own the host for which you are specifying the Sitemap. And by\nchoosing to host all of the Sitemaps on a single host, it becomes simpler to manage your Sitemaps.\n\n\nWe are making this announcement today on\n[Sitemaps.org](https://www.sitemaps.org/index.html) as a joint\neffort. To see what our colleagues have to say, you can also check out the blog posts published\nby [Yahoo!](https://www.ysearchblog.com/archives/000524) and\n[Microsoft](https://blogs.msdn.com/webmaster/archive/2008/02/27/microsoft-to-support-cross-domain-sitemaps.aspx).\n\nPosted by Prashanth Koppula, Product Manager"]]