创建并维护正确的robots.txt文件有时并非易事。尽管大多数网站的robots.txt文件创建工作都非常轻松(提示:这些网站通常甚至不需要robots.txt文件!),但是,当robots.txt文件非常大时,要在其中找到当前或以前规定禁止抓取个别网址的指令无异于大海捞针。为了更轻松地进行此项工作,现在我们在网站站长工具中推出了经过改进的
r
obots.txt测试工具
。
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["没有我需要的信息","missingTheInformationINeed","thumb-down"],["太复杂/步骤太多","tooComplicatedTooManySteps","thumb-down"],["内容需要更新","outOfDate","thumb-down"],["翻译问题","translationIssue","thumb-down"],["示例/代码问题","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2014-07-01。"],[[["Google has updated the robots.txt testing tool in Webmaster Tools to make it easier to identify and fix crawl issues."],["The tool allows users to test new and existing URLs, review robots.txt file history, and pinpoint problematic rules blocking Googlebot access."],["Google recommends reviewing robots.txt files for potential errors, especially for older files that might unintentionally block essential resources like CSS or JavaScript."],["Using the robots.txt testing tool alongside other Webmaster Tools features, like \"Fetch as Google,\" can provide a comprehensive approach to website crawl optimization."],["Google provides resources like their developer site and webmaster help forum for further assistance and guidance with robots.txt files."]]],["An updated robots.txt testing tool is now available in Webmaster Tools, under the Crawl section. This tool allows users to test URLs against their robots.txt file, highlighting the specific rule affecting crawlability. Users can also modify and test changes before uploading the updated file. The tool also reviews older file versions and identifies access issues, such as `500` server errors. It's advised to check for errors and combine with other Webmaster Tools to resolve any blocking rules.\n"]]