Register for this year’s #ChromeDevSummit happening on Nov. 11-12 in San Francisco to learn about the latest features and tools coming to the Web. Request an invite on the Chrome Dev Summit 2019 website
robots.txt is not valid
The robots.txt file tells search engines what pages of your site they can crawl. An
invalid robots.txt configuration can cause 2 general types of problems:
Not crawling public pages, causing your relevant content to show up less in search results.
Crawling private pages, exposing private information in search results.
Expand the robots.txt is not valid audit in your report to learn why your robots.txt file is
Here is an explanation of common errors:
No user-agent specified. Put a User-agent directive before your Allow
or Disallow directive.
Pattern should either be empty, start with "/" or "*". Start your Allow or
Disallow directive with one of these characters, or leave it empty.
Unknown directive. The directive name listed in the Content column is not part
of the robots.txt specification.
Invalid sitemap URL. The sitemap URL should begin with http, https, or ftp.