Importance: Medium
Error «Blocked by Robots.txt» Description
Indicates URLs disallowed in the robots.txt file.
The Importance of the Problem
A robots.txt file can disallow search robots from crawling specific website pages. Thus, they will not take part in search.
Robots.txt is mostly used to block pages unimportant for search (e.g. internal search, shopping cart, sign up pages, etc.) to save robot's crawling resources.
If a website contains a lot of links pointing to URLs that are disallowed from crawling, useful pages may get less link weight, lower rankings, thereby less traffic.
How to Fix Error
Important pages have to be allowed for crawling by search robots.
It's recommended to reduce the number of links pointing to pages disallowed from crawling.
However, if links are useful for user navigation through the website (e.g. links pointing to shopping cart or sign up pages), do not delete them.
|