Saturday, April 28, 2012

URL Restricted By Robot.txt Error and How to Solve the Problem

URL restricted by Robot.txt errors may happen because of various reasons such as; your robot file might be prohibiting the Googlebot from accessing a directory where the said URL is found or from accessing a specific URL.  This usually happens if the robot.txt file is set in such a way that it prohobit the Googlebot from crawling this URL. In other rare cases, you might have accidentally done some changes on your robot.txt which would have resulted into this error.
 
How to Solve URL Restricted by Robot.txt Error
 
I have had this error several times and here is a simple way that I usually use to solve it. Login to your Google Webmaster Tool, select the site that has the URL restricted by robot.txt error. Click on Diagnostic, choose Fetch as a Googlebot. Submit the sitemap (yourblogURL.blogspot.com/rss.xml) of the site with the robot.txt error and then click on Fetch.  Read more on Submitting a Sitemap to Bing and Google Webmaster.

After a few minutes you will receive a message informing whether the fetch was successful. Then click on submit for indexing and wait for a few minutes for results. You will receive a another message informing you whether the submission was successful or not with a checkmark after the word success.

If you still experiencing the same problem, wait for a few minutes and then re-submit it again.  Most likely you will only have to  try twice before succeeding.

To confirm whether you submission was successful and whether your robot.txt file is working properly,  click on Site Configuration, Crawl Access.  Next to Status, there is the word success which signify that your robot.txt is working properly.

0 comments:

Post a Comment