Why Google Search Shows Robot.txt Blocked Urls

By Posted 2009 Updated   BloggingGoogle

Are you surprised to see your blocked urls displaying in Google search results? Why did Google not follow robots.txt directive? Why does this happen?

Matt Cutts explains well in this video why this occurs –

The video clearly shows that Google sometimes lists your blocked urls, but shows no description, meaning that they have not “crawled” those urls but still list them as they found them, thereby abiding by robots.txt. Sometimes they extract the description from the ODP, and not by actually crawling the url. So many times these Google search results are just an ‘uncrawled url reference’!

What is the best way to stop Google indexing your pages?
1. Add meta tags
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

2. Use the Google remove url tool, a webmaster tool to remove urls from Google’s index forever.


Leave a Reply

Your email address will not be published. Required fields are marked *




Next Article »
css.php