How to Stop Search Engines From Indexing Site

How to block search engines? You created your very own personal home on the web. However, the site turning up into search engines and web directories, revealing your personal details. It is very simple to block search engines from indexing your website by adding a small META code in your web pages and adding a small text file called robots.txt. Protect your privacy.

Method 1

Add meta tag to head of your

Method 2

Create a robots.txt (all lower-case) using any simple text editor like Notepad. Save it into your root directory of your domain to prevent search bots from accessing any page on your site. Type the details exactly as give below into the robots.txt file

To exclude all robots from the entire server

User-agent: *
Disallow: /

To allow all robots complete access

User-agent: *

Or create an empty “/robots.txt” file.

To exclude all robots from part of the server

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /private/

To exclude a single robot

User-agent: BadBot
Disallow: /

To allow a single robot

User-agent: WebCrawler
Disallow: User-agent: *
Disallow: /

More information is available here.

Read more tips about

Start a discussion. Share your view: Post a Comment on Twitter