How to use robots.txt

When a search engine crawls (visits) your website, the first thing it looks for is your robots.txt file. This file tells search engines what they should and should not index (save and make available as search results to the public). It also may indicate the location of your XML sitemap.

The robots.txt file belongs in your document root folder.

You can simply create a blank file and name it robots.txt. This will reduce site errors and allow all search engines to rank anything they want.

If you want to stop search engines from ranking you, use this code:

#Code to not allow any search engines!
User-agent: *
Disallow: /

  • 1 Users Found This Useful
Was this answer helpful?

Related Articles

Can I use my account and my site even though my domain name hasn't propagated yet?

Yes, you can use your hosting account without a domain name.We provide you a temporary URL in...

What is my home directory?

Most of our hosting is Linux based. In this case, the home directory is the entire directory...

Why do I need an owned IP for my own SSL certificate?

The reason you must have your own IP address when you want to use your own SSL certificate (when...

What is the difference between a shared server and a dedicated server?

Shared Server Shared hosting involves small-scale users uploading their websites onto a single...

What Is Hosting?

Web Hosting or 'Hosting' is when a company offers a physical location for the storage of web...