The robots TXT (located at www.domain.com/robots.txt) is used to give instructions to the search engine spiders before they start to crawl.

It’s commonly used to block Google from indexing certain sections or file types. You can also link to XML sitemaps from the file.

Once you learn the basics blocks can be used to stop search engines from indexing duplicate content.

Leave a Reply