Monday, June 25, 2012

Important role of Robots.txt in SEO


Most of you might not be familiar with the file name but it is a file at a website server that contains all the links or directories of the site that the site owner do not want search engines to index while searching throughout the site. Now there are few things that must be done in order to create a perfect Robots.txt for your website.



What you should do in Robots.txt
1.      Disallow all the directories of your site that are duplicates of the contents on the site. This will allow search engine to index only 1 page.
2.      Make sure that all the important and main content of your website are not disallowed from indexing as it will damage the overall site SEO and page ranking.
3.      If you have certain scripts running on your site that contains some authenticated info like credit card numbers, pins etc. disallow them from indexing so that you can protect the data.
4.      Do not use any type of comments in the file as it will render the search bot ambiguity.
5.      Go through your site and see if there are any other types of content that you need to disallow from indexing and keep updating your Robots.txt from time to time

There is one major mistake that most people make is that they use the “/allow” command in the file which is totally wrong as there is no such type of command that exist. Every other content that is not disallowed is automatically added to the allow list.

This is very important file as it can really damage your business so if you are not comfortable with it, please consult an SEO specialist who can do the job for you

0 comments:

Post a Comment