Google indexing is very important for ranking your page in search engines. But, indexing of unnecessary links may harm your blog performance. As a blogger, you should always look which area of your blog should google index. For example, when we post an article with an original URL and also keep the same post in a category, in such case we should avoid indexing category URL. Let us say we have website www.example.com, and post an article about “How to increase traffic to blog”. Also we keep the same blog post in the Category say “blogging”. Now we will have two URL for same blog post.

www.example.com/how-to-increase-traffic-to-blog

www.example.com/blogging/how-to-increase-traffic-to-blog

We are actually duplicating our content, with different URL. Google may penalize your site for such duplicate content. To avoid indexing such URL, robots.txt file is used to control the indexing done by the google.

How to Increase Blog ranking using Robots.txt File

Googlebot or Google Spiders

Googlebot or spiders of google are responsible for indexing and crawling your blog post. Googlebot and google indexing are different from each other. The websites which are updated regularly are retrieved by googlebot from the web. Googlebot only access content which is allowed by robots.txt file. The main function of googlebot is to bring all the accessible content to google for indexing. So, here robots.txt file plays an important role in indexing your blog or website. If you are want to know how a robots.txt file looks, here is an example of my robots.txt file from my website below.

User-agent: *

Disallow: /cgi-bin/

Disallow: /wp-admin/                                                                      

Disallow: /recommended/

Disallow: /comments/feed/

Disallow: /trackback/

Disallow: /index.php

Disallow: /xmlrpc.php

Disallow: /wp-content/plugins/

Disallow: /tag

Disallow: /category

As you can see above, in my robots.txt file I have stopped indexing all the unnecessary links by the google search engines. If you see only the post and pages I have not added here, which are the only things I would like google to index. This way I am trying to focus google search engines or googlebot on the important area of my website and index only my post and pages of my website.

How to read a Robots.txt file?

Yes, the above robots.txt file will not be understood but I will try to explain in simple terms. In this file, you can either block one file, one folder or allow all the files or folders to be accessed by the googlebot. Let us learn through the below example.

Allow full access

User-agent: *

Allow: /

Disallow full access

User-agent: *

Disallow: /

Disallow one category

User-agent: *

Disallow: /category

Disallow tag

User-agent: *

Disallow: /tag

With above example, hope you would have understood how to allow and disallow access of links to googlebot. This is the easiest way you can allow or block any path of your blog through robots.txt file.

How we should create a robots.txt file?

Creating a robots.txt file is not difficult, but we should be very careful while creating this file because incorrect robots.txt file can harm your website. Now, which are the areas of your websites or blog should be allowed and blocked from google to increase the ranking of your website. Let us first check which are the areas of website should be blocked. Check the image below.How to Increase Blog ranking using Robots.txt File

Now, as per the above image, the above term links to below pages.

www.example.com/wp-admin

www.example.com/wp-login

www.example.com/feed

www.example.com/comments/feed

Now, these links actually lead to websites dashboard form where we manage our blog. Now, these links are can be mostly seen on the blog or website. But, google will not be aware of such links and it will try to index these links as well, which will lead to dead links. To avoid indexing such dead links by google, we should block them. So, as you can see in the above robots.txt file, I have disallowed these links of my website. Now, robots.txt file will not allow google to index these dead links.

How to test the robots.txt file?

We can test the robots.txt file in webmaster tool. For accessing webmaster tool, you have to add and verify your website in google webmaster tool. Once you have added your website, just go to

Dashboard->Crawl>robots.txt tester

Here you have to paste your robots.txt file as shown below.

How to Increase Blog ranking using Robots.txt File

Once you have pasted the robots.txt file, you can see all the disallow links in the above image. Now, if you want to check if this robots.txt file works correctly or not. If you see above image we have option to enter a URL, in the bottom. Now just enter a disallow URL from the robots.txt file. For example, I have entered ‘Page’ in the box, domain will be same like www.worldinfo4all.com. Now, for disallow URL the test should show ‘Blocked’ for googlebot as shown below.

How to Increase Blog ranking using Robots.txt File

Now for the links which we have not blocked or disallowed like the post, we can enter that as well to check whether those are allowed to index by google or google bot. Now, just enter ‘Post’ in the box you can see it is allowed to access by search engines or googlbot to get indexed by google spiders.

How to Increase Blog ranking using Robots.txt File

Hope, now you would have got the idea how the robots.txt file works. It is very important text file to create and edit, hence be careful while using it. Please so comment and suggest what else can be done with robots.txt file to improve your blog performance.

Please do read below references to enhance your knowledge about Robots.txt file.

http://www.webconfs.com/what-is-robots-txt-article-12.php

Read also  KickASS method to find keywords using SEOquake Add-on of Firefox

https://www.freefind.com/library/howto/robots/

http://tools.seobook.com/robots-txt/

https://www.woorank.com/en/blog/robots-txt-a-beginners-guide