How to Enable Custom robots.txt Content in Blogger (Updated)

Add Custom robots.txt File in Blogger
If you want to increase your audiences but have no idea about how to customize your blog's search preference on Blogger. If you are a good article contributor, visitors are fully satisfied with your blog is gonna be down unless you edit your rebots.txt file as it should be. We don't know actually what kind of traffic you get or you prefer. If you prefer organic traffic, it’s necessary to improve your blog's SEO as a result of search engines to crawl your blog posts.



Please note: Implementing incorrect robots.txt file may be seem difficult for search engine spiders to index your content and seldom display in search results. If you are unsure about how to block your URLS with robots.txt file, read this post at least once completely before you put into them action. We strongly advice use this implementation with caution.

What is Robots.txt? How robots.txt file Blocks your URLS

In simple, robot.txt file is text file as its name implies. Robots.txt file is a robot from search engine spiders and grant the permission for webmaster tools to access your blog or website. for an example, as a blogger you would like to be indexed and crawled your certain pages and posts like post page, static page etc... But it’s not necessary to be indexed and crawled your pages like blog's demo page, label page or a further extraneous pages etc. In short robots.txt files help you to prevent appear your irrelevant certain pages in search engines. Google uses a custom robots.txt file and it's accessible via http://www.google.com/robots.txt

You may probably ask a question how robot.txt file helps you block your irrelevant URLS getting indexed and crawled by search engine spiders. This is a most popular generally asked question in many SEO forms. When search engine spiders land your blog or blog post, rebots.txt file is the first one they're gonna be indexed. Blogger automatically creates its own robot.txt file that may have the chance to crawl and index your labels and demo pages too. So by using this custom robot.txt file feature properly, you'll be able to tell search engine spiders which URLs to be indexed and which shouldn't be.

Major Keywords Include in Robot.txt File

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: 

User-agent: Mediapartners-Google — Mediapartners-Google is a AdSense robot mainly aims at Google AdSense users to crawl and index your blog or website to display advertisements on your site with your blog's same niche. If you don't want display AdSense ads or not using AdSense you no need to add this option (delete these lines) until your AdSense application get approved.

User-agent: * — Actually this option is set for developers and folks who have experience in programming. So as a newbie you don't have to worry about this command because it never going to hurt your blog in any manner. Simply, most visitors pass their names when they visit your blog. It’s clearer in your Google Analytics' account.

Disallow: /search — You'll be able to see a word search next to the robot.txt command keyword Disallow. This means you are telling search engine spiders not to crawl and index your pages of your blog or website in search results. Therefore, adding a specific URL next to the keyword Disallow, Google will never be indexed your irrelevant URL anymore. Keyword Allow permits Google to crawl your rest of the URLs. Remember adding "/" (Allow: /) to the keyword, Google will crawl and index your homepage too.

Sitemap: — Even if you are a newbie blogger you might have known what the importance of adding sitemap to Blogger is is. Few months ago I have covered a post on this blog. Search engines like Google and Yahoo use your sitemap to discover pages on your site that the search bots may have otherwise missed during regular crawling. If your blog contains only less than 25 posts, the robot will index your posts by default. But if your blog includes more than 25 posts, don't miss our most popular tutorial about adding Google Sitemap to your blog.

Adding Custom rebots.txt File to Blogger

As I mentioned above if you're not a Google AdSense publisher or user, I recommend you use Google XML Sitemap Generator to generate a sitemap for your blog. It is a Digital Inspiration project offered by Amit Agarwal, the legend among Indian tech bloggers. You don’t worry about how many post you've return, because this XML Sitemap Generator automatically index how many posts you've written. But don't forget to re-update your XML sitemap when your blog contains more than 500 published posts.

Enable Custom robots.txt content via XML Sitemap Generator


Using Google AdSense? Well, if you are using AdSense ads on your blog, you can use the below robot.txt file instead of above. Remember below robots.txt file contains an AdSense robot to crawl your site often for displaying relevant ads. Also I have attached its screenshots below for better understanding.

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://your-blog.com/feeds/posts/default?orderby=UPDATED

  • Sign into your Blogger account and choose your blog
  • Settings — Search preferences — Crawlers and indexing
  • Click on Edit link next to Custom robots.txt file

add custom robots.txt in blogger


  • Enable custom robots.txt content by checking Yes
  • Copy and paste the above code in that empty box

enable custom robots.txt content
.

Final Guidelines Before you Save changes

Block Irrelevant Posts or Pages: You can block your irrelevant posts or pages from being crawled and indexed by Google with the keyword "Disallow". To prevent a particular post simply copy the particular URL of your blog post with out your blog's homepage URL. So the keyword then will be Disallow: /yyyy/mm/post-url.html. If you’re still in doubt see the screenshot below.


Use the same way if you are tying to block a particular static page. Simply copy the post-URL (without your blog's homepage URL) and paste it next to the keyword Disallow. Remember Blogger static page URL do not contains year & month (yyyy/mm). So the keyword then finally will be Disallow: /p/page-url.html.

  • Blocking Particular Post eg: Disallow: /2014/11/top-10-adsense-earners.html
  • Blocking Particular Page eg: Disallow: /p/contact.html

Adding Proper Sitemap: This is very important that I would like to inform you that we don’t know how many posts contain in your bog or website. So as per your need, you have change your XML sitemap before you click Save changes button. Also default Blogger robot indexes only 25 posts. So if your blog contains more than 25 posts I recommend you use the following XML sitemap.

  • Sitemap: http://your-blog.com/atom.xml?redirect=false&start-index=1&max-results=500

Have 500+ posts? Well, the above XML site map will index only 500 posts from your blog or website. So you may please use below two sitemaps instead of the default sitemap I added in robots.txt file. After adding this robot.txt file you do not need to ping your blog for each update.

  • Sitemap: http://your-blog.com/atom.xml?redirect=false&start-index=1&max-results=500
  • Sitemap: http://your-blog.com/atom.xml?redirect=false&start-index=500&max-results=1000

Once you done all these things, click Save changes and you're absolutely done! Finally, it’s your time to check whether your changes are made to robots.txt file or not. Therefore open a new tab in your browser and add robots.txt with your blog URL (eg: http://labstrikes.blogspot.com/robots.txt) and press enter. Then you will be able to see a screen similar to below's screenshot. And this is the code that you are currently using in your custom robots.txt file.


I have simplified this tutorial as much I posibile. Hope you read this article at least once before you put into them in action. Blogger itself warns incorrect use of this feature may affect your search results. So if you have any doubt you never need any hesitation to mention them via comment. It's our pleasure to help our readers :)

See also: Top Ping Service Site List Ever to Index your Content Faster