Best Crawlers and indexing settings for your Blogger website

How to set up the crawlers and indexing settings on Blogger, so that your website gets indexed fast. Just use these Crawlers and indexing settings.

Searching for good Crawlers and indexing settings for your Blogspot website?

You found it!

Today, I will show you the best settings for crawlers and indexing on Blogger, so that your website can rank faster.

There are not any Plugins available for SEO in Blogger, so you have to use every option available to index and rank your website on search engines like Google and Bing faster.

Crawlers and indexing settings on Blogger can play a major role for your Blogger website to get indexed fast, so you have to set it up properly.

Best Crawlers and indexing settings for your Blogger website

What is a Crawler?

A crawler, also known as a bot or spider, is run by search engines like Google and Bing. It discovers and scans webpages and then indexes them in the database.

As a result, those websites can be shown in the search results.

Crawlers and indexing settings on blogger

The main objective of these crawlers is to learn and categorize the topics and contents on the webpage so that they can be retrieved whenever someone searches for them.

What is indexing?

Indexing or Search indexing is like adding books to a huge library with labels. So that search engines can easily retrieve information from it, whenever someone searches on the internet.

So, it is very important for your website to get indexed, otherwise, it won't be shown in the search results.

Crawlers and indexing settings on Blogger

In Blogger, Crawlers and indexing status settings remain untouched by default.

It doesn't mean your blog won't get indexed, but it will take relatively more time to index your blog posts. Because Crawlers will crawl everything on your website including unnecessary pages like archived pages.

So, you have to enable the Crawlers and indexing settings and customize it. It will help the Crawler to understand which posts and pages to crawl, and your posts will get indexed much faster.

Custom Robots.txt

Log in to the Blogger dashboard, and then go to the blog settings, and then scroll down to the Crawlers and indexing settings. First, you need to enable the custom robots.txt option.

A robots.txt file informs the web crawler which posts or pages it can request and which pages are not required to crawl. It also helps your website to avoid too many bots requests.

Now, you have to add custom robots.txt, for that you have to first generate a robots.txt file. You can simply copy the example given below, and paste it after changing the domain name.

User-agent: *
Disallow: /search
Allow: /
Sitemap: https://YourDomain.com/sitemap.xml
How to add custom robots.txt in Blogger

Custom robots header tags

Once you enable the option for custom robots header tags, you will see that Home page tags, Archive and Search page tags, and Post and Page tags options are available. Let me just explain the meanings of those tags.

  • all: it means there aren't any restrictions for crawlers to crawl or index any page.
  • noindex: if this option is selected for any page, then the crawler can't crawl or index that page.
  • nofollow: blog posts contain both internal and external links. Crawlers automatically crawl all the links on any webpage, but if you don't want crawlers to crawl any links on any page, then you have to enable this option.
  • none: it means both "noindex" and "nofollow" at once.
  • noarchive: it will remove any cached link from search.
  • nosnippet: it will stop search engines to show snippets of your web pages.
  • noodp: it will remove titles and snippets from open directory projects like DMOZ.
  • notranslate: it will stop crawlers to show translation on pages.
  • noimageindex: in this case, your post will be crawled and indexed, but all the images will not be indexed.
  • unavailable_after: this option helps to noindex any post after a particular time.
Correct settings for Blogger Robots Tags

Now that you know all details about the robot tags, you can decide for yourself which one to choose. By the way, here are the best settings for any normal blog or website. Only enable these tags category-wise.

  • Custom robot tags for the home page: all, noodp
  • Custom robot tags for archive and search pages: noindex, noodp
  • Custom robot tags for posts and pages: all, noodp

Manage custom robot tags for each Blogger post

After you change the Crawlers and indexing settings, custom robot tags will also be available in post settings, at the bottom right-hand side of the post writing dashboard. So, you can easily control settings for each post.

Conclusion

Crawlers and indexing settings can play a major role in blog posts SEO, but if you do it wrong, it can also remove your posts completely from Search results.

So, always follow the correct methods and know what you are actually doing.

I am sure if you follow the steps as described above, there won't be any problems. If you still have any questions, don't forget to comment.

I like to read and learn new things on different topics, and then share them in my Blog.

12 comments

  1. Thanks for the settings, what do you think about anime blog monetization?
    1. You can use Google Adsense for Anime Blogs, there are many anime blogs with Google Adsense.
      Also, you can try affiliate marketing of related products.
    2. I read many posts and realy impresed by the knowledge you are give with examples and appriciate you to keep doing it up as like.
    3. Thanks
  2. What is Blogger Draft?
    1. It's the beta version of Blogger.
  3. Pls how can I get this website theme I like it
    1. This is a paid theme. You can buy it from the official developer
  4. good job thanks
    1. Thanks
  5. Hi l, how did you make your homepage and search title and description unique and not duplicate
    1. Just Wrote it myself :)
© Innate Blogger. All rights reserved. Developed by Samik Pal