How to Enable Custom Robots file in Blogger blog with Pictures Tutorial
September 29, 2023 ・0 comments ・Topic: Blogger SEO

Dear friends, my name is ARYA. Today I am going to show you! How to Enable Custom Robots file in Blogger blog with Pictures Tutorial.
What is the Robots.txt file?
Robots.txt file tells the crawler which pages, posts, products, and other URLs they want to crawl to index in the search engine result page (SERP).
Why do we need to enable a custom robots.txt file and robots header tags in the blog?
When we create a blog, website, and online store and we upload some content on it then we are adding our online asset to the search engine console or Webmaster tools, that help us to crawl or index our pages, posts, products, and all URLs in the search engines like - Bing, Qwant, Google, Duckduckgo, Startpage, etc..., and without robots.txt file our all content URLs (unwanted & useless URLs) will be index in the search engine without any restriction.
Now I give you some reasons to enable the robots.txt file in your blog or website, let's see some points
we don't want to index some of our pages or posts like we are working on a particular page or post to improve the design.
we don't want to index some specific page in the search engine, like the DEMO page, testing pages, WP-ADMIN page, etc...
we don't want to index our blog Label or Category URL, like -
- aryatcity.blogspot.com/lable/blogger
- aryatcity.blogspot.com/category/bogger
Now you will understand how the robots.txt file helps our online assets (blog, website, online store) be crawled and indexed in the search engine and the main thing is every crawler has time to spend in every blog or website then they are automatically redirected or jump to another blog or website so in the particular time frame, the crawler crawls the maximum pages or posts URLs and the visit another website or blog.
But if our website or blog has a lot of pages, posts, and product URLs then the crawler cannot crawl the whole website or blog so we are enabling the robots.txt file in our blog, website, and online store to tell the crawler which URLs they want to crawls and index, For Example:-
you can visit some given URL in your web browser then you can see the settings of the robots.txt file.
https://aryatcity.blogspot.com/robots.txt
Now in the Blogger platform, the blog by default crawl all content URLs in the search engine results page (SERP), so here we are enabling custom robots txt file and header tag to tell all search engines how they can crawl & index our blog in their SERP.
Custom robots.txt file explanation:-
#User-agent: Mediapartners-Google:- this is for Google to Provide better ads on blogger blogs.
#User-agent: *:- this means our blogger blog or website can crawl all search engines. It means visible to all search engines.
#Disallow: /search:- this line tells all search engines not to crawl this post who related /search post and pages.
#Allow: /:- this line tells all search engines to crawl and Allow the homepage.
#Sitemap:- we added our Blogger blog sitemap, if you don't know what a is sitemap I just explain the sitemap in another post you can visit and check the post.
Custom robots header tag explanation:-
#All:- it crawls all Blogger blog posts and pages and this is enabled by default.
#noindex:- this option tells all search engines to not index Blogger blog pages and posts in the search area.
#nofollow:- it tells us not to follow the links on the pages.
#none:- it means we have selected both noindex, nofollow.
#noarchive:- this option can remove the cache link from search results.
#nosnippet:- this option can remove a snippet of the Blogger blog post and pages.
#noodp:- this option can not allow search engines to use metadata from the DMOZ directory.
#notranslate:- this option tells Spiders not to allow to translate the Blogger blog post and pages.
#noimageindex:- this option can not allow to index images into all search engines.
Now we are going to the blogger blog and do these configurations in our blog, now let's start.
These are the Following Steps:-
Step 1: First go to the Blogger blog and click the Setting option, here we scroll down and find the Crawlers and indexing section then we On the button for the Enable custom robots.txt option.

Step 2: Now a new pop-up window appears middle of the device screen, here we paste the robots.txt file code and click the Save button.

Custom Robots.txt file Code:-
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://blog-name.blogspot.com/sitemap.xml
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://blog-name.blogspot.com/sitemap.xml
Step 3: If you provide all the code in the robots file then you not need to worry about the robots header tag option but if you're not sure then you can click the On button for the Custom robots header tag and one by one you can configure these options.

Step 4: Here first I click the Home page tags option and then I enable all and noodp option then click the Save button.

Step 5: Now I click the second option Archive and search page tags, here I just enable noindex and noodp options then click the Save button.

Step 6: Now at last I click the Post and page tags option and I just enable all and noodp option then click the Save button.

Step 7: Now open the Blogger blog to check if there is any problem.
I hope you enjoy this Post. If any problem comment it.
Post a Comment
Please do not enter any spam link in the comment box.
If you have any doubts, Pease let me know.
If you can't commemt, try using Chrome instead.