Hey Bloggers, Have you ever heard the name robots.txt or have you added your own custom robots.txt file in your Blogger blog!
In today’s Article, we will discuss about custom robots.txt file in deeply and come to know more about its use and benefits in SEO. I will also tell you how to add custom robots.txt file in blogger.
In blogger it is known as Custom Robots.txt that means now you can customize this file according to your choices and needs.
So let’s start the guide...
What is Robots.txt?
Robots.txt is a text file which contains few lines of simple code.
It is saved on the website or blog’s server which instruct the web crawlers on how to index and crawl your blog website pages in the search results.
That means you can restrict any web page on your blog from web crawlers so that it can’t get indexed in search engines like your blog labels page, your demo page or any other pages that are not as important to get indexed.
Every Bloggers Blog have a default robots.txt file which is something look like this:
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.pixelnatures.in/sitemap.xml
What are these Code?
The robots.txt code is divided into three sections we seen. Let’s first discussion what are these code, after that we will learn how to add custom robots.txt file in blogspot blogs website.
User-agent: Mediapartners-Google
This line of code is for Google Adsense robots which help to serve better ads on your blog. Ig you are using Google Adsense for earning on your blog or not simply leave it.
User-agent: *
This is for all robots marked with asterisk (*). In default settings our blog’s labels links are restricted to indexed by search crawlers that means the web crawlers will not index our labels page links because of below code.
Disallow: /search
That means the links having keyword search just after the domain name will be ignored. See below example which is a link of label page named SEO.
http://www.pixelnatures.in/search/label/Telecom
And if we remove Disallow: /search from the above robots.txt code then crawlers will access our entire blog to index and crawl all of its content and web pages to Search engine.
Allow: /
Allow:/ refers to the Homepage that means web crawlers can crawl and index our blog’s homepage content
Disallow Particular Post
Now suppose if we want to exclude a particular post or page from indexing then we can add below lines in the code↓
Disallow: /yyyy/mm/post-url.html
Here yyyy and mm refers to the publishing year and month of the post respectively. For example if we have published a post in year 2021 in month of January then we have to use below format.
Disallow: /2021/01/post-url.html
To make this task easy, you can simply copy the post URL and remove the blog name from the beginning.
Disallow Particular Page
If we want to disallow a particular page from indexing then we can use the same method as above. Simply copy the page URL and remove blog address from it which will something look like this:
Disallow: /p/page-url.html
Sitemap:
sitemap: https://www.pixelnatures.in/sitemap.xml
This code we use reference to the sitemap of our our blogger blog. By adding sitemap link here we are simply optimizing our blog’s all pages and post url crawling rate.
That Means whenever the web crawlers robots scan our robots.txt file they will find a path to our sitemap where all the url links of our published Post.
Web crawlers will find it easy to crawl all of our posts links.
Hence, there are better chances that web crawlers crawl all of our blog posts without ignoring a single one.
Note: This sitemap will only tell the web crawlers about the recent 150 posts.
If you have more than 150 published posts in your blog. you not too worry about it because now Blogger Default sitemap automatically generated sitemaps like below:
https://www.pixelnatures.in/sitemap.xml?page=1
https://www.pixelnatures.in/sitemap.xml?page=2
Also Read→
How to Add Custom robots.txt file to Blogger Blog in 2021
Now Come the main point of this article as we mention on title of this post is how to add custom robots.txt in blogger blog in 2021. It's too simple!
So follow the below Simple steps to add robots.txt file to your Blogger blogs.
- Go to your blogger Dashboard
- Navigate to Settings » Search Preferences » Crawlers and indexing » Custom robots.txt » Edit » Yes
- Now paste your robots.txt file code in the box.
- Click on Save Changes button.
- You are done!
How to Check Your Robots.txt File Save or Not?
You can check this file on your blog by adding /robots.txt at the end of your blog URL in the web browser.
For example:
http://www.yourblogurl.in/robots.txt
Once you visit the robots.txt file URL you will see the entire code which you are using in your custom robots.txt file in blogger.
Don’t forget to share this article on social media if you enjoy and liked it. Feel free to comment below if you have any confusions or doubts about the post.
Talk to you soon again. Have a great day ahead!
Visit daily for Latest Updates on Technology, Telecom, Gaming, Free Blogger Template, and Blogging tips & Tricks. Needs any help contact us on Telegram.
How to index site categories like you?
ReplyDelete