In this post, I am going to show you how to setup your robot.txt on blogger for better seo. Robot.txt is mainly for the benefit for the bloggers so it is known as Custom Robot.txt file.
The use of this file needs complete information by its users. Users must have complete information about the keywords used in a file. I am here to make you familiar and provide you with complete information about Robot.txt and different keywords and tags that are used.
Meaning Of Robot.txt File
Robot.txt file is not very complicated to be understood. It is very simple. It is basically a text file that contains few lines of simple codes. Every person saves this file on the root of their website or blog’s server. Robot.txt file restricts certain posts. The search engines always scan the robot.txt file and then show the instruction according to it. It is very helpful in many ways like you do not want the search engine spiders to index your particular page or any page that is not useful to you then it won’t show you the unnecessary things it will always point out the things that are needed by you.
How to Setup Robot.txt File In Blogger
First, you have to sign in to your blogger account.
In dashboard then you are to click on settings.
When you open the settings you are to click on “search preference option. And then a new screen will appear you have to click on edit link which is beside “disabled” option of “custom robot.txt.
Now you have to enable the option by pressing on yes button.
Now enter this in the box section.
User-agent: Media partners –Google
What Each Line Of Command Does
User-agent: Media partners-Google
These codes are basically used and helpful for those bloggers who are using Google Adsense Ads on their blogs.
User – agent:*
If we see in the terms of programming then an asterisk (*) this sign basically means allowing all. But when we use this sign for the blog purpose then this means that we are inviting all kind of robots to visit our blog.
These keywords are very important for blogs because they act as a guard. They allow the search engine to search specific or general information.
So these codes are directly added by the bloggers to the robot.txt folder which restricts the search engine robot from viewing all links which are related to keyword “search” after the domain name
This code is the main keyword in the blog platform. In case the “disallow” code has been removed by you then you can search in the engine will crawl all kind of link on your blog.
This tells the search engine to crawl a number of posts(25 max).
Robot.txt file is the root of blogger. So this is the reason why you can easily check the content by visiting
Share your experience and comments below.