Welcome to your blog. Today, we will explain how to add a robots.txt file in your Blogger blog in a correct way to improve the appearance of the blog in search engines, by completing the series of the Blogger blog creation cycle 2022.
How to add robots.txt file in Blogger blog
Adding a robot file is easy for all experienced bloggers. But it is difficult and confusing for all beginners, especially since they are in a hurry to archive articles and the problems they encounter in archiving. Let us move on to the explanation.
What is a robots.txt file?
The robots.txt file is a file for Google crawls to archive pages and articles in search engines. It is also very important in blogging and leadership processes, so it must be treated with caution. We also provide you the best robots.txt file for blogger blog after update.
Best robots.txt file for robots
This file is made by the best robots.txt file that Google prefers in blogs and websites and can also be fetched through Google Console.
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.#####.com/sitemap.xml
Explanation of the file code
Before we explain how to add the robot text file, we will give you an initial idea of the code for the robot file.
First code: User-agent: Mediapartners-Google
This code is for introducing Google to the ads shown on the blog, especially Adsense ads.
Second code: Disallow:
This code prevents archiving through the commands you request through which it is bad to prevent archiving ads, pages, and so on.
Third code: User-agent: *
This code compiles all search engines to give the following commands: Allow or Disallow
Fourth code: Disallow: /search
It prevents archiving of search pages, not articles.
Fifth code: Allow: /
Through this code, we give orders to search engines to archive all articles or pages that you want to archive.
Sixth Code: Sitemap: https://www.######.com/sitemap.xml
This code is for allowing webmasters and Google search engines to archive blog articles only.
How to add robots.txt file in Blogger blog
Go to the blog and go to settings
Scroll down to access crawlers and indexing programs
Enable custom robots.txt content
Open a custom robots.txt file
Go and copy the code after downloading it from the bottom of the article and paste it
Change your blog url in this section Sitemap: https://www.######.com/sitemap.xml
Make a save after placing the file.
Explanation of custom header tags
Custom header tags are enough to archive your blog in search engines, but if you place them correctly. Now you know the header tags and their role in search engines.
all
The code all allows Google search engines to archive the page and articles and follow the links on it.
noodp
The code noodp prevents Google search engines from displaying any random description under the title in search results.
noindex
This noindex code prevents indexing and appearing in search engines, i.e. not archiving pages.
nofollow
The nofollow code prevents Google search engines from tracking any links on the page during the archiving process.
noarchive
The noarchive code prevents Google search engines from storing a blog backup.
snippet
The code nosnippet prevents Google search engines from displaying an article description under the title in search results.
no imageindex
This code noimageindex prevents Google search engines from archiving images in articles and pages.
none
The code prevents the page from being archived and prevents any links from being tracked.
notranslate
The notranslate code prevents Google search engines from translating the page into any other languages.
And here we have finished today's lesson within the series of the Blogger blog creation course. If you have any questions, leave a comment below, and thank you for following.