Crawlers and indexing
Enable custom robots.txt
This
section is very important for Search Engine Optimization when you enable this
section then create robots.txt file content and paste that file here You can
create robots.txt file through the following web:
·
https://www.labnol.org/blogger/sitemap/
There
you have to scroll a little bit there you will see an option “Generate XML
sitemap for Blogger” there you have to copy your blog address from your blog
and paste it here and click Generate XML
Here you will get a text just copy that text and go to:
Settings>Crawlers
and Indexing>Enable Custom robots.txt
Paste
your copied text here and it’s done.
Now
this text will be used instead of your default robots.txt content to all of the
Search Engines.
Enable custom robots
header tags
Here
you have to set custom robots header tags same as mentioned below:
Home page tags
- ·
All
- · noodp
Archive and search page tags
- ·
No Index
- ·
noodp
Post and page tags
- ·
All
- ·
noodp
Set
the blog’s custom robots header tags as shown above. These flags are used to
set the robots header tags that served to Search Engines.
Google Search Console
This
section will be uploaded soon.
Monetisation
Enable custom ads.txt
In this section you have to provide
ads.txt content. The ads.txt file allows to specially authorize and identify of your digital
ads inventory. In simple words this is used to display ads on your blog. If you have approved Google
Adsense than this file will be provided by Adsense and if you have not approved your
Adsense account but you have a lot of traffic than you can contact some other
third party to run their ads on your blog for which they provide you a
custom ads.txt file and let your blog show their ads and
you will be paid for it.
0 Comments