Setting up robots.txt on Blogger

Setting up robots.txt
Setting Custom Robots.txt True. Buddy Blogger, Robots.txt is contains guidelines or regulation for the robot-crawler about how they were going to crawlers and mengindexs website or blog, therefore, bloggers provide special arrangements to set some very basic SEO settings and very easy on the dashboard of your blog. One is a set of custom robots.txt to your blog buddy as needed.

When a search engine crawler robot page or website and blog, then the first thing visible is the robots.txt file. for users of the Blogger platform, now you can or have the option to control the bot crawlers of search engines and determine which ones should be crawled and reindexs from your website or blog.
Setting up robots.txt on Blogger

Basically every blog on blogger has a robots.txt platfrom default settings, but with the new changes in this blogger, you can change it according to the needs, nah..untuk it in this post, you'll know about how suits default robots.txt blogger and how to add or edit a custom robots.txt to your blog.

Under This Is Robots.txt Standard Default Custom Blogger
User-agent: Mediapartners-Google
User-agent: *
Disallow: / search
Allow: /
Robots.txt default setting above is the same for each blog. but if you feel this setting has been able to meet the needs of SEO for your blog and blog already has a lot of visitors, then you do not need to replace it if it is not necessary, because the robots.txt suits this will greatly affect your blog, but of course you also have to know if in this setting, there may be what you need and this is important to your blog.

To check the custom Robots.txt blog, you simply add the robots.txt at the back of your blog URL in a browser and click enter.
For example:
Setting Or Using Custom Special Robots.txt In the addition
  1. Open Dashboard> Settings> Search Preferences> Crawling and Indexing.
  2. Click the 'Custom robots.txt'> Click 'Yes'.
  3. Paste custom robots.txt that you have specified. (See Sample Image Below)
  4. Click 'Save changes' to save custom robots.txt.
That's how to enter or edit custom robots.txt at dashboar. Now let us see how to write custom robots.txt for good bloggers but also to be adjusted to your needs. on these points you just have to put the code in any way you need.

Custom robots.txt right in with the needs of his blog.
User-agent: Mediapartners-Google
Disallow:User-agent: *
Disallow: / search
Disallow: / p / *
Disallow: / view / *
Disallow: /? M = 1
Disallow: /? M = 0
Disallow: / *? M = 1
Disallow: / *? M = 0Allow: /Sitemap:
The above example is a custom robots.txt to blog according to their needs, how to non-essential and important to be in reindexs and on the block. Now I will explain the meaning of each line in the top of each of the codes were implemented, and the additional meaning given in this costume.

User-agent: Mediapartners-Google
Disallow: This line indicates THAT crawler bot in order to crawl or to adsense crawler. If you already have AdSense ads; This line is great for, clicking adsense crawlers and to visit all the pages of your site in accordance with the guidelines adsense, for this line better on leave.

User-agent: * This line is used for crawler bot clicking all the sites your blog page (except for the setting apart under it, whether it should be in reindexs or in blocks in the addition of the code below it. That is the underlying purpose code- code which you apply.

Disallow: / search This line indicates that each page / search in the form of the URL will not be crawled and indexed by crawler bot. This line is very important to make your blog SEO, such as will not crawl or index the page will not Label and Archives. Because the page is not unique pages or URLs can reindexs and also to avoid duplicate content on your blog.

Disallow: / p / * This line serves to block robots crawl pages for your blog. But if you want to keep your pages indexed by crawler, please delete this line.

Disallow: / view / * This line serves to stop the robot crawls blogger page to display dynamic link. If you use a dynamic display on your blog, then you can delete this line but if you do not use it please leave it alone.

Disallow: /? M = 1
Disallow: /? M = 0
Disallow: / *? M = 1
Disallow: / *? M = 0 These lines are used to stop the robot crawls a page redirect to a mobile phone or mobile phone display. If you do not use it, then you might see a blog with a link in the search results display mobile phone *? M = 1 or *? M = 0 line is necessary to avoid the problem of duplicate pages. If you do not have this problem or if you do not need it, you can delete it.

This line basically shows a link to your blog sitemap.
please replace into your blog URL.

Check and Analyze your Robots.txt
There are many tools available on the web to check your robots.txt. But to examine how the robot google (ie.Adsbot-Google, Mediapartners-Google, Googlebot) will crawl on your blog, you should use google-webmaster tool. please open the webmaster dashboard> Crawler> URL Blocking. You can see how these robots work in accordance with robots.txt your blog that you specify.
Note: the use of robots.txt, you must actually use the code you need it, because in the application of this robots.txt if you just perfunctory apply without knowing the functions of each of the code, would be fatal, even impact crwaler severity bot will not index your blog, so better learn in advance of each code and functions.
If there are other ideas that you think is better or there are in question please discussions in the comments field. OK bloggers I think that's all in the first tutorial Custom Setting Robots.txt True to me :). More and lack I apologize. Good luck
0 Komentar untuk "Setting up robots.txt on Blogger"

Back To Top