WordPress robots.txt file - indexing websites by popular search robots, the prohibition for indexing in the robots.txt file ("Disallow", "User-Agent", "Host")
In the process of your own internal SEO optimization, it’s very important not only writing a unique content or selection the keywords in Google Keyword Tool (this is necessary in order to create a semantic kernel for website), but you should also give due consideration to such an important parameter as the indexing blogs in a variety of search engines. Nowadays we can name a few major search engines, which are almost completely captured the market of Internet search, it is Google, Bing (Microsoft Corporation) and Yahoo. These three search engines dominate in the online world, and from them, but rather on how well and quickly they will index your blog’s content, and will depend all the continued success of your long and diligent efforts in SEO optimization.
Friends, at the disposal we have only two main instruments through which we can manage the indexing of our blogs, created by using WordPress CMS, which have many tangible benefits in comparison with its counterparts. Well and here the tools themselves by which we can very easily manage the indexing of our WordPress blog:
- first, it is, of course, the robots.txt file, which allows us to customize the prohibition of indexing all those materials that don’t contain the main content (this can be files of the WordPress engine, as well as duplication of content) and exactly about this useful robots.txt file we’ll be talking in this article;
- second, except for robots.txt file, there is another important tool to control the indexing blog – this is the sitemap in XML format, which can be created by the plugin Google XML Sitemaps. About this WordPress plugin I in detail wrote in the article, which you can read just clicking on the link above.