Subscribe Us

How to fix indexed though blocked by robots.txt in blogger

How to fix indexed though blocked by robots.txt in blogger


Robots.txt is a file used by websites to communicate with web robots (also known as crawlers or spiders) and tell them which pages or content to crawl and index. The file is placed in the root directory of a website, and search engine crawlers check this file before indexing the site's content.

In the context of Blogger, robots.txt is a file that provides instructions to search engines on what content they should and should not crawl and index. It is automatically generated by the Blogger platform, and the default configuration allows search engine crawlers to access all of the site's pages and content.

However, if you want to prevent search engines from crawling and indexing specific pages or content on your Blogger site, you can modify the robots.txt file to include directives that block certain URLs or types of content. This can be useful if you want to hide certain pages or content from search results, or if you want to prevent duplicate content issues.

It is important to use caution when modifying the robots.txt file, as incorrect configurations can result in unintended consequences such as blocking all search engines from crawling and indexing your site. It is recommended to consult with a technical expert or follow official documentation to ensure that your robots.txt file is configured correctly.

 

Fix Indexed Though Blocked by Robots.txt

If your Blogger site's pages or posts are not getting indexed by search engines due to a robots.txt block, here are some steps you can take to fix the issue:

Check your robots.txt file:

Make sure your robots.txt file is not blocking search engines from crawling and indexing your site's pages or posts. You can check this by typing "yoursite.com/robots.txt" in your web browser.

Adjust your robots.txt file:

If you find that your robots.txt file is blocking search engines, you will need to adjust it. You can use the "Allow" directive to tell search engines which pages or posts to crawl and index. For example, if you want to allow search engines to index all pages and posts, you can add the following lines to your robots.txt file:

 

User-agent: *

Disallow:

 

This tells all search engines to crawl and index all pages and posts on your site.

Submit your sitemap to Google:

Once you have adjusted your robots.txt file, you will need to submit your sitemap to Google. This will help Google find and index your site's pages and posts more quickly. To do this, go to the Google Search Console and click on "Sitemaps" in the left-hand menu. Then, click on "Add/Test Sitemap" and enter the URL of your sitemap.

Wait for search engines to crawl your site:

It may take some time for search engines to crawl and index your site's pages and posts, so be patient. You can check if your pages and posts are indexed by typing "site:yoursite.com" into Google's search bar.

By following these steps, you should be able to fix the issue of your pages or posts being blocked by robots.txt and get them indexed by search engines.

Post a Comment

0 Comments