How To FIX - Blocked by Robots.Txt Search Console - Daily-Riches

How To FIX - Blocked by Robots.Txt Search Console

How To FIX - Blocked by Robots.Txt Search Console

 




Ways to Resolve “Blocked by robots.txt” Warning in Google Search Console

Website owners or SEO experts often run into the “Blocked by robots.txt” warning in Google Search Console. If you’ve faced it, you know it can be puzzling. This guide explains the error why it happens, and what steps you can take to resolve it. These solutions apply whether your site is built on WordPress, Blogger, or custom platforms.

By finishing this article, you’ll learn why setting up your robots.txt file matters. You’ll also see how to stop it from causing trouble for your SEO efforts.

📌 What Does “Blocked by robots.txt” Mean?

The "Blocked by robots.txt" message pops up in Google Search Console when your site blocks certain URLs from being accessed by search engine bots. This happens because of the restrictions you set in your robots.txt file.

You can spot these blocked URLs by checking the report in Search Console. Clicking on the error will reveal which pages are affected. It could be a single page or several depending on how your site is set up.

A Quick Example

Imagine your robots.txt file includes this rule:

bashCopyEditDisallow: /wp-admin/

This file blocks search engines from crawling anything inside the /wp-admin/ folder. If any essential content lands in this restricted area, it won’t appear in search results. Google might also mark it as blocked.

🔍 What Does the robots.txt File Do?

The robots.txt file sits in the root folder of your site found at something like example.com/robots.txt. It tells search engine crawlers what they should or shouldn’t check on your website.

Here’s an example layout:

txtCopyEditUser-agent: *
Disallow: /page1
Disallow: /page2
Disallow: /page3
Disallow: /pageA
Disallow: /pageB
Disallow: /pageC
  • Pages 1, 2 3: These pages are important and should be indexed but might be blocked.

  • Pages A, B C: These are pages you don't want indexed, like low-priority or internal pages.

Google includes all these pages in the report when it detects conflicting signals. For example, you might block a page in robots.txt, but at the same time, it appears in your sitemap or has an index meta tag. This creates confusion.

🧠 Why the Error is Important: To save crawl budget and improve SEO

Search engines assign a specific crawl budget to check your website. This budget limits the time and resources bots spend scanning your pages. When bots waste time crawling unimportant or restricted pages, the indexing of your key content becomes less effective.

Here’s an example:

  • Pages 1–3 should be treated as important, but they are blocked.

  • Pages A–C don’t matter but are appearing in your sitemaps or meta tags giving Google unclear directions.

These mixed signals can delay indexing, misuse crawl resources, and lower your rankings.

✅ Fixing the Issue: Making Sure Important Pages Get Indexed

If key pages (like Pages 1–3) are blocked but should be indexed, use these steps to lift the restrictions.

Editing Using WordPress SEO Plugins

With Yoast SEO:

  1. Open SEO > Tools > File Editor in your dashboard.

  2. Check the robots.txt file listed there.

  3. Delete any lines that block valuable pages like Disallow: /page1.

  4. Hit save to confirm the updates.

With Rank Math Plugin:

  1. Navigate to General Settings > Edit robots.txt.

  2. Erase the rules that restrict access to important pages.

  3. Save your updated file.

⚠️ Important: Do not delete necessary rules such as Disallow: /wp-admin/ unless you're certain they're not needed.

Editing Manually Through File Manager or FTP

  1. Log in to your hosting control panel. This could be cPanel or a tool like FileZilla.

  2. Find the root folder of your website.

  3. Look for the file called robots.txt. Right-click on it and choose Edit.

  4. Delete any rules that block important content from being accessible.

  5. Save the file and upload it back to the server.

To adjust this on Blogger

  1. Open Settings and select Crawlers and indexing.

  2. If “Custom robots.txt” is on, click to access it.

  3. Take out any Disallow lines that restrict the pages you need to be indexed.

  4. Hit Save to record the changes.

💡 Pro tip: If you’re unsure how this works, turning off custom robots.txt is a quick solution on Blogger.

🛑 Keeping Unwanted Pages Off Google the Right Way

If there are pages you don’t want Google to index—like internal search results, tag pages, or cart pages—blocking them with robots.txt helps. But this step alone isn’t enough.

Say you:

  • Add a block for the page in robots.txt

  • AND still list it in your sitemap

  • THEN leave an index meta tag on the page

...it gives Google mixed signals.

Fix: Add “Noindex” and Remove from Sitemap

Here’s what to do:

On WordPress when using Yoast

  1. Open Search Appearance > Taxonomies or Archives.

  2. Turn off indexing to stop these from being crawled:

    • Tags

    • Product tags

    • Search pages

  3. Check that these are left out of the sitemap.

Using Rank Math in WordPress

  1. Navigate to Titles & Meta > Misc Pages.

  2. Choose Noindex for:

    • Search results

    • Password-protected pages

  3. Then access Tags or Product Tags and confirm they are also set to Noindex.

This stops them from being crawled or indexed, so they don’t show in your Search Console data.

On Blogger

Turning off the "Custom robots.txt" option fixes this issue. If necessary, you should delete any index tags or sitemap details that conflict with what you are trying to achieve.

🧾 Summary

Page TypeRobots.txt RuleSitemap EntryMeta TagOutcomePage you WANT showing up in search❌ Don’t block✅ Add✅ indexAppears in search ✔️Page you DON’T want in search✅ Block❌ Remove✅ noindexDoesn’t appear in search ✔️

🚀 Final Words

Fixing “Blocked by robots.txt” errors can feel tricky at first, but once you learn how Google handles these rules, the process becomes clear.

Quick To-Do List:

  • Check Search Console to see which pages are blocked.

  • Get rid of blocking rules on pages that matter the most.

  • Apply noindex and take out sitemap entries on pages that aren’t important.

  • Wait it out—Google might need some time before showing the updates.

By doing this right, you make better use of your crawl budget, enhance your SEO, and help your top content get the attention it should have.

Please write your comments