Home >> By Priya Darshini
- SEO -
image credits-dribbble


About Google Search Console:

Google search console helps in monitoring and fixing the errors in your website. Google Search Console helps you to analyze how your website is performing well. It helps you in understanding the factors that help with the organic results. Google search console helps you to submit new content for crawling and indexing. It helps you to improve your website visibility in the SERP.

Indexed, though Blocked by robots.txt

Reason for this error:

Robots.txt file instructs Google’s bot which page to view and which page should not be viewed. This error means “Google has found your page and also got an instruction to ignore the robots file. When you have submitted a web page for indexing, it's blocked by robots.txt file.

How to Fix it?

You will receive a mail from Google Search Console (GSC), incase of any issues. Find this error from Google Search Console -> Coverage.

Use robots.txt tester to test and validate your robots.txt files.

Open Google Search Console and by clicking the resource you may find a popup like this:


Crawl errors don't have much greater impact on your SEO and if there is a less amount of crawl errors, then your users are less likely to see those server errors.

First Check your Crawl errors once in a week by monitoring your crawl errors report.

404 Crawl Error is the most common error and it is easier to fix too. For every 404 error that you get, Google tells you from where it is linked and tells you the error URL.

You can also download the entire crawl report so that you don’t have to manually check each and every Crawl errors.


If your page is moved then replace them with the permanent 301 redirect and that is the proper way of redirecting your users. If by mistake it throws a soft 404 Error, then use the URL inspection tool. Sometimes the error is thrown just because of the Thin and Duplicate content. Sometimes it might be due to some technical glitch which leads to Duplicate content issues.

A soft 404 is not an official response code sent to the web browser.


When your server is taking too much time to respond and the request timed out. To crawl your site, Google Bot waits only for a certain amount of time to load your site. If it takes too long, Google Bot will not be able to connect to your site.

How to Fix it?

At first, there are several types of server errors so you have to identify them.

Here are the types of server errors:

  • Timeout
  • Connection reset
  • Truncated response
  • Connect failed
  • Connect timeout
  • No response

Refer Google Search Console help to identify the specific errors.

If your website is running fine when you get this error, then that error might be encountered in the past.

Use Fetch as Google to check if Bot could access your site. Without any errors if Google returns the Homepage, then your site could be accessed properly.


DNS(Domain Name Server) error is important and should be addressed immediately. Since the DNS issue means it can’t connect with your domain because of a timeout or lookup issue.

How to Fix it?

Use Fetch as Google to see how Google bot crawls your site. There might be an issue on the DNS providers end, If your server is displaying a 404 error or 500(Server error) code then it’s a DNS error.


We have covered some of the common Google Search Console errors and the cause and how to fix them in simple steps. With the help of tools, any GSC errors could be fixed easily. Identifying the errors becomes easier and also helps you to improve the rankings.

Google Search Console Guide Common Errors

Subscribe to Blog

Get latest trends, tips & tricks in Digital marketing, Sales, Website, SEO and Social Media.

We respect your privacy and do not sell your data with any third-party. You agree to our Privacy Policy by signing up.