Google Search Console (GSC) is a favorite of many SEO’s, webmasters, and website owners.
GSC enables you to monitor, maintain, and troubleshoot your website in Google’s search results. One of the reasons that it’s so dear to us is that it’s first-party organic traffic data - that doesn’t cost a dime. You can access the GSC toolset from here and start analyzing your website’s online presence better.
So how exactly do you use it, and the error reports you might see coming into your inbox after you set it up? Here’s our guide to just that. So don’t let these emails scare you!
Have you verified ownership of your website or multiple domains, on GSC yet? If not, make sure to verify all versions of your domain, including HTTP, HTTPS, www, and non-www versions of it. Claiming a Domain property - via your server/hosting provider - is also recommended.
You have 5 different verification options that you can select from as you complete this process.
Once you’re verified and logged in, you will see the reports and tools that you can access on the left-hand side. Today’s topic is all about the Index Coverage Report and how to find and fix the errors you may come up with within GSC. There are many other useful reports in GSC (e.g. Performance insights, mobile insights & more,) so we encourage you to explore!
Two things to keep in mind:
1) When new “Errors” and “Valid with warning” issues occur, GSC will automatically send you an email notifying you of the error. This guide will break down the process of verifying the issue, plus determining if you should fix it, and how to do so when you should.
NOTE that these emails can absolutely be "noisy" in your inbox, especially if you've claimed all domain variations and/or manage multiple domains. Here's a guide on how to filter your inbox for these GSC notifs, so you can focus on those issues that matter.
2) Valid and Excluded URLs will NOT generate emails, but they should be monitored on a regular basis to ensure real issues aren’t being ignored. We’ll go over these major issues below, as well.
Finding the issues & resolving them is as easy as:
Receive an error email alert from GSC & click it OR visit GSC directly and navigate to the Coverage report.
Review your Coverage Error reports directly in GSC, which specify what the issue is and what URL(s) that issue applies to.
QA the issue on the URLs in question (e.g. validate that the issue is still occurring)
Determine the priority of the fix, based on the combination of these two factors:
If you detect a real issue, then determine the correct action to revolve it (again, we’ve generally outlined the correct actions below.) If you couldn’t solve it by yourself or don’t know what to do even after you read the guidelines, you can reach out to The Webmasters Help Community or an SEO agency (like ourselves) for help.
After taking the correct action, you should carry out the validation process in most cases. Unfortunately, it can be buggy, but when you can, keeping the list of pages to review & fix short means you’ll be better able to spot & fix real issues when they arise.
NOTES:
Once you click on Coverage, you will see:
The summary page shows the results for all URLs in your property grouped by status, and the specific reason for that status (such as server errors). By clicking a table row on the summary page, you can view the list of URLs with the same status/reason and more details about the issue. Don’t forget to scroll below the graph to see that data set (depending on your screen size, that gets buried easily!)
Let’s go over all the statuses, the reason for that status, and how to fix these URL issues under Index Coverage Report.
Errors mean that these specific pages cannot be indexed, therefore won’t be visible on Google search results page (SERP). Meaning no one can access your website via organic search. Below are the reasons that your submitted URL may have a crawl issue, and what those errors actually mean.
A page that you have previously submitted, or are currently submitting to Google (likely via the sitemap XML file) has a 'noindex' directive either in a meta tag or HTTP header.
So the real question you need to answer is: should the page(s) be indexed or not?
If the page(s) should be indexed, the meta tag or HTTP header noindex directive should be removed. You may want to use the URL Inspection tool to investigate and re-submit the article for indexation once the issue is resolved. You can also start the Validation process in GSC to clear out the errors - this way if/when new errors arise, they are easier to spot (and not easily buried under old and inaccurate issues!
If the page(s) are correctly noindexed, just make sure the URL in question is not listed on a current XML file.
Note: when you validate this error, it’s checking to see if the page stopped being noindexed. It will validate correctly in that case. But if the correct solution is to remove the page from the sitemap file, then validation won’t work. In this edge case, you should just ignore the error. It’ll clear itself out eventually.
Your server returned a 500-level error when the page was requested. Investigate and contact your server provider if there is a problem since the capabilities of your server indirectly affects the indexation process as well.
You may see URLs that are working on this list; they get added to this report if/when there is a server blip and your site goes down. Simply start the validation process to clear these out (but if this happens frequently, you should resolve that issue with your host directly!)
If the 500 errors are real & should not load, then set up a 301 redirect to the closest indexable page instead.
Start the validation process when you confirm that the errors are resolved.
Google experienced a redirect error, either a redirect chain or a redirect loop. By using a web debugging tool, such a Screaming Frog or Httpstatus.io, to learn what causes this redirect error.
Redirect loops are redirects that cause errors because they (eventually) point to themselves, e.g. Page A -> Page B -> Page A. Resolve these by deciding what the correct page is (or should be), and ensure that the final page of the redirect correctly loads.
Redirect hops are multi-step redirects, e.g. Page A -> Page B -> Page C - and sometimes deeper. (NOTE that Google will only follow 5 redirect chain steps before giving up.) Fix this by changing the redirect from Page A to point directly to Page C, or whatever the final, 200 page should be. This is rather an issue we see after website migrations, so make sure that you plan your migration well and check this off your list if you experience a traffic drop after website migration.
It’s not uncommon to accidentally send conflicting messages to search engines (read more here to learn about these common mistakes.) In this case, you are submitting a URL to Google (likely via your sitemap XML file) as a page to crawl & index, while simultaneously blocking that URL in your robots.txt file.
So again, the real question comes down to: should the page be indexed?
If it should be indexed, you are blocking it in your robots.txt file. If you aren’t sure what line of the fix is causing the conflict:
If the URL should not be indexed, remove it from your sitemap XML file.
A “soft 404” is a page that appears to be blank (so far as Google can tell), and therefore may be a 404 error that’s not correctly sending a 404 response code (this is a common 404 mistake to avoid!)
If the page is, in fact, an error page, work with your web developer to send an actual 404 http header response. If your website is an SPA (single page application), sending a 404 not “out of the box” functionality, so here’s how have your SPA correctly send a 404 error. Further, consider redirecting the page and/or issuing a 410 (Gone) status (find more about why we recommend this on our 404 FAQ page.)
If the page is not an error page, it’s either a) a temporary bug (try the validation process and see if it will clear itself out, OR b) an issue. If Google can’t access and/or render the contents of your page, they won’t “see” the page to know it’s good and valid.
Why wouldn’t they be able to render it? Generally due to:
Similar to above, if you are submitting a URL (via your sitemap XML file) and it issues a 404 error, you need to resolve it. Generally speaking, you’ll either fix the page (if the 404 is in error, and the content should be there), or you should 301 redirect the URL to the closest functional page on your site. Don't forget to fix any broken links on your website!
Sometimes, Google notified you about crawl errors that don't fall into any of the other buckets. These can be bugs on Google’s end (they will clear themselves out eventually), or there’s some other error. In this scenario, you should debug your page using the URL Inspection Tool.
If you still see issues, you can refer to our guide to indexing and crawling for a thorough explanation.
Excluded URLs represent the ones that Google believes are intentionally left out from indexation. However, that’s not always the case! That’s why it’s important to review periodically for issues.
From Google:
“These pages are typically not indexed, and we think that is appropriate. These pages are either duplicates of indexed pages, or blocked from indexing by some mechanism on your site, or otherwise not indexed for a reason that we think is not an error.”
The Googlebot encountered a 'noindex' directive, or a disallow in the robots.txt file and didn’t index the page. If this is intentional, perfect. If not, remove the noindex tag (or resolve the robots.txt issue) and simply add the URL to your sitemap.
This states that the URL is currently blocked by a URL removal request. The URL removal tool only helps you to suspend the indexation of a page for a period of 90 days. After that, Google may index the page again so if you need a permanent option, just block, noindex or remove the page.
Google encountered a bug. That may have been a temporary bug on your side, OR on Google’s side, but the result is that the page content couldn’t be retrieved by Google.
Assuming these are valid pages and you want them indexed, crawl errors of such can be identified by using the URL Inspection Tool. If none come up, resubmit the URL for Google’s consideration - and monitor to see if it works. If errors do come up, work to resolve them.
If they are not real or valid pages, you can ignore them (or redirect them, if appropriate.)
Much like the Submitted URL not found (404) above, this is a list of 404s on your site. In this case, you’ve not submitted them to Google, but nevertheless, Google is aware of them. You should take steps to correct these (generally via 301 redirects, but issuing a 410 instead is also a valid step.)
Review this list of URLs to ensure no important pages are listed (work to improve & index them if they are valid.) If many pages are listed, it could be a sign of quality issues on your site as a whole.
There are several cases where Google doesn’t index the specific URL due to canonicals.
The complete list of the reasons that create the excluded list of URLs can be found here.
Valid with warnings are the URLs that have some issues that aren’t preventing indexation. However, these “warnings” can limit ranking potential, and are therefore worthy of review.
A similar issue to the index coverage issue resulting from robots.txt directive, however, these pages are indexed. If the page should not be indexed, and is intentionally blocked, use the “Remove URL” tool in GSC to get it done quickly. If you’ve made a mistake, and the page should be indexed, then refer to the robots.txt tool once again.
Valid URLs are generally okay - but you may want to check and ensure the URLs under this group are good & valid URLs.
The list of URLs where theoretically, everything went according to plan. You wanted the page indexed, so you submitted it, and it worked.
In this case, the URL was discovered elsewhere by Google and indexed. If any important pages are on this list, consider doing the work to include them on the XML sitemap file instead. If URLs you don’t like are on this list, you should work to deindex them.
You’ll get automatic emails notifying you about issues in 3 other primary cases:
In each case, the process is the same: verify if the issue is real. If it is, work with your engineering team to fix it (for these issues, it’s almost always your design your development team that has to help), and start the Validation process to clear them out once you confirm the fix (good SEO QA processes can help here!)
We listed the most common possible actions to be taken to fix the specific GSC errors above on your website. Now let’s take a look at a few more options to fix URL errors.
If you’ve determined that the issue is a false positive, or the issue was real but is now resolved, you can use the default GSC Validate Fix process to prompt Google to recrawl the URLs in question and remove the error once they confirm the fix.
To do so, simply navigate to the correct error report, then click the Validate Fix button:
This is perfect for doing this work in bulk; if you want a specific URL reviewed, use the URL Inspection Tool fix process - as outlined below - instead.
Identifying & fixing URL errors is quite easy via URL Inspection Tool (this replaces the retired tool "Fetch & Render"). Once you identify a submitted URL with an index issue, you can check the index status of the page and troubleshoot errors. Just enter the URL in question here (available at the top of the page on any URL in GSC) and hit enter:
This will generate a (cached) report that will give you specifics about any issues Googlebot has encountered. If you believe the information is old or incorrect, click the “Test Live URL” button at the top right of the webpage - this will generate a new, non-cached version.
Once you complete troubleshooting and fix the index error, you should re-submit the page/s you fixed. You can do this by clicking “Request Indexing” (or reindexing) on the inspection report.
NOTE: This is also how you can quickly submit new content or new pages to Google, to get it indexed more quickly!
What a list, huh! Websites - and search engines - sure can be tricky.
Fortunately, the GSC Index Coverage Report gives you great insights that help you identify & resolve problems - so it gives us a pretty neat starting point.
If you ever have a problem you can’t resolve, don’t stress! Other experts have been through it all. If you need a hand fixing those URL errors, reach out!