Google Search Console: The ultimate guide to understand it and optimize your SEO thanks to it!


If I work on my SEO, I want to know if it produces results, so I might as well understand right away how to measure these results using the tools that Google provides! The Google Search Console is an essential and complementary tool to Google Analytics, which we will discover without further delay!

Update: Google recently developed a new version of the Google Search Console. And, great news, our guide is wonderfully up to date! Find out more .. 🙂

Summary

How do I create a Google Search Console account?

Enter the URL of your site in the corresponding field then validate by clicking on “Add property”.

There are several methods to validate ownership of your site:

The 1st method displayed by default by Google is to go through the HTML file (method which consists in having access to the source code of the website).

For this :

  1. k
  2. Upload the file to the root of your site.
  3. Check that the transfer was successful. If this is the case, the imported validation file should be able to display as one of your site’s web pages, in your browser. The URL of this page should consist of your domain name, followed by the name of the validation file. For example: http://www.my-site.com/googleff2a43992774ce75.html
  4. Return to the Google Search Console interface and validate.

Warning: Do not delete the validation file at the root of your domain, as you may no longer be able to access the data in Google Search Console.

The 2nd method  consists of modifying your DNS settings directly on your account that you have with your domain provider (simpler method).

Select “Domain Name Provider” as shown below> at the very bottom of the property validation form.

You will be redirected to another page.

Choose your domain provider and then follow the steps given by your provider.

Warning: If you have several sites, you must repeat steps 3 and 4 as many times as you have sites.

Do not forget to register your secure and unsecured site and also your subdomains.

Example:

  • http://my-site.com
  • https://my-site.com
  • https://blog.my-site.com
  • http://blog.my-site.com

Link Google Analytics & Google Search Console

Open your Google Analytics and head to the settings.

Then, click on “property settings” and scroll to Google Search Console and click “adjust Search console” → Add → Save

Here is your Google Search Console account is now configured, you can now start optimizing your site! ?

How to monitor your SEO performance?

To do this, go to the “Performance” report, you will be able to follow all of your traffic, in terms of clicks, impressions, CTR (Click through rate) and average position.

  • The Clicks filter shows you how many clicks you got in search results.
  • The Impressions filter shows you the number of times on Google results pages that users may have seen your ad. (Google counts these impressions based on different factors, and it may not be useful to know them all).
  • The CTR (Click Through Rate) also called the click through rate, is the percentage of clicks that your links generate when they appear on the results pages of Google. Just click on “CTR”, and presto, you’ll find out!
  • The average position allows you to have access to all of your positions on the keywords that interest you as well as to your average position. This is obviously another clue to watch since the closer your positions are to the 1st position of the 1st page (the dream!) The more traffic you will receive.

Are you abnormally badly positioned on certain themes? Do your positions tend to go up, down? Just click on “Position”, and presto, you’ll find out!

Below this graph, you will be able to filter its data to refine your searches according to:

  • Some queries that you refer to the list of your best keywords typed by users who have led them on your site.
  • The pages that show you the list of the best pages of your site that appeared in search results.
  • The country that show you the list of countries from which all the research by order of magnitude.
  • The devices that you indicate which devices were used for research.
  • The appearances in the results that you show your results enriched.

Key take away? : When you filter by pages, you will have the list of the best pages displayed in the SERP. When you click on a page, it will automatically take into account all the data coming from that page.

When you subsequently tap on the “Queries” option, this will present you with all the keywords in order of magnitude on which the page in question is positioned.

This report also allows you to filter your data by date and this data being up to 16 months with this new version of the Google Search Console.

Use the performance report to optimize your SEO

Boost your CTR

In this scenario, we will focus on the pages with a low CTR but which rank, ie which are positioned on the first Google page.

These are keywords that can get you traffic!

For this, we will focus on the keywords that are positioned between 1 and 5 but have a CTR of less than 4.88%.

Because according to Advanced Web Ranking, a 5 position has a CTR of around 4.88%.

Advanced Web Ranking

To do this, in “Queries”, select by CTR by pressing the button then filter on “less than 5”:

Make sure “Position” is still selected then select CTR and filter on “less than 4.88” (use a. Not a comma).

You will thus have a list of keywords with potential!

Choose keywords that have strong impressions but low CTR like in this example:

Click on this keyword to find the corresponding page.

Then take a look at the keywords that rank on that page.

Nothing could be simpler, click on “+ new” in the performance report and select “Pages” to enter the URL of the page concerned.

You will surely see that the page has a high number of impressions but a low CTR, which confirms our suspicions!

Now that you’ve located the page, edit your title and meta description to increase your CTR.

After your changes, monitor your changes using the comparator in the report.

Find keywords

Still in the Performance tab, the Google Search Console is also a good way to improve your list of new keywords to optimize your SEO.

By using the performance report and in a few steps you will be able to find nuggets!

It could not be easier :

  • Filter the dates to 28 days.
  • Adjust the query filters to have positions greater than 8.
  • Sort the keywords in ascending order of impressions.

These keywords which generate impressions and which have not yet reached the 1st page of Google or are barely at the bottom of this 1st page are nuggets.

Write them down in order to work on them to improve your positions on them and thus increase your click-through rate and ultimately generate sales.

How to deal with indexing errors with the Google Search Console?

In the index section of the menu, you will find the coverage report to find out which pages have been indexed and the corrections to be made for those which could not be.

To do this, go to the “Index” section of the menu, then to “Cover”.

The coverage report is as follows:

With 4 sections: Error, Valid with warnings, Valid and Excluded.

The “Valid” pages report

We recommend that you  start with the “Valid” section in green .

This section lists the indexed and valid URLs, that is to say without presenting an error, nor any form of anomaly with Google.

valid pages

google-search-console-detail-valid-report

First, check the total number of URLs displayed at the top under “Valid” (here 2.3k). Does this correspond to the number of pages on your site to index? If not, keep this in mind as these missing URLs are most likely in one or more of the other 3 sections that we analyze in the next 3 articles.

The graph also allows you to see the progress of the indexing: does it seem consistent to you? Or conversely lower / higher than the content creation reality of your site? Same, keep that in mind for the future.

Present in the sitemap or not?

Further down, in the “Details” subsection, we see 2 lines: “Sent and indexed” and “Indexed, but not sent via a sitemap”

If you’ve made a sitemap that is up to date and contains all of your indexable content, all of your valid URLs should be in the “Submitted and Indexed” section.

=> But there are sometimes inconsistencies to identify and correct.

To do this, we will take a closer look at the list of URLs concerned by clicking on the corresponding line.

List of URLs sent and indexed

If the list contains URLs that should not be indexed: consult the declared sitemap (s) to identify any errors, then deindex these unwanted pages.

List of URLs indexed but not sent via a sitemap

You have URLs in this subsection because:

  1. the sitemap is not up to date, or it does not contain all your URLs, but only the main ones: don’t panic, you just have to update your sitemap (s). This is for example the case with the above capture.
  2. you don’t have a sitemap because the size of your site does not justify its creation.

The warning report

We then detail here the section that we recommend you to consult in the second step “Valid with warnings” in orange.

The following report shows you the pages for which Google is giving you a warning, in particular because it is indexed but is blocked by the robots.txt file as you can see below.

Explanation of the warning

The robots.txt is not a deindexing tool but a blocking tool, it is possible that certain pages are still visible to Google if a third-party site links them.

Interpretation and correction

Click on the “Warnings” section:

Then on the corresponding line of the “Details” subsection to display the detailed loste:

  • If these are pages that need to be indexed: remove them from robots.tx as soon as possible to allow indexing.
  • Otherwise, in this case, you must remove these pages from robots.txt, deindex them properly, then put them back in robots.txt.

The error report

We then detail here the section that we recommend you to consult in 3rd step “Error” in red.
This is the section listing the URLs that Google has not indexed because they have errors. Unlike the “Excluded” section, these are URLs that you have chosen to send to Google via a sitemap , which is why it warns you via this error section.

Here in this example we can see, that there are 131 errors which have for cause 2 problems.

Click on the section in red “Error”:

In the table below, you will be able to identify the cause of the error in order to be able to correct it.

Then on each error to have the list of URLs concerned.

Technical errors

Let us first list the errors of the “technical” type:

  • Server Error (5xx): The server did not respond to an apparently valid request.
  • Redirection error: 301/302 redirect does not work.
  • Sent URL appears to be a “soft 404” error: you sent this page to be indexed, but the server returned what appears to be a “soft 404” error.
  • Sent URL returns unauthorized request (401): You submitted this page to be indexed, but Google received a 401 (unauthorized access) response.
  • Sent URL not found (404): You sent a URL to be indexed, but the URL does not exist.
  • The URL sent contains a crawl error: you sent this page to be indexed, and Google detected an unspecified crawl error that does not match any of the other reasons.

For all these errors: correct the error if the page must be indexed or delete it from the sitemap and internal mesh (internal links) if not.

Indexing errors

  • Sent URL blocked by the robots.txt file: this page is blocked by the robots.txt file and sent by the xml sitemap at the same time. Remove it from robots.txt or xml sitemap depending on whether you want to index it or not.
  • Sent URL designated as “noindex”: You submitted this page to be indexed, but it contains a “noindex” directive in a Meta tag or HTTP header. If you want this page to be indexed, you must remove this tag or the HTTP header, otherwise, it must be removed from the sitemap.

You can now correct your pages so that there are no more errors and that they can be indexed without problems!

The “Excluded” pages report

We then detail here the section that we recommend you to consult last, “Excluded” in gray.
This is the section listing the URLs that Google did not index, deeming this to be voluntary on your part. Unlike the “Error” section, these are URLs that you did not choose to send to Google via a sitemap , which is why it cannot assume that this is an error.

Click on the gray section “Excluded”:

These pages are usually not indexed, but that seems intentional on your part.

So it is worth watching the reason for these pages not being indexed, because you don’t want Google to unindex a page that you would like to index!

Let us first list the excluded for technical reasons:

Technical causes

Blocked due to unauthorized request (401)  : An authorization request (401 response) prevents Googlebot from accessing this page. If you want Googlebot to be able to crawl this page, remove the login credentials or allow Googlebot to access your page.

Not Found (404)  : This page returned a 404 error when requested. Google detected this URL without an explicit request or sitemap. Google may have detected the URL via a link from another site, or the page may have been deleted. Googlebot will likely continue to try to access this URL for some time. There is no way to tell Googlebot to permanently forget a URL. However, It will explore it less and less often. 404 responses aren’t a problem if they’re intentional, just don’t make any connections to them. If your page has moved, use a 301 redirect to the new location.

Anomaly while crawling  : An unspecified anomaly occurred while crawling this URL. It can be caused by a response code of level 4xx or 5xx. Try to analyze the page using the Explorer tool  like Google  to check if there are any problems preventing it from being crawled, then loop back with the technical team.

Soft 404  : The page request returns what appears to be a “soft 404” response. That is, it indicates that the page cannot be found in a user-friendly way, without including the corresponding 404 response code. We recommend that you either return a 404 response code for “not found” pages to prevent indexing and remove them from the internal mesh, or add information on the page to tell Google that it is not. this is not a “soft 404” type error.

Causes linked to a duplicate or a canonical

Other page with correct canonical tag  : this page is a duplicate of a page that Google recognizes as canonical. It correctly refers to the canonical page. There is in theory no action to be taken with Google, but we recommend that you check why these 2 pages exist and are visible to Google in order to make the correct corrections.

Duplicate page without canonical tag selected by user  : This page has duplicates, none of which are marked as canonical. Google thinks this page is not canonical. You should designate the canonical version of this page explicitly. Inspection of this URL should show the canonical URL selected by Google.

Duplicate page, Google did not choose the same canonical URL as the user  : this page is marked as canonical, but Google thinks another URL would be a more appropriate canonical version and therefore indexed it. We recommend that you check the origin of the duplicate (maybe you should use a 303 rather than keeping the 2 pages), then add the canonicals tags you need to be precise with Google. This page was detected without an explicit exploration request. Inspection of this URL should show the canonical URL selected by Google.
If you have this message on 2 different pages, it means that they are too similar and that Google does not see the point of having two. Say you have a shoe store, if you have a “red shoes” page and a “black shoes” page that contain little or no content, or content that is too similar, with barely the title changing: it you have to ask yourself if these pages should really exist, and if so, improve their content.

Duplicate page, the URL sent was not selected as canonical URL  : the URL is part of a set of duplicate URLs without an explicitly specified canonical page  . You requested that this URL be indexed, but since this is a duplicate and Google thinks another URL would be a better canonical version, this one has been indexed in favor of the one you declared. The difference between this state and “Google did not choose the same canonical page as the user” is that in this case you explicitly requested indexing. Inspection of this URL should show the canonical URL selected by Google.

Page with redirect  : the URL is a redirect and therefore has not been added to the index. There is nothing to do in this case except to check that the list is correct.

Page Deleted Due to Legal Claim  : The page has been removed from the index due to a legal claim.

Causes related to indexing management

Blocked by a “noindex” tag  : when Google tried to index the page, it identified a “noindex” directive and therefore did not index it. If you  don’t want  the page indexed, you’ve done it right. If you  want  it to be indexed, you must remove this “noindex” directive.

Blocked by Page Removal Tool  : The page is currently blocked by a URL removal request. If you are a verified site owner, you can use the URL Removal Tool to see who is causing this request. Deletion requests are only valid for 90 days after the date of deletion. After this period, Googlebot can crawl your page again and index it, even if you don’t send another index request. If you don’t want the page to be indexed , use a “noindex” directive, add credentials to the page, or delete it.

Blocked by robots.txt file  : A robots.txt file is preventing Googlebot from accessing this page. You can check this with the test tool in the robots.txt file . Note that this does not mean that the page will not be indexed by other means.  If Google can find other information on this page without loading it, the page could still be indexed (although this is rarer). To make sure that a page is not indexed by Google, remove the robots.txt block and use a “noindex” directive.

Crawled, currently unindexed  : The page has been crawled by Google, but not indexed. It may be indexed in the future; it is not necessary to return this URL for exploration.
This happens quite often with paginated pages after the 1st page, because the engine does not see the point of indexing them in addition to the first.
It is also possible that it concerns a large number of very similar or low quality pages, for which Google does not see the point of indexing them. We must therefore ask ourselves whether it is not better to deindex them voluntarily, unless we plan to work on them in the near future.

Detected, currently not indexed  : The page was found by Google, but not yet crawled. Usually, this means that Google tried to crawl the URL, but the site was overloaded. Therefore, Google had to postpone exploration. Therefore, the last exploration date is not included in the report.
This happens quite often with paginated pages after the 1st page, because the engine does not see the point of crawling them in addition to the first.
It is also good to dig the track of the depth: when you have many deep pages, it is difficult for the robot to crawl your site well, so it decides to hide an “uninteresting” part of the site. This problem must be corrected as soon as possible because it can affect the overall crawlability of the site and therefore other pages, which are crucial for your SEO.

You know all about the Search Console Excluded URLs report!
A Smartkeyword consultant can help you audit your index coverage, don’t hesitate to contact us!

The URL Inspection Tool

The inspection tool gives you information about a specific page: Accelerated Mobile Pages (AMP ) errors, structured data errors, and indexing issues.

To access this report, insert the URL you want to inspect in the search bar.

With this report you can also:

  • Inspect Indexed URL : To collect information about the Google indexed version of your page. 
  • Inspect URL Live : To determine if a page on your site can be indexed by clicking the “test URL online” button at the top right.
  • Request indexing : to ask Google to crawl an inspected URL by clicking “Request indexing” in the first section of the report.

Manage links using the Google Search Console

Manage your netlinking: Which sites are linking to your site?

To do this, go to the menu in the “Links” tab.

Information on netlinking can be found in the left column called “external links”.

You will find 3 reports:

  • Main landing pages in which you will list the pages of your site with external links pointing to them.
  • Main original sites in which you will have external links to your site.  
  • Main anchor texts in which the anchors used on these external links are presented.

Will I see all of my links?

Not all links to your site are necessarily listed. This is normal, don’t worry! Here are the reasons that may explain it:

  • Problem on robots.txt : The data shows the content detected and crawled by Googlebot during the crawling process. If a page on your site is blocked by a robots.txt file, links that redirect to that page are not listed. The total number of these pages is available in the Exploration section of the “Blocked URLs (robots.txt files)” tab in Search Console.
  • Problem on your 404 pages : If a broken or incorrect link is detected on your site, it is not listed in this section. >> We advise you to regularly check the Crawl errors page to check for the presence of 404 errors detected by Googlebot while crawling your site, which will ensure that if an external site links to you, you will benefit from the transmission of popularity!
  • Google hasn’t crawled it yet! Yes, Google may not have analyzed the page on the external site that is linking to you yet, in which case it could not see that it was linking to you! Patience, it will happen quickly!
  • Your site may be indexed under an https or with another version (with or without www) . For example, if you don’t see the expected link data for http://www.example.com, make sure you’ve added a http://example.com property to your account, and then verify the data for that property.

? Pro tip: To limit this, set a favorite domain

Manage your internal links

The number of internal links that redirect to a page allows search engines to determine the importance of that page. If an important page is missing from the list or if a less important page has a lot of internal links, it may be useful to review the internal structure of your site.

All this information is all well and good, but how do you use it?

  • Highlight a certain page

If you want to make sure that a page of your site is well meshed (that it has multiple internal links from other pages on the site), you can make sure through this menu, it’s beautiful!

  • Delete or rename pages

If you want to delete or rename pages on your site, check this information first to identify any broken links and avoid this type of problem.

  • If no information is displayed in this report …

This might mean that your site is new and has not been crawled yet. If not, check the Crawl Errors report to see if Google has had any issues crawling your site.

How to manage the mobile ergonomics of my site?

Global web traffic from mobile devices is growing. In addition, recent studies show that mobile users are more likely to return to mobile-friendly sites. Google helps you with this!

To do this, go to the “Improvements” section of the menu then “Mobile ergonomics”.

In the graph, find the mobile usability problems detected over time on your site.

Here are the different problems you may have:

  • Uses incompatible plug-ins
  • Display window not configured
  • Display window not configured on “device-width”, ie the window cannot adapt to different screen sizes.
  • Text illegible because it is too small

Clickable items too close together

Go further with the Google Search Console

As the old Google Search Console is still available, you can go even further in your website optimizations with the following reports:

HTML Improvements

This report presents possible difficulties encountered by Google when crawling and indexing your site. Check it regularly to identify changes that could improve your ranking on Google search results pages, while ensuring better user convenience for your visitors. For example, below we can see that there are duplicate meta descriptions.

Manual actions

This report, found in the new Google Search Console, lists known issues on your site and provides information to help you resolve them. Here, Google has detected a problem.

International targeting

This report allows you to manage one or more websites designed for several countries in several languages ; you should make sure that the version of your pages that appears in the search results is appropriate for both country and language.

Structured data

This report, which you can find in “Appearance in search results> Structured data”, gives you the list of URLs on your site which have a structured data element: it is now possible to click on each URL to find out if Google has well taken into account your markup: (fantastic no ??)

Data markers

“ Appearance in Search Results> Data Markers”  : This is a tool that allows Google to interpret the format of structured data on your website.

URL parameters

“ Crawl> URL Parameters”  : Use this report to tell Google what your site parameters are used for and how to interpret them.

Conclusion

All these features explain why webmasters agree that Google Search Console provides a very interesting complement to Google Analytics, with once again the dual objective of achieving both a better understanding of where your traffic is coming from and also work it in a more targeted way.

sam lak

Finding passion in internet research that makes an impact.

Recent Content