Google Search Console Tutorial Part 2: Checking Your Website’s Health

In this second part to our Google Search Console series, we take a dive into the platform, showing you the most valuable features of this tool and how you can identify the health issues on your website.

How To Use Google Search Console

This is the second part to our Google Search Console series, where we will take a dive into the platform, showing you the most valuable features of this tool and how you can identify the health issues on your web site. If you haven’t already read part 1 and need to implement Google Search console then you can find our comprehensive guide here.

Don’t overlook these important features…

Identify Top Search Queries: The “Search Analytics” section outlines the queries real users make and how it relates to your site. You will see information about which queries you are getting impressions on, which queries are getting the most clicks and what your click through rate or CTR is.
You are also able to segment the data by to identify the top performing:

  • Pages
  • Countries
  • Devices
  • Search Types

By default the “Search Analytics” report will show you combined statistics relating to all searches. By filtering down into specific segments this report can provide deeper insights into your SEO progress.

Check the incoming links to your site: The “Links to Your Site” section lists the links that Google has discovered during the crawling of your web site. It also provides an overview of which sites link to you most, the most linked to content, and what the most common anchor text is.  When reviewing this report be sure to be on the lookout for any spammy links that could negatively impact your SEO.

Review Possible Penalties: Google will take manual action if they believe a spammy technique is being used and demote or even remove your site from search results. The “Manual Actions” section can help you identify any possible actions against your site. In the event that there is a manual webspam penalty in place you will find links to the steps you can take to address these problems. There are two types of possible penalties, site-wide and partial (which impact only individual URLs or sections of a site). Some of the reasons Google might hit you with one of these penalties include unnatural links to or from your site, user generated spam, cloaking redirects, hidden texts, keyword stuffing or spammy structured markup.  If you are employing any of these unhealthy tactics, Stop Immediately, clean your site up and check for possible penalties.

Analyse Crawl Errors: The “Crawl Errors” section is where you can find a list of URLs that Google could not successfully crawl or returned an HTTP error code. These errors can happen site-wide or just apply to specific URL(s). The cause of these crawl errors can range from DNS errors (your Domain Name System) to server errors or even a robots.txt file issue. Crosschecking that your robots.txt file is accurate and up to date for your specific needs can help to minimise these issues. This will be something you need to monitor with consistency so you can address errors as they appear.

You will also need to check for 404 errors. These error pages are commonly caused because a page was deleted or renamed without redirecting the old URL to the new page, or sometimes due to a typo in a link. Error pages happen, but they should be properly cleaned up to ensure not only a good user experience but also a healthier site.

Fetch As Googlebot: The “Fetch as Googlebot” function is an incredibly valuable tool within Search Console. It allows you to simulate how Google crawls and renders a URL on your site.

You can accomplish two different goals with this tool. The “Fetch” function will show how Google communicates with your site; meaning what the HTTP response is during the crawling process.  Or, you can view “Fetch and Render” which will display how your site looks to the Googlebot. This is very useful for simulating different browsers types. This tool can also be used to effectively identify hidden spam pages only shown to Googlebot on your site, which would negatively impact your SEO.  Once you have updated a page, you can use this section of Google Search Console to force crawling through a specific URL or your entire web site. The only caveat is you have a weekly quota of only 500 fetches. So use them wisely during analysis!

Google Search Console is Incredibly Insightful – If you use it to its fullest potential!

This tool is an effective weapon that can be used to identify and improve aspects of your web site health. Through its numerous tools you can analyze search queries, keyword analysis, link data, and site rendering metrics. Google Search Console can give you the detailed data you need to maximize your SEO campaign and focus your resources. Its ability to track changes over time provides insight into the effectiveness of your efforts, as well as ways to further improve your ongoing strategies.

[gs_faq id=”15069″]