Google Webmaster Tools Part 2: Checking your Website’s Health

Following up, from our previous article on “how-to” setup Google Webmaster Tools (GWT), are insights into using some of the most valuable features from this tool.   We will look at how to use these features to get a clearer picture of a site’s health and how to troubleshoot common problems.

Don’t overlook these important features…

Identify Top Search Queries: The “Search Queries” page shows Google searches that pointed to your site. Also, available is information about the top pages (pages on your site that were seen most often in search results).

Check Incoming Links to the Site: The “Links to Your Site” page lists links that were discovered during the crawl of your site. Also, this report lists the most common link sources and the pages on your site with the most links. From a marketing aspect, review this report to find the most common anchor text yielding clicks. Be on the look out for any spammy links that could result in a negative SEO impact.

Review Possible Penalties: Google will take manual action if they believe a spammy technique is being used and demote or even remove your site from search results. Use the “Manual Actions” page to find a list of any possible actions against your site. Also, you’ll find links to steps you can take to address potential problems. There are two types of possible penalties, site-wide and partial (which impact only individual URLs or sections of a site). Some of the reasons Google might hit you with one of these penalties includes unnatural links to or from your site, user generated spam, cloaking redirects, hidden texts, keyword stuffing or spammy structured markup.  If you are employing any of these unhealthy tactics, Stop Immediately, clean your site up and check for possible penalties.

Analyse Crawl Errors: The Crawl Errors page lists the URLs that Google could not successfully crawl or returned an HTTP error code. This could be a site-wide error or just from specific URL(s). Site errors range from DNS errors (your Domain Name System) to server errors or even a robots.txt file issue. Making sure your robots.txt file is always carefully updated to minimize issues is a good start but frequently checking this page will ensure you don’t sit on errors.  Also, don’t forget to check your 404s.  These error pages commonly caused because a page was deleted or renamed without redirecting the old URL to the new page, or sometimes due to a typo in a link. Error pages happen, but they should be properly cleaned up to ensure not only a good user experience but also a healthier site.

Fetch As Googlebot: Fetch as Googlebot is an incredibly valuable tool within GWT allowing you to simulate how Google crawls and renders a URL on your site. You can accomplish two different goals with this tool. When using the Fetch Details page you can click “Fetching” to see how Google communicates with your site; meaning what the HTTP response is during the crawling process.  Or, you can view “Rendering” to see how your site looks to the Googlebot. This is very useful for simulating different browsers types. This tool can also be used to see hidden spam pages only shown to Googlebot on your site.  And once you’ve updated a page, you can force crawling through it. The only caveat is you have a weekly quota of only 500 fetches. So use them wisely during analysis!

GWT is Incredibly Insightful – If you use it to its fullest potential!

Using GWT will give you a clearer view on the health of their site. Through its numerous tools you can analyze search queries, keyword analysis, link data, and site rendering metrics. Google Webmaster Tools can give you the detailed data you need to maximize your SEO campaign and focus your resources. Its ability to track changes over time provides insight into the effectiveness of your efforts, as well as ways to further improve your ongoing strategies.

Related Posts