In SEO nothing is set and forget. To achieve the most you need to ensure ongoing maintenance of your website. Identifying the technical issues on your site and fixing them can help improve your organic traffic issues.
There are some relatively simple technical site errors that can be identified and addressed in a small amount of time with some potentials for traffic gains. We are going to take a look at some common technical SEO issues that can impact your organic visibility and how you can fix them.
1. Not being indexed
Limited or no visibility can be broken down into two categories. One, you are not being indexed or two, you are not being ranked. The first thing we need to examine is if your site is being indexed.
This is something an SEO expert would pick up quickly and it is an important element to review. There are a few ways of doing this. A simple way is as seen below.
You will want to type “site:YourWebsite.com” and in less than a second you will see how many pages on your site are being indexed.
There are a few things you will want to check here
- Are we seeing the amount of pages we would expect to see here?
- Is there any indications that the site has been hacked? Look for inclusions of things like pharmaceutical or gambling spam.
- Dig deep into the site and ensure subdomains are being indexed.
- Are there pages being indexed that shouldn’t be?
This will help you gain an understanding as to the issue you are facing and from there you can dig deeper into the reasons why this is happening. Such as errors within your Robots.txt file.
Small and seemingly innocent errors to the untrained eye in this file, is a very common way many sites are suffering traffic loss.
When you redevelop or migrate a website without the guidance of an SEO expert it is common to overlook updating this file. It is not something your developer will always plan for so it’s important to crosscheck.
One of the first things you will need to investigate is the Disallow line. More often than not, a rogue forward slash is the culprit. As seen in this example.
By simply having a “/” in your disallow line you are indicating to the web crawlers that they should avoid crawling and indexing every webpage starting with “/” and you can see how this would be a problem if your goal is to be visible. You will need to contact your developer as soon as possible if you see “/” in the disallow line.
Many large or complicated sites such as ecommerce stores may also have a complicated robots.txt file. In this instance you should review this file line by line to ensure that it is functioning correctly. During this process it would be best to have your developer close at hand so that you can efficiently identify and correct any entries needing attention.
3. Broken links
When relaunching or redeveloping a website without the use of an SEO expert, you are likely to have many broken backlinks on other websites. A good quality backlink profile is important when it comes to keeping your site ranking, so you need to make sure to address this.
Search Console provides a list of known 404 errors and also highlights the broken links pointing to these pages. From here you can redirect to the most appropriate pages.
If you have not yet set up Google Search Console, you can find our tutorial here.
4. 301 & 302 redirects
301 and 302 redirects are a great tool to address dead pages on your website. Though to be most effective, you need to know which type of redirect to use and when.
To break it down, 301 redirects are permanent redirects while 302 redirects are temporary. It is better to use 301 redirects when permanently redirecting a page so you don’t lose any link equity.
Redirects can be really beneficial for a site when used properly. Though, as you can imagine when they are not used properly it can create a mess that is not only time consuming to fix, but can also be costly. Some key points to keep in mind
- Don’t implement any new redirects that conflict with those already existing. You will create a redirect loop which will not permit the user from entering the site, which is not what we want!
- Don’t bulk redirect a whole site to your new homepage. Instead, identify which pages are most appropriate for the user to land on.
- Manually review that all of your current redirects are behaving as they should.
- Monitor Search Console and review the 404s for accuracy and authenticity. Be aware that spammers can create false 404 errors, be vigilant in your dissection of information from Search Console.
5. XML Sitemap
An XML sitemap file lists the URLs on your site that are available for crawling. This helps search engines like Google better understand your website.
If you have a large or complicated site, you can use this protocol to give guidance to the search engine crawlers. You can also include metadata about the web pages such as when a page was last updated and its importance in comparison to other pages on your site.
Below is an example of an XML sitemap file
Search Console help states “If your site’s pages are properly linked, our web crawlers can usually discover most of your site. Even so, a sitemap can improve the crawling of your site”
Some of the most seen problems associated with XML sitemaps include
- No sitemap at all
- Not including the sitemap within your robots.txt
- Having an old and inaccurate sitemap file
- Having more than one version of your sitemap
- Omitting a sitemap index for large websites
- Neglecting to add an up to date sitemap to Google Search Console
The first step in identifying whether you are having sitemap errors is using the above checklist to crosscheck and review whether you are violating any of these guidelines. In Search Console you will be able to check the number of URLs that have been submitted and compare this to how many have been indexed. If you identify inconsistencies you may need to have your current sitemap reviewed and refreshed.
Although you can enlist the help of your developer to address some of these issues once they are identified, it’s important to remember that unless you have a dedicated SEO professional monitoring your site, you are likely to be losing visibility, traffic to your site and worst of all revenue!
While these items are relatively simple to correct its essential to highlight the value of having experts monitor and maintain your site. They can efficiently identify all of the components that add up to the big picture of where you stand in your journey to the best online visibility possible.