Website Checklist: Technical SEO Audit

Discover how to review your SEO technically and improve your website’s performance

The success of your website’s SEO depends on far much than a few well-placed keywords.

And while on- and off-page SEO is crucial to any effective digital strategy, some behind-the-scenes work is also necessary to achieve great results. 

We call this area technical SEO. It’s all the stuff your visitors never see, but are glad you’ve got in place. It covers everything from controlling how Google indexes your site to ensuring your website is performing at its optimal speed.

This 10-point checklist covers the core concerns when running a technical SEO audit and offers insights into improving your SEO performance.

1. Refine your SEO Objectives

As with any process, it’s sensible to begin by checking in with your overall goals.

• What is it that your digital strategy is aiming to do for your business?

• Who are you targeting and by what means?

While the answers will only have a minor bearing on the choices you’ll make during a technical SEO audit, having them in mind will enable some joined-up thinking about your SEO in general going forward. 

2. Review your XML sitemap

Your XML sitemap is used by search engines looking to crawl your site, which is the process by when your webpages get indexed.

Having an incomplete or out-of-date XML sitemap impairs the search engine robots’ ability, or willingness, to pull the information from your website that you want them to. Clear guidance on what your XML sitemap should look like can be found here.

If your website is lacking an XML sitemap entirely, then we’d suggest getting one should be top of your priorities list.

You can check your own sitemap by heading to your Google Search Console, where it should have been registered when first created. Under Crawl you’ll find Sitemaps which lists all those associated with your account.

An added bonus of a refined XML sitemap is that you can use it to set up a ‘crawl budget’. This adds a level of control to how you direct search engines to crawl your website by directing how you want to ‘spend’ your budget (more here).

3. Check how you’re being indexed

Beyond the XML sitemap, you can also control how your website is indexed via instructions left in your code for Google’s bots.

The primary instruction list is your robots.txt file. This is the first port of call for indexing robots when they arrive at your website.

By using a disallow function in your robots.txt file, you can close off access to certain pages and, therefore, create some control over which pages do get crawled. 

Be aware that this can count against you if you disallow access to important pages or important internal linking pathways through your website. So, take a close look at in your Google Search Console (Crawl – Robots.txt Tester) and make sure the right paths and pages are indexable.

We are also able to do a more refined review of exactly how well your website’s being searched by different search engines by identifying exactly how many pages each search engine is indexing.

This allows us to identify whether a particular search engine is struggling to find its way around your website and take actions to show them the way. Obviously, if a search engine can’t find the information it needs, nor will your customers be able to find you under relevant searches there.

If you’re interested in getting this detailed information, talk to us directly.

A robot looks passed the camera

4. Make sure everything you want indexed is indexable

Aside from checking all your primary pages are crawl-able in your robots.txt, there are several other points to cover on making the robots go where you want them to.

The noindex meta tag, used to prevent indexing of precise website elements like images, and the X-Robots-Tag are two other examples of functions that influence indexing, and it’s worth digging down into such details.

Next, check for Orphan pages.

Orphan pages are webpages that have no internal links directed at them. This means that a visitor can’t access them unless they go straight to the correct URL. Search engines are unlikely to index them (there is a suspicion they can even lead to your website being penalised) and so they are entirely useless for your SEO.

Better to fix them or get rid of them.

5. Make the most of every crawl

However often your website gets crawled, it’s important that you make the most of every opportunity.

Here’s a quick rundown of the ways you can make every robot visit worthwhile:

• Fix all broken links.

• Eliminate any keyword cannibalism.

• Add URL parameters to make sure Google only crawls each page once.

• Reduce the length of any chains of 301 or 302 redirects.

• Vary your metadata, especially descriptions.

• Stop indexing pages that have no SEO value: privacy policy, T&Cs, outdated promotions, etc.

• Refine your ‘crawl budget’.

• Remove duplicate content.

6. Carry out an internal link audit

This is an area your visitors may well have noticed, even if you haven’t yet.

If you’ve got important pages more than three clicks from your homepage, then you’re not making them important enough for the robots to reliably pay them attention. More to the point, your visitors may get frustrated wandering around your site trying to find them.

So, tighten up those pathways—also known as ‘Click Depth’—for more than just improved SEO performance.

Also, make sure that—if you haven’t already—any broken links are fixed and orphan pages linked to or removed entirely. Robots are not going to go out of their way to figure out your website so make it easy for them and remove all unnecessary barriers to their work.

And while your there, check that your site architecture is offering as efficient an SEO experience as possible.

If your website has been designed with user experience at the forefront of decision making—as is the case with every Code23 web design—then this should all be in place already. But with clear SEO value in a good site architecture, it’s worth making this part of your SEO audit.-

A map of a website’s architecture drawn in pencil

7. Check HTTPS

Whether you’ve migrated to a HTTPS connection yet or will do in the near future—we’re all heading that way—it’s important to check that every element of your webpages is loading via secure channels.

Mixed content issues can arise when images, videos or JavaScripts load over less secure HTTP connections. Having these makes your site less secure, meaning your smart decision to migrate to HTTPS gets undone by one piece of media.

There are online tools you can use to check this, but it is also part of the suite of checks we offer as part of our technical SEO audit. One advantage of working with us is that we put these results in context with all the others, saving time interpreting different data from multiple sources.

And while your staring at your address bar, also check that your URLs are optimised for SEO. They should contain relevant SEO keywords, be friendly—short, clear and memorable—while avoiding any over-optimisation tactics such as keyword stuffing.

Google doesn’t like it and neither will you when they hit you with a penalty.

8. Test and improve your page speed

A Google ranking signal since 2010, your page loading speed is becoming ever more important to SEO. It is something that can be improved through a series of steps like compressing media, reducing ad ‘weight’ and choosing a quality web host.

For more information on page speed, and how to measure and improve yours, check out our article on the benefits of having a fast website.

9. Get mobile-friendly

Mobiles are on the rise in the online world so, unsurprisingly, they are becoming more and more important to SEO too. In fact, mobile matters so much today that Google is planning to launch mobile-first indexing from 2018

If your website isn’t mobile-responsive, or you haven’t built a mobile-specific site, now would be the time to develop one.

Any good technical SEO audit needs to carry out a full audit of your mobile site. This ensures that your mobile version is pulling its SEO weight as well as—or better than—your desktop version. You can find more technical guidance on how to do this here.

10. Ask for a re-crawl, then start monitoring

Once you’ve run your technical SEO audit and taken the actions that will improve your SEO performance, you’ll need to tell the search engines you’re ready to be re-crawled. 

To do this, head back into your Search Console and use the Fetch to Google function under Crawl, then select Submit to index when your site appears. You can choose to do this for up to 500 individual URLs per month or fetch all the linked pages from 10 ‘starter’ URLs instead.

This SEO audit’s over, but your work’s not done yet.

Carrying out regular technical SEO audits is vital to maintaining SEO success. With regular updates to your on-site content, Google’s algorithm and other factors that influence the landscape of your technical SEO, keeping a regular check on it is vital.

We carry out regular monitoring and full technical SEO audits for our clients to keep their SEO performance at the top of the charts. Talk to us about doing the same for you.

Leave a Reply

Your email address will not be published.