Website Checklist: Technical SEO Audit

The success of your website’s SEO depends on far more than a few well-placed keywords.

And while on- and off-page SEO is crucial to any effective digital strategy, some behind-the-scenes work is also necessary to achieve great results. The points below all add into your website audit, which is essentially a run through of your whole website in order to ensure it is fully optimised for maximum performance, exposure and crawlability on the web. 

Below we run through several key factors of a technical SEO audit, and why they’re worth doing.

What is SEO?

For those of you who are unfamiliar, here’s a quick lesson on SEO. SEO stands for Search Engine Optimisation, which is the process by which a website can be better optimised for exposure online. In layman’s terms, getting SEO right can make the difference between being the top search result on Google, and appearing at the bottom of the search rankings. 

One area that’s vital to achieving a high result in search rankings is the speed and performance of your website. This will involve audit reports, crawlability checks, and a general overview of your site as a whole from a completely technical perspective.

We call this area technical SEO. It’s all the stuff your visitors never see, but are glad you’ve got in place. It covers everything from controlling how Google indexes your site to ensuring your website is performing at its optimal speed.

So, look no further to get clued up on how to perform a website audit to ensure you’re ticking all the boxes and doing all the right things when it comes to your own website. 

This 10-point checklist covers the core concerns when running a technical SEO audit and offers insights into improving your SEO performance.

Refine your SEO Objectives

1. Refine your SEO Objectives

As with any process, it’s sensible to begin by checking in with your overall goals.

SEO involves many aspects, and takes time to get right. It’ll involve keyword research, performance reports, content visualisation and an intrinsic understanding of your target audience. Understand what your audience is searching for, and you can tailor your website to meet those needs. 

SEO works alongside all other elements of business – PR, eCommerce, visibility, web design, social media – all the important stuff. It makes sense, then, to ensure SEO is considered from the very start. 

So, before you do anything else, refine your objectives.

This may include: 

  • Your budget (the bigger this is, the more effective your campaign will be – simple) 
  • Your target audience
  • Your goals for the business
  • Your brand identity (how people find you, what they search) 

Essentially, what is it that your digital strategy is aiming to do for your business?

2. Review your XML sitemap

Your XML sitemap is u

Your XML sitemap is used by search engines looking to crawl your site, which is the process by when your webpages get indexed. 

What is an XML Sitemap?

Simply put, it’s how search engines like Google get to grips with what your site is offering. Your crawlability refers to how Google will literally crawl through your website, through a series of links that collectively form your- you guessed it – site map.

Having an incomplete or out-of-date XML sitemap impairs the search engine robots’ ability, or willingness, to pull the information from your website that you want them to. 

TOP TIP: Clear guidance on what your XML sitemap should look like can be found here.

If your website is lacking an XML sitemap entirely, then we’d suggest getting one should be top of your priorities list.

How do you do this?

For one thing, turn to us for web support! We can take care of your WordPress website for you, monitoring your web performance and finding efficient solutions to improve usability & performance. 

You can check your own sitemap by heading to your Google Search Console, where it should have been registered when first created. On the left hand side, you will find a subheading labelled Index you’ll find Sitemaps which lists all those associated with your account.

An added bonus of a refined XML sitemap is that you can use it to set up a ‘crawl budget’. This adds a level of control to how you direct search engines to crawl your website by directing how you want to ‘spend’ your budget (more here).

3. Optimising your Robots.txt

Beyond the XML sitemap, you can also control how your website is indexed via instructions left in your code for Google’s bots. An essential box to tick on your technical SEO checklist, this determines how robots make their way around your website. 

What is robots.txt?

Your robots.txt file is the first port of call for indexing robots when they arrive at your website. It instructs search engines regarding how they get around your site. The primary instruction list is your robots.txt file. It comprises of various typed functions that search engines follow when crawling your site. 

By using a disallow function in your robots.txt file, for example, you can close off access to certain pages and, therefore, create some control over which pages do get crawled. 

If you find you have disallowed access to some key pages on your site, you’re doing yourself an injustice and preventing search engines to do what they’re designed to do. 

How to find robots.txt 

Finding the robots.txt is easy enough. Simply type ‘/robots.txt’ to the end of your webpage URL. 

How to edit robots.txt 

Web developers are the way to go if you need to edit your robots.txt for any reason. Reach out to us for more info on this. 

We are also able to do a more refined review of exactly how well your website’s being searched by different search engines by identifying exactly how many pages each search engine is indexing.

This allows us to identify whether a particular search engine is struggling to find its way around your website and take actions to show them the way. Obviously, if a search engine can’t find the information it needs, nor will your customers be able to find you under relevant searches there.

A robot looks passed the camera

4. Make sure everything you want indexed is indexable

Aside from checking all your primary pages are crawlable in your robots.txt, there are several other points to cover on making the robots go where you want them to.

The noindex meta tag, used to prevent indexing of precise website elements like images, and the X-Robots-Tag are two other examples of functions that influence indexing, and it’s worth digging down into such details.

Next, check for Orphan pages

Orphan pages are bad news. Orphan pages are webpages / URLs with no internal linking, meaning they cannot be indexed or discovered by search engines. Web crawls don’t discover them, and the page will likely never show up in search results. As orphan pages have no internal links directed at them, a visitor can’t access them unless they go straight to the correct URL. Search engines are unlikely to index them (there is a suspicion they can even lead to your website being penalised) and so they are entirely useless for your SEO.We’d recommend you remove orphan pages, the sooner the better!

This is an area your visitors may well have noticed, even if you haven’t yet.

If you’ve got important webpages that are hidden three clicks in, they’ll get lost in the fray. You won’t be making them important enough for the robots to reliably pay them attention, and, more to the point, your visitors may get frustrated wandering around your site trying to find them.

So, tighten up those pathways. This is known as ‘Click Depth’ for more than just improved SEO performance.

Also, make sure that—if you haven’t already—any broken links are fixed, and orphan pages are removed or linked to. 

Robots are not going to go out of their way to figure out your website, so make it easy for them and remove all unnecessary barriers to their work. Also, while you’re there, check that your site architecture is offering as efficient an SEO experience as possible.

If your website has been designed with user experience at the forefront of decision making—as is the case with every Code23 web design—then this should all be in place already. But with clear SEO value in a good site architecture, it’s worth making this part of your SEO audit. 

6. Check HTTPS

HTTP and HTTPS can mean the difference between a secure site and an unsecure site. Secure sites are vital, as visitors and potential customers will want to be working with a secure site. You will secure your communication, show legitimacy and increase trust level of users.  

Whether you’ve migrated to a HTTPS connection yet or will do in the near future—we’re all heading that way—it’s important to check that every element of your webpages is loading via secure channels.

Top Tip: If you operate on WordPress and think your site might need a security overhaul, come to us for web maintenance support!  

If your images, videos or JavaScripts load over a less secure HTTP connection, issues can arise. Mixed content makes your site less secure, as one piece of media has the potential to damage your HTTPS status.

Some online tools can check this, and it can be incorporated as part of a technical SEO audit, such as the one we offer at Code 23. 

URL Structure

Don’t forget to check that your URLs are optimised for SEO. They should contain relevant SEO keywords, be friendly—short, clear and memorable—while avoiding any over-optimisation tactics such as keyword stuffing. (Google doesn’t like keyword stuffing one bit, and neither will you when they hit you with a penalty.)

A map of a website’s architecture drawn in pencil

7. Test and improve your page speed

Page load speed is something you should be aware of. If you’re not – why not? How many times have you left a website or mobile site because you’re frustrated at it’s loading speed? Well, exactly the same thing applies to your customers!

Page load speed wasn’t so much of a problem when desktop searches lead the way. However, when mobile search for e commerce overtook desktop in 2016, it suddenly made page load speed times a hot topic. 

Easy ways to improve your page speed include: 

For more information on page speed, and how to measure and improve yours, check out our article on the benefits of having a fast website.

TOP TIP: Use this website to check your site speed. It’s an essential part of your wider technical SEO audit, so don’t forget! 

8. Get mobile-friendly

A Google ranking signal s

Google launched mobile-first indexing from 2018 and the popularity of m-commerce and mobile search is only on the rise.  

Just take a look at this graph:

If your website isn’t mobile-responsive, or you haven’t built a mobile-specific site, get on it!

Any good technical SEO audit needs to carry out a full audit of your mobile site. This ensures that your mobile version is pulling its SEO weight as well as—or better than—your desktop version. You can find more technical guidance on how to do this here.

9. Ask for a re-crawl, then start monitoring

Once you’ve run your technical SEO audit and taken the actions that will improve your SEO performance, you’ll need to tell the search engines you’re ready to be re-crawled. 

To do this, head back into your Search Console and use the Fetch to Google function under Crawl, then select Submit to index when your site appears. You can choose to do this for up to 500 individual URLs per month or fetch all the linked pages from 10 ‘starter’ URLs instead.

This SEO audit’s over, but your work’s not done yet.

Carrying out regular technical SEO audits is vital to maintaining SEO success. With regular updates to your on-site content, Google’s algorithm and other factors that influence the landscape of your technical SEO, keeping a regular check on it is vital.

We carry out regular monitoring and full technical SEO audits for our clients to keep their SEO performance at the top of the charts. Talk to us about doing the same for you.

10. Make the most of every crawl

However often your website gets crawled, it’s important that you make the most of every opportunity.

Here’s a quick rundown of the ways you can make every robot visit worthwhile:

• Fix all broken links.

Broken links can be seriously detrimental to your overall website as they devalue your site, and make a bad user experience. A broken link is any link on the site that does not work. This may occur due to the page no longer existing, or because there is an error in the URL. Users discovering broken links doesn’t help with user experience and will likely put them off using your website, so look out for these. 

• Eliminate any keyword cannibalism.

Keyword cannibalism refers to the process whereby you essentially compromise your own website through duplicate content pages. This essentially makes it harder for Google to trace and rank one page, if there are content pages offering the same thing. 

• Add URL parameters to make sure Google only crawls each page once.

URL parameters are used to work out how search engines should handle parts of your site based on your URLs, in order to crawl your site more efficiently.

Parameters are used for: 

  • Tracking 
  • Reordering 
  • Filtering 
  • Identifying 
  • Paginating  
  • Searching 
  • Translating

• Reduce the length of any chains of 301 or 302 redirects.

Unnecessary redirects just confuse Google, so ensure your URLs are as simple to crawl as necessary. 

• Vary your metadata, especially descriptions.

Your metadata tells search engines what you have to offer, so vary it and cover all bases. Using the same keywords throughout will limit your search results by default, so cast your net wide. 

• Stop indexing pages that have no SEO value: privacy policy, T&Cs, outdated promotions, etc.

Although these are important to have, they are not going to be what people are searching for. Bearing this in mind, ensure they are a good quality but don’t waste time and energy optimising them. 

• Refine your ‘crawl budget’.

Firstly, what is a crawl budget? Simply put, a crawl budget refers to the amount of pages on your website that Google crawls through. This exact number, or ‘budget’ is often determined by the: 

  • Size of your website
  • Health of your site 
  • Number of links to your site

• Remove duplicate content.

Duplicate content isn’t doing you any favours, and will make your web page look tired. Go through your content and determine any room for error, and consolidate content wherever necessary to create one or two killer pages, as opposed to many pages offering the same thing. If a technical audit daunts you, you don’t have to go it alone. There’s plenty of tips online, or contact us at Code 23 web design. We can handle everything for you. Sorted.

If a technical audit daunts you, you don’t have to go it alone. There’s plenty of tips online, or contact us at Code23 web design. We can handle everything for you.

Leave a Reply

Your email address will not be published.