The ultimate guide to Technical SEO

You may not have heard the term technical SEO before, but it is quickly becoming a vital tactic in creating a sustainable search ranking. As websites get more complex and bloated, Google is paying more attention to the speed and coding of the pages it ranks. Despite this, technical ranking factors are often the last to be considered, since they are mostly invisible and many old school marketers aren’t aware of them.

To give you an understanding of what Technical SEO involves, we have documented some of the most common factors below.

Want some quick, actionable tips that you can implement now? Download our free Technical SEO checklist.

How important is Technical SEO?

Now you may be wondering “How much does technical SEO affect my rankings?” Well, quite a lot, actually. In fact, renowned marketer Neil Patel says that it is becoming one of the most important SEO tactics. With many websites competing to boost their organic search rankings, Google is finding more ways to differentiate the higher quality sites from the lower quality ones.

As the majority of traffic now comes from mobile devices rather than desktop computers, visitors are less likely to engage with sites which are slow, buggy or unoptimised for their devices. Obviously, Google wants to serve the highest quality sites to users, so in one of its 2015 algorithm changes Google started penalising unoptimised sites.

Speed Optimisation

Screenshot of results from GTMetrix

Speed is an important factor that affects the bounce rate of visitors to your website. Studies have repeatedly found that a slower website decreases the number of sales that e-commerce websites make. The last thing you want is people clicking your site from the search results page, then going back to the results because your site was too slow. Google sees this behaviour and subsequently lowers your rankings in the results.

There are many ways to improve the speed at which your website loads, however some are more effective than others. I recommend Cloudflare to most of the clients we work with, as it performs a lot of speed optimisations automatically. A few examples of speed optimisations are:

  • Image compression (use Smush for WordPress)
  • Static resource caching (i.e. images, CSS, JavaScript – use something like WP Super Cache)
  • Host more scripts locally than using remote servers (reducing the latency of extra DNS queries)
  • Defer loading of JavaScript files

GTMetrix is a free tool which allows you to check your site’s PageSpeed and YSlow scores. The average PageSpeed score is

Site Architecture

Considering your site’s architecture may not seem very important when you are just starting to build your site. However, it becomes more important as your business (and your website) grows. These architectural decisions aren’t just what blog categories you should use, but also:

  • How URLs should be formatted (i.e. /books?category=1 or /book/adventure)
  • The creation of sitemaps (these help Google crawl your site)
  • Internal linking to your own resources (letting Google know which pages hold authority)

Individual pages also need to have the correct architecture to make sure that Google understands them properly. For example, your page should only contain a single ‘h1’ tag, representing the title of the page. It’s also important to include ‘alt’ tags on images on your site, as screen readers use these to understand images. Use the SEO SiteCheckup tool for checking your images’ alt tags.

Page Errors

There’s nothing worse than browsing a site, finding a great resource and then being greeted with ‘404 – Page not found’. This will drive visitors away from your site very quickly, as it interrupts their browsing. Google knows that visitors hate this, so when it crawls your site it makes a note of any errors it comes across. The pages that used to be there get removed from Google, and the pages linking to them get penalised for bad user experience.

Nobody expects you to keep browsing your site constantly looking for errors, but there are some tips on reducing the impact on your site:

  • Use Google Search Console to keep track of any errors
  • When deleting a page or post on your site, use a 301 redirect to let people know
  • Don’t link to unreliable sites
  • When redeveloping your site, make sure you keep a sitemap, and use that to create new links

If you are often facing issues with your site or you would rather not have to fix issues internally, try our Website Maintenance service. We offer tailored packages that keep your site online, secure and working perfectly.

Cross-Browser Compatibility

Your visitors use a range of different web browsers to access your site, so you should consider all of them when building and maintaining it. Not only are people using modern browsers like Google Chrome and Mozilla Firefox, a small amount are using older browsers like Internet Explorer. This can be a difficulty for website designers, as these older browsers do not support modern web standards. For this reason, web developers need to create workarounds to support older browsers.

Browser compatibility isn’t known to be a ranking factor, but it definitely affects the bounce rate of visitors to your page. When Google discovers that visitors are leaving your site quickly, it will move you down in the rankings. This is especially a threat if your customers are more likely to be using older browsers. This could be the case with large organisations with outdated infrastructure.

Page Cannibalisation

Faces eating one another

Page cannibalisation is the issue of certain pages of your site competing with each other for rankings, leading to a lower ranking for your most important pages. Sometimes, the pages that you would like to rank higher are actually ranked below your other content. Issues like this often appear when you publish large amounts of content written by many different authors.

I recommend that you think about the topics that you would like to cover in your blog and make sure that you have an internal linking strategy so Google can understand which pages are more important. Hubspot has written a brilliant description of these cornerstone or ‘pillar’ pages.

While we are here, it’s also vital to ensure that your site doesn’t contain any duplicate content. This applies to both existing content on your site and content taken from another website. As it knows when it was published, Google will always rank the original content higher than your own. If duplicate content can’t be avoided (like a guest post on another blog), then you can use the ‘canonical’ tag on links to the original page of publication.


Google has started cracking down hard on insecure sites. Along with many other major internet players, Google thinks that the web should be secure by default. As of July, Google has been warning Chrome users of insecure (HTTP) websites in the address bar. In addition, Google has started ranking websites that don’t have an SSL certificate lower in results.

SSL isn’t the only security factor to consider, as Google will also detect malicious software or links on your site and penalise you for it. These malicious links can be added unintentionally by dodgy WordPress plugins or other vulnerabilities. By using security software on your CMS or web server, you can mitigate this risk somewhat. We recommend WordFence, which has a free version.

If your website is dynamic then it may be susceptible to a wider range of security vulnerabilities. This is particularly true if your website is custom built. To help combat these vulnerabilities, make sure your CMS or platform is regularly updated . If your website is mission-critical, then I highly recommend carrying out a full security audit.


This factor is fairly straightforward. If your site isn’t available when Google tries to crawl it, then Google will assume that it no longer exists. Therefore, it is essential that your website is available as much as possible. Not only this, but by being down your site is potentially missing out on valuable visitors.

It’s best to create a local version of your site when performing maintenance and working on that. You are ensuring that your site is available if anything goes wrong while working on your site. It is also worth looking into Cloudflare’s ‘always online’ feature, as that will maintain your site’s presence if it goes offline briefly.

The best way to avoid website downtime is to use a trusted hosting or cloud provider. Ideally, you should get your provider to put an SLA in place. An SLA means that they must compensate you for any downtime.

Hidden Technical Factors

Some of the technical factors that Google is thought to consider are completely invisible to the visitor of the site. These factors generally relate to the ownership of the site.

Domain Ownership and Age

Google is thought to consider the owner and registration date of a domain in its ranking factors. Since older sites are less likely to be temporary, they are unlikely to be spam or scam sites. Google can penalise a domain if it associated with a known spammer. Google might also consider the expiration date of the domain, as a spammer is unlikely to register a domain for many years in advance.

If you’d like to know your domain’s age and owner details, you can perform a WHOIS query.

Website Server

If Google determines that a single server is hosting many different spam websites, then it ranks your site lower. This is concerning for anybody who uses a cheap web host, since these are often shared by a lot of spammers. The best solution is to move your website to a dedicated host. This would give you complete control over how your server is used.

Limited Indexing

Sometimes, the reason why your page isn’t appearing in search results can be as simple as not having been crawled by Google yet. If this is the case, there may be an issue with your site map.

Simply type ‘site:{}’ into Google and see whether your site is fully indexed. If there are pages missing, then you should go into Google Search Console and submit a sitemap.