A site with good technical SEO is the backbone of any Search Engine Optimisation campaign. No SEO campaign (really, no SEO campaign) will succeed if your site fails in regards to fundamental tech SEO problems.
You can have the best content in the world. You can have links from outstanding relevant URLs, but without good tech SEO your site will never rank.
But it’s a complex subject and one that can confuse anybody trying to get their site ranking. To help, we’ve put together a guide on 15 things you should know about technical SEO. We’ve categorised each item into sections:
- Key issues – These are the big problems and need to be fixed above all else. Key issues include indexation problems, site speed, and site structure. If your site is suffering from one of these problems, it’ll mean you won’t rank.
- Efficiency issues – While these aren’t as vital as above, crawl efficiency problems may mean that your website ranks, but not well. Think crawl budget, image size, & poor meta.
- Best practices – These are lower priority problems, but stuff you should get round to fixing. While they aren’t going to make your site rank position 1 overnight, they’re actions that ensure we follow Google’s best practice guidelines and can protect us against getting hit by an algorithm update in the future.
Okay, let’s go!
In most instances, key technical issues arise as a result of poor site structure. The way your web pages are structured can drastically affect how search engines like Google crawl your site. If you get this right it’ll make all the other tasks relating to tech SEO a whole lot easier.
Your website should use a flat site structure, meaning that all your web pages should be within just a few clicks of one another. This will make it easier for Google to crawl all of your webpages and it’ll help with overall site organisation. To help give you an idea of your current website structure, you can use a free tool like Visual Site Mapper.
Breadcrumbs are incredibly useful tool for users, allowing them to get back to their previous page and see where they are on a site at a glance. But did you know that they’re incredibly useful for SEO, too? They can vastly improve the internal linking performance of your website. Plus, they mean Google can display URLs as breadcrumb type nav in search results.
There’s a lot of talk in the iSEO industry about URL structure but it isn’t something to over complicate. The key is ensuring that your URL does two things well:
Shows a clear structure, allowing users to understand where they are on a site at a glance (think /category/product)
It includes your target keyword allowing search engines to quickly understand the types of terms that page should rank for
XML sitemap files feel a little old hat, but they’re as vital to a good SEO campaign now as they were ten years ago. Google has even said that they’re the second most important way that they find URLs to crawl.
To check that yours isn’t suffering from any issues, you can jump into Search Console and hit ‘sitemaps’ on the left hand side.
You might think that you don’t need to worry about this. You write creative, unique content for all your landing pages, right? Well don’t get ahead of yourself. There are a lot of instances where CMS setups cause page duplication, so this is something you should always check for.
If there’s a duplicate content issue on your site, consider one of four things:
- Re-writing the copy: if the pages that contain duplicate content both have their own intent and you’d like both to rank separately, write unique copy for each.
- Adding a canonical: If there are two similar pages but you want users to be able to find both, use a canonical tag to indicate to search engines which page you’d like them to rank.
- Adding a redirect: If you only need one of the pages live, 301 redirect the duplicate page to the page you want to see appear in SERPs.
- Noindex: You can also noindex a page that has dupe content, meaning that Google won’t show the duplicate page in search.
Page speed is one of the most important parts of technical SEO and one of the few that can have an impact on where your site ranks. It’s important to note that some issues may be unfixable depending on your CMS, so you should focus on the issues you know you can resolve.
You can use free tools like Google’s Page Speed Insights or GTMetrix to get a better understanding of the types of problems your site is facing in relation to site speed issues.
A couple of key things to watch out for are:
Webpage size: If you have a large web page size, then compressing images and improving server response time isn’t going to do a whole lot. If this is a problem you’re suffering with, look to fix this first.
Image size: Any images over 100kb you should look to losslessly compress. These images can drastically impact site performance. Use a free tool like TinyPNG to do this.
This problem could be caused by a whole host of different things. But it’s vital you check to see if Google have ANY problems with indexing your pages. You’ll often find that if there is a problem, it won’t just affect one page but many.
Your first port of call should be Google Search Console. It does a great job of breaking down any problems they’re having indexing your site and why it is they’re having that issue.
You can also use a dedicated SEO tool like Screaming Frog to get a better view on the pages that Google aren’t indexing.
The number of mobile searches is climbing. Rapidly. So it shouldn’t be much of a surprise to learn that the mobile usability of your website is vital to performing well in search engines. Not to mention the fact that Google now operates a mobile first indexing.
Once again, Google Search Console comes to the rescue here. They offer a report that gives you an idea of how many URLs are up to scratch in terms of mobile visibility and what needs to be done to make improvements.
It’s a walk in the park getting your homepage indexed. But those really in depth blogs you wrote that sit much deeper in your site architecture? With bad internal linking, those pages will struggle to see the light of day.
This issue loops back to our recommendation with site structure. But you should also be keenly aware of your internal linking. If there’s a valuable page on your site that’s much deeper in the structure, look at other areas of the site that can link out to it.
We mentioned duplicate content a little earlier, but you also need to watch out for pages that have ‘thin’ content. There’s an old saying in SEO that content is king. And it’s true. If there’s a page you have that you would like to see in SERPs for your target keyword you need to make sure that there’s enough content on the page for it to rank. At Embryo we value content highly, so if you’re struggling writing or find the resource to write content for your site, our content marketing team can help you.
If content is king, links are the queens of SEO. Bear in mind that search engine crawl bots traverse the web through the links they find, so if your webpage has dead links (or if someone is linking to you but to a dead page) it’s going to damage your SEO efforts.
You can use a tool like Ahrefs to crawl your site for dead links. You can then export them and make your way through the list, either updating the link or removing it all together.
HTTPs pages are much more secure, and Google don’t like plain old HTTP that much any more. Most importantly, HTTP pages can end up causing users with security warnings when they try to access your site. You can use Screaming Frog to find any pages that are HTTP on your site, or you can use this search query in Google:
You also need to ensure that your domain redirects from HTTP to HTTPs. Otherwise you could end up with duplicate pages. You can use a tool like HTTP status to check that all HTTP variations of your domain redirect correctly.
While schema isn’t likely to have a direct effect on rankings, it can drastically improve the way that some of your pages appear in SERPs thanks to rich results. And this can have a drastic impact on your organic click through rates.
This is a huge subject and warrants a blog of its own. We’ve put together a guide on 8 things to consider before launching an international SEO campaign here.
If your site has pages for different countries and different languages then you’ll need to implement hreflang. This should ensure that Google don’t consider the different variations of the pages created for different countries as duplicates. It’ll also help Google understand which countries the pages should appear in SERPs.
Tech SEO can be difficult to get right but important to remember that if you get the basics right, you’re well on your way to a successful SEO campaign.