Search engine optimization (SEO) is the method of making a website rank higher on search engines in order to attract organic visitors who are looking for certain resources, services or products on search engines such as Google.
There are many different disciplines to cover when doing SEO to improve a website’s ranking, but one of major subcategories of SEO is technical SEO (also sometimes referred to as on-site SEO or on-page SEO), which focuses on improving a website itself in order to elevate user experience, page speed and crawlability of the site.
This article is a deep-dive into what technical SEO is and how working on your website’s technical SEO can help improve your site or blog’s performance when ranking on search engines. We’ll go through some of the technical SEO basics for optimizing your website as well as some technical SEO tips to improve your site without much effort.
As previously mentioned, there are many different techniques and areas of search engine optimization, with some of the most common ones being:
Keeping a good balance between how much work you put into the different areas will have a huge impact on how your website will rank as certain neglecting parts of your SEO will have a negative effect on your website performance despite putting work into the others.
This is also the case with technical SEO, as having a lot of backlinks and well-written content on your website will not be as beneficial on search engines if your website performance and on-page SEO is poor.
Technical and on-page SEO, as the name suggests, covers SEO related to a website’s technical elements. Many aspects of technical SEO have been proven and verified to affect performance on search engines with many first-party resources such as Google’s article on Maintaining your website’s SEO. Though there are also technical SEO techniques that are not specifically verified to have a positive effect on ranking though often recommended and developed by SEO-experts.
In this chapter we’ll go into some of the most important elements to attend to when improving your website’s technical SEO.
One of the most fundamental characteristics of a website with good UX is that pages load quickly. According to a Google article from 2017, 53% of website visits on mobile devices are abandoned if the site takes longer than three seconds to load. This means that there is a very clear correlation between a website’s load speed and retaining users on your website, and while search engines like Google have not officially stated that page speed has a direct effect on SEO, it is very likely that website performance is included as a factor in Google’s algorithm when ranking websites on search engines. It is recommended to keep your website’s load speed below 2 seconds though good page speeds are often considered to be below the 1 second mark.
So how do you optimize your website’s page speed? The performance of a website can be tied to many different things but some of the most important things to look at when improving load speeds are:
One of the main jobs of search engines is to crawl and index the pages of all websites that wish to rank in search engine results. To do this in an efficient manner, search engines have “crawlers” or “spiders” which are bots that visit websites and follow hyperlinks in order to get an overview over the different pages on a website that should be indexed and evaluated.
In order to further help these crawlers, there are a handful of useful files that you can include on your webpage that can be read by the crawlers in order to get a better and deeper overview over the pages on your website, which will increase the chances of your website getting indexed faster.
While crawler bots are capable of following internal links on a website in order to determine which pages exist, it is advised to also include a sitemap file which is a list of absolute urls to all the pages that exist on a webpage. The most commonly used format for sitemaps are XML sitemaps but can also be done as basic text or RSS.
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://wasabee.app/</loc>
</url>
<url>
<loc>https://wasabee.app/pricing</loc>
</url>
<url>
<loc>https://wasabee.app/roadmap</loc>
</url>
</urlset>
In the above sitemap there are three URLs disclosed which will be included in the indexing process of search engines. XML sitemaps can also include other properties than the url such as the <lastmod> field which indicates when a resource was last modified and is often used on pages such as blog posts or other text resources which might change overtime.
The sitemap file is often located in the root of a website (for example www.wasabee.app/sitemap.xml), which makes it easier for crawlers and other stakeholders to find your sitemap, though this is only a convention and not a requirement. If your sitemap has another name or location, you can use tools such as Google’s Sitemap report tool located in the Google Search Console or you can specify the location of your sitemap in your robots.txt file, which leads us to our next search engine file.
The robots.txt or “robots file” is another important technical SEO element, and is a file which determines if search engine crawlers are allowed to access a specific page or file on your website. This is convenient if there are certain areas of your website you don’t wish to have indexed on your website, such as an admin panel, though including certain pages in your robots.txt does not guarantee them from being indexed and it is advised to block indexing using noindex. The primary use case of a robots.txt file is to instruct crawlers which pages you would prioritize having indexed, as most search engine crawlers have something called a crawl budget that limits the amount of pages that it can crawl and index at a time - the robots.txt file can help the crawler go through and index important pages such as blog posts rather than pages that are not intended for search engines such as admin sections. You can read more on the use cases of the robots.txt file here.
Search engines like Google have made it easier to summarize and display your website content on search engines through a thing called structured data. Also sometimes referred to as rich snippets or schema markup, structured data is a standardized way of providing search engines with information tied to your content such as ingredients and instructions for a recipe or an image to a blog post. Search engines will then use this information to show enhanced search results that include some of this information - for example by showing an image or rating on a recipe.
If you wish to know more about structured data, we have an extensive article on how they can improve your blog or website’s SEO.
Site structure refers to the way the different pages on your website are connected and interlinked. Having a good site structure makes it easier for both crawlers and users to navigate your website and predict where certain resources can be found on your website.
Some of the most important characteristics of having a clean website structure are:
Auditing and testing your technical SEO is good practice and should ideally be done on an occasional basis in order to ensure that your website performs well on search engines. There are many different tools and techniques that you can use to test your website’s technical SEO (both paid and free).
Some of the most common and free SEO tools for doing technical audits on different technical SEO factors are:
The above mentioned tools are incredibly useful for ensuring good technical SEO but if you wish to have more in-depth and advanced SEO audits done, you may consider using paid services such as Semrush or Screaming Frog that offer tools for more extensive SEO analyses and audits.
While using a technical SEO audit tool to ensure that your website is following good technical SEO practices, it is also important that you don’t follow all advice blindly as different cases can vary from site to site.
I hope this article has given good insights to the fundamentals in technical SEO and how you yourself can start auditing and improving your website’s technical SEO in order to perform better in search engine results!
Wasabee is the easiest way to get up and running with a food blog and provides both hosting and an admin panel for creating recipes.
Search engine optimisation (SEO) covers a very broad spectrum of practices and neglecting one of them will end up having a negative effect on your website's search engine results despite you excelling in other areas of SEO. This means that even if you have good quality backlinks and well-performing content, it will not do very well unless you also focus on providing a good platform (website) for your content, which is where technical SEO comes into the picture. Focusing on technical SEO can make your website rank easier and faster while also providing a better experience for your visitors and readers.
While there are many different elements to attend to when optimizing technical SEO, some of the most important issues to take care of are:
There are various SEO audit tools that can help you identify if these issues are present on your website. Doing frequent audits of your website is a good way to ensure that your website upholds a good standard in terms of technical SEO.
It can be tempting to add lots of plugins and third-party extensions to your blog or website in order to make it more appealing and customized, but one of the biggest culprits to poor SEO is slow loading speeds on your web pages. This means that you should avoid adding unnecessary plugins to your site and opt for high-quality themes that have low overhead in order to improve page load speeds.