What is technical SEO and how does it impact my site?

Search Engine Optimisation. No other phrase makes marketers, growth hackers, and developers alike break into a cold sweat so succinctly. The days of stuffing keywords into a meta tag on your website to game Google are well over, but there are still many legitimate SEO optimisations you should be taking into consideration.

The factors that impact your search engine rankings can be divided into two broad categories: technical and non-technical. Technical SEO is how your site is put together, and it's what most people think of when they think SEO (pssst… it's also what we can help with 👉 website development). Non-technical SEO is everything external that can't be solved with code, like backlinks and domain authority. In this article we're going to talk about technical SEO, since that's what you have direct control from day one.

Metadata

Metadata is data that describes data, and it's the biggest part of nailing technical SEO. This includes things like hints and descriptions for search engines, accessibility, and, yes, keywords. Forget using metadata as a way to "hack" your search engine rankings, because it doesn't work. Instead, think of configuring metadata as making your site as user-friendly to search engines as it (hopefully!) is to your human visitors.

Structured data

Structured data is an evolution of the original meta information included on a web page. It's written in JSON format (JSON-LD to be precise) and allows you to provide extensive information about a page in a machine-readable manner. Search engines can use this data to display rich, deeply linked results for your site.

Provide structured data by placing JSON-LD in a <script> element with an application/ld+json type on your page. You can specify multiple data blocks throughout your site. For example we at Jellypepper provide information about our Organisation, WebSite, and BlogPosting on where appropriate.

<script type="application/ld+json">
  ({
    "@context": "https://schema.org",
    "@type": "Organization",
    "name": "Jellypepper",
    "url": "https://jellypepper.com/"
  },
  {
    "@context": "https://schema.org/",
    "@type": "WebSite",
    "name": "Jellypepper",
    "url": "https://jellypepper.com/"
  })
</script>

It's easiest to use a generator to create valid JSON-LD structured data. We recommend the TechnicalSEO's Schema Markup Generator. Once you've created your structured data, test it using Google's Testing Tool.

Descriptive metadata

Beneath structured data, your site should still have valid descriptive metadata. This is mostly for displaying things correctly in search previews, bookmarks, and browser tabs.

Ensure that every page of your site has:

  • A valid <title>, with a maximum of 65 characters. You should put your most important keywords here if possible
  • A valid <meta name="description">, between 70 and 160 characters long. Think of this as an organic advertisement, and use clear messaging with a compelling call to action. Don't stuff it full of keywords, optimize for humans instead
<title>An example website</title>
<meta name="description" content="A description of this example website" />

Viewport metadata

Viewport metadata is used to optimise your site for mobile devices. It's crucial to include this, since search engines like Google penalise websites that aren't mobile-friendly. Specify viewport configuration in a <meta name="viewport"> tag, where you can set width and an initial zoom scale.

<meta name="viewport" content="width=device-width, initial-scale=1" />

This snippet sets the width of the viewport to the width of the device, and the initial zoom level when the user visits the page.

Language metadata

If your site is available in multiple languages, you need to inform search engines which page they should list for a given locale or region. This is done with <link rel="alternate"> tags. Specify the locale in a hreflang attribute, and the corresponding URL in the href.

<link rel="alternate" hreflang="en" href="https://example.com" />
<link rel="alternate" hreflang="es" href="https://es.example.com" />
<link rel="alternate" hreflang="de" href="https://de.example.com" />

Even if your site isn't internationalised, you should still explicitly inform search engines to use the locale's default language for a page.

<link rel="alternate" href="https://example.com" hreflang="x-default" />

Canonical links tell search engines what the true source of a given document is, and what URL to display for results. For most pages, the canonical URL is the same as the page URL. But if you were to cross post an article from your blog to Medium (for example), you can flag your original blog as the canonical URL, and therefore give it higher priority in search engine rankings as the original source of the article.

In both cases, you should provide an explicit canonical URL for every page. Do so by adding a <link rel="canonical"> element to your document with the desired canonical URL as the href.

<link rel="canonical" href="https://example.com" />

Accessibility

Accessible websites are good websites, and there are so many reasons beyond SEO to ensure that your content is accessible by all users. Accessibility is also a huge field that we couldn't possibly do justice to in this article, so we're only going to lightly touch on the bare basics that impact SEO.

Image alt attributes

Every image on your website should have an alt attribute which accurately describes the content of the image. Not only do blind users require alt text for screen readers to describe images, search engines also use the information provided.

Get as descriptive as possible, and try to avoid generic words like "chart", "image", or "diagram". If the image is effectively contentless (eg: a decorative icon inside a button), you can provide an empty alt tag instead.

<img
  src="/img/example.jpg"
  alt="An example of an image with an alt description"
/>

Search engines don't just read the URL of a link, they also read its text. Users, of course, do the same, so you should be descriptive about the content you are linking to within the link. Avoid empty phrases like "click here", "learn more", "start", etc.

Legible font sizes

Ensure that your website is readable by a majority of users. The base font size across most of the web is 16px, and this is a good minimum to try and stick to. Font sizes less than 12px are generally too small to be legible, and require mobile users to "pinch to zoom" in order to read. With this in mind, you should strive to have over 60% of the text on your page larger than 12px.

Pro tip: If you use font sizes smaller than 16x on form elements (input, textarea, etc), many browsers zoom in automatically when they are focussed. This is a poor user experience and should be avoided

Avoid legacy plugins

Search engines, assistive devices, and many mobile devices cannot read the content of Flash and related plugins, in tags like embed, object, and applet. Which means that not only do they make your content inaccessible to a large portion of users, any content inside them won't show up in search results.

If you are working with an existing plugin-based website, consider converting it to HTML.

Tappable interactive elements

Ensure the interactive elements on your site (links, buttons, etc) are large enough to be easily tapped by mobile users. They should be 48px or larger on all sides (including padding), and have enough free space around them to avoid accidental taps (at least 8px between elements).

Technical considerations

Finally, there are some purely technical considerations to ensure your site is functioning well and easily indexed by search engines

Robots configuration

Robots configuration does exactly what it says on the tin: tells robots how to treat your site. In this case the robots we're talking about are search engines, and there are two places they look for instructions, a <meta name="robots"> tag on the page, and a robots.txt file at the root of your site (ie: /robots.txt). If you want everything on your site to be indexable and searchable, all you need to do is provide a simple robots.txt config allowing search engines to go everywhere

User-agent: *
Allow: /

You can also tell search engines things about your site in robots.txt, like the location of your sitemap and the canonical host of the site.

User-agent: *
Allow: /
Sitemap: https://jellypepper.com/sitemap.xml
Host: https://jellypepper.com

HTTP status codes

Just like users, search engines get grumpy if your page has basic HTTP errors and is unreachable. Ensure your server returns either a 200 OK or 301/302 redirect code when it loads. Returning a 404 Not Found or 500 Error code will result in the search engine de-indexing the page. Also, make sure you have a consistent trailing slash (or lack thereof). Search engines don't particularly like constantly redirecting users to one of two variants of your page e.g. jellypepper.com/work vs jellypepper.com/work/.

Anyway, that's it from us! Until next time.

Tell us about your project

Get in touch