Before you even begin your SaaS SEO strategy, you’ll first want to optimize your website for search engines, also known as technical SEO.  

Technical SEO is often neglected by SaaS companies since it’s difficult to measure the direct impact on ROI and doesn’t exactly fit into most SaaS budgets

Even though it’s difficult for SaaS companies to measure the impact of technical SEO, it’s still an important element of SEO that shouldn’t be ignored.

Technical SEO for SaaS websites is the process of optimizing how search engines interact with your website, such as discovery, crawling, rendering, and indexing, but also includes other elements like improving user experience. 

If you neglect technical SEO, your site will have less pages indexed, meaning less pages ranking and driving traffic, and less overall growth for your SaaS

In this article, you’ll find a technical SEO checklist you can use for any SaaS website. 

Be sure to check out this checklist if you want a full item list for SEO.

SaaS Technical SEO Tips You Should Know 

Check if Your Pages are Being Indexed

Even before you start creating content for SEO, you will want to review Google Search Console to see if search engines can index your website. 

z6LAm68oRtMdoVrPc5R6uXngzPt9Cn3d1K f1vuyhGBhwS6RbRUEdmnWX IZjR9zk8LwH2QJnYCQGQf956rSADQKGWhzPPb5dfuUqBiHu1 StiEmd5yXzfwmPXFnAuXmI8oAc 4YemwphV 0ewOYBtk

Indexing is the process of having search engines accept website pages into their database and use it to match to relevant queries.

Simply put, Google is pulling website pages from their index and using them to rank for keywords. 

So if search engines can’t index your content, it won’t rank.

That’s why indexing should always be a top priority for SaaS websites when it comes to technical SEO. 

If search engines can’t index your content, then your SEO efforts will all be for nothing. 

In Google Search Console, you can see which pages are indexed and which pages were excluded from Google’s index. 

zPhaETE 1nZmsX275XlLnHCAEEKk5mgvWAcbRsbLO200J5vMPCBdgw2oyVtGLcal3 ibZ6iR4OXwUdlY0H7MbQ2jqwavZIZDxbBKhH6w954a3hdP8 nvU5RKlDp2xBFgxXdF stp83UJqfKiFjLXRzM

There are a few reasons why content can be excluded from Google’s index, but the most common indexing issues are:

  • Noindex tags
  • Canonical tags
  • Redirects
  • 404s
  • Soft 404s
  • 5xx issues
  • Crawled not indexed
  • Discovered not indexed
  • URL blocked by robots.txt
  • Redirect error
41yNp9S8Sx fNvv mX8i1OfYTlm9gMzaURTpORoOPv7dYqMO6HKGTApMI8qTKB5VSOAgZo1t9nchRzeJ5xyuwfPWrnq9iZuzfhYQXG3iyW9Wz7qGLdwQ ZLinrK0OGDZN4 O7mkUHBM96EZ QfjABKY

Noindex tags are directives, so if Google sees it, they have to honor it and exclude that page from their index. 

Noindex tags are often applied intentionally, but you should still review the tags anyways to see if they’re set on the right pages. 

OFG BZ2zmZotW3 Aa9Q3aVwaS xU9ogtSvZvQ9415Sw4Z8 qzs8cgVVEh1JytPEbyA3bPjwzw0

Check if Your Pages are Being Crawled

After you’ve reviewed your indexed pages, you’ll also want to check if search engines can crawl your website. 

The process of getting your content indexed on Google looks like this: discovery > crawling > rendering > indexing

So if your website can’t meet any of these requirements, your content won’t rank and you’ll be missing out on any traffic potential that page has.

It’s not uncommon to have non-indexed pages that should be indexed, but you will still want to prioritize getting these pages indexed as they could be driving traffic to your website, and even contribute to revenue growth. 

So review these excluded pages and prioritize them based on their value if they were indexed.  

But with crawling, similar to indexing, Google will list out reasons why they aren’t crawling your content. 

1QHkgzqioAr iiuVhPZalkNMSfwa28Zk7cP1mSCt3V1 fHGOmRxycrUNUPTFCF0bWko2z1pe1SnrBR73dBLrp4BRmPPIyjT9rlrU4DoV shl C HfJfm9eQgydsHa 4RmuoR3IWZX543eU4nXGaKFgk

Most of the time, crawling issues come down to directives and crawl efficiency (and maybe even crawl depth for larger sites).

Check Your Robots.txt

Your robots.txt file is a directive for Google that dictates how they should be crawling your website. 

Wecz66ZOql VCko1uQjN1BxurFW6H Y8rKDhjzerQgxMv5rqGHPn QldjYyJgxghLTCz3DCX0KAxAGpcg6ejarrNqZppOE5W4nJr8YvrUAiFg NoeN4kf8ZYxAXBD4y9uJPDQR

Your robots.txt file can prevent search engines from crawling certain sections of your website and prevent pages from being indexed.

Unfortunately, it’s common for website owners to disallow search engines from crawling their entire site, so this is something you’ll want to check as well. 

Say if you set up your robots.txt like this: 

  • Disallow: / 

You will be directing search engines to avoid crawling your entire site since search engines use your robots.txt to follow your website’s directory path. 

I’ve seen this happen multiple times and it can lead to some brutal results. 

So always check your robots.txt for any misused directives.

Check Your Crawl efficiency

Crawl depth has been commonly used as a term when pages are not being crawled outside of your robots.txt, but a better term to use is crawl efficiency. 

Crawl efficiency is the process of optimizing your site for crawling, so search engines only crawl pages you want them to while making it easy for them to crawl your website. 

Crawl issues are more common for e-commerce websites with a high amount of pages and pagination, but if your website isn’t optimized for crawling, then Google may struggle with discovering older pages.

cSHiDLlNfpUWuMLnwMRHK2UVDReCFQyIOv3ju6KWZWktY7wD9H9yvPwkpM1hhS3qOjU01jGw5gQ8fxUZcmQ8GUZa9FdUfPbHw1nBAmAkKRW5TAsLOR9prH4IL0E3rJO5 UqWe13eJY cfaU8CP7rRmE

As an example, say you have over 50 paginated pages, each paginated page will count as a click, or crawl depth. 

So by the time search engines reach page 50, they would have had to crawl through 49 pages before reaching their destination. 

Google recommends keeping your crawl depth around 3-6 clicks from the home page to reduce crawl depth and improve your crawl efficiency. 

I’ve even seen search engines give up on crawling around the 10th page before choosing to crawl other pages.

If you want to reduce the crawl depth search engines have to take, then you can either reduce your pagination pages by expanding your pagination feed list or by adding more internal links.

A great way to visualize this is by using Screaming Frog’s force-directed crawl visualization. 

Wyjl5JgVdUfUuRNw3pF0ch01tQhpcWha9qC2asVPDkpFU4HUnE04uq85CjXY5vR6qDC62z6 8VTRf nlw9p705UeldqfLf1d1

This visualization tool will allow you to get a bird’s eye of how search engines are crawling your URLs through internal links. 

From there, you can decide which pages have a high crawl depth and need internal links. 

Check if Pages are Being Discovered

The last part of the indexing process is discovery.  You’ll want to check if Google can discover your URLs. 

RtX OoHxcgS0NFYLhEhNgnu6ugaduMbHWYIx YZSxCAnGU1Gjn6M62 EnTqqtRygH5kxBjU9DIvcwU6kvG UukrDfojXJ7FvAt8wJOVUtw2GlD9Ree3zOpbTOJzVyy

Again, discovery and crawling all factor into whether your content is indexable or not, so you’ll want to ensure Google is discovering and crawling your pages. 

Crawl efficiency still plays a part in the discovery stage, especially with paginated pages, but internal links help crawlers discover content on your website and reduce crawl depth.

And the more internal links a page has, the more important search engines see that page as, making it more likely to be crawled. 

Outside of your sitemap, you’ll still want to internally link every page on your website. 

This brings me to my next point, sitemaps.

Setting up a Sitemap.xml to Help Search Engines Discover Your URLs

A sitemap.xml is a file on your website that contains all indexable URLs on your website that you want search engines to find and crawl. 

bO ZvkuX 1Lqb24jUVrMzrebZgM9xH9lbOzXz v8dltdd2iWS5N9myUqD9122XXsJlTYXtE U4g40T11Euy4prSm savcZW0RQWOVUgsywLl eApJdkx QSKm38Lh9POnLpuF5bwXw26FMfUAnLewVo

This file is a hint and not a directive, so this means search engines can choose whether or not to honor it.

Search engines will typically honor your sitemap, but that doesn’t mean they’ll index every URL. 

And they will even crawl URLs outside of your URL if you don’t have your robots.txt configured. 

RySY85Sc2C7CTFfDIvn1O4hvFZ4Y6q6v09hKtI94r3EWKWIQtYZxN5SJEpH0ickCs0DRJ4YbGanu8xMyVEqrJcXvqTsw6g8hIJKVrkaBFPnVKp dAOtHCXhebELvweEyNIeheg9hhbfYef0cRkwh9LQ

This is why crawl efficiency is important, you only want Google to crawl the pages you want indexed.

And having a sitemap.xml is the first step to influencing how search engines interact with your website. 

Check for Duplicate Content

You should also identify any pages on your website that may be considered to be a duplicate of another page. 

Duplicate content will not only cause search engines to show the wrong search result, but it can lead to content being devalued and a loss of trust with search engines. 

To find duplicate pages, you can use Google Search Console to find pages marked as “duplicate without user-selected canonical.” 

This error means that Google found two pages to be near-duplicates and chose to index one page over the other, regardless of whether or not they target different search intents.

To fix this, you’ll need to add self-referencing canonical tags to each page and optimize them both so each page can be considered unique.

 If you’re just copying and pasting content across your website, you’ll likely have your content marked as a duplicate. 

Identify Canonical Tag Issues That Are Preventing Your Pages From Being Indexed

Canonical tags are often the culprit for why content can’t be indexed. 

Again, canonical tags are a hint and not a directive, so search engines can choose whether they want to honor it or not, but they will take it as a hint.

snMbZwqVFcP7ucqSGamUIyMoG4uL5ARR4j3bDxPXITd

You’ll likely come across a few canonical issues in Google Search Console like:

  • Duplicate without user-selected canonical
  • Google chose different canonical than user
  • Alternate page with proper canonical

An alternate page with proper canonical means your canonical was read and accepted properly, no other action item is needed here. 

Google will decide which page should be the canonical if you have pages without a canonical tag. 

Google will likely select that page or a similar page, but by adding a canonical tag, you can at least influence Google’s decision on page indexing.

This can either be a self-referencing canonical or a canonical tag referencing another page.

If Google ignored your canonical tag, then you’ll want to review the non-canonical page Google indexed to see why that is. 

Sometimes similar content without canonical tags will cause Google to index one over the other, even if they target different keywords. 

Canonicals can be a tricky game, but it’s important to review them to ensure Google is crawling and indexing the right URLs.

Having a Solid Site Structure and Informational Architecture

How you structure the architecture of your site will influence how search engines (and users) interact with your website. 

Especially for SaaS websites, it’s important to build a site hierarchy that Google can easily understand.

To make your site architecture easily digestible for both search engines and users, you should create a logical hierarchy of categorized pages with lots of internal links that keep pages within 4-5 clicks of the home page. 

Having a simple architecture will allow search engines to easily discover your content while distributing link equity throughout your site.  

Identifying Keywords That May Be Causing Cannibalization Issues

Similar to duplicate content, you’ll also want to identify pages affected by keyword cannibalization.

While duplicate content is in terms of copy, keyword cannibalization occurs when two pages are targeting a keyword with the same search intent. 

While both pages are unique and indexable, they will lose out on visibility with one page being devalued and the other not ranking.

sqHYAkQwVEt1YEblMa5vUyU eRP1TO24Ref6sQRzXWOiMe7bjexeXmEIkUOKfsU7MgsM9Zc4DrUvAVWTwNkHLngSKo7dwNS12uo5pxZaDys6O6vbHHkXeC67HYFx93sxg9uKx MzZtLbE5ge d6kbJI

Google will likely devalue both cannibalizing pages if they can’t decide which one to use. 

To fix keyword cannibalization, you will want to optimize both pages for different search intents.

Meaning keywords that have different search results.

As an example, say you have one page optimized for “SaaS marketing” and another optimized for “software as a service marketing,” both keywords will return the same set of search results.

If you want to fix this, you can optimize one page for “SaaS marketing” and another for a keyword like “how to market a software product,” since they both have different search results.

Only reoptimize a page if both keywords result in the same search results.

Optimize Your Website’s JavaScript for Search Engines

If you’re using JavaScript, you may run into some technical issues that can impact your SEO. 

Not only can JavaScript slow down your website, but it can also cause crawling and rendering issues for search engines, which can lead to indexing delays.

gN7TBdh9Y ziL1BmXO ISSyFD1eD6KIgN3YlTpaVs9yyzEAIgLRyBcu8vMbwLg90cnQ5yVnkAghteM6Kgceb7tvGE6WcykrQRkV 6DXs5c3gBwkoNJU6mXlmDVeKqDuLJHe9Yt6EXvkt6iNTT VUGVc

To prevent JavaScript from impacting your SEO, you should minimize and delete unused JavaScript wherever possible. 

You can use tools like Google’s PageSpeed Insights to see pages where JavaScript might be causing issues.

Check That Your HTML Elements are Present

One of the more important ranking signals for your website is your HTML elements like title and header tags. 

Not only are HTML elements ranking signals for Google, but they also help readers better understand your content, like structure and accessibility (alt text for those using page readers).

If you’re missing these elements, you’re missing an easy opportunity to optimize your content for search. 

You can use tools like Screaming Frog and Ahrefs to find any pages on your website that might be missing HTML elements.

Find and Fix Internal Broken Links

Another SaaS technical SEO tactic is to find and fix all broken internal links on your site. 

0en8VhXY9tekSSnY4yooUgbHNeQIgmlmScX0cVFUkbR1JyFKVP1PNrDQ wBiCFniwVBShJJCcL0hFSiEqf6

Not only do internal links help search engines discover content, but they also distribute link equity, also known as PageRank, which helps your pages perform better.

If you have broken internal links, especially from pages with backlinks, you’re wasting link equity that could otherwise benefit another page. 

Broken internal links are also bad for user experience as well. 

If searchers are expecting to land on a relevant page and land on a 404, they might just leave your site. 

RGk3Lz4rbrIh8XDjFrjJtWpErwrxTHk XMvXS5UTbQ1v6tE GhJyiffF6TTiNoRljbZO74g0ZbGqDY94SVwk7sxzN29GwCRSTiHeg4UCbay0FlmHt7Aug0Ow23q0a 1ghH6DbXXfgn4ZBFh6zbWwq6U

You should always either update your internal links with a new link or use 301 redirects to help consolidate link equity into a relevant page.

If you want to find these broken links, you can use Screaming Frog, Sitebulb, and Ahrefs to find them. 

From there, you’ll just need to find a relevant page to redirect to. 

Google has stated that redirecting directly to the home can cause soft 404 issues, so you’ll ideally want to redirect to any page outside of the home page.

Update Internal Redirects with the Proper Destination

Fixing broken internal links tie into my next suggestion, fixing internal redirects. Not that redirects are bad for your site’s SEO, but too many of them can cause some SEO-related issues like crawl delays and slow loading speed.

unparC LPibxif0yvZF66AHwTuwSkjmCFpJ 3hx0sEOpVjCpE9cm2FnO7LbMdyBCjfKQ0KUooKItB8skWxju0vIT eLOENDfj Xy9ctIcLim4edOY3bICEGrHcdRonEu9oiXiokljsgikSJVkBU2 Ec

Not to mention that 301 redirects do not preserve 100% of link equity, so you’re better off replacing redirect links and broken links with a direct URL.

301 redirects work great as a temporary solution for internal links, but they shouldn’t be used as a permanent solution. 

Screaming Frog works great as a tool to identify any pages on your website that are linking to 301 redirects. 

Once the redirect has no remaining internal links, you can kill the redirect and leave it as a 404. 

If the page has existing backlinks then you can leave the redirect as is.

Fix Redirect Chains or Loops

Redirect Loops

A redirect loop is a result of creating an infinite loop redirect that makes your page inaccessible to both search engines and users.

Say you redirect Page A to Page B, but Page B redirects back to Page A, that will cause an infinite loop with your redirect. 

Redirect loops are very easy to fix, all you need to do is kill the redirect that’s causing this endless loop.

Redirect Chains

A redirect chain is a redirect that has multiple redirect paths.

 Redirect chains take search engines and users to multiple destinations before landing on a page with a 200 response code. 

Say you redirected Page A to Page B, redirected Page B to Page C, and then finally redirected Page C to Page D. This will cause search engines to bounce from each page until they reach Page D.

Not only will your internal links be less impactful, but too many chains can frustrate crawlers and cause them to crawl other pages.

To fix redirect chains, you’ll need to find internal links pointing to them and update them with the current destination URL.

Once you have updated your internal links, you can kill the redirect. 

You can find both of these issues by running a site crawl through Ahrefs or Screaming Frog. 

Check that Google Can Render Your Content

Another aspect of SaaS technical SEO to consider is whether Google can render your content. 

If Google can’t render your content, they either won’t include it within their index or will be missing key information that could influence its rank position. 

While content rendering issues can occur with HTML, the majority of rendering issues occur with JavaScript. 

Especially in the case of infinite scroll, search engines may not be able to see all your content resulting in pages not performing as well through search. 

BaZZ5z5ALII7AQEJQZndVLcjNz hsWieToDi7urtUCox5rXAFxALYlip Cxem2T53w6uKuyXP9F APG8AhOVmx vNGnT1OzUTGmfuq7SDtituUqH9Pfz

One way you can check page rendering is to use Google Search Console’s live test tool to see if Google can fully render your content. 

Review Which Pages Have a Noindex Tag Applied

As mentioned before, you should review which pages can and cannot be indexed and with noindex tags, you’re instructing search engines not to index your URL.

mpUBAipdP k6FF8555PhODB12BhywKUB7BhPrAjUuHmBa2vN

Noindex tags are a directive so search engines have to follow it. 

While it’s fine to add a noindex tag to pages you don’t want indexed, it’s better to just block search engines from crawling it. 

Search engines will still crawl pages marked with a noindex tag, but not as frequently. 

It’s also common for noindex tags to be applied to the wrong page. 

You can view in Google Search Console which pages have been excluded by a noindex tag.

I recommend filtering by your submitted sitemap to see if any indexable URLs have a noindex tag.  

Review Which Pages Have a NoFollow Tag Applied

Similar to the noindex tag, the nofollow tag is also a directive for search engines. 

Rather than excluding your page from the index, nofollow tags instruct search engines not to follow any internal links on that page. 

If your page has a nofollow attribute applied, link equity will not be distributed to other pages. 

This is commonly used for external links, but sometimes nofollow links are applied internally to conserve link equity. 

While nofollow tags are not as concerning as noindex tags, you’ll still want to review which pages have the attribute, especially pages with backlinks. 

You can use Screaming Frog to review all pages that have nofollow attributes applied to them and decide whether those pages need them or not.

Most likely they don’t. 

Know Your Response Codes and When There’s an Issue

While response codes aren’t necessarily technical SEO issues, you’ll still want to understand them and monitor them for any sharp increases. 

sTBTEaKHVLRocUSpyKQfQB2gyH2FQ2ZZzTFdY3b5bBV1bP7VwIMGFkfvp5u7 AeCWU0Cnhz8UGAeXU4 CkS6uxy7OetfuBLoKHm YgbwyVf z GzKKvW3y0pMIIt xaMTiOR DqoKfQPGB 7vjbi8s

2xx status codes are fine, but you need to keep your eye on everything else.

3xx

3xx responses won’t cause issues for the most part, but they can impact your SEO.

3xx issues commonly refer to redirects that take users and search engines to another URL outside of the one they landed on. 

K1PXxGVklxhLFIlqGNPWr0FCpj0G5OlZjX5Q6nwNqWHl5g0lEv3IsbqwaFJ2pL8DawTG

While 3xxs work great as a temporary solution, having too many can slow download speed and reduce the impact of your internal links.

Say you have a broken internal link that you want to fix. Redirecting the 404 to another relevant page is a solid option to conserve and transfer link equity, however, these redirected internal links do not fully transfer link equity. 

While you may be conserving some link equity, you’d be much better off replacing the redirected internal link with a link that leads straight to the destination URL. 

2xx

The 2xx status code means that the request was successful and was processed by your browser. This is ultimately the response code you want to see for the majority of your indexable pages.

YMvY8OoqcS9iRB0Re EQM25zo6QoI4E1wcoHjy0p7j1Wll93yVf HAXe Qn1VRDzsplKrT2DLV5bKNRPSYC9Nh2TB3pMIgYDJ1beSR8Cny IM3JBGZ2EZBd5nNuGDprV0WheqvPLQlz2iyCLninu 6E

4xx

A 4xx error is an error that occurs when a webpage does not exist or has restricted access or rights. 

zFpE AlFAhBycuI VhXHF486E5XiLq4FuQak3X1dKh3unfaHKPVFy3qUyvSJPd fSY 2A46D1wuXtvaiNcy7YcSpCeYr1h8ozs AeEy0fu6WdZqn8giHBKYBIB89P

These pages existed at one time and now are no longer live (2xx) and have not been redirected elsewhere. 

5xx

These are server error codes. The client made a valid request, but your server failed to complete the request. This most likely occurs when your page can’t load properly, which results in it not being available to the client-side user agent viewing it. 

i iaXpUGWl0QjT4oEuMrGEzuDBVlOQyYQK4WsX3UcI6JOHqwyBplYxTd6hxEetMKEXinHfAJUKdx3FmZUd4QNc44e93FBW7Nf0x6 dlaCoF 5EVUQU7r3rU kpLGNujIne7ylqTfVbCh7AT1UniV20A

Make Your Website Mobile Friendly

Mobile friendliness is an aspect of technical SEO you don’t want to ignore. 

Mobile friendliness is how your site is designed and optimized to render on a mobile device, such as a smartphone or tablet.

Simply put, optimizing your website to be mobile-friendly, means your website content can shrink down to fit on any screen while allowing users to view all elements of your content. 

KCoXK0QSD8JkMoj6qMt6fCw32Znc6XxuTeDaDhqEXaON 7D mfPyMBrcLPlG9LC8 SKAMtGJ82N v Cyyib1NK2yOZ YSS54kQLwYCOak4gPohbWNrxgQQ1JR DO18Hof9ELuq7ELEz7LyDzMeK32lI

You’ll want to make sure your text isn’t too small to read, clickable elements are properly spaced, and the content is fully visible on a mobile screen. 

You can use Google Search Console to identify any pages that have mobile usability issues.  

But in actuality, you don’t want to fix mobile usability on a page-by-page basis. Instead, you’ll want to make your entire website mobile-friendly. 

This can be done by using a responsive web design. 

Mobile-friendliness is a ranking signal for search engines, so it’s something you will not want to ignore. 

Improve Your Site’s Speed

To optimize your website to the fullest, you’ll also need to optimize your site’s page speed.

Like mobile usability, page speed is also a direct ranking for search engines. 

49GziemnPINWcBtqWckeJ3YEOusAPzitMf2FrqOD uQonfyg6 3Id5rEEZQMX80gO8toLaGGGM z7TzKrK4jFr4b9wYWKeGJ5dmcvkZbW KulXp CUHh3vwga9UQIY18MuhIk87KkSUObs8oooBOIHE

Having a poor page speed won’t make or break your SEO, but it’s still worth optimizing. Not only for search engines but also for users. 

Your users don’t want to navigate a slow site, so it’s important to optimize your website for speed. 

You can use Google’s PageSpeed insights tool to see which areas of your website can be improved. 

Whether it’s compressing JavaScript, minimizing CSS, or compressing images.

Google’s PageSpeed Insights tool will also give you estimates on time savings for each activity.

Check Your Website’s Security

Having a secure website is another ranking signal search engines use. 

If your HTTPS isn’t set up, then Google may see your website as less trustworthy and send fewer users to your site. 

R5F9vtSz 5ZB85a L7Q1FAg5cgnSuzLaUv AVL9aMgTqxH5m4w4A5VqBB32ROkpU6SXi8IdxHe3b 9WYrFprbOnLrk8oozhqbVCLdNQ

Again, this won’t drastically affect your SEO, but it can make your website less competitive. 

So if you haven’t yet, set up your SSL certificate to secure your website.

Improve Thin Content

Not that thin pages are an issue, but they can help identify low-quality content that needs to be improved. 

You can use Screaming Frog to find which URLs are considered thin. 

JowxaYlSEO 1wWaBqLZ7PHUZOlw748FlzZnvaBVDhgNrLjynySNNtLBolFQkU5T0lcmU7epnjIQnRs0azD7nT TT51Z S9M9eUtCS3VmKYnAxX j5PUDKKWCbQ4rwoDZSLcrXUq7nipq 3wmoc8BF2I

However, you’ll want to review each URL individually in GSC to see if they’re ranking and driving traffic. 

The thin content error can be misleading since high-performing pages can still be marked as thin. So always check your data before you delete or change a “thin page.”

Internally Link Every Page on Your Website

Every page on your website should ideally be linked from pages outside of pagination and the sitemap. 

For priority or money pages, they should be easily discoverable by search engines and users starting from your home page. 

gw M2oh38KfWaW

For SaaS websites, each post on your blog should be internally linked once and at least 3-5 clicks away from the home page. 

SEO tools like Ahrefs and Screaming Frog are great for finding orphaned pages.

SaaS Technical SEO Optimization Recommendations

Optimize Your Meta Tags

Always optimize your meta tags. Especially title tags since they are a strong on-page ranking factor and can influence a searcher’s click.

No matter what, you’ll want to review each title tag with traffic potential and optimize them to increase visibility and drive more clicks

Improve Page Loading Time

Page loading time is a direct ranking factor tied to page speed that should be prioritized. Items that can impact your load speed include:

  • Your site’s server
  • Page size
  • Image compression
  • Web hosting
  • Number of HTTP requests

You can also use Google’s PageSpeed Insights and Google Search Console to find any issues with your site’s loading speed.

Use Schema Markup When Possible

SaaS websites shouldn’t neglect schema markup either. Schema is used to describe your website and website pages to Google, and potentially searchers. 

0y9nJbFMf6LM0gFXzxF0DHMHOfo3Dy4TuBBNCYLp3LDwOWa8sX ukkG6nIxU4Plvk6Huz75v CnQUUbXDR 0r1L6UOjGWLS1o7Nu7zmbSFYB nvcQp0GiGR4XRav8 InFcm FRbz6mD2zoADrCQU9LQ

You can also schema markup to earn rich result snippets. 

2zk97I3kH6p0sp7 GRu8wP3RenmUa0iijrKiYq TxQaOp4m2b0IbPJERZUoqC5CO3DahDkb6J 3nv8JYJnFNzgeqxINDdNqy2Gtp10lvnjiSkuY gP5LFfb5ogm0Hxct18

These can be useful when using FAQ schema to convey content on your website before users even click on it. 

There are 100s of different schema types, but for SaaS websites, you use schema for your reviews and monthly/annual pricing options. 

This will not only help search engines understand your website, but it will also entice users to click on your search result.

Schema should always be implemented whenever a website is first set up. Organization schema should be used at the very least and article schema should be created for your blog posts.

Focus on Core Web Vitals

Another user experience-related signal you’ll want to optimize for is Core Web Vitals. 

Core Web Vitals, which became a recent ranking factor in 2021, measures your website’s user experience for both mobile and desktop for loading performance, interactivity, and visual stability.

These metrics are made up of three specific measurements: largest contentful paint, first input delay, and cumulative layout shift.

Core web vitals are a subset of Google’s page experience score.

Largest Contentful Paint (LCP)

Largest contentful paint (LCP) is how long your page takes to load from the point of view of a user.

Simply put, it’s the time it takes from clicking on a link to seeing the majority of the content on-screen. So this includes images, videos, and text.

Ideally, you’ll want every page on your site to have LCP within 2.5 seconds.

xzx2Fvb0xvgQDo3eawiDl8ZTtaom7A9qbvFKsZ1mic5W07t0

If you want to improve your site’s LCP, you can:

  • Remove unnecessary third-party scripts
  • Upgrade your web host
  • Set up lazy loading for images
  • Remove large page elements
  • Minify your CSS and JavaScript

First Input Delay (FID)

First input delay is the time it takes users to interact with your page. This can range from clicking on a link to entering contact information in a submission form.

With FID, Google considers 100ms to be the perfect length of time for a user to interact with your page.

Cumulative Layout Shift (CLS)

Cumulative layout shift is how stable a page is as it loads. So if you have elements on your page moving around as the page loads, then you have a high CLS.

Cl P7GnEte7e9aO4tWrgMFYoOmpMqRWFqGfCqzAFQD9uevtwROzDU2sGmvZWphzX4MQdIuB3GcQItD0Jm2WxXJDMVJ9opjb8F0R6oTCVMDrYAgPWDaPV1BkBs9XjxzuLY954Ay7tmseG 82WIh857Vg

If you want to improve your site’s CLS, you’ll want to:

  • Set width and heights on images
  • Optimize font delivery
  • Reserve enough space for ads and embeddings

How to Optimize Your SaaS Content for Search

After the SaaS technical SEO checklist has been finished, you will want to optimize your content so it’s visible to searchers. 

While this article is primarily focused on SaaS technical SEO, there are a few on-page SEO elements that overlap. 

After all, the point of technical SEO is to get your site more organic traffic.

Always Start With Keyword Research

Even before you start creating content, you need to start with keyword research. 

This doesn’t mean picking keywords with the highest search volume from a keyword research tool like SEMrush or Ahrefs but researching keywords that align with your target audience. 

So researching their pain points, knowing what they are, and how they search for solutions. 

RmEZJiW JHPJgDAHGbhCvXnUWzECBhrlumsLleQg4FgPYw5BTAZUtvhHEV0PE4PS6KHptiE0UFHYYxXGZCQY8qpGdZvGUooEntHhwtv8HDfJLAVxoBr63hpJPSiVvMtE Lw7YofPRbW4HpcoJwsfDg

Why is the user searching this and what are they looking for? 

Once you have that idea in mind, you’ll want to tie it to a keyword. 

This is the process of finding the keyword that acts as a gateway to that topic. 

Next, you’ll need to review the search intent behind that keyword. How will you match the relevancy of your content to that specific keyword? 

Remember it’s not about keyword stuffing, it’s about creating content relevancy. 

And that can’t be achieved without nailing search intent first. 

So if the search results for your keyword are mostly product pages, you’ll want to avoid creating an article. 

But if the search results for your keyword are articles, then you’re good to go.

After you have the search intent identified, review the top-ranking results to see why that article is ranking.

Review the content covered, topics covered, headers covered, word usage, entity usage, and how much supporting content they have. 

Once you have all of that pulled together, use it in your content while creating content that’s better than the competition’s. 

You can cover angles of the topic that haven’t been covered yet and use more media, but ultimately, starting with keyword research will help you outrank your competition 

Entity Optimization

An element of SEO that’s often neglected is entity optimization. 

Instead of relying on keywords, search engines now use natural language processing to understand content relevancy. 

So instead of keyword stuffing content, leverage semantically-related terms that help search engines better understand your content while building relevancy. 

One great way to measure the entities is to use Google’s free NLP tool. 

This tool allows you to measure the salience (relevancy) and categorization of the entities found within your content. 

Ideally, your target keywords should be the highest salient entity found within your content. 

As you can see with my SaaS SEO consultant page, the top entities for this page are “SEO consultant” and “SaaS.” 

BfNyxiwaWemIy26E0V0VOVt04gL0IGwwMf7nFgYFsU6BgkU26NoRWM4JWdybjop9jycxnO9rZAM jBkghSDXFa1CgjuqF6GttXFhIpvvLSImE2EYbdTqWcKJidm8zTWr8 d NWSp RKq7Ug5p2IDXo

This means these two entities are the most relevant (important) entities found within my content.  

You can also see here that Google’s natural language processing tool is almost 100% confident that this content can be categorized as “Internet marketing services.”

ygL7R0J2A1q4SwluipqJFW5Xl oKxeTr

Focusing on natural language processing and content relevancy is the modern way of doing on-page SEO.

Never Build Content Without a Content Strategy

As mentioned before, avoid picking topics at random using a marketing tool

Instead, build out a full content strategy that lists all the topics you’ll cover and how they translate into business results. 

For SaaS, the ultimate goal is revenue and leads coming from product demos and signups. 

With a full content marketing strategy set in place, your software product will be covered from end to end with content that captures searchers at all stages of the funnel.

After you’ve covered your SaaS content marketing strategy, find existing pages that are underperforming and optimize them with new/updated content. 

LPJhPgFzBwB8j5tM2uD 8cu 8Xdp6V9WPqbEWR2HTU61tUehq04fdtgHuh7qbsukDH BGyPOjUYCP PyNJQKbbW7zlUE79AhcBnyreab HX8syaFepcfYI8qRs8UCzGbXvo8BDBsrh5N0Uw6UIlKMe4

After optimizing your underperforming pages, you can then move to target any topics that were passed over during the initial content planning stage. 

But before you even start your content, you should always have a SaaS content marketing strategy in place. 

Incorporate Link Building into your SaaS SEO Strategy

Link building is a pillar of any SaaS SEO strategy. 

If you want your content to rank, you need backlinks.

While some SaaS websites operate in less competitive niches that don’t require links, building backlinks is still a great way to safeguard your SEO efforts. 

Especially for the long term.

Not only will backlinks make your website more competitive, but they will also help your site rank for more keywords. 

Link building for SaaS is difficult, but you can use tactics like digital PR, HARO, guest posting, brand mentions, and broken backlink building to build high-quality links.

Don’t Forget to Neglect Technical SEO for your Website

After reading this article, hopefully you now understand why technical SEO is important for SaaS websites

If you want your SaaS content to rank, you can’t ignore technical SEO. 

Six Tools That Can Help With Your SaaS Technical SEO Efforts

SaaS technical SEO is unfortunately something you can’t do without a tool. Unlike on-page SEO, technical SEO requires tools to understand page speed, mobile usability, page experience, and how Google is discovering, crawling, and indexing your pages.

 So here are a few options to consider before you get started.

Screaming Frog


Screaming Frog is likely the best option for SaaS websites when it comes to technical SEO. 

x46srj0CKjdOlK3ddC B9lIP3zIfWO t0MA8QsOc88EB98hej6WHGjE5HUyABmnD7T5O HPje5Fi VtQoC7gtpyk5FjoN2KdRYsT5px9FkCYlul4ZoaPL0AyY14EGc

This tool can pretty much do it all. You can find pages with nofollow and noindex attributes, high crawl depth pages, orphan pages, and much more. 

Screaming Frog is almost a no-brainer for any SaaS company looking to perform technical SEO on their website. 

Luckily Screaming Frog is a free tool for websites under 500 URLs. Larger websites will need to buy a paid subscription, but this tool will very easily pay for itself.

Ahrefs

While Ahrefs is commonly known as a keyword research tool, they also have site-crawling features that match up with Screaming Frog. 

TlvtXgVqlqX5YayALiHwHmb28bSF8GgV4vH2GuP19WX9 rZcp4yAUdK293VignckW6zCLW4 GgA5JLowT07kzUJz8aTVyD8DEhKmIA7mB10zdysTs4j1NL2 C4T5tF7BSm49WmccTdcLGtIlPUQfy g

While you can’t see indexed pages like Google Search Console, you can still find issues like:

  • Orphan pages
  • Response code issues (3xx, 4xx, 5xx)
  • Canonical issues
  • Noindex and nofollow pages
  • Broken links
  • Broken redirects
  • Redirect loops and chains
  • File size compression
  • HTTPS points to HTTP

Plus many more. 

2fBjK6qCs42AXEFR1vybye I5F2oJp4GuA xodJU79FzWITQj2Phw1SJUVknA60 OKaU4tI1dFaebvUJ2nQH0LVxX9yW3NvjcGvQ1XtRU79kRyMwWFR9bnWV16lAvVf2I3GHdZSU3qyC oiSmUJQNWs

Personally, my favorite SaaS tech SEO stack includes Ahrefs, Screaming Frog, and Google Search Console. 

These 3 tools are more than enough to conduct extensive site audits for SaaS websites. 

Google Search Console

Google Search Console is the only tech SEO tool in this stack that’s non-negotiable. 

9GKxsrGScnvGgnE45Bl6RCgPhWSOJ2Pr7HkAcnohontBvPgpb10T3V bNcCAqTc3LXOPY2qfAxIU vKfAWcmaAESWuvZ3KXwOs7

As mentioned before, 75% of SaaS tech SEO is improving indexing for indexable pages. 

And the only way to know which pages are indexed is to use Google Search Console.

Google Search Console has an entire tab dedicated to your site’s indexing. Any pages that have been excluded from Google’s index will appear here with a reason attached. 

lavlbCo3qxg94xyPBHl1dVz2Rev0gCuHbyoXHnJhgBVMYkfKnDq6YBY6QtImYrtrLAEpD toeVwbC3SfOiOYIgN01Agi60v9134gXE5jK djHPpO63glIUwr2Hs7PBRgtuyyXL Fgfga49X296TZSCw

You can also submit your sitemaps in Google Search Console, which Google will then tell you if your sitemap is readable. 

Google Search Console is necessary for all SaaS websites. 

Lumar (Formerly Known as DeepCrawl)

Similar to Screaming Frog, Lumar is another technical SEO tool that works great for SaaS websites.

v8xr47AtWndNlahZ qWPL73pfPF3vL9lJxwvkA fDf9qRZNshgnqWMZO3Oqu5xRaJH yRqUI m2WKtuVzIAJ0ZFsCBY mTMsQIZnB8Yvle6ip3YcwIcpbQykSanrV566ucH9woo 13 2rOvjoWfPN8

It pretty much has the same features as Screaming Frog, while offering a similar price, so having both tools isn’t necessary. Picking one over the other comes down to preference.

SEMrush

Like Ahrefs, SEMrush also has site crawling features that include technical SEO action items. 

LgdlhyytbmuDTw5XjDErEtNAiy3FoXawlT1pj0fmgePtMOOc aXLvcgMp2Pp0qzga7aqjTDqjmikI1BbE6Ns6aXSX QpuYiG1rqjA8trgnRA6CgP3PcW6MPwjGPoUTX76zV SnAsmbDCz4DVe2KgKgo

Both Ahrefs and SEMrush have similar site crawling features, so you don’t need both tools.

Again, this will mostly come down to personal preference. I personally prefer Ahrefs, but SEMrush is still a solid tool for SaaS technical SEO.

Google’s PageSpeed Insights

Similar to Google Search Console, Google’s PageSpeed Insights tool is another free tool that is a must-use for SaaS websites. 

oZzaDf5qfMtC3gdcjQppa6mlXPPkFsewi Wo5ZDaznAikfW4QVDj3TUgy02q4IOMjGYeQfAG71H4GOzk4ZuRGHS20nYomvV2BQusV0zmDtbX6h67S1As352VmXYNseULrTFm S9wuAolSq1mdXB6mEU

This tool doesn’t even require an account, all you have to do is enter your URL to have Google measure your site’s performance. 

It will grade your site based on performance (speed), accessibility, best practices, and even SEO.

In the performance tab, you’ll find metric reports on the items that are slowing your site down the most. 

Further below, Google will list out opportunities and action items to improve your site’s page speed. This includes:

  • Properly sizing images
  • Reducing Javascript
  • Reducing CSS
  • Eliminating Render-Blocking Resources

All SaaS websites should take advantage of this tool, especially those that use JavaScript. 

Get Started With Your SaaS Technical SEO Strategy Today

Looking for additional help getting your site’s technical SEO in order? Book a free SEO strategy call with me, and we’ll go over action items and any issues that may be holding your website back.

Recommended Reading