Google Search Console is a handy audit tool that should be used by every SEO, digital marketer, or anyone working with a functioning website.

It’s an all-in-one tool that can provide you with everything from technical SEO to keyword research. I find it works even better as a keyword research tool for SaaS companies.

Even better, it’s also a FREE tool. All you have to do is verify your property, and you’re good to go. In this article, I’ll discuss how you can use Google Search Console as an SEO strategy to audit your website and increase organic traffic.

Why You Should Use Google Search Console for Audits

Learning the basics of GSC (adding a property, submitting a sitemap, checking to see if a URL was indexed) is one of the fundamentals of being an SEO, but learning how to leverage Google Search Console should be a must for all digital marketers. Especially if you work as an SEO, you should live in GSC daily. You should be able to eat, sleep and breathe Google Search Console. If you know how to use it, Google Search Console is one of the most resourceful and effective SEO tools.

That’s not to say that some third-party tools aren’t great, they do have their benefits, especially for more specific tasks like running a technical SEO audit or doing competitor research, but as an all-in-one tool, nothing comes close to being as valuable as Google Search Console. Below I’ll explain a few ways GSC can be used to increase organic traffic.

Checking the Basics of the Site

Before you dive into the audit checklist, you’ll need to review the basics in Google Search Console to ensure everything is set up properly. This will just be to check for anything that could impede our SEO efforts. Whether that’s a manual penalty or security issue.

But most importantly, you’ll want to at least make sure the website was set up properly.

Google Search console site tab

Website is Set Up Properly

First step in our list is to check that the website has been set up properly in GSC. If you’re setting up GSC for the first time, you may have to wait a few days to receive any data. You’ll have to wait even longer for experience data like core web vitals reports. Once your Google account starts tracking data, you’ll know you’re good to go. You can also check the settings tab to verify that the verification process went through and that you’re a verified user. You can also have the owner of the account add you in, but being a verified owner allows you to add and remove people from the account.

Google Search Console settings tab

Checking the Overview Report

While we won’t be auditing anything here, this is moreso to get a bird’s eye view of your website. Here you’ll have a brief visual for each tab in GSC. This will help give you an idea of what needs your attention and what you should be prioritizing. It’s a good tab to reference if you’re looking to get a holistic look at your website’s performance.

GSC overview

Recent Manual Penalties

While manual penalties are pretty rare, you’ll still want to check for them anyways.

GSC manual actions

Manual penalties can be caused for a few reasons, including manipulating search results and going against Google’s guidelines. With that being said, Google hasn’t been hitting websites with manual penalties like they used to. Even if your site was hit with a manual penalty, Google will still provide a reason for why they did and how you can reverse it. Even before you start anything, you’ll want to check for manual penalties just in case. They will absolutely deflate your SEO efforts and reversing them should always be the first priority.

Checking Any Security Issues

Like a manual penalty, if any security issues are displayed then that should be a top priority. To see if there’s been any recent security issues, all you need to do is go to security & manual actions > security issues.

GSC secrutiy issues

Check if the Website Uses HTTPS Instead of HTTP

Under the page experience report, GSC will tell you if there are any remaining HTTP URLs found on your website.  Remember, having a secured site is considered to be a ranking signal, so you’ll want to ensure that your website is redirecting to the proper and preferred version.Meaning there are no URLs with issues. 

Page experience tab in GSC

Reviewing Discoverability and Crawling

One of the first things you’ll want to dive into is whether Google can discover your URLs and if they’re able to crawl them. A few factors like the robots.txt and XML sitemap will influence this, but ultimately we’ll look at everything about discovery and indexing to ensure there are no technical SEO issues. If your content can’t be discovered, it won’t be crawled or indexed.

If your content can’t be crawled, it won’t be indexed. Whether it’s blocked via robots.txt, crawled not indexed, we’ll check for everything that could be affecting discoverability and crawlability. You’ll want to start with this technical audit so you can ensure your pages are being properly found. 

Crawl Stats

Crawl Stats in GSC

First item in our audit is to check the crawl stats report. This report will provide us with insights into how Google is crawling your website. To access this data, all you have to do is scroll to the very bottom of GSC and then select settings. Under general settings, there’s another tab that’s labeled crawling. Next, you’ll want to select open report. Within this report you can see data on the total crawl requests for your website, the total download size, and the average response time.

Crawl stats in GSC

The total crawl request filter is basically a way to see how often Google crawled your site in the last 90 days. As for the total download size, this is basically the sum of all of the files and resources that were downloaded by crawlers over the last 90 days. This includes things like HTML script files and CSS files. Lastly, we have average response time. The average response time is the time it took crawlers to retrieve your page content. Meaning how long it took Google to access your webpage content. This excludes things like scripts, images, embedded content, or page rendering time. These 3 items are only the very beginning of this report. I’ll be covering the other crawl stats you can review in the next section..

Is Google Discovering New URLs?

image 3

At the very bottom left of the crawl stats report, you can also review how your URLs are being crawled based on existing URLs and new URLs. “By purpose” boils down to two categories, discovery and refresh. Discovery URLs are URLs that Google has just discovered (never been crawled before) And refresh URLs are URLs that have been recrawled.

Crawl Requests in GSC

If anything, the discovery tab is kind of a byproduct of the recrawled URLs. Your new URLs are almost always going to be found through existing content. This is usually in the form of internal links. So, if you’ve been publishing a lot of content recently, you’ll want to check this stat report to see if your new content is being discovered.

Is Google Crawling a High Volume of 404s?

Next, you can check which status codes are being picked up by crawlers. You can see whether Google is finding a high volume of 404 pages across your website. While 404s shouldn’t be a reason for concern, you’ll still want to keep an eye on this to see if there’s an increasing trend of 404s. This will indicate whether something on your website is broken and causing pages to appear as 404s.

404s in GSC

Another thing to check for is whether you have internal or external links pointing to these 404 pages. You can use either Ahrefs, SEMrush, or Screaming Frog to find them.

Is Google Finding a High Volume of 5xx Errors

Here, you can see whether Google is returning any server errors across your site. You can find these server errors in 2 ways.  

  • You can got to the indexing report to see if any 5xx are being picked up
  • You can go back to the crawl stats report and under “by response” you should see if any 5xx status codes are being returned
By response in GSC

Is Google Crawling a High Volume of 301 Redirects?

A high volume of 3xx codes may highlight URL path changes and a sharp increase might show more pages are returning as redirects than they should be. As with the 5xx errors, you can check for 301s the same way. You could either:  

  • Check the indexing report to find “pages with redirect”
image 62
  • Head to “by response” underneath the crawl stats report
image 28

Are There Soft 404 Pages?

Within the indexing report, you can check to see whether any of your URLs are being registered as soft 404.

image 42

While 404s don’t affect the crawling rate, soft 404s do, so you’ll want to fix those issues if they’re being registered.

Can Google Access your Robots.txt file?

Within the crawl stats report, you can check if crawlers had any issues accessing your robots.txt.

image 46

To check this, go to host status and then robots.txt fetch. This will tell you if Google had any issues accessing your robots.txt From there you can see when that fetch failed and the number of times it failed.

How is Google Crawling Your Site by File Type?

The crawl request breakdown will also let you see data for crawl requests based on their file type. Each file type will have a percentage next to it demonstrating the total percentage of file types found on your website.

image 52

You can use this report to find crawling issues with your website, such as slow response rates or server issues. You can look at the average response rate to see if there’s been an issue recently, meaning your response time was slower than usual.

Type of Googlebot Crawling Your Site

In the crawl stats report you can also view the types of user agents that are visiting your website.

image 35

The different agents you’ll see here are:  

  • Googlebot Smartphone (smartphone)
  • Googlebot Desktop (desktop)
  • Googlebot Image (image
  • Googlebot Video (video)
  • Page Resource Load
  • Adsbot
  • Other Agent Type

  You’ll want to monitor the page resource load specifically to see if there’s been any spikes recently. A sharp increase in page resource load might indicate further issues are occurring with your resources like images, videos, or the most likely culprit, Javascript.

Retrieve Data From Google’s Index

image 50

When using the inspection tool, you can retrieve URL data from Google’s index. You can check things like:  

  • When your URL was crawled
  • How it was discovered
  • Indexing issues
  • Schema Issues
  • Canonical Issues
  • Rendering Issues

Are These URLs Found in the XML Sitemap

If you have a larger site, you’ll want to review your top-performing (or underperforming URLs) to see if they’re accessible through your sitemap. This will ensure all pages on your website are included in your sitemap. This can help identify any issues or missing pages within your sitemap.

image 51

Can Google Render These Pages

The live test tool within the inspection tool allows you to view your rendered HTML alongside a screenshot of the URL. Just click test live URL to access these.

image 7

This will show you if your page is being properly rendered by crawlers. From here, you’ll want to see if there are any issues with your Javascript or with your page resources loading.

image 6

Was The Proper Canonical URL Chosen?

You’ll be able to check this in the coverage report, but another thing to check for is whether Google accepted your selected canonical.

image 56

If Google found your URL too similar to another page, they might choose another URL as the canonical and exclude that URL from the index. Most canonicals are self-directed, but sometimes Google does not respect your chosen canonical as canonical tags are signals, not directives.

When Was Your Page Last Crawled?

Like the refresh crawl stat, you can also check whether your specific URLs are still being crawled.

image 57

If you see your content was last crawled over 6 months ago, you’ll want to improve your content’s discoverability. Usually, that will be through internal links. This can happen a lot for pages that are buried in pagination.

Request the Page to be Recrawled

Now, one of the best things about Google Search Console is that you can request pages to be recrawled and indexed. This is one of the tools that everybody in SEO should know about. You can:  

  • Request discovered not indexed pages to be crawled
  • Request crawled, not indexed to be recrawled
  • Request new pages to be crawled
  • Request pages that have been recently updated to be crawled

There are plenty of uses cases when it comes to the “request indexing” button, but it’s one that should always be considered. This will help with any indexing issues you need immediately fixed. 

image 35

I’ve seen pages get indexed within 15 minutes of requesting it to be crawled. I’ve also seen pages shoot up to page 1 after some updates and requesting the page to be crawled. With that being said, this method does not ensure your page will be indexed. That’s still up to Google’s crawlers whether they’ll index your content or not. All this method does is add your URL to a priority crawl list. However, you should still be using this tool frequently.

View the HTML of a Page

You can also view the HTML of your page to ensure all of your metadata is being properly read. Sometimes you’ll find you’re missing a canonical tag, so I find this check to be occasionally helpful if a URL was excluded from the index.

image 39

Host Issues

Another item you could use to audit your website is the host status. This report will tell you if there’s been any availability issues with your site over the last 90 days. The items you can check for usually boil down to these 3 types of host issues:

  • Robots.txt fetch
  • DNS resolution
  • Server connectivity
image 58

Robots.txt Fetch

Within the host status tab, you can check whether Google has had any issues accessing your robots.txt. A successful fetch means that either the robots.txt file was successfully fetched (200) or that the file wasn’t present (404); all other errors would result in a fetch error.

image 64

DNS Resolution

A DNS resolution shows whether Google had any issues with your DNS while crawling your site.

image 63

Google says that they consider DNS error rates to be a problem when they exceed a baseline value for that day.

Server Connectivity

Server connectivity shows whether Google came across any server connection issues while crawling your website. If your server was unresponsive, Google will provide an error message within this graph.

image 65

If the Sitemap Has Been Submitted

A very basic item you’ll want to check off with this audit is whether your sitemap URL was submitted in Google Search Console.

image 47

You’ll want to check to see that it was submitted properly without any issues. Under indexing, you can find the sitemaps tab which will show you your website’s sitemap history. This also gives you the option to resubmit your sitemap if it’s changed in any way.

image 8

Here you’ll want to check off that your sitemap is still being read.

Does Your Sitemap Have Any Technical Issues?

Within each submitted sitemap folder, you can view whether your individual sitemaps are being read.

image 66

You’ll also want to check that your sitemaps have a “success” status. If there are any issues, Google will label that specific sitemap as an error.

Any Pages Marked as Discovered, Not Indexed?

Another thing you’ll want to check is whether there are any pages marked as discovered not currently indexed.

image 30

This common error message basically means that Google knows a specific URL exists on your website, but they haven’t initiated their crawl yet. I’ve seen this indexing happen for a few reasons, with the most common being a lack of internal linking. Say you have a page that’s buried within pagination that’s over 10+ clicks deep from the homepage, Google’s crawler will likely give up trying to crawl that content and choose to crawl other URLs instead. It’s very important to note that this isn’t an issue with crawl budget, it’s an issue with having a poor internal linking framework set in place. The closer your URL is to the home page, the better chance it has of being indexed and performing well. So it’s crucial that you audit your internal links frequently. To find these pages, all you’ll need to do is go to coverage > excluded > discovered, not currently indexed.

image 10

Any Pages Marked as Crawled, Not Indexed?

Taking this a step further, crawled – currently not indexed means that your URL was discovered and crawled, but Google decided not to include it in their index. This can be caused by content quality, but there are plenty of other reasons why your content might be marked as crawled – currently not indexed.

image 22

If Google Can Access All URLs in Your Sitemap

Within your submitted sitemap, you can check to see if there are any issues with your sitemap. Whether those URLs can’t be accessed or if you have any errors in your sitemap, this tab will allow you to check to see that everything is in working order.

Check if Google is Indexing Pages Disallowed in the Robots.txt

Another common issue is if Google is indexing pages within your robots.txt.

image 9

This happens fairly often. The reason this happens isn’t because Google is ignoring your robots.txt, instead, this can happen if external links are pointing toward that blocked page. While Google still won’t request or crawl the page, they can still index it using the information from that external page. Snippets for these pages will either be limited or misleading since Google is pulling information from external sources instead of your own website. If you need to remove these URLs from organic search, you can use GSC’s removal tool. You can also apply a noindex tag to remove the page from Google’s index.

When Auditing Your Google Search Console Account, Make Sure Search Engines Can Index Your Pages

Review How Many Pages Are Included Within Google’s Index

Here in the coverage report, you can review your indexed pages and what that portfolio looks like. You can check to see if there are any unnecessary pages that shouldn’t be indexed.

image 16

Check to See if There Are Pages That Shouldn’t Be Included in the Index

As I mentioned above, you’ll want to check to see if there’s any unnecessary pages being indexed. Not that this is much of an issue (unless your site is dealing with crawl budget issues) but you’ll ideally want to do this to speed up the indexing of your pages. You can either noindex them or block them entirely via the robots.txt. For smaller sites, the noindex tags should be enough.

image 41

Check if there are any Canonical Issues

Also in the page indexing report, you can see how Google is responding to your canonical tags. You can see whether they’re respecting them, ignoring them, or can’t find them. The “alternate page with proper canonical” is the response you want to see, everything else you’ll want to fix.

image 9

Are Your Videos Being Indexed?

image 43

In Google Search Console, you can actually check if your videos are being indexed. In the case of my site, I’ll typically embed videos directly from Youtube, but if you have media hosted on your site, you can likely get it indexed. This report will show you whether it was indexed or why it was not indexed. Similar to the page indexing report.

Check if There Were Any Pages That Couldn’t be Indexed

Blocked by Robots.txt

image 34

These are pages that have been blocked through your robots.txt file. If your robots.txt was set up properly then this shouldn’t be much of an issue. However, you’ll still want to review it anyways to make sure the proper URLs are being excluded.

URLs Marked with a Noindex Tag

image 38

If you’ve applied a noindex tag to any of your URLs, then those pages will appear here. The noindex tag is a directive that tells Google to exclude a specific page from their index. While this is something that’s manually set, you’ll want to review the excluded report to see if there’s any negative trends of noindex URLs being excluded. At the very least, you’ll want to check that none of your web pages were mistakenly added in there.

Check the Coverage Report to Review URLs That Were Excluded From The Index

Like discovered and crawled currently not indexed, this part of the page indexing report will show you URLs that have been excluded and why. There are other reasons you’ll want to look out for like:  

  • Page with redirect
  • Excluded by noindex tag
  • Duplicate without user-selected canonical
  • Duplicate, Google chose different canonical than user
  • Not found (404)

Mobile Usability

Another important factor to audit your website for is whether your pages are usable for mobile devices. Obviously for desktop, you’ll still want to prioritize usability, but mobile friendliness is where things get tricky. Sometimes text and elements on mobile can become stretched or bunched up, making it harder to read. One important thing to note is that Google will almost always prioritize or reward websites that are optimized for a positive user experience. And with everything in SEO, a lot of it comes down to providing a quality experience for the user.

image 46

Can Users Read the Text?

image 54

Here you can check to see if there are any issues with your text being too close together. Google Search Console will provide you with a list of URLs where this error is occurring on. In any case, if your content is wider than the users screen, you’ll want to shrink your text so users can properly read your content.

Are Clickable Elements Too Close Together?

image 49

Another mobile usability element that you can check is whether your clickable elements are too close together. Whether that’s an image or a contact button, you’ll want to fix this so it isn’t causing an issue for mobile users.

Page Experience Audit

image 48

In the page experience report, you can check whether your URLs have any issues with providing page issues and how many URLs are categorized as “good URLs.”.

Check the Percentage of Good URLs for Mobile and Desktop

image 61

Ideally you’ll want to see 100% good URLs, but as we know that can be challenging at times. This page experience report is basically an overview of your website in terms of core web vitals.

Enhancements

image 55

Under the page experience tab, you can see the different types of rich results (schema) implemented on your website. Here you can check whether there are any issues with your schema and if they’re valid. If there is an issue with your schema, Google Search Console will show you which URL it is and what issue it’s having.

Does Your Site Have Any Issues with Breadcrumbs?

image 13

Having breadcrumbs throughout your site is almost always beneficial. Here you’ll want to check and see if your breadcrumbs have been set up properly and that they’re valid (can be displayed in search)

Core Web Vitals

Being one of the more recent algorithm updates, Core Web Vitals is a set of metrics that grade your site based on performance. While core web vitals are not a significant ranking signal, it still can move the needle slightly in the form of a minor rankings boost.

image 37

While this won’t cause your pages to jump significantly, it may make the difference if you’re ranking on the top of page 1. Optimizing your site for core web vitals will be applied sitewide, so even moving a few ranking positions may make a difference.

Check for URLs Failing Core Web Vitals Metrics for Mobile

Head over to experience > Core Web Vitals > Mobile and open report to see data on URLs that are either failing core web vitals, passing it (good URLs), or need improving.

image 18

Google will provide a reason why each URL failed the core web vitals check so you won’t have much guessing work to do.

Check for URLs failing Core Web Vitals for Desktop

Same with core web vitals for mobile, underneath that report, you can find the same set of metrics for desktop.

image 36

You’ll want to check this report too for any issues that are causing your URLs to fail the core web vitals check.

Internal Links

You can also use Google Search Console to audit your internal link profile. While there’s plenty more you can do to audit your internal link profile, Google Search Console makes it significantly easier to audit a few things like click depth or orphan pages.

image 29

All you need to do is go to links > top internal links.

Review the Top Linked Pages Report

From here you can review which pages have received the most internal links. These pages will likely have the most link equity so you can use these to link to other URLs.

image 25

Fix Orphaned Pages

One thing you can use the internal link section for is to find orphaned pages. Basically, orphan pages are pages on your website that don’t have a single link pointing to them from anywhere on the website.

image 40

To find these, you’ll click the arrow to flip it to sort by lowest to highest. If you have any URLs that are between 0-5, you’ll want to work on linking to those.

Audit Your Backlink Profile

Similarly to the internal link report, you can use Google Search Console to also audit your backlink profile. Whether that’s for finding more linking opportunities, reviewing incoming links, auditing your disavowed links.

image 59

Review Top-Externally Linked Pages

Exactly like the internal link report, you’ll want to click on “top linked pages” under external links and use the URLs with the most backlinks (audit your backlink quality) to link to other pages on your website.

image 67

Review the Disavow File

While not entirely necessary, you can review your disavow file in Google Search Console to ensure that those spammy links are still being ignored.

Review Your Site Performance Report

image 23

Moving away from the technical SEO audit side of your website, reviewing the performance report will allow you to review your website from an on-page SEO perspective. Here you’ll want to look at queries with the most impressions and clicks, traffic changes, and low-hanging fruit opportunities. This will all be under the performance report.

Which Queries Are Driving the Most Organic Traffic

One thing you’ll want to review on your website is the queries that are driving organic traffic. Whether they’re branded or non-branded, they can give you a better idea of how users are discovering and interacting with your website. This will be under queries in the performance report.

image 60

Which Pages are Driving the Most Traffic

Similar to the query report, you can review which pages are receiving the most clicks and impressions based on the accumulated queries for each page. This report is invaluable when performing a SaaS content audit for your website and seeing which pages could benefit from being updated, consolidated, or deleted. I’ll talk about some of the different methods you could use to audit your existing content. This report is under pages in the performance report.

image 26

Compare Traffic Over Different Periods

Another thing we can do in GSC is compare periods. Whether that’s month over month or year over year, we can review data that’s been changing over specific time periods.

image 44

To find this, all you need to do is toggle the date filter and then select compare.

Analyze Traffic Coming From Pages with Schema

Another filter you can use is to check the clicks coming in from schema. Whether that’s product schema or FAQ schema, you can filter these schema types to see if they’re driving clicks and how they’re driving clicks. It’s also worth noting that this can be a useful way to measure performance if you just recently added schema to a page.

image 68

If you start seeing a spike after adding schema, then you know your efforts were influenced by that. So from an SEO testing perspective, it’s worth checking out.

Check Traffic Coming From Mobile and Desktop

One filter that’s often underused in Google Search Console is the mobile and desktop filter. You can filter by device to see which is bringing in the most traffic, but even better, you can use this to see queries coming from each device.

image 67

So if you know your website is getting significantly more mobile clicks, you’ll want to see which queries those are and if there are any other potential queries you can target. How people search on mobile can sometimes vary from desktop, but as we say in SEO, it all depends.

Check Queries Being Clicked on For Google Business Profile

One of the beautiful things about UTM tracking is that you can use it for Google Analytics and Google Search Console While UTM tracking is typically used to segment GA data, you can also use it to segment Google Search Console data. You can do this by setting up UTM tracking for the URL that’s connected to your GBP. Once that’s published and starting to get clicks, you can see which queries are driving clicks for your Google Business Profile. This technique can help you identify opportunities to either improve your business profile or local SEO strategy.

Find Striking Distance Keywords

Now for this portion of the Google Search audit, you’ll want to find keyword opportunities that can be considered quick wins for your website. Finding striking distance keywords or pages can help increase traffic as well. For striking distance pages or keywords, it can be anything ranging from the bottom half of the first page to the top half of the third page. Basically, it’s a page or keyword that’s close to ranking but isn’t quite there yet. Even if you’re ranking on the first page, we can see that most of the traffic volume is almost always distributed to the first 5 results. Everything after that is going to see a fraction of those clicks.

image 70

So if you find a page or keyword that’s currently ranking between position 6 to position 20, we would consider that within striking distance or a low-hanging fruit keyword. You’ll need to set custom filters to find these keywords. This can be done for both queries and pages, but first, you would select the little three bars on the right-hand side that says “filter rows.”

image 72

Once you have that selected, it will give you five options (top search query/page, clicks, impressions, click-through rate, and average position.

image 69

To find pages/keywords with striking distance potential, you would want to set the date range to the last 28 days to have access to more current rankings. After that, you’ll want to choose the position filter and set it to show keywords that are ranking below the 20th position (you can play around with the number, 20 is usually the number I start with.) Once you have the filter set, you should see all of the pages/keywords on your site that are ranked below that number. My personal blog doesn’t have a ton of content published, but to use it as an example, I’ll set the filter to show keywords that rank below position 55, which is on the sixth page of results.

image 19

You’ll usually receive a solid list of keywords from this filter, but to make sure I’m targeting valuable keywords, I’ll also set a filter for impressions. Again, you can play around with this number, but I’ll usually set the filter to show impressions greater than at least 10. This will allow me to filter out the junk and provide me with a solid list of keywords that are worth optimizing for.

image 71

After these two filters are set, you can click on the impressions tab to show the keywords/pages in descending order. This will show you the results for pages/keywords with the highest search volume within your position range. Those will always be the keywords to start with.

image 31

After you’ve identified a page or keyword with striking distance potential, you should research how that query/page is performing in the SERPs and optimize/update your content according to the top results. This doesn’t mean stealing your competitor’s work and content; instead, you should reverse engineer the top 3 SERP results to see what search engines consider relevant for that query. Once you have a better idea of your topic, it’s time to update your page. When doing this, always make sure your content is better written than your competitors and is the most comprehensive resource on that particular topic. This is necessary to outrank your competitors and overtake their SERP real estate.

Check If Your Pages are Proprely Targeting Search Intent

One of the most important things in SEO is properly targeting search intent. If you’re not starting your keyword research around search intent, you’re almost always destined to fail. Checking for search intent using Google Search Console is very easy and only works for existing content. All you’ll want to do is manually inspect each URL within the query tab.

Filter the query positioning to only show queries on or near the first page. Take this query to the SERPs and review it to see if your page is relevant to the results being shown. If not, then review the query report for a query that’s more aligned. Also, review the SERPs to see what’s ranking. If it’s transactional pages while your page is informational, you won’t have much of a chance of ranking. But if the ranking pages have informational intent, your content will have a better chance of ranking. To take this a step further, you’ll also want to review each of the top-ranking articles to see how you can beat them. Look at the topics covered, word content, and internal links. Once you have that research nailed down, you’ll want to apply it to your own content. That’s how you align your content with search intent.

Audit Existing Content for Dead Pages

Another SEO audit you can do for your website is using GSC to find dead/zombie pages. This tip will apply more for websites with a larger volume of pages and blog posts, but to perform this SEO audit, we’ll basically want to look for any URLs on our site that have stagnated organically. These are pages that:  

  • aren’t receiving any traffic
  • aren’t ranking for any queries
  • don’t have any search potential

  The reason we’ll want to initiate this audit is to prevent crawlers from wasting resources on these low-value pages and preserve sitewide authority. Having a large volume of pages with no clicks can hurt your SEO efforts since it signals to search engines that your site is filled with low-quality content. And search engines only want to show their users high-quality, trustworthy content. To prevent this, you’ll want to identify any of these dead pages, delete them, and then add a redirect to a relevant page. Even better, you can potentially turn this garbage into gold. If you have a page covering the same topic and performing better in search, you can scrape the content from the deleted page and add it to a relevant page. As I mentioned previously, Google loves to see fully comprehensive content. So if this dead content can provide additional value, then absolutely add it there. It will not only help improve the content, but it will also help your content perform better. Finding these pages is relatively easy.

image 73

All you’ll want to do is set a date range for the last 16 months, a click filter of less than 1, and an impressions filter of less than 100.

image 71

Traffic Drop Analysis

Also in the performance tab, you can compare performance data for specific periods. Comparing results month over month, year over year, and so on. Just set a date filter and switch to compare. From there, you can choose the time frame you want to compare.

image 11

Using this tool, you can see queries or pages that have decreased or increased within a specific time period.

image 33

Once we identify these pages, we’ll want to see the cause of the drop. Traffic drops usually result from:  

  • Competitors overtaking your positioning
  • Content decay
  • Search intent changes
  • Content having outdated or even misleading information
  • Weak CTAs

  Whatever the diagnosis may be, you’ll want to review the SERPs to see what Google is showing and improve your page based on that. In most cases, you’ll have to improve the quality of your content by making it more comprehensive, easier to read, and is better than your competitor’s content. Adding media (audio, video, pictures, etc.) is also a great and easy way to improve the quality of a page. Search engines love to see media included within a page since it improves the searcher’s experience and makes your content more engaging.

Cannibalizing Pages

image 53

Another useful way to audit your site’s performance is to find cannibalizing content on your website. This technique is basically a way to find content on your website that may be competing with another URL. Not only will one piece of content be hidden from search, but it can also devalue the performance of the ranking article. Consolidating or deleting your cannibalizing article is a good way to prevent this. To find these cannibalizing pages, all you’ll do is search for an exact query within GSC. Once you have your query set, you’ll wanna look at the pages tab to see if two URLs appear for that query. After that’s been identified, you’ll want to review the content on both pages to see if they’re too similar.

Most likely they are and you’ll benefit from merging them. As a cautionary reminder, check the data before you do any of this. If those pages are still receiving solid traffic in different ways then you can leave them. But if one is performing and the other isn’t, then you should consolidate or delete it.

Outdated Content

Another easy on-page tip using GSC is to find content that may be outdated. Whether that’s content from the previous year or contains outdated information. The easiest way to find these pages is to filter your query as a year (2020, 2021,2022) to find queries on your site that are driving clicks from those years. You can also use regex to find a range of years instead of going year by year.

image 32

Content with Potential

Going off of the content audit, you’ll want to identify pages that have ranking potential.

image 12

While this may sound similar to striking distance keywords, you’ll instead be looking for keywords that aren’t currently close to ranking, but have potential to be heavy hitters with a rewrite. These pages will be low in clicks and positioning, but high in impressions. While you could delete or consolidate them, they still have potential to drive value to your site. These pages will likely need to be realigned with search intent and given a full on-page makeover with additional content. Don’t neglect these pages as they could also be easy wins for your site.

High Impressions and Positioning, Low CTR

Sometimes your content is sitting in a spot where it should be driving traffic but isn’t for whatever reason. Competitors could be stealing your potential traffic using engaging meta descriptions and titles.

image 20

To find these pages, all you need to do is set a position filter for 5 and above, click filter for clicks below 1, and then sort by impressions. Then review each URL on the SERPs to see why they aren’t driving traffic. Are they being stolen by competitors? Is your title and meta description boring? Are there opportunities to add schema? These are the type of things you’ll want to look for when you audit your Google Search Console.

Now You’re Set to Audit Your Own Google Search Console Account

Hopefully that was comprehensive enough, but this is basically a complete way you can audit your website in Google Search Console. Need help with a SaaS SEO strategy? Feel free to schedule a free SEO strategy call with me. I’m a SaaS SEO consultant with over 4+ years in the industry.

Related Google Search Console Articles