How to Perform a Technical SEO Audit: Explained in One Post

seo cover

One of the hardest parts of conducting an SEO audit is understanding where to start.

Even when audits are performed, the actions often take a long time to complete. In fact, the steps from some SEO audits never get achieved at all.

Audits do not need to be thousands of pages long, nor should they only be understood by people’s technical knowledge.

An excellent SEO audit should efficiently communicate a prioritized list of justifications and actions that clearly sets out the steps that need to be taken to develop organic search performance.

It is as easy as that.

In this article, we will walk you, step by step, through the method of performing an audit.

We will show you everything you need to do to audit your site, prioritize and find problems that could be holding back your site’s performance and help you learn the steps you need to take to improve your organic traffic.

Table of Contents:

What is an SEO Audit?

Before we dig deep into auditing your site, let us first explore what an SEO audit actually is. 

To put it clearly, an SEO audit is a process of classifying issues that could stop your site from ranking on Google and even other search engines.

Why is it So Important?

If there are problems that mean that your site cannot be correctly indexed and crawled, you have toxic links, or your content does not stand up against your opponents, you will miss out on organic traffic.

Sales and Competition

When your website is missing out on traffic, that also means you are missing out on your sales. It will be your opponents who enjoy this traffic and these conversions. 

Suppose there is any motivator to encourage you to audit your site. In that case, it knows that your competitors could be taking traffic and conversions that you could otherwise be reaping the benefits of.

Suppose you are not auditing your site regularly. In that case, it is easy to miss opportunities for growth, accidental issues that arose in the latest development roll-out, or only things you could have done better.

Crawl Before You Walk

Before we can get to know about your problems with the site, we have to know exactly what we are dealing with. Therefore, the first development step is to crawl your entire website.

Crawling Tools

I suggest using Screaming Frog’s SEO Spider to complete the site crawl.

Alternatively, if you want a free tool, you can use Xenu’s Link Sleuth; however, be warned that this tool was created to crawl a site to discover broken links. It represents a site’s meta descriptions and page titles, but it was not designed to perform the analysis level we are going to discuss.

Crawling Configuration

Once you have chosen a crawling tool, you need to configure it to behave like your preferred search engine crawler. First, you should place the crawler’s user agent in a proper sequence.

Next, you should determine how you want the crawler to manage various Web technologies.

There is an ongoing discussion about the intelligence of search engine crawlers. It is not entirely clear if they are glorified curl scripts or simply full-blown headless browsers.

By default, I would recommend disabling cookies, CSS, and JavaScript, when crawling a site. If you can diagnose and fix the problems found by dumb crawlers, work can also be used to most of the issues encountered by smarter crawlers.

Then, you can shift to a smarter crawler for circumstances where a dumb crawler just won’t cut it.

Ask the Oracles

The site crawl gives us a lot of information, but we need to discuss the search engines to audit the next level. Unfortunately, search engines do not give their servers access, so we will just have to resolve for the next best thing: webmaster tools.

Most of the major search engines offer a set of diagnostic tools for webmasters, but for now, we will focus on Bing Webmaster Tools and Google Webmaster Tools for our goals. If you still have not listed your site with these services, now is a good time.

Now that we have discussed the search engines, we also need to get input from the site’s guests. The simplest way to get that input is through the website’s analytics.

The Web is being controlled by an ever-expanding list of analytics packages, but it does not matter which package your site uses for our goals. As long as you can examine your site’s traffic patterns, you are good to go for our upcoming analysis.

At this point, we are not finished collecting data, but we have enough to begin the analysis so let us get this party started!

SEO Audit Analysis

The concrete analysis is broken down into five broad sections:

  1. Accessibility
  2. Competitive Analysis
  3. Off-Page Ranking Factors
  4. Indexability
  5. On-Page Ranking Factors

If users and search engines cannot enter your website, it might as well not endure. With that in mind, let us just make sure that your site’s pages are available.


The robots.txt file is used to restrain search engine crawlers from entering sections of your website. Although the file is handy, it is also an easy way to accidentally block crawlers.

As an example, the following robots.txt entry restricts all crawlers from accessing any part of your site:

Manually check the robots.txt file, and make sure it is not limiting access to critical sections of your site. 

Robots Meta Tags

The robots meta tag tells search engine crawlers if they can index a particular page and follow its links.

When examining your site’s approachability, you want to distinguish pages that are accidentally blocking crawlers.

HTTP Status Codes

Users and Search engines cannot enter your website’s content if you have URLs that return errors.

You should fix and distinguish any URLs that return errors. If a broken URL similar page is no longer available on your website, redirect the URL to a proper replacement.

Speaking of redirection, this is also an outstanding possibility to inventory your site’s redirection techniques.

Be sure the site uses 301 HTTP redirects because they pass the most link juice to their destination pages.

Site Architecture

Your site architecture defines your website’s overall structure, including its vertical depth and its horizontal breadth at each level.

When assessing your site architecture, distinguish how many clicks it takes to get from the homepage to other relevant pages.

Also, evaluate how well pages link to others in the site’s authority and make sure the most critical pages are prioritized.

Ideally, you want to attempt a flatter site architecture that benefits from horizontal and vertical linking opportunities.

Flash and JavaScript Navigation

One of the world’s best architecture sites can be weakened by navigational elements unavailable to search engines.

Search engine crawlers have become much more rational over the years. It is still safer to JavaScript and bypasses Flash navigation.

To estimate your site’s usage of JavaScript navigation, you can perform two separate area crawls, one with JavaScript enabled and another with it disabled. You can then match the similar link graphs to distinguish sections of the inaccessible site without JavaScript.


We have distinguished the pages that search engines are allowed to access. Next, we need to discover how many of those pages are being listed by the search engines.

Site: Command

Most search engines offer a “site:” command that enables you to search for content on a particular website.

You can use this command to get a rough approximation for the number of pages listed by a given search engine.

  1. The index and actual counts are harshly equivalent – this is the ideal scenario; the search engines are strongly crawling and indexing your site’s pages.
  2. The index count is larger than the actual count – this scenario usually implies that your site accepts duplicate content.

Index Sanity Checks

The “site:” command enables us to look at indexability from a very high level. Now, we need to be a little more granular.

Clearly, we need to make sure the search engines are recording the site’s most significant pages.

Page Searches

Positively, you already found your site’s high priority pages in the index while performing “site:” queries. If not, you can search for a particular page’s URL to check if it has been indexed.

If you do not find the page, double-check its approachability.

On-Page Ranking Factors

Up to this point, we have examined the indexability and approachability of your site.

 It is time to pay attention to the characteristics of your site’s pages that affect the site’s search engine rankings.

For each one of the on-page ranking factors, we will examine page-level features for the site’s domain-level characteristics and individual pages for the entire website.

In general, the page level analysis helps identify particular examples of optimization possibilities.

The domain level analysis assists in defining the level of effort required to make site-wide corrections.

1.   Information Architecture

Your site’s data architecture defines how information is laid out on the site. It is the blueprint for how your site shows information.

During the audit, you should guarantee that each of your site’s pages has a goal. You should also confirm that each of your targeted keywords is described by a page on your site.

2.   Keyword Cannibalism

Keyword cannibalism represents the condition where your site has multiple pages that target the same keyword. Want to learn more about keyword research? Don’t worry we have got you covered.

When multiple pages target a keyword, it generates uncertainty for the search engines, and more importantly, it creates confusion for guests.

To recognize cannibalism, you can create a keyword index that maps keywords to pages on your site. When you place collisions, you can merge the pages or repurpose the competing pages to target alternate keywords.

3.   Duplicate Content

Your site has duplicate content if various pages contain the same content. Unfortunately, these pages can be internal and external.

You can recognize duplicate content on internal pages by building equality classes with the site crawl.

These classes are essentially clusters of near-duplicate or identical content. You can then choose one of the pages as the original and the others as duplicates for each group. 

Off-Page Ranking Factors

The on-page ranking factors play an essential part in your website’s position in search engine rankings, but it is only one piece of a more giant puzzle.

Next, we are going to focus on the ranking factors that are caused by external sources.


A site’s authority is defined by a combination of factors.

To evaluate your site’s authority, SEOmoz provides two critical metrics: Domain Authority and Page Authority. Page Authority foretells how well a particular page will perform in the search engine rankings, and Domain Authority foretells the accomplishment for an entire domain. You can learn more about Domain Ratings and Domain Authorities from this article.

Both metrics aggregate various link-based characteristics to give you a simple way to compare the relative strengths of domains and multiple pages.

Social Engagement

As the Web becomes more social, your web site’s success depends more on its ability to create social conversations and draw social mentions.

Each social network presents its own form of social currency. Twitter has retweets. Facebook has liked. Google+ has +1s. The list is endless. 

When analyzing your site’s social commitment, you should quantify how well it is growing social currency in each of the most significant social networks.

Additionally, you should estimate the authority of the individuals that are sharing your site’s content. Just as you want backlinks from high-quality sites, you want mentions from reputable and highly influential people.

Competitive Analysis

Just when you thought that we are done, it is time to start the analysis all over for your site’s opponents.

I know it sounds painful, but the more you know about your competitors, the easier it is to identify their vulnerabilities.

SEO Audit Report

After you have examined your site and the sites of your opponents, you still need to extract all of your notes into an actionable SEO audit report. 

Here are three fundamental tips for efficiently presenting your findings:

  1. Write to multiple audiences. The meat of your report will include very technical recommendations and observations. However, it is necessary to understand that the data will not always be read by tech-savvy individuals. So, when writing the report, be sure to keep other readers in mind and provide valuable summaries for executives, managers, and anyone else that might not have knowledge of SEO.
  1. Prioritize. Despite whom actually reads your report, try to appreciate their time. Put the most critical issues at the start of the report so that everyone knows which items are critically essential.
  1. Provide actionable suggestions. Do not give general advice like, “Write better titles.” Give particular examples that can be used directly to make a positive impact on the site. Even if the requests are extensive in scope, attempt to offer concrete first steps to help get the ball rolling.

Concluding Thoughts

As an old saying says, “There is more than one way to skin a cat.” And that’s true when it comes to implementing an SEO audit, so I’d really love to hear your questions and comments in the comments below.

I will respond to everyone, and since I most probably broke this year’s record for the longest post, I urge you to break the record for most comments!

Leave a Comment