Sending your data to our servers, please wait...

illustration of a blog post: Fed Fix: How To Perform a SEO Site Audit
white rounded rectangle masking image below

SEO 7 min read

Fed Fix: How To Perform a SEO Site Audit

Dallin Porter photo

Written by Dallin Porter

Marketing Director @ Galactic Fed

Dallin Porter photo

Expert reviewed by Dallin Porter

Marketing Director @ Galactic Fed

Published 01 Sep 2020

At Galactic Fed, we use white best practices with all of our clients to ensure higher quality ranking. Part of these best practices includes performing a SEO Audit. Check out the best way to conduct a SEO audit, and how you can optimize your findings for better ranking results.

What is SEO Site Audit?

In simplest terms, a site audit is a process for evaluating the search engine friendliness of a website in multiple areas. Think of it as a comprehensive evaluation of the site’s overall performance.

Site audits are an essential tool to evaluate how easily a search engine can discover, crawl, and index everything from individual elements of a webpage all the way up to an entire domain. To better deliver a comprehensive site audit, you must understand how search engines crawl and index a site.

How Google Crawls and Indexes Web Pages

Crawling and indexing are two distinct tasks. Crawling is when Googlebot, search bot software used by Google, which collects documents from the web to build a searchable index for the Google Search engine, looks at all the content and code on a web page and analyzes it. On the other hand, indexing is when that same page is eligible to be included and show up in Google’s search results.

Googlebot building a searchable index for the Google Search engine.

Source: SearchEngineWatch

Here’s a brief overview of how Google crawls website:

  1. Your website is always being crawled. A site is always being crawled, provided that the site is correctly set up to be available to crawlers. Google’s “crawl rate” means the speed of Googlebot’s requests. Typically, businesses want more visibility, which comes in part from more freshness, relevant backlinks with authority, social shares, and mentions, etc. the more likely it is that your site will appear in search results.
  2. Google’s Routine is first to access a site’s robots.txt file. A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. Any web pages that are indicated to “disallowed’ will not be indexed. Keeping your robots.txt file up to date is important. A technical website audit should cover the coverage and syntax of your robots.txt and let you know how to fix any existing issues.
  3. Google reads the sitemap.xml next. A good XML sitemap acts as a roadmap of your website that leads Google to all your important pages. XML sitemaps can be good for SEO, as they allow Google to quickly find your essential website pages. Because of how different websites are constructed and optimized, web crawlers may not robotically crawl every page or segment. Some content benefits more from a professional and well-constructed Sitemap; such as dynamic content, lower-ranked pages, or expansive content archives, and PDF files with little internal linking. Sitemaps also help GoogleBot quickly understand the metadata within categories like news articles, video, images, PDFs, and mobile.
  4. Search engines crawl sites more frequently that have an established trust factor. If your web pages have gained significant Domain Authority (DA), then we have seen times when Googlebot awards a site that is called “crawl budget.” The greater trust and niche authority your business site has earned, the more crawl budget you can anticipate benefiting from.
  5. Use of meta robots tags. Some sites use meta robots tags on their pages which give search engines instructions on how they like the search engines to crawl or index parts of their website. You can find a list of meta robot tags at The ultimate guide to the meta robots tag.
  6. Use of canonical tags. The canonical tag helps the webmaster to show Google the preferred URL of a web page. Canonical tags can be used to self-reference which appears beneficial because URLs may get linked to with parameters and UTM tags. Though most importantly, canonical tags should be used when your content also appears on other websites. With canonical tags, it helps webmasters prevent duplicate content issues and generally tells them that your version of the web-page is the preferred and original version.

A full site audit requires many components. If any of the terminology is unfamiliar, check out our Essential SEO Guide for Beginners, part of our SEO Series.

Audit Outcomes

The key to a successful SEO site audit is to deliver action items, not just insights. You’ll want takeaways and prioritize actions, such as fixing 10 broken links on the homepage or compressing a file to save 10 bytes.

Avoid superfluous detail wherever possible. Just deliver what to fix, and briefly explain in the “Details” column why it needs to be fixed.

Some sites will have different issues than other sites, so the audit should be tailored to highlight the main problems for your own website.

The main tasks to carry out in the site audit are:

  • Finding Broken Links / Status Code Errors - Broken links are links in the site that don’t work. Cleaning up broken links can add context to your website, improve user experience, and make content within your website easier for visitors and search engines to discover.
  • Page Speed Optimization - Page load speed is important for SEO, because it has a strong correlation with bounce rate. The longer a page takes to load, the more likely it is that the user will just leave. Google will register this behavior as a “bounce”, which has a direct negative effect on rankings. The goal is to recommend improvements in the site to improve page speed.
  • Mobile Usability - Mobile-friendliness is a direct ranking factor for Google, so it’s important to make sure that every key page type passes Google’s mobile usability test.
  • Using Screaming Frog to Audit Site - Screaming Frog is a free SEO tool that will crawl your website and spit out pages of information. You can view all of your website’s internal and external links, response codes, URLs, metadata, and directives (and that’s just scratching the surface).
  • User Interface (UI) or User Experience (UX) Visual Audit - the goal of this process is to identify areas that are causing grief for site visitors or ways to help users accomplish the goals you want. In basic words - you browse the site and look for anything wrong.

Now that you have all the information you need to execute a comprehensive technical SEO site audit, you can now move forward to creating your first audit. Stay tuned to the Galactic Fed blog for more SEO guides, tips, and how-tos including The Essential SEO Guide for Beginners and The Definitive Guide to SEO Keyword Research.

white rounded rectangle masking image below
Dallin Porter photo

Dallin Porter

Marketing Director @ Galactic Fed