So, you’ve written lots of powerful content, used your keywords, and added high-quality links. You should be on the way to SEO superstardom. Right? 

Not necessarily. 

Before you automatically assume your efforts will land your client at the top of search engine result pages (SERPs), you need to think about the nuts and bolts of your strategy. 

Achieving true SEO optimization that positively drives your search engine rankings takes constant checks and re-checks of your work. Are there errors? If so, they can cause significant damage to your SEO strategy and derail all your hard work. 

Technical SEO audits are necessary tools to catch and correct dangerous flaws and protect your client's website’s health. 

This article is your complete guide to learning what a technical SEO audit is, how to conduct one, what to look for, and how to address the errors and other issues you may find.

SEO planning template

SEO planning template

Bring your SEO strategy to the next level. Leverage our SEO planning template to streamline your SEO projects from initiation to delivery. Designed for agencies who need to deliver client work effectively and efficiently.

Engaging, interesting content and slick landing pages are the foundation of a good website. A technical audit is an in-depth inspection of these elements to ensure they’re contributing to (not detracting from) your ability to drive organic traffic through SEO efforts.

A technical SEO audit helps catch issues that would hinder crawlers from being able to index your client’s web pages. It also improves the user experience by identifying issues that frustrate users and make them click away. 

Project management software for agencies

Project management software for agencies

Do you need project management software designed especially for agencies?

Certain tools help facilitate a technical SEO audit, making it go faster and smoother. We recommend these three tools to help run your audit efficiently:

  • Google Search Console (GSC): This tool is free to anyone who owns a website, and it’s invaluable in running a site audit. GSC handles things like checking that the site’s property is available in the correct version, optimizing for visibility, and checking for server errors. 

  • Google Analytics: This tool provides tons of useful data that impacts the website’s searchability. User reports, backlink data, and content performance are all materials Google Analytics offers to help you measure the website’s health and understand how to improve it. 

  • SEO Tools: Numerous tools can crawl a site and find broken links, loading issues, missing metadata, duplicate pages, and more troublesome issues. A few of the ones we like best (because of their functionality, ease of use, and robust features) are Screaming Frog, Deepcrawl, Ahrefs Site Audit Tool, and Google PageSpeed Insights.

A technical SEO audit starts by having your client's website crawled, which means a tool of your choosing will go over every part of the site and locate issues affecting its health. 

Various tools can crawl your client's website completely and efficiently, but here are some essential things to look for during a website crawl.

Not having your client's website indexed correctly can cause issues with crawlers, which can hinder your efforts to rank high on SERPs. 

You can catch indexing issues by going to GCS and reviewing the pages of the site indexed on the “Coverage” report. This report will list any of your pages that have incurred warnings and the excluded ones.

If there are pages on the site you don’t see on the report, it means they aren’t indexed yet.

The robots.txt file sets the rules for the website. If it’s incorrect, the designation can cause the crawlers to ignore web pages. 

Check for robots.txt errors by looking for “Disallow /” within the file. Changing this rule will allow your bots to crawl the site, and then be indexed by search engines. 

HTML snippets that direct crawlers on how to index web pages are called robots meta tags. Crawlers read a “noindex” designation as not needing to crawl that specific page. If a page is marked “noindex” by accident, the designation can negatively affect the website’s ranking factor.

To audit robots meta tags, look at the “<head>” section of the web page. If you see “content = noindex,” the notation is instructing web crawlers to not crawl the page. 

A crawl budget refers to how many pages and how long it takes to crawl a website. A big crawl budget means you probably have lots of pages on the site and plenty of resources. 

The site’s most popular pages (like the homepage) are probably crawled regularly and continuously contribute to your SERP ranking. However, new pages and pages that don’t have internal and external backlinks may not be crawled at all. 

See if Googlebot has flagged any issues affecting crawlability and address them promptly to get as much online visibility as possible. 

Your guide to SEO project management

Your guide to SEO project management

Want to know how to build a successful SEO project management plan for your agency?

An XML sitemap (the one written for search engines, not humans) is a blueprint of the website, its pages, and how they relate to each other. Crawlers heavily depend on sitemaps to help them know how to move through and index your web pages.

An error and glitch-free sitemap are critical to your SEO. Otherwise, the crawlers can run up against links that lead nowhere. 

Auditing the sitemap is one of the most important tasks within a technical SEO audit. If you haven't developed one yet, using a sitemap template can get you started quickly. 

Display a sitemap somewhere on the website for the crawlers to find. It also needs to be submitted to GSC so the bots will get a heads-up to crawl and index it. 

During an audit, look for the sitemap on your client's website and ensure it’s not locked behind a login page or landing page. It should be easily and freely available. 

The next step is to visit GSC and look at the “Sitemap Report”, which will tell you:

  • If the sitemap was submitted

  • The date it was last read

  • The status of the submission 

Promptly address any issues you uncover on the Sitemap Report. If you can’t locate the sitemap in GSC, submit it right away.

Just like a house, a website’s structure needs to be strong and sturdy. The structure’s foundational elements are the web page hierarchy and how internal links connect the pages to one another. 

A good site structure creates high usability for humans and crawlers, resulting in better SEO and a more intuitive user experience.

undefined

Check that every page has a link to at least one other web page. If orphan pages are hanging out there, work on link building to connect them in a way that makes sense. 

Deep websites are bad for SEO. What do we mean? If it takes a dozen clicks to get to pages on the site, that means it's deep. 

Every click signifies a layer. If you’re clicking 12 times, that’s 12 layers. A widely accepted goal is to require no more than three clicks to reach any page on the site. A flat site makes it easier for bots and humans to find what they’re looking for.

During your audit, check how deep your client's website is. If you’re unhappy with the answer, use your keyword research to group pages together in relevancy and start flattening it out.

This objective will probably change the site architecture and the main navigation page, but the effort will increase the SEO and user experience.

Hopefully when your client's website was developed, it was set up with a URL structure that’s uniform and easy to understand. Google recommends simple, descriptive words in URLs

Changing URLs is a slippery slope, as it can affect your SEO negatively in the short term. However, if they don’t describe the web page at all, it may be worth it to change them to a consistent, descriptive formula or pattern throughout the site. 

When used well, internal links create a more seamless experience for search engine crawlers and human users. Reviewing internal links should be one of your top priorities during a technical SEO audit. 

The more clicks it takes to get to a page, the deeper its click depth. Crawlers consider these deeper pages to have low priority and relevance. If you find these pages during an audit, figure out a way to connect them closer to the homepage (if they are, indeed, valuable pages).

Broken links are not helpful to SEO, as they’re essentially dead ends. Reaching a 404 instead of the content you’re expecting is disappointing to machine and human searchers alike.

If you find them during your audit, use redirects to ensure the internal link navigates to a relevant page. 

The robots.txt file may accidentally be set to instruct crawlers to ignore certain web pages. Change the designations if you find any of these to ensure all your significant pages are being indexed. 

When a web page isn’t linked to any other web page, it’s an orphan page. The only way to get to these pages is to know their URL, which is why they seldom get crawled or indexed.

If you find orphan pages during your audit and deem them valuable, add their links to other web pages on the site to get them included in your SEO. 

While backlinks are generally valuable to your SEO, some unsavory websites will saddle other sites with spammy links. These backlinks can do more SEO harm than good. 

Download a list of the links to the site from Google Webmaster Tools and eyeball them to catch suspicious ones. But, depending on the number of pages on your client's website, this manual process may take too long. Handy tools can automate the process, like Semrush and Moz Open Site Explorer. 

Getting granular with on-page SEO components is detailed, but necessary.

When your client's website is competing for a high ranking, these seemingly small details may be the difference between first and seventh place.

Title tags and meta descriptions are what your “result” looks like on the SERP: a shorter title and basic description of what the page will be covering. If they aren’t clear and explanatory, consider revamping them to gain additional visibility. 

Google views duplicate content negatively, so having the same information on multiple web pages can hurt your SEO. That’s where canonical tags help.

A canonical tag designates the primary version, and the one you want to be indexed if there’s duplicate content on your client's website. Using canonical tags can eliminate duplicate content issues and give your SEO a boost. 

Sometimes called schema markup, this helps search engines understand your content and the details it contains more thoroughly. Choose a testing tool to see how your structured data performs.

One of the most frustrating technical issues for website visitors is a site that takes too long to load and web pages that “hang” or freeze. It’s interesting to note that site speed and performance also factor into search engine rankings. 

Here are a few ways to test the site’s performance and load speed during a technical SEO audit: 

Is the site difficult to navigate on a smartphone, or do the visuals look weird on a tablet?

Your SEO will suffer because of these issues. The prevalence of mobile makes creating a seamless experience critical, as mobile devices account for roughly 60% of global website traffic

Once you audit your desktop site, you need to audit its mobile version. A mobile crawling tool can make this quick and easy. 

Review your main audit report and see if it uncovered any images that were:

  • Too big to load

  • Missing alt text

  • Broken/not loading at all

Use tools or plugins to optimize all of the website’s images and improve the site’s performance. A few good choices are WordPress and Imagify. 

Videos are dynamic additions to your client's website. However, they can weigh it down and cause slow load times.

Address extremely large videos or those that glitch when they’re being watched. A tool like FFmpeg can help you figure out the issue and get your videos (and the website as a whole) moving again.  

Google takes security issues seriously. The search engine doesn’t want to direct users to unsafe sites. Problems detected with a website’s security will negatively impact the site's SEO. 

HTTPS employs processes that encrypt communications so it doesn’t fall into the hands of hackers or others with nefarious intentions. 

Linking content (like images) that has an HTTP on an HTTPS page is mixing content. This error can trigger a warning and cause SEO issues. 

Hackers can add code to the site if they find a vulnerability. Likewise, malware can be attached to the site and cause serious problems. Look for these in your GCS report and resolve them immediately.

A technical SEO audit project is too vital to your client’s overall business to not treat it like every other big project. The best way to keep workflows organized is with project management software.

 Teamwork offers numerous helpful features that can keep your audit organized:

Making sure your client's website is performing and ranking as well as possible is essential for your SEO objectives and the user experience. It's arguably one of the foundational elements of a digital marketing strategy. Technical website audits should be non-negotiable in keeping the site functioning well, loading fast, and free from malware and hackers. 

If your team needs assistance and support bringing your technical website audit (or any other project) together cohesively, trust Teamwork as a valuable tool to make it happen. Our robust features, easy-to-use platform, and highly visual tools will keep you and your agency team organized and on track. Contact us today to learn more.