Technical SEO Audit Guide
What a real technical SEO audit checks, what gets found on most audits, and how to prioritize fixes so you're not staring at a 50-item list with no idea where to start. Written by a senior SEO analyst who's run audits since 2010.
What a technical SEO audit is
A technical SEO audit is a structured review of every technical factor that affects how search engines crawl, index, and rank a website. Unlike content audits (which look at what's on the page) or backlink audits (which look at links pointing to the site), a technical audit focuses on the underlying infrastructure: how the site is built, how pages connect to each other, how fast they load, how mobile devices see them, and whether search engines can even read the content properly.
Most ranking problems aren't about content quality or backlink profiles. They're about technical issues that prevent the content and backlinks from doing their work. A page with great content that can't be crawled doesn't rank. A site with strong backlinks but slow page speed underperforms. A site with a faulty canonical tag setup competes against itself for every query.
This guide covers what a proper technical SEO audit actually checks, what gets found on most audits, how to prioritize fixes, and when DIY is enough versus when senior expertise is the right call. The framework here is the same one Whitewater uses on paid client audits, just without the client-specific findings.
When you need one
Not every site needs a technical audit right now. Some situations call for one urgently. Others can wait. Knowing the difference saves money and focus.
The right times to audit
Within 30 days of launching a new site or completing a redesign. Catching technical issues before they get crawled and indexed at scale is much easier than fixing them after.
After any site migration. CMS swaps, URL restructuring, domain changes, and theme migrations all introduce technical issues that aren't visible until search engines start seeing them. Migration audits should happen within the first 30 to 60 days post-launch.
After significant performance drops. If organic traffic or rankings drop noticeably without an obvious algorithm update, the audit is the right diagnostic step before assuming the problem is content or links.
Before investing heavily in content or link building. There's no point pouring resources into content on a site that can't be crawled efficiently or has Core Web Vitals failures dragging every page down.
Annual maintenance on healthy sites. Even sites without obvious problems benefit from a yearly check. Technical issues accumulate quietly. An annual audit catches drift before it becomes a problem.
The wrong times to audit
The site is brand new with no content yet. A launch audit makes sense, but a full technical audit on a 5-page site is overkill. Build out the site first.
You just had one in the past 6 months and nothing major has changed. Technical SEO doesn't degrade fast on stable sites. Save the audit budget for when something actually changes.
The known problem is content or audience targeting, not infrastructure. Don't audit infrastructure to solve a content problem.
Crawlability and indexation
The most basic technical question: can search engines actually access your pages? You'd be surprised how often the answer is no.
Crawlability is whether search engine bots can reach your pages. Indexation is whether those pages get added to the search index after being crawled. Both have to happen for a page to rank. Either step can fail for reasons that aren't obvious from looking at the website normally.
Common crawlability issues
- Pages blocked by robots.txt without intent
- Pages that require JavaScript to load and don't render server-side
- Pages behind authentication that should be public
- Internal links to URLs that 404 or redirect
- Crawl traps (infinite loops created by faceted navigation, calendars, search results)
Common indexation issues
- Pages with "noindex" meta tags that should be indexed
- Duplicate content causing canonical confusion
- Thin content that Google chooses not to index
- Soft 404s (pages returning 200 status but with error content)
- Pages excluded because of crawl budget limitations on large sites
Google Search Console's coverage report shows which pages are indexed, which aren't, and why. Reading that report carefully is usually the first 30 minutes of any audit. Most sites have at least one category of pages with indexation issues that surprise the owner.
Before debating content strategy or backlink profiles, confirm that Google can actually see your pages. The number of sites failing this baseline check is higher than most owners expect.
Site architecture and URL structure
Site architecture is how pages relate to each other. URL structure is how the addresses are formatted. Both affect how Google understands the site hierarchy and how authority flows internally.
Architecture that works
A clear hierarchy where every page is reachable within 3 to 4 clicks from the homepage. Category pages that group related content. Internal links that flow from authority pages (like the homepage) down to detail pages (like service pages and articles). No orphan pages that no other page links to internally.
URL structure that works
Lowercase. Hyphens between words, not underscores. Descriptive slugs that hint at the content (/services/technical-seo-audit/, not /p=12345). Consistent depth across similar content types. URLs that don't change without good reason because every change creates redirects to manage.
Common architecture problems
Orphan pages: pages that have no internal links pointing to them. These exist for various reasons (migration mistakes, removed nav items, abandoned features) and they tend to rank poorly because Google reads internal linking as a signal of importance.
Excessive depth: important pages that take 5+ clicks to reach from the homepage. The deeper a page sits, the less internal authority it receives. Important content should be closer to the surface.
Inconsistent URL patterns: /service/audit/ for one service and /technical-seo/ for another. Inconsistency creates indexing confusion and makes future migrations harder.
If site architecture issues are coming up regularly during the audit, the right next step might be a website design or rebuild rather than trying to patch around structural problems on the existing site.
Page speed and Core Web Vitals
Google uses Core Web Vitals as ranking signals. They're not the largest signals, but they're real ranking signals on top of the user experience benefits faster pages produce.
The three Core Web Vitals
Largest Contentful Paint (LCP): how quickly the main content of a page becomes visible. Target is under 2.5 seconds for the 75th percentile of users. Common causes of slow LCP: slow server response, large hero images, render-blocking JavaScript, slow font loading.
Interaction to Next Paint (INP): how quickly the page responds to user interactions like clicks and taps. Target is under 200 milliseconds. Replaced First Input Delay (FID) in March 2024. Common causes of slow INP: heavy JavaScript execution, third-party scripts, inefficient event handlers.
Cumulative Layout Shift (CLS): how much the page layout shifts unexpectedly as it loads. Target is under 0.1. Common causes of CLS: images without dimensions specified, fonts loading and reflowing text, ads or embeds loading in late, lazy-loaded content pushing other elements around.
Where most sites fail
The most common Core Web Vitals failures: LCP from oversized hero images and slow server response, CLS from images without width and height attributes, INP from third-party tracking scripts running on every page load. Most of these are fixable without rebuilding the site, but they require knowing what to look for.
Google Search Console reports Core Web Vitals data from real user measurements (RUM) over the past 28 days. That's the data Google actually uses for ranking, not synthetic tests. The report shows which URLs are passing, which are failing, and which need improvement, broken down by mobile and desktop.
Mobile experience
Google's index has been mobile-first since 2019. The mobile version of a site is what gets indexed and ranked, with desktop as a secondary signal. A site that works well on desktop but has mobile usability issues underperforms.
What gets checked
Tap targets sized appropriately (buttons and links that are easy to tap with a finger, not just a mouse). Content width matching the viewport so users don't have to scroll horizontally. Text legible without zooming. Mobile speed (mobile devices are typically slower than desktops, so the same page speed work matters more on mobile).
Common mobile failures
Themes that look good on desktop but break on mobile in specific spots: tables overflowing horizontally, popup modals covering the entire viewport, sticky elements blocking content. These often go unnoticed because the owner mostly tests on desktop. The audit catches them.
HTTPS and security
HTTPS has been a ranking signal since 2014. Every legitimate site should be on HTTPS by now. What gets missed in audits isn't whether HTTPS is enabled but whether it's implemented cleanly.
Common HTTPS issues
Mixed content warnings: HTTPS pages loading HTTP resources (images, scripts, stylesheets). Browsers flag these and Google sees them as a signal of poor implementation.
Redirect chains from HTTP to HTTPS: HTTP requests should redirect directly to the HTTPS version. Extra hops (HTTP → HTTPS non-www → HTTPS www) waste crawl budget and dilute link equity.
Expired or improperly configured certificates: catches that happen during renewal lapses or when subdomains weren't included in the cert.
Old HTTP URLs still being linked internally instead of HTTPS versions.
Structured data and schema markup
Structured data is JSON-LD code that tells search engines exactly what a page is about. It's not a direct ranking factor in most cases, but it enables rich results in search (review stars, FAQ accordions, product details, recipe cards) that significantly increase click-through rates.
Schema types worth checking
Organization or LocalBusiness on the homepage or contact page. Service schema on service pages. Article schema on blog posts and guides. FAQPage schema where FAQs appear naturally. Product schema for ecommerce. Review schema (only from real third-party review sources, not self-serving). Breadcrumb schema on every non-homepage page.
Common schema issues
Schema implemented but with errors that prevent rich result eligibility. Schema present on staging sites but missing on production. Schema that doesn't match the content visible on the page (Google sees this as deceptive). Outdated schema using deprecated properties.
Google Search Console's Enhancements section flags structured data errors. Worth checking. Many sites have schema implementation issues their original developer never circled back to fix.
XML sitemaps and robots.txt
The XML sitemap tells search engines which URLs exist on the site. The robots.txt file tells them which URLs they're allowed (or not allowed) to crawl. Both files are simple but often configured incorrectly.
XML sitemap issues
Sitemap including URLs that should be excluded: tag pages, author archives, paginated archives, duplicate variants. Sitemap missing URLs that should be included: new pages added but never regenerated, content types excluded by plugin configuration. Sitemap including URLs that return 404, redirect, or have noindex tags. Sitemap not submitted to Google Search Console or referenced in robots.txt.
Robots.txt issues
Disallow rules blocking CSS, JavaScript, or images that Google needs to render pages correctly. Disallow rules accidentally blocking important content (a wildcard rule that's too broad). Robots.txt blocking the sitemap location. Missing or incorrect sitemap reference.
Canonical tags and duplicate content
Canonical tags tell search engines which version of a URL is the authoritative one when duplicates exist. Most CMSs handle canonicals automatically, but the automation often goes wrong.
Common canonical issues
Self-referencing canonicals missing on pages that should have them. Canonical tags pointing to the wrong URL (homepage instead of the actual page). Canonical tags pointing to redirected or 404 URLs. Different canonical strategies across different page templates within the same site.
Duplicate content sources
URL parameters creating duplicate versions (sort=price, sort=name, filter=red). Print versions of pages with separate URLs. AMP versions if AMP is still in use. Trailing slashes versus no trailing slashes. www versus non-www versions not consolidated. HTTP versus HTTPS not consolidated. International versions without proper hreflang.
Most CMSs handle these automatically once configured. The audit verifies that configuration is correct and that the canonicals are doing their job.
JavaScript rendering
Modern websites built with React, Vue, Angular, or similar frameworks often render content with JavaScript on the client side. The HTML that arrives initially is mostly empty, and JavaScript fills it in after the page loads.
Google can render JavaScript, but with limitations. The rendering happens in a second pass that can take days or weeks after initial crawl. If your most important content only appears after JavaScript runs, you're waiting longer to get indexed and your content quality signals don't compound as quickly.
The safer approach
Server-side rendering (SSR) or pre-rendering critical content. The meaningful HTML arrives in the initial response. JavaScript then enhances the experience but isn't required for indexing. Most modern frameworks support SSR with reasonable effort.
Testing for JS dependence
Disable JavaScript in your browser and load a page. If the content disappears, you have a rendering dependency that affects SEO. The fix isn't always SSR, sometimes it's pre-rendering specific page types, but the dependency needs to be identified before it can be addressed.
Common audit findings
After 15 years of running audits, certain findings come up almost every time. Not always all of them, but at least several on most sites:
- Missing or broken canonical tags on at least one page template
- Duplicate content from URL parameters not properly canonicalized
- Mobile usability issues (typically tap targets too small or content wider than screen)
- Core Web Vitals failures (usually LCP or CLS)
- Missing structured data on key page types
- Slow time-to-first-byte from over-provisioned themes or shared hosting
- Excessive redirect chains (3+ hops)
- 404 errors from old URLs that should redirect to current pages
- Mixed content warnings (HTTP resources on HTTPS pages)
- Pages with no title tag or duplicate title tags across multiple pages
- Robots.txt blocking resources Google needs (CSS, JS, image directories)
- XML sitemap including URLs that should be excluded
- Orphan pages with no internal links
Most sites have at least 5 to 7 of these. None of them require advanced expertise to find. They're just rarely looked at without a structured audit.
Want Whitewater to run a technical SEO audit on your site?
Senior SEO analyst running the audit. Prioritized findings, implementation roadmap, and audit fee credits toward the first month of any SEO engagement if you convert within 30 days.
See the audit servicePrioritizing what to fix first
A thorough audit can produce 50+ recommendations. Trying to fix everything at once is the fastest way to never finish anything. The right approach is to prioritize by impact and effort.
High impact, low effort
Fix immediately. Examples: pages with no title tag, broken canonical tags pointing to wrong URLs, easily fixable 404s, missing meta descriptions on top pages. These are usually a few hours of work each and produce visible movement within weeks.
High impact, high effort
Plan and execute systematically. Examples: site migration cleanup, Core Web Vitals improvements requiring theme changes, schema implementation across templates, internal linking architecture overhauls. These are weeks or months of work but produce the largest sustained gains.
Low impact, low effort
Knock out in batches when time permits. Examples: cleaning up minor schema warnings, removing unused meta tags, updating dated copyright years. Small wins that look good in the aggregate but don't move rankings on their own.
Low impact, high effort
Usually not worth doing unless they become part of a larger initiative. Examples: hreflang setup for sites without significant international traffic, AMP implementation in 2026.
Whitewater audits arrive with this prioritization built in, not as a flat list of 50 items to figure out yourself. After the audit, implementation can happen in-house using the audit as a roadmap, or through ongoing monthly SEO packages for sites that need both technical work and ongoing strategy. The SEO retainer is the third option for businesses that prefer hour blocks they can allocate against specific audit findings month by month.
When to DIY vs hire help
Parts of a technical SEO audit are absolutely DIY-friendly. Other parts require expertise, tools, and judgment that most business owners don't have access to. Knowing which is which saves time on the parts you can handle and avoids wasted effort on the parts you can't.
The DIY parts
- Running Google Search Console's coverage report and reading what's indexed and what isn't
- Checking the mobile usability report in Search Console
- Reviewing Core Web Vitals data in Search Console
- Looking at the XML sitemap to verify it's submitting the right URLs
- Spot-checking robots.txt for obvious issues
- Running individual pages through Google's mobile-friendly test
The parts that usually need help
- Crawling 1,000+ pages and analyzing patterns across the whole site
- Diagnosing why specific pages aren't indexed when they should be
- Reading server log files to understand actual crawler behavior
- JavaScript rendering analysis and remediation planning
- Schema markup implementation and validation across templates
- Migration planning and risk assessment
- Building an actual prioritized roadmap from the raw findings
The free consultation as a starting point
If you're not sure whether your site needs a full audit, the free SEO consultation is a 30 minute call where a senior analyst pulls your site up live and runs through the high-level technical checks. That's usually enough to know whether audit-level issues exist and whether commissioning a full audit is the right move. No pitch, no contract pressure.
For local businesses, the technical audit findings often overlap with local SEO concerns. The local SEO guide covers the on-page and off-page parts of local search that work in parallel with technical work. And for ongoing implementation after the audit, local SEO services bundle technical fixes with the local-specific work like Google Business Profile optimization and citation building.
Common questions about technical audits.
How often should I do a technical SEO audit?
How long does a technical SEO audit take?
What's the difference between a technical SEO audit and a content audit?
Can I do a technical SEO audit myself?
How much does a technical SEO audit cost?
Do I need a technical SEO audit before doing content work?
What gets fixed after the audit is done?
Want to know what's actually under the hood?
Book a free SEO consultation. A senior SEO analyst pulls your site up live on the call, runs through the high-level technical checks, and tells you straight whether you have audit-level issues or whether ongoing work like monthly SEO packages is the right next move. No pitch, no contract pressure.