Technical SEO audit: a complete step-by-step guide (2026)
What is a technical SEO audit?
A technical SEO audit is a systematic review of everything on your website that affects how search engines crawl, index, and rank your pages. Unlike on-page SEO, which focuses on content and keywords, technical SEO deals with the infrastructure underneath: server response codes, page speed, URL structures, mobile rendering, HTTPS, and how well Google can actually access and understand your site.
Most sites that struggle to rank are not held back by weak content alone. They have crawl errors that prevent pages from being discovered, page speed problems that trigger poor Core Web Vitals scores, or duplicate content issues caused by URL parameters. These problems are invisible to most site owners because they do not show up in the front end of the site. Only a structured audit surfaces them.
A technical SEO audit should cover eight core areas: crawlability, indexation, page speed and Core Web Vitals, mobile usability, URL structure and redirects, HTTPS and security, structured data, and site architecture. Each one is covered step by step in this guide.
Tools like Semrush and Ahrefs have dedicated site audit features that automate much of this process. They crawl your site, flag errors, and group issues by severity. That said, knowing what the tools are looking for and why each issue matters is what separates a useful audit from a list of alerts you do not know how to action.
Step 1: Crawl your website
A crawl is the starting point for any technical audit. Crawl tools simulate how Googlebot moves through your site and surface the same types of issues Google encounters when it visits your pages.
Before running a crawl, check your robots.txt file. This file sits at the root of your domain (yoursite.com/robots.txt) and tells crawlers which parts of your site to skip. Mistakes here are common and serious. A single incorrect disallow directive can block Google from crawling your entire site or prevent it from accessing key directories. Open your robots.txt directly in a browser and look for any rules that block important pages, directories, or file types you need indexed.
Next, check your XML sitemap. Your sitemap should be submitted in Google Search Console and should only list pages you want indexed. A common mistake is including pages with noindex tags in the sitemap or listing pages that return a non-200 status code. Both create crawl budget waste and send mixed signals to Google.
Run your crawl using a tool like Semrush's Site Audit or Ahrefs' Site Explorer. Set the crawl to match Googlebot's user agent so you see what Google sees. Configure it to crawl all pages including those blocked by noindex, since you need a complete picture before making decisions. Once the crawl completes, review the error report in order of severity: 5xx server errors first, then 4xx client errors, then 3xx redirects.
- 5xx errors indicate server-side problems that need immediate attention from your hosting provider or developer
- 4xx errors, particularly 404s, show pages that no longer exist or have broken links pointing to them
- 3xx redirects are generally fine, but chains of three or more redirects slow down crawl efficiency and should be collapsed to a single hop
For WordPress sites, crawl issues often stem from plugin conflicts or theme code that generates duplicate pages. WordPress technical SEO has its own specific set of problems that go beyond what platform-agnostic audits typically flag.
Step 2: Check indexation
A page that cannot be crawled cannot be indexed. But a page that can be crawled can still fail to get indexed for several reasons. Indexation is the process by which Google stores your page in its search index and makes it eligible to appear in search results.
Open Google Search Console and navigate to the Coverage or Indexing report. This shows you how many of your pages are indexed, how many are excluded, and the reason for each exclusion. The most important exclusion categories to investigate are:
- Excluded by noindex tag: These pages carry a meta robots noindex directive. If any of these are pages you want ranked, remove the tag immediately.
- Crawled but not indexed: Google has visited the page but chosen not to index it. This often signals thin content, duplicate content, or low perceived quality.
- Discovered but not crawled: Google knows the page exists but has not visited it yet. On large sites, this usually indicates crawl budget problems.
- Duplicate content: Google has found multiple versions of the same content and chosen not to index the variant. This is often caused by URL parameters, trailing slashes, or HTTP vs HTTPS inconsistencies.
The fastest way to check whether a specific page is indexed is the site: search operator in Google. Type site:yoursite.com/your-page-slug and see whether it appears. If it does not, go back to Search Console to understand why. You can also use the URL Inspection tool in Search Console to request indexing for individual pages and see the last crawl date and rendered HTML.
For a detailed guide on forcing Google to discover and index new content, see the full walkthrough on how to get your website indexed by Google.
Step 3: Audit page speed and Core Web Vitals
Page speed has been a ranking factor since 2010, but the introduction of Core Web Vitals as a formal ranking signal raised the stakes considerably. Google now measures three specific metrics as part of the Page Experience update: Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), and Interaction to Next Paint (INP). All three are measured from real user data collected via Chrome, not just lab conditions.
LCP measures how long it takes for the largest visible element on the page to load. For most pages, this is a hero image or a large text block. A good LCP score is under 2.5 seconds. Slow LCP is most often caused by unoptimised images, render-blocking JavaScript, and slow server response times. The first fix is almost always image compression. Convert images to WebP format, add explicit width and height attributes to prevent layout shifts during loading, and use lazy loading for images below the fold.
CLS measures visual instability, specifically how much page elements shift around as the page loads. A CLS score below 0.1 is considered good. The most common cause of poor CLS is images or embeds without defined dimensions, and web fonts that swap in after render. Reserve space for all images and iframes, and use font-display: swap in your CSS to reduce layout instability from late-loading fonts.
INP, which replaced First Input Delay as a Core Web Vital in 2024, measures how long the browser takes to respond to user interactions. A score under 200 milliseconds is good. The main culprits are heavy JavaScript execution, third-party scripts, and long tasks that block the main thread. Audit your JavaScript using Chrome DevTools, and defer or remove any scripts that are not essential to the page experience.
Your choice of hosting has a direct impact on all three metrics. Shared hosting plans on slow servers produce high Time to First Byte (TTFB), which drags down LCP regardless of how well your images are optimised. If your TTFB is consistently above 600 milliseconds, switching to a faster hosting provider or adding a CDN is often the highest-impact change you can make. Providers like Hostinger, Bluehost, and IONOS all offer hosting plans with performance tiers suited to sites with SEO requirements.
For a full breakdown of each metric, what causes poor scores, and how to fix them, the dedicated guide to Core Web Vitals covers every issue in detail.
Step 4: Check mobile usability
Google uses mobile-first indexing for all sites. This means Google predominantly uses the mobile version of your site to determine rankings, even for searches made on desktop. If your mobile experience is broken, your rankings suffer across the board.
The Google Search Console Mobile Usability report shows any pages with detected mobile errors. The most common are text too small to read, clickable elements too close together, and content wider than the screen. Each of these is fixable with CSS adjustments and does not require a site rebuild.
Beyond the obvious errors, check that your mobile pages display the same content as your desktop pages. If your site uses a dynamic serving approach or a separate mobile subdomain (m.yoursite.com), there is a risk that content is hidden on mobile and therefore not indexed by Google. Responsive design, where one version of the HTML serves all devices, is the simplest way to avoid this problem.
Platform choice affects mobile performance significantly. Webflow, WIX, and Squarespace all generate mobile-responsive layouts by default. Custom-built sites need manual testing across multiple device sizes using Chrome DevTools device emulation.
Step 5: Audit URL structure and redirects
URL structure matters for both crawl efficiency and user experience. Clean URLs are easier for Google to parse and easier for users to remember and share. An audit of your URL structure should check for the following issues:
- Inconsistent URL formats: HTTP and HTTPS versions of the same page both resolving, or www and non-www variants returning content instead of redirecting to a canonical version
- Trailing slash inconsistencies: /page and /page/ should resolve to the same URL via a 301 redirect, not both serve content
- Dynamic parameters creating duplicate pages: URLs like /product?colour=red and /product?colour=blue that display near-identical content
- Uppercase characters in URLs: Google treats /Page and /page as different URLs, which can create duplicate content issues
- Redirect chains and loops: Any redirect that passes through more than one hop wastes crawl budget and reduces the PageRank passed through the chain
For a detailed explanation of how to structure URLs for maximum SEO benefit, the guide to SEO-friendly URL structure covers every rule and common mistake.
Redirect auditing is most efficiently done through your crawl tool. Filter for all 3xx responses and identify any chains (A redirects to B which redirects to C). Collapse these to a single redirect from the original URL to the final destination. Also check for any redirects pointing to 404 pages, which are a common legacy issue on sites that have been through multiple redesigns.
Step 6: Check HTTPS and security
HTTPS has been a confirmed Google ranking factor since 2014. If your site still serves pages over HTTP, fixing this is non-negotiable. But HTTPS adoption is not just about having an SSL certificate installed. A full HTTPS audit checks several things:
- All HTTP URLs redirect to their HTTPS equivalents via 301 redirects
- The SSL certificate is valid, not expired, and covers all subdomains you use
- No mixed content warnings exist (HTTPS pages loading HTTP resources like images or scripts)
- The canonical tag on each page points to the HTTPS version
- The XML sitemap references HTTPS URLs only
Mixed content is the most common issue on sites that have recently migrated from HTTP to HTTPS. It occurs when a page is served over HTTPS but loads a resource (typically an image or a CSS file) over HTTP. Browsers flag this as a security warning, and it can cause certain browsers to block the HTTP resource entirely. Check for mixed content using the Security tab in Chrome DevTools or a tool like Semrush's site audit.
Tools like CookieYes and Termly also address the trust and compliance layer of your site's security posture, which increasingly affects how users engage with your pages and the bounce signals Google interprets from those sessions.
Step 7: Audit structured data
Structured data is code added to your HTML that tells Google what type of content is on the page. It does not directly affect rankings, but it enables rich results in search, such as FAQ dropdowns, star ratings, and How-To cards. Rich results increase click-through rate significantly, which drives more organic traffic to the same ranking positions.
The most useful schema types for content sites are Article, FAQ, BreadcrumbList, and HowTo. For e-commerce sites, add Product and Review schema. For local businesses, LocalBusiness and OpeningHoursSpecification are essential.
Audit your structured data using Google's Rich Results Test (search.google.com/test/rich-results) and the Schema Markup Validator (validator.schema.org). Look for the following issues:
- Missing required fields for a schema type (which prevents rich results from triggering)
- Mismatched schema (schema that describes content not on the page)
- Duplicate schema blocks for the same entity
- Deprecated schema types from older implementations
Rank Math is the most efficient way to manage schema on WordPress sites. It generates and validates structured data automatically for every page type, and lets you add custom schema without editing code.
Step 8: Review site architecture
Site architecture is the way your pages connect to each other and how they are organised into a hierarchy. Good architecture makes it easy for both users and crawlers to navigate your site, ensures link equity flows to your most important pages, and reduces the number of clicks required to reach any page from your homepage.
The benchmark for most sites is the three-click rule: no important page should be more than three clicks from the homepage. Sites that bury content five or six levels deep tend to see those pages ranked poorly because Google assigns less authority to deeply nested pages, and internal linking equity dilutes with each additional hop.
A flat architecture keeps most pages within two to three levels: homepage at level one, category or hub pages at level two, individual articles or product pages at level three. This structure concentrates crawl budget on your important pages and makes internal link authority flow more efficiently.
Review your existing architecture by mapping out your URL hierarchy from the crawl data. Identify any orphan pages (pages with no internal links pointing to them), pages that are only linked from the footer or sidebar, and category pages with fewer than five internal links from the rest of the site. Orphan pages are particularly damaging because Google rarely crawls pages that have no links pointing to them, even if they are in your sitemap.
For a full guide to planning and auditing your site structure, including how to use sitemaps and internal linking to reinforce your architecture, see the dedicated article on site architecture for SEO.
Technical SEO audit checklist
Use this checklist to track your audit progress across all eight areas. Each item below represents a discrete check that can be completed independently.
Crawlability
- Robots.txt reviewed and no important pages blocked
- XML sitemap submitted to Google Search Console and returning 200 status
- Sitemap includes only indexable pages with 200 status codes
- Full site crawl completed and 5xx, 4xx, and 3xx errors triaged
- Redirect chains collapsed to single-hop redirects
- No redirect loops present
Indexation
- Coverage report in Google Search Console reviewed
- No important pages carrying unintentional noindex tags
- Canonical tags present and pointing to the correct URL on all pages
- No duplicate content from URL parameter variations
- All important pages confirmed indexed via site: operator or URL Inspection
Page speed and Core Web Vitals
- LCP under 2.5 seconds across key landing pages
- CLS score below 0.1 on all key pages
- INP under 200 milliseconds
- Images converted to WebP and lazy-loaded below the fold
- TTFB under 600 milliseconds on server response
- Render-blocking scripts deferred or removed
Mobile usability
- No mobile usability errors in Google Search Console
- All page content identical on mobile and desktop
- Touch targets minimum 48px with adequate spacing
- Text readable at default zoom without horizontal scrolling
URL structure and redirects
- Canonical domain (www vs non-www, HTTP vs HTTPS) set and consistent
- No uppercase characters in URLs
- URL parameters handled via canonical tags or parameter handling in Search Console
- All redirect chains resolved
HTTPS and security
- Valid SSL certificate installed and covering all subdomains
- All HTTP pages 301-redirecting to HTTPS equivalents
- No mixed content warnings on any page
- Canonical tags reference HTTPS URLs
- Sitemap references HTTPS URLs only
Structured data
- Article schema on all blog posts and news articles
- FAQ schema on pages with FAQ sections
- BreadcrumbList schema on all pages within a hierarchy
- All schema validated in Rich Results Test with no errors
- No deprecated schema types in use
Site architecture
- No important page more than three clicks from the homepage
- No orphan pages with zero internal links
- XML sitemap accurately reflects current site structure
- Internal linking distributes equity to key category and hub pages
What this means for your technical foundation
A technical SEO audit is not a one-off task. Sites change constantly: new pages are published, plugins are updated, redirects are added, and content is migrated. Each of these events can introduce new technical issues. Running an audit quarterly, or after any significant site change, keeps your technical foundation clean and prevents small issues from compounding into significant ranking problems.
The order of priority matters when you have a backlog of issues to fix. Server errors and crawl blocks come first because they prevent your content from being discovered at all. Indexation problems come second. Core Web Vitals and page speed come third, since they directly affect ranking scores and user behaviour signals. Structured data is the last priority because it optimises visibility for pages that are already ranking, rather than fixing barriers to ranking.
If you are starting from scratch with your technical understanding, the broader guide to how to improve your SEO puts technical work in context alongside content and link building. For sites that have already passed the technical baseline, the website SEO health check guide covers the ongoing monitoring and reporting layer that sits on top of a clean technical foundation. And if you want the tools that professional SEO teams use to run these audits at scale, the best SEO audit tools guide reviews every major option. Tools like Semrush and Ahrefs remain the two most capable platforms for running site audits, tracking issues over time, and prioritising fixes by impact. Pair them with Google Analytics to correlate technical changes with measurable traffic and engagement improvements.
Technical SEO is not glamorous work. It does not produce the immediate visible results that a new piece of content or a link-building campaign might. But it is the work that determines whether everything else you do can function. Content that cannot be crawled cannot rank. Pages that fail Core Web Vitals lose positions to competitors that meet the threshold. A site with broken redirects haemorrhages the authority built up through years of link acquisition. Fix the foundation, and every other SEO investment you make performs better.
LATEST BLOGS
Mobile SEO: how to check, fix, and improve your mobile rankings
Local SEO health check: how to audit your local business rankings
Domain authority explained: what it is and how to improve your score
MORE FROM BLOGS
RELATED
Mobile SEO: how to check, fix, and improve your mobile rankings
Local SEO health check: how to audit your local business rankings
Domain authority explained: what it is and how to improve your score
Subscribe for updates
Get the insights, tools, and strategies modern businesses actually use to grow. From breaking news to curated tools and practical marketing tactics, everything you need to move faster and smarter without the guesswork.
Success! Check your Inbox!
Tezons Newsletter
Get curated tools, key business news, and practical insights to help you grow smarter and move faster with confidence.
Latest News




Have a question?
Still have questions?
Didn’t find what you were looking for? We’re just a message away.








