A technical SEO audit evaluates everything about a website that affects how search engines crawl, index, and rank its pages—independent of the content itself. Technical issues are especially damaging because they can silently prevent even excellent content from ranking, and most of them are invisible to non-technical site owners.
This is the 47-point framework we use on every new client engagement. We've organized it into seven categories: Crawlability, Indexation, Site Architecture, On-Page Technical, Page Speed & Core Web Vitals, Mobile & UX, and Structured Data. Work through them systematically.
Category 1: Crawlability (8 Checks)
1. Robots.txt exists and is valid. Verify at yoursite.com/robots.txt. Check that Googlebot is not blocked, that the sitemap is referenced, and that no critical content directories are disallowed by mistake.
2. Crawl budget is not wasted on low-value URLs. Identify and disallow (or noindex) faceted navigation pages, session ID parameters, internal search result pages, and print-friendly versions that generate duplicate content at scale.
3. Crawl errors are reviewed and resolved in Google Search Console. Navigate to Indexing → Pages → Not Indexed to find "Crawled - currently not indexed," "Discovered - currently not indexed," and "Crawl anomaly" reports.
4. Internal links use correct canonical URLs. All internal links should point to the canonical version of each page (HTTPS, non-www or www consistent, with or without trailing slash—consistently).
5. No orphaned pages exist. Every page should have at least one internal link pointing to it. Orphaned pages get crawled infrequently and earn no internal link equity.
6. Redirect chains are resolved. No link should require more than one redirect hop to reach its destination. Use Screaming Frog to map all redirect chains and flatten them to direct 301s.
7. Site is accessible to crawlers without JavaScript dependency. Check that critical content is present in the raw HTML source, not rendered exclusively via JavaScript. Use Google Search Console's URL Inspection tool to compare "View Tested Page" vs. the raw HTML.
8. XML sitemaps are valid, submitted, and free of noindexed or 4xx URLs. Sitemaps should contain only indexable URLs. Use Google Search Console's Sitemaps report to verify submission status and identify any errors.
Category 2: Indexation (7 Checks)
9. Canonical tags are correctly implemented on all pages. Every page should have a self-referencing canonical tag. Paginated pages should have rel=prev/next or canonicalize to the first page. Duplicate pages must canonicalize to the primary.
10. No unintended noindex directives exist. Check x-robots-tag HTTP headers and HTML meta robots tags on all URLs you want indexed. A misplaced noindex on a category page can silently remove hundreds of products from Google's index.
11. The correct version of the site is indexed. Go to Google and search site:yourdomain.com. Click through to verify pages are rendering correctly. Search for both www and non-www versions to ensure one redirects to the other.
12. HTTPS is fully implemented with no mixed content. All resources (images, scripts, stylesheets) load over HTTPS. Mixed content (HTTP resources on an HTTPS page) triggers browser warnings and can suppress rankings.
13. Hreflang is correctly implemented for multilingual/multi-regional sites. Hreflang tags must reference all language variants reciprocally. Missing return references are a common implementation error that invalidates the entire hreflang setup.
14. Pagination is handled correctly. For paginated content series, use rel=next/prev links or ensure Googlebot can discover all pages in the series through crawlable links.
15. Duplicate content from parameters and trailing slashes is consolidated. Trailing slash vs. no trailing slash, ?ref= tracking parameters, and similar URL variations should resolve to a single canonical URL via 301 redirect.
Category 3: Site Architecture (6 Checks)
16. URL structure is clean, logical, and uses hyphens as word separators. URLs should be readable, contain the target keyword, and avoid unnecessary depth. /blog/how-to-fix-lcp is good; /blog/post?id=4721&cat=12 is not.
17. All important pages are within 3 clicks of the homepage. Pages deeper than 4–5 clicks from the homepage receive exponentially less crawl budget and internal link equity.
18. Category and subcategory taxonomy is logical and consistent. For e-commerce and content-heavy sites, the taxonomy directly maps to URL structure and internal linking hierarchy. Inconsistent taxonomies create orphaned pages and dilute topic authority.
19. Internal linking distributes authority to priority pages. Your most commercially important pages should receive the most internal links from high-authority pages on your site. Audit using a crawl tool: compare the number of internal links pointing to each priority page.
20. Breadcrumb navigation is implemented and consistent with URL structure. Breadcrumbs provide contextual internal links and enable breadcrumb rich results in SERPs.
21. Faceted navigation and filtering systems are managed to prevent index bloat. Sites with large product catalogs or content libraries must prevent crawlers from indexing millions of parameter-based filter combinations. Use robots.txt disallow, noindex meta tags, or canonical tags as appropriate.
Category 4: On-Page Technical (8 Checks)
22. Every indexable page has a unique, optimized title tag (50–60 characters). Title tags should contain the primary target keyword near the beginning and match the search intent of the target query.
23. Every indexable page has a unique meta description (150–160 characters). Meta descriptions don't directly affect rankings but significantly influence click-through rate in SERPs.
24. Each page has exactly one H1 tag containing the primary keyword. Multiple H1 tags per page dilute heading hierarchy signals and create internal confusion about the primary topic.
25. Heading hierarchy is logical (H1 → H2 → H3, no skips). Screen readers and search engines both parse heading structure to understand content organization. Heading levels should nest logically without gaps.
26. All images have descriptive alt text. Alt text provides context for crawlers that cannot interpret images and improves accessibility. Decorative images should use alt="" to indicate they are presentational.
27. Open Graph tags are present on all public pages. OG:title, OG:description, and OG:image should be explicitly defined to control how pages appear when shared on social platforms.
28. No keyword stuffing or hidden text is present. Both are violations of Google's spam policies. Check for CSS-hidden text (color: white on white background, font-size: 0) using a crawl tool's CSS rendering mode.
29. Pages are free of broken internal and external links. Use Screaming Frog or a similar tool in crawl mode to identify all 4xx responses from internal links. Broken external links are a minor quality signal but should be cleaned up during regular maintenance.
Category 5: Page Speed & Core Web Vitals (7 Checks)
30. LCP is under 2.5 seconds on mobile (field data in CrUX). Check Google Search Console Core Web Vitals report. Field data—not lab data—is what influences rankings.
31. INP is under 200ms on mobile (field data in CrUX). Use Chrome DevTools Performance panel to identify long interaction tasks causing INP failures.
32. CLS is under 0.1 on mobile (field data in CrUX). Check for images without dimensions, late-loading web fonts, and dynamically injected above-the-fold content.
33. TTFB (Time to First Byte) is under 600ms. Slow TTFB indicates server, hosting, or CDN issues that cannot be compensated for by front-end optimization.
34. Render-blocking scripts and stylesheets are minimized. Use defer and async attributes for non-critical scripts. Inline critical above-the-fold CSS.
35. Images are properly sized, compressed, and served in next-generation formats (WebP/AVIF). Oversized images remain the most common cause of poor LCP on content-heavy sites.
36. A CDN is configured and serving assets globally. CDNs reduce latency for visitors far from your origin server. For international sites, proper CDN configuration is non-negotiable for competitive page speed.
Category 6: Mobile & UX (5 Checks)
37. Site passes Google's mobile usability test. Use Search Console's Mobile Usability report or the URL Inspection tool to verify no mobile usability errors exist.
38. Tap targets (buttons, links) are appropriately sized and spaced for mobile. Minimum 48x48px touch target size, minimum 8px spacing between adjacent targets.
39. Text is readable without zooming on mobile screens. Default viewport settings prevent horizontal scrolling, and font sizes are at least 16px for body text.
40. Interstitials and pop-ups do not cover the main content on mobile immediately after arrival from search. Google penalizes intrusive interstitials that appear on the initial page load from search results.
41. Site renders correctly across major browsers and devices. Test on iOS Safari, Android Chrome, and the latest desktop Chrome/Firefox/Edge. CSS rendering discrepancies can cause CLS and mobile usability issues.
Category 7: Structured Data (6 Checks)
42. Organization or LocalBusiness schema is implemented on the homepage and contact page. At minimum, this establishes your business name, URL, logo, contact details, and social profiles in the Knowledge Graph.
43. Article or BlogPosting schema is implemented on all blog/editorial content. Enables rich results including article date, author, and thumbnail in some SERP displays.
44. BreadcrumbList schema matches the visible breadcrumb navigation. Enables breadcrumb rich results in SERPs and reinforces site hierarchy signals.
45. FAQ schema is implemented on FAQ pages and applicable content. Historically enabled SERP features; Google has reduced FAQ rich result display, but structured data still improves crawler understanding of content.
46. Product schema is implemented on all e-commerce product pages (where applicable). Price, availability, and review data in schema enables rich product results in SERPs.
47. All structured data is validated via Google's Rich Results Test with no errors. Invalid structured data is ignored by Google and wastes implementation effort. Validate after every deployment that touches schema markup.
Working through this checklist systematically on a new site typically surfaces 12–20 issues, of which 3–5 are genuinely significant ranking blockers. The most impactful issues are almost always in the Crawlability and Indexation categories—problems that prevent Google from properly accessing and understanding your content regardless of how well-written it is. Fix those first. Then work through Site Architecture and Page Speed. Structured data and on-page technical elements are refinements on top of a sound foundation. Run this audit quarterly and you'll catch regressions before they become ranking problems.
Want this done for you?
Our team handles everything in this article—and more—for your site every month.
Book a Free Strategy Call
