A technical SEO audit is the foundation of any serious search optimization effort. You can have the best content in your industry, but if search engines cannot crawl, render, and index it properly, that content will never reach its potential audience. Technical SEO is the plumbing that makes everything else work.
This checklist covers 30 critical points organized into eight categories. It is designed to be actionable: each point includes what to check, why it matters, and how to fix common issues. Whether you are auditing your own site or a client’s, work through these points systematically and you will catch the vast majority of technical issues that hold sites back from ranking.
Crawlability: Can Search Engines Find Your Pages?
Crawlability is the first gate. If Googlebot cannot discover and access your pages, nothing else in this checklist matters.
1. Robots.txt review. Verify that your robots.txt file is accessible at yourdomain.com/robots.txt and that it is not accidentally blocking important pages or directories. Common mistakes include blocking /wp-admin/admin-ajax.php (which breaks some WordPress AJAX functionality for Googlebot), blocking CSS or JS files that Googlebot needs to render pages, and using overly broad Disallow rules. Use Google Search Console’s robots.txt Tester to validate.
2. XML sitemap. Confirm that you have a valid XML sitemap submitted in Search Console. The sitemap should include only canonical, indexable URLs—no redirected pages, no noindexed pages, no 404s. For large sites, use sitemap index files with individual sitemaps capped at 50,000 URLs or 50 MB. Check that lastmod dates are accurate, not auto-generated timestamps.
3. Internal link structure. Every important page should be reachable within three clicks from the homepage. Use a crawler like Screaming Frog or Sitebulb to map your site’s architecture and identify orphan pages (pages with no internal links pointing to them). Orphan pages are effectively invisible to search engines unless they are in the sitemap.
4. Crawl budget management. For sites with more than 10,000 pages, crawl budget becomes a factor. Remove or noindex low-value pages (thin tag pages, empty category pages, paginated archives). Check the Crawl Stats report in Search Console to see how Googlebot is spending its crawl budget on your site. If most requests go to low-value URLs, restructure your internal linking to prioritize high-value pages.
Indexation: Are Your Pages in the Index?
A crawled page is not necessarily an indexed page. Google may choose not to index pages it considers low-quality, duplicate, or not useful enough to warrant inclusion.
5. Index coverage report. In Google Search Console, review the Pages report (formerly Index Coverage). Focus on pages in the “Not indexed” categories: “Discovered – currently not indexed” often means Google found the page but does not think it is worth indexing. “Crawled – currently not indexed” means Google visited it and decided not to include it. Both warrant investigation.
6. Canonical tags. Every page should have a self-referencing canonical tag or a canonical pointing to the preferred version if duplicates exist. Common canonical mistakes: pointing canonicals to 404 pages, having chains (A canonicals to B, B canonicals to C), and conflicting canonicals between the <link> tag and HTTP header. Use site:yourdomain.com searches to spot pages where Google has chosen a different canonical than the one you specified.
7. Duplicate content. Run a crawl and check for pages with identical or near-identical title tags, meta descriptions, or body content. Common culprits: URL parameter variations (e.g., ?sort=price), HTTP vs. HTTPS versions, www vs. non-www versions, and trailing-slash inconsistencies. Consolidate with 301 redirects or canonical tags.
8. Meta robots and X-Robots-Tag. Check that no important pages carry a noindex directive, either in the HTML <meta> tag or the HTTP X-Robots-Tag header. This is one of the most common audit findings—staging or development noindex tags that were never removed after launch.
Site Speed and Core Web Vitals
Page speed is both a ranking factor and a user-experience factor. Google’s Core Web Vitals have formalized the metrics that matter most.
9. Largest Contentful Paint (LCP). LCP measures when the largest visible element in the viewport finishes loading. The target is under 2.5 seconds. Common fixes: optimize and properly size hero images (use WebP or AVIF), preload the LCP resource, eliminate render-blocking CSS and JS, and use a CDN for static assets. Check field data in the CrUX report via PageSpeed Insights.
10. Interaction to Next Paint (INP). INP replaced First Input Delay in March 2024 as the responsiveness metric. It measures the delay between a user interaction (click, tap, keypress) and the next visual update. Target: under 200ms. Common fixes: break up long JavaScript tasks, defer non-critical scripts, reduce DOM size, and minimize main-thread blocking.
11. Cumulative Layout Shift (CLS). CLS measures unexpected layout shifts during the page lifecycle. Target: under 0.1. Common fixes: set explicit width and height on images and videos, avoid dynamically injected content above the fold, use font-display: swap for web fonts, and reserve space for ads and embeds.
12. Server response time (TTFB). Time to First Byte should be under 800ms for the origin server. If TTFB is slow, investigate server-side rendering bottlenecks, database query performance, and whether you need a faster hosting provider or a CDN edge-caching layer.
13. Resource optimization. Compress images (target 80–90% quality in WebP/AVIF), minify CSS and JavaScript, enable Brotli or gzip compression, and defer or async non-critical scripts. Use the Coverage tab in Chrome DevTools to identify unused CSS and JS.
Structured Data and Rich Results
Structured data does not directly boost rankings, but it enables rich results that increase click-through rates and provides machine-readable context that AI search engines use.
14. Schema markup validation. Run every page template through Google’s Rich Results Test and the Schema.org Validator. Fix all errors and warnings. Common errors: missing required fields (e.g., image on Article), invalid date formats, and broken @id references.
15. Appropriate schema types. Use the most specific schema type for each page: Article for blog posts, Product for product pages, LocalBusiness for location pages, FAQPage for FAQ sections, HowTo for tutorials. Do not over-markup—only add schema that accurately represents the page content.
16. Breadcrumb markup. Implement BreadcrumbList schema on every page. This helps search engines understand your site hierarchy and can generate breadcrumb rich results in SERPs. Ensure the breadcrumb trail matches your actual site navigation.
17. Organization and sitelinks. Add Organization schema to your homepage with name, url, logo, and sameAs links to your social profiles. This helps Google build a Knowledge Panel for your brand and improves entity recognition across all AI search products.
Mobile-First Indexing and Responsiveness
Google has used mobile-first indexing for all sites since October 2023. If your mobile experience is lacking, your desktop rankings suffer too.
18. Mobile rendering check. Use Google’s URL Inspection tool to see how Googlebot renders your pages on mobile. Check for content that is hidden on mobile (e.g., collapsed accordions that Googlebot may not expand), images that are lazy-loaded with a mechanism Googlebot does not trigger, and JavaScript-dependent content that fails to render.
19. Viewport configuration. Ensure your pages include <meta name="viewport" content="width=device-width, initial-scale=1">. Without this tag, mobile browsers render pages at desktop width and scale down, which hurts usability metrics and can trigger mobile-usability errors in Search Console.
20. Touch target sizing. Buttons and links should have a minimum tap target of 48x48 CSS pixels with at least 8 pixels of spacing between adjacent targets. Small touch targets frustrate users and contribute to poor INP scores when users accidentally tap the wrong element.
21. Font legibility. Body text should be at least 16px on mobile. Avoid requiring horizontal scrolling. Ensure line height is at least 1.5 for body text. These are basic usability requirements, but they are frequently violated, especially on sites that were designed desktop-first and adapted for mobile as an afterthought.
Security, HTTPS, and Trust Signals
HTTPS has been a confirmed ranking signal since 2014, and in practice it is now a baseline requirement. But there are several security-related checks that go beyond simply installing an SSL certificate.
22. HTTPS everywhere. Every page, image, script, and stylesheet should be served over HTTPS. Check for mixed-content warnings using Chrome DevTools or a crawler. Even a single HTTP-loaded image can trigger a “Not Secure” warning in some browsers and undermines trust signals for search engines.
23. SSL certificate validity. Verify that your SSL certificate is valid, not expired, and covers all subdomains you use (including www). Set up automated renewal through Let’s Encrypt or your hosting provider. An expired certificate can cause your entire site to become inaccessible to users and crawlers.
24. HTTP to HTTPS redirects. All HTTP URLs should 301 redirect to their HTTPS equivalents. Check that this redirect is a direct 301, not a chain (HTTP → HTTPS with www → HTTPS without www, for example). Redirect chains waste crawl budget and dilute link equity.
25. Security headers. Implement Strict-Transport-Security (HSTS), X-Content-Type-Options: nosniff, X-Frame-Options, and a Content-Security-Policy. While these are not direct ranking factors, they protect your site from attacks that could result in malware warnings, which absolutely do affect rankings and trust.
International SEO and Hreflang
If your site serves content in multiple languages or targets multiple countries, international SEO signals are critical to ensure the right version appears for each audience.
26. Hreflang implementation. Add hreflang tags to every page that has language or regional variants. Each page must reference all its variants, including itself. Common mistakes: missing the self-referencing tag, using incorrect language codes (e.g., en-UK instead of en-GB), and having non-reciprocal hreflang tags (page A points to page B, but page B does not point back to page A).
27. URL structure for international content. Choose a consistent URL strategy: subdirectories (/en/, /fr/), subdomains (en.example.com), or ccTLDs (example.fr). Subdirectories are generally the safest choice because they consolidate domain authority. Whichever you choose, be consistent and ensure your hreflang tags match your URL structure.
28. Content localization vs. translation. Machine-translated content with no human review can trigger Google’s thin-content or auto-generated-content policies. If you use machine translation, have native speakers review and edit the output. Better still, localize content for each market—adapt examples, currency, regulations, and cultural references, not just language.
29. Geotargeting in Search Console. If you use subdirectories or subdomains (not ccTLDs, which have inherent country targeting), set geographic targets in Search Console for each international section. This is not possible for sites that target multiple countries from one subdirectory, in which case hreflang does the heavy lifting.
30. CDN and server location. Serve content from servers or CDN edge nodes geographically close to your target audience. While Google has stated that server location is a minor signal, it affects TTFB, which affects Core Web Vitals, which does affect rankings. More importantly, faster load times improve the user experience for every visitor.
Conclusion
A technical SEO audit is not a one-time project. It is a recurring discipline that keeps your site’s foundation solid as you add content, make design changes, and respond to search engine updates. The 30 points in this checklist cover the issues that account for the vast majority of technical SEO problems in the wild.
Work through them systematically, prioritize by impact, and build technical SEO checks into your regular workflow. A site with strong technical foundations will outperform a competitor with better content but a broken technical setup every time.
Frequently Asked Questions
Run a comprehensive audit at least twice a year, and perform lighter checks monthly. After any major site change—CMS migration, redesign, URL restructure—do a full audit immediately. Automated monitoring tools can catch issues between manual audits.
At minimum: Google Search Console (free), a site crawler like Screaming Frog (free up to 500 URLs), PageSpeed Insights (free), and Chrome DevTools (free). For larger sites, consider paid crawlers like Sitebulb, Lumar, or Ahrefs Site Audit for more comprehensive analysis.
Prioritize by impact: crawlability blockers first (robots.txt blocking important pages, noindex on key pages), then indexation issues (duplicate content, canonical errors), then performance (Core Web Vitals failures), then structured data. Fix the issues that prevent pages from appearing in the index before optimizing how they appear.
Yes. Core Web Vitals are a confirmed ranking signal within Google’s page experience system. While content relevance and backlinks are stronger signals, Core Web Vitals serve as a tiebreaker when content quality is similar. More importantly, slow sites have higher bounce rates, which indirectly hurts engagement metrics.
A complete audit addresses all eight categories in this checklist: crawlability, indexation, site speed, structured data, mobile, security, and international SEO. After making changes, re-crawl the site and verify that the issue count has decreased. Zero issues is unrealistic for most sites—aim for zero critical issues and a manageable number of minor ones.