Technical SEO Checklist for 2026: The Complete Audit Guide
Technical SEO problems are invisible to most business owners but can have significant consequences for search rankings. Crawl errors, missing schema, slow Core Web Vitals, and broken mobile rendering silently cost you traffic every day.

Technical SEO failures are quiet. There is no error message in your analytics, no notification from Google, no obvious sign that something is wrong. Traffic just gradually declines, or never builds in the first place, and the cause is almost never obvious from the surface.
The most common reason a well-written, well-designed website does not rank is technical: pages that cannot be crawled, content that is not indexed, schema markup that is invalid, mobile rendering that breaks on the devices most users are on. These problems exist under the surface and require a systematic audit to find.
This checklist covers the complete technical SEO audit process as it stands in 2026. Use it as a framework for assessing any site — whether you are auditing an existing property or verifying a new build before launch.
1. Crawlability: Can Google Find Your Pages?
Crawlability is the foundation of technical SEO. If Google cannot discover and navigate your pages, nothing else matters (Google Search Central, 2026).
Site Architecture
Every important page should be reachable within three clicks from the homepage. This is the practical limit for efficient crawler discovery — pages buried deeper receive less crawl attention and rank lower as a result. Run a crawl using Screaming Frog or a similar tool and map click depth across your entire site.
Orphan pages — pages with no internal links pointing to them — are a common audit finding. Search engines cannot discover these pages through crawling; they rely entirely on sitemap inclusion or external backlinks. Identify orphans and either add internal links or assess whether the content should exist at all.
Robots.txt Audit
Your robots.txt file controls which sections of your site crawlers can access. Overly restrictive configurations accidentally block important content — a misconfigured rule that disallows /blog/ or /services/ can silently remove entire content sections from Google’s index (Google Search Central, 2026).
Audit robots.txt against your site architecture. Verify that:
- No important content sections are disallowed
- The sitemap URL is declared in robots.txt
- Low-value sections (admin panels, staging environments, duplicate parameter URLs) are appropriately restricted
XML Sitemap Health
Your sitemap provides Google with a direct roadmap of available content. A sitemap that includes redirected URLs, 4xx error pages, or non-canonical URLs sends conflicting signals and wastes crawl budget (Google Search Central Documentation, 2026).
Audit checklist for sitemaps:
| Check | Pass Criteria |
|---|---|
| Submitted to Google Search Console | Yes |
| Submitted to Bing Webmaster Tools | Yes |
| No 3xx redirects in sitemap | All URLs return 200 |
| No 4xx errors in sitemap | All URLs return 200 |
| No noindex pages in sitemap | Excluded |
| No non-canonical URLs in sitemap | Excluded |
| Last modified dates accurate | Reflects actual content updates |
2. Indexation: Are Your Pages in Google’s Index?
A page can be crawlable but still not indexed. These are distinct problems with distinct fixes.
Index Coverage Report
Google Search Console’s Indexing report is the primary diagnostic tool. It categorises every page Google has encountered into indexed, excluded, or error states.
Key statuses to investigate:
“Discovered — currently not indexed” means Google found the page but has not yet crawled or indexed it. Common causes: low page authority, crawl budget constraints, or content quality signals insufficient for indexation priority.
“Crawled — currently not indexed” is more concerning. Google crawled the page and decided not to index it. This typically signals thin content, duplicate content, or a page that Google does not consider sufficiently unique or valuable.
“Excluded by noindex tag” should match your deliberate decisions. Audit this list against your site architecture — any important page appearing here has a misconfigured noindex directive.
Canonical Tag Audit
Canonical tags tell search engines which version of a URL is the preferred one when duplicates exist. Missing or incorrect canonicals allow ranking signals to split across multiple URL variations (Google Search Central, 2026).
Common canonical errors:
- Pages with no canonical tag (self-referencing canonicals should be on every page)
- Canonical pointing to a redirected URL rather than the final destination
- Paginated pages without proper canonical handling
- HTTP version canonicalising to HTTP instead of HTTPS
Meta Robots Directives
The <meta name="robots"> tag in the page <head> controls indexation at the page level. A noindex directive on a page you want ranked is the single most direct cause of a page not appearing in search results.
Crawl your site and flag every instance of noindex in meta robots tags. Verify each one is intentional. This error appears with surprising frequency after CMS updates, template changes, or staging-to-production migrations where development environment settings carry over.
3. Core Web Vitals: The Speed and Stability Audit
Chrome User Experience Report (CrUX) data from January 2026 shows the current industry-wide position: 68.3% of origins achieve good LCP, 87.1% achieve good INP, and 80.9% achieve good CLS (Google Chrome UX Report, January 2026). Nearly one in three sites still fails LCP — the most challenging metric and the one with the most direct impact on perceived load speed.
Core Web Vitals Thresholds
| Metric | Good | Needs Improvement | Poor |
|---|---|---|---|
| LCP (Largest Contentful Paint) | Under 2.5s | 2.5s — 4.0s | Over 4.0s |
| INP (Interaction to Next Paint) | Under 200ms | 200ms — 500ms | Over 500ms |
| CLS (Cumulative Layout Shift) | Under 0.1 | 0.1 — 0.25 | Over 0.25 |
Under mobile-first indexing, Google primarily crawls the mobile version of pages. Core Web Vitals are measured from real-user CrUX data for each URL. Ensure both mobile and desktop scores are within the Good threshold (web.dev, 2024). Run PageSpeed Insights on your most commercially important pages — homepage, service pages, contact — for both device types.
LCP Common Failures and Fixes
LCP is most commonly degraded by slow server response times, render-blocking resources, and unoptimised hero images (web.dev, 2024). Audit actions:
- Check Time to First Byte (TTFB). If TTFB exceeds 600ms, server-side improvements are needed before any other LCP optimisation will have full effect
- Identify the LCP element using Chrome DevTools Performance panel
- Verify the LCP image is preloaded with
<link rel="preload">in the<head> - Confirm the LCP image is served in WebP or AVIF format, not JPEG/PNG
- Check for render-blocking scripts in the document head — add
deferorasyncwhere appropriate
INP Common Failures and Fixes
INP fails when the main thread is blocked during user interactions. Key audit actions:
- Use Chrome DevTools Performance tab to identify Long Tasks (tasks exceeding 50ms)
- Review third-party script load strategy — analytics, advertising, and chat widgets are the leading cause of INP failures on otherwise well-optimised sites
- Check for synchronous event handlers that trigger expensive DOM operations
CLS Common Failures and Fixes
CLS failures are almost always preventable. The most common causes (web.dev, 2024):
- Images and iframes without explicit
widthandheightattributes - Web fonts causing text reflow — apply
font-display: swaporfont-display: optional - Dynamically injected content (banner ads, cookie notices, newsletter popups) that shifts existing content rather than pushing it with reserved space
4. Mobile-First Indexing Requirements
Mobile-first indexing reached full maturity in 2026. Google now uses the mobile version of your site as the primary basis for indexing and ranking across all devices (Google Search Central, 2026). This is not a future consideration — it is the current operational reality.
Content Parity Audit
Verify that every piece of content available on desktop is equally accessible on mobile. This includes text, images, structured data, and navigation. A common error involves “mobile-friendly” designs that hide or truncate content on smaller screens — this content is invisible to Google’s primary crawl.
Checklist:
| Element | Verify |
|---|---|
| All body text accessible on mobile | No hidden-on-mobile content classes |
| All images load on mobile | Not display:none on mobile |
| Schema markup on mobile pages | Not desktop-only implementation |
| Navigation links accessible | Not requiring hover states |
| Interactive elements reachable | Touch-target minimum 44x44px (WCAG 2.1 SC 2.5.5) |
| Body font size readable | Minimum 16px without zooming |
| Viewport meta tag configured | <meta name="viewport" content="width=device-width, initial-scale=1"> |
Mobile Performance Targets
The mobile performance gap between good sites and average ones exceeds 20 percentage points for LCP (web.dev, 2024). Mobile LCP targets should be stricter than desktop targets given the higher failure rate:
- LCP: under 2.5 seconds (same threshold, harder to achieve on mobile)
- Performance budget: keep total page weight lean on mobile — industry guidance generally targets under 1MB for optimal performance on constrained connections
- JavaScript: minimize execution time — mobile CPU is significantly slower than desktop
5. Schema Markup Audit
Structured data is not a direct ranking factor. Its value is in enabling rich results — the enhanced search listings with star ratings, FAQ dropdowns, and other visual features that increase click-through rate — and in helping search engines understand content context with precision.
Schema Types for Common Business Contexts
| Business Type | Recommended Schema Types |
|---|---|
| Local business | LocalBusiness (with appropriate subtype), OpeningHoursSpecification |
| Professional services | ProfessionalService, Person (for practitioners) |
| Medical practice | MedicalOrganization, Physician, MedicalSpecialty |
| E-commerce | Product, Offer, AggregateRating |
| Blog / content | Article, BlogPosting, Author |
| FAQ content | FAQPage, Question, Answer |
| Service pages | Service, ServiceChannel |
Schema Validation Checklist
All structured data should be implemented in JSON-LD format — Google’s recommended approach (Google Search Central Documentation, 2026).
| Check | Tool |
|---|---|
| Valid JSON-LD syntax | Google Rich Results Test |
| All required properties present | Schema.org documentation |
| Schema matches visible page content | Manual verification |
| No deprecated schema types | Schema.org changelog |
| LocalBusiness includes address, phone, hours | GBP cross-reference |
A critical rule: schema markup must accurately represent visible page content. Marking up a page as having 200 customer reviews when none are visible, or claiming a price that is not shown, violates Google’s guidelines and risks a manual action (Google Search Central — Structured Data Policies, 2026).
6. JavaScript SEO Considerations
Googlebot renders JavaScript using a modern Chromium engine, but the rendering process introduces latency — pages are crawled first, then rendered in a second wave (Google Search Central, 2026). For critical content, this delay matters.
Rendering Audit
The primary test: use Google Search Console’s URL Inspection tool and click “Test Live URL,” then view the rendered screenshot. Does the rendered version match what a user sees? If content that is visible to users is absent from the rendered screenshot, Google may not be indexing it.
Common JavaScript SEO failures:
- Navigation links rendered client-side only (Google may not follow them for crawling)
- Content loaded via API calls after page load (may be missed in initial crawl)
- Infinite scroll implementations without pagination fallback (content beyond the first viewport may not be discovered)
Framework Recommendations
Server-side rendering (SSR) or static site generation (SSG) eliminates JavaScript rendering uncertainty by delivering fully-rendered HTML in the initial server response. Frameworks like Astro, Next.js, and Nuxt all support SSR and SSG out of the box. For SEO-critical sites, SSG produces the most reliably crawlable output — every page is a static HTML file that requires no rendering.
Single-page applications that rely entirely on client-side rendering face the highest JavaScript SEO risk. If your site is built as a client-side SPA without SSR, this should be flagged as a structural limitation.
7. Common Technical Issues: Quick Reference
The issues that appear most consistently across technical SEO audits, with their fixes (Google Search Central, 2026):
| Issue | Symptom | Fix |
|---|---|---|
| Missing canonical tags | Duplicate content splitting signals | Add self-referencing canonical on every page |
| Broken internal links | 404 errors in crawl | Redirect or update link destinations |
| Missing title tags | Generic or empty browser tabs | Unique title per page, 50-60 characters |
| Duplicate title tags | Multiple pages competing for same query | Differentiate or consolidate pages |
| Missing meta descriptions | Google auto-generates (often poorly) | Write unique descriptions, 150-160 characters |
| Mixed content on HTTPS | Browser security warnings | Migrate all resources to HTTPS |
| Slow TTFB | Everything else is slow | Upgrade hosting, implement caching, use CDN |
| Render-blocking resources | High LCP | Defer non-critical scripts, inline critical CSS |
| Images without alt text | Accessibility and SEO gap | Descriptive alt text on all images |
| Pagination misconfiguration | Thin pages indexed individually | Self-referencing canonicals on pagination |
8. Building a Continuous Audit Workflow
A one-time technical SEO audit has a short shelf life. New content, plugin updates, third-party script additions, and CMS changes all create regression risk. Technical SEO in 2026 is a continuous monitoring discipline, not a periodic project (Google Search Central, 2026).
The minimum viable monitoring stack:
- Google Search Console: Weekly review of Index Coverage and Core Web Vitals reports, with email alerts configured for significant indexation drops
- Automated crawls: Schedule monthly crawls using Screaming Frog or similar, with issue tracking across runs to identify regressions
- Real-user monitoring: Implement the web-vitals JavaScript library to capture actual Core Web Vitals data from real users, not just lab simulations (web.dev, 2024)
- Performance budgets in CI/CD: If your site has a deployment pipeline, integrate Lighthouse checks so builds fail when performance drops below threshold
The sites that maintain strong technical SEO health are the ones where these checks run automatically, not the ones that audit once a year.
Where to Start
If this checklist has identified potential issues but the priority order is unclear, work from the foundation upward:
- Crawlability and architecture first — if Google cannot discover your pages, nothing else matters
- Indexation second — confirm discovered pages are making it into the index
- Core Web Vitals third — mobile LCP is the highest-impact performance metric to fix
- Schema markup fourth — low effort, meaningful click-through rate improvement
- JavaScript SEO and advanced issues — relevant if you are on a client-side-heavy stack
Technical SEO is not glamorous work. It does not generate content or build links. But it is the foundation that determines whether the content and links you do produce have any search impact at all.
Want a technical audit of your site? Let’s talk ->
Ready to Build Something Exceptional?
Let's start a conversation about your next project.
Start a ProjectRelated Articles
Core Web Vitals in 2026: Why Website Speed Equals Revenue
Google's Core Web Vitals now determine both rankings and conversions. With 68.3% of sites passing LCP thresholds and a 25% conversion drop between 2-second and 4-second load times, speed is no longer optional — it's revenue.
StrategyHow Construction Firms Should Structure Their Portfolio for SEO and Developer Discovery
Free-form project pages break under updates and don't rank. Structured project data enables filtering, SEO, and consistent presentation. Here's the portfolio architecture that helps construction firms get discovered by property developers.
StrategyHow to Migrate Your Website Without Destroying Your SEO
A well-planned website migration preserves 95%+ of your organic traffic. A rushed one can lose 30-50% within three months. The difference is not luck — it is process.