Technical SEO Audit Checklist: 25 Issues to Check
A technical SEO audit is the process of systematically reviewing your website for issues that prevent search engines from crawling, indexing, and ranking your pages effectively. Unlike content or link-building audits, a technical audit focuses on the infrastructure: the HTML, server responses, site architecture, and metadata that search engines rely on to understand your site.
This checklist covers 25 specific items organized into five categories. For each item, you will find a brief explanation of why it matters and what to look for. Whether you run these checks manually or use an SEO crawler to automate the process, working through this list will give you a clear picture of your site’s technical health.
Crawlability and Indexing
These items determine whether search engines can access and index your pages in the first place. Issues here can make entire sections of your site invisible to search results.
1. Robots.txt Configuration
Your robots.txt file tells search engine bots which parts of your site they are allowed to crawl. Check that it exists at yourdomain.com/robots.txt, that it does not accidentally block important directories (like /blog/ or your main content folders), and that it references your XML sitemap.
What to look for: Overly broad Disallow rules, missing Sitemap directive, and any discrepancies between what you intend to block and what is actually blocked.
2. XML Sitemap
Your sitemap is a machine-readable list of pages you want search engines to index. Verify that it exists, that it is referenced in robots.txt, that every URL in it returns a 200 status code, and that it does not include pages you have noindexed or redirected.
What to look for: Sitemaps with 404 or 301 URLs, sitemaps that exceed the 50,000 URL or 50MB limit, and missing sitemaps entirely.
3. Noindex Directives
A noindex meta tag or X-Robots-Tag header tells search engines not to include a page in their index. This is useful for thank-you pages, internal search results, or staging environments. The problem occurs when important pages are noindexed by mistake.
What to look for: Crawl your entire site and filter for pages with noindex. Cross-check against pages that should be indexed.
4. Canonical Tags
The rel="canonical" tag tells search engines which version of a page is authoritative when duplicate or near-duplicate versions exist. Every indexable page should have a canonical tag, either self-referencing or pointing to the preferred URL.
What to look for: Missing canonical tags, canonicals pointing to 404 pages, canonicals pointing to redirected URLs, and conflicting canonical signals (the page says one thing, the sitemap says another).
5. Redirect Chains and Loops
A redirect chain occurs when URL A redirects to URL B, which redirects to URL C, and possibly further. Each hop adds server round-trip time and can dilute link equity. A redirect loop (A redirects to B, B redirects back to A) is even worse because it makes the page completely inaccessible.
What to look for: Any redirect path with more than one hop. Consolidate chains so that every redirect points directly to the final destination.
On-Page Elements
On-page elements are the HTML tags that directly influence how search engines understand and display your content. Issues here are among the most common and most impactful.
6. Title Tags
The title tag is the single most important on-page ranking factor. Every page should have a unique, descriptive title between 30 and 60 characters. Titles that are too short waste ranking potential. Titles that are too long get truncated in search results.
What to look for: Missing titles, duplicate titles across different pages, titles shorter than 30 characters, and titles longer than 60 characters.
7. Meta Descriptions
While meta descriptions are not a direct ranking factor, they heavily influence click-through rates. Each page should have a unique meta description between 120 and 160 characters that accurately summarizes the page content and includes a clear value proposition.
What to look for: Missing descriptions, duplicate descriptions, descriptions that are too short to be useful, and descriptions that exceed 160 characters.
8. Heading Hierarchy
HTML headings (<h1> through <h6>) communicate the structure and topic hierarchy of your content. Every page should have exactly one <h1> that describes the primary topic. Subheadings should follow a logical order without skipping levels (no jumping from <h1> to <h3> without an <h2> in between).
What to look for: Pages with no <h1>, multiple <h1> tags, skipped heading levels, and headings that are stuffed with keywords rather than being descriptive.
9. Image Alt Text
Alt text serves two purposes: it makes images accessible to screen readers, and it helps search engines understand image content. Every meaningful image (not decorative spacers) should have descriptive alt text.
What to look for: Images with empty or missing alt attributes, alt text that is just a filename (IMG_2847.jpg), and alt text that is excessively long or keyword-stuffed.
10. Duplicate Content
When multiple URLs on your site serve the same or very similar content, search engines may struggle to decide which version to rank. This dilutes your ranking signals across the duplicates. Common causes include URL parameters, print-friendly pages, HTTP/HTTPS duplicates, and www/non-www variations.
What to look for: Pages with identical or near-identical content at different URLs. Use canonical tags, 301 redirects, or parameter handling rules to resolve duplicates.
11. Thin Content
Pages with very little substantive content (fewer than 200 to 300 words with no media or interactive elements) may be seen as low quality by search engines. Category pages with no descriptions, tag pages with only a list of links, and automatically generated pages are common culprits.
What to look for: Pages with low word counts that are not serving a specific functional purpose (like a contact form or login page).
12. Open Graph and Social Meta Tags
Open Graph tags (og:title, og:description, og:image) control how your pages appear when shared on social media. While not a search ranking factor, they significantly affect social click-through rates and brand perception.
What to look for: Missing OG tags, OG images that are too small or the wrong aspect ratio, and OG descriptions that do not match the page content.
URL Structure and Links
Your URL structure and internal linking patterns shape how search engines discover and prioritize your content. Issues in this area can waste crawl budget and orphan important pages.
13. Broken Internal Links
A broken internal link is any link on your site that points to a page returning a 4xx status code. These create dead ends for both users and search engine bots. Every broken link is a missed opportunity to pass link equity and guide users to relevant content.
What to look for: Run a crawl and filter for internal links with 4xx responses. Fix by updating the link target or removing the link.
14. Broken External Links
Links to external sites that have been removed, moved, or restructured will return 404 errors. While broken outbound links do not directly hurt your rankings as severely as broken internal links, they degrade user experience and can signal neglected content.
What to look for: External links returning 4xx or 5xx status codes. Either update the URL, remove the link, or replace it with a working alternative.
15. Redirect Chains on Internal Links
Beyond checking that redirects resolve correctly (item 5), verify that your internal links point to final destination URLs rather than to URLs that redirect. If your navigation links to /old-page/ which 301s to /new-page/, update the navigation to link directly to /new-page/.
What to look for: Internal links whose targets return 3xx status codes. Update the link href to the final URL.
16. Orphan Pages
An orphan page is a page that exists on your site but is not linked from any other page. Search engines may struggle to discover orphan pages, and even if they find them via the sitemap, the lack of internal links signals low importance.
What to look for: Pages that appear in your sitemap or CMS but receive zero internal links. Either add internal links to them or consider whether they should exist at all.
17. Internal Link Depth
Click depth refers to how many clicks it takes to reach a page from the homepage. Pages buried four, five, or more clicks deep receive less crawl attention and less link equity. Important content should be reachable within three clicks.
What to look for: Pages with a high click depth, especially those that you consider high priority for rankings.
18. URL Readability
Clean, descriptive URLs are easier for users to understand and for search engines to parse. Avoid excessive URL parameters, session IDs in URLs, and URLs with no semantic meaning (like /page?id=48271).
What to look for: URLs with more than two query parameters, URLs containing session IDs, URLs with unnecessary depth (/category/subcategory/sub-subcategory/page/), and non-lowercase URLs that could create duplicates.
Performance and User Experience
Page speed and user experience are confirmed ranking factors. Google’s Core Web Vitals directly measure the real-world experience of visitors on your site.
19. Page Load Speed (Server Response Time)
Time to First Byte (TTFB) measures how quickly your server begins delivering content. A TTFB above 600 milliseconds indicates server-side performance issues. While a full page speed audit requires tools like Lighthouse, checking TTFB at scale during a crawl gives you a useful overview.
What to look for: Pages with TTFB consistently above 600ms. Investigate server configuration, database queries, or caching for those pages.
20. Mobile Friendliness
Google uses mobile-first indexing, meaning it primarily evaluates the mobile version of your pages for ranking purposes. Your site must be fully functional and readable on mobile devices, with no horizontal scrolling, unplayable media, or tiny touch targets.
What to look for: Pages that are not responsive, content that is wider than the viewport on mobile, text that is too small to read without zooming, and interactive elements that are too close together.
21. Core Web Vitals
Core Web Vitals consist of three metrics: Largest Contentful Paint (LCP, measures loading speed), Interaction to Next Paint (INP, measures responsiveness), and Cumulative Layout Shift (CLS, measures visual stability). These are measured from real Chrome user data and available in Google Search Console.
What to look for: LCP above 2.5 seconds, INP above 200 milliseconds, and CLS above 0.1. Address the worst-performing pages first since these metrics affect ranking.
22. HTTPS Implementation
HTTPS is a confirmed ranking signal and a baseline expectation for modern websites. Every page should be served over HTTPS, with HTTP versions redirecting to HTTPS. Mixed content (HTTPS pages loading resources over HTTP) should be eliminated.
What to look for: Pages served over HTTP, mixed content warnings, expired or misconfigured SSL certificates, and HTTP-to-HTTPS redirects that create chains.
23. HTML Size and Resource Count
Excessively large HTML documents and pages that load dozens of render-blocking resources slow down page rendering. While modern browsers handle complexity well, there are practical limits. Pages with HTML over 100KB or more than 100 resource requests warrant investigation.
What to look for: Unusually large HTML file sizes (often caused by inline SVGs, excessive DOM elements, or duplicated code), high resource counts, and render-blocking CSS or JavaScript files.
Structured Data and Rich Results
Structured data helps search engines understand your content semantically and can unlock enhanced search appearances like star ratings, FAQ accordions, product prices, and breadcrumb trails.
24. Schema Markup Presence and Validity
Schema markup (usually implemented as JSON-LD) provides explicit, machine-readable information about your pages. Common types include Article, Product, LocalBusiness, FAQ, HowTo, and BreadcrumbList. The markup must be syntactically valid JSON-LD and conform to Google’s specific requirements for each type.
What to look for: Pages that should have structured data but do not (product pages without Product schema, articles without Article schema), JSON-LD with syntax errors, required properties that are missing, and structured data that does not match the visible page content.
25. Breadcrumb Markup
Breadcrumb structured data (BreadcrumbList schema) helps search engines understand your site hierarchy and can appear as a breadcrumb trail in search results instead of the raw URL. This improves click-through rates and gives users context about where a page sits within your site.
What to look for: Missing breadcrumb markup on content pages, breadcrumb trails that do not match the actual site navigation, and breadcrumbs with broken or incorrect URLs.
Running Your Audit
Working through all 25 items manually on a site with more than a few dozen pages is impractical. An SEO crawler automates most of these checks by visiting every page on your site and extracting the relevant data points in a single pass. Tools like Seodisias can run these checks locally on your machine, giving you a complete audit without uploading your site data to a third-party server.
Regardless of the tool you use, here is a practical approach to running your audit:
Prioritize by impact. Not every issue deserves immediate attention. A missing title tag on your highest-traffic landing page is more urgent than a missing alt attribute on an image buried in an archive page. Cross-reference crawl findings with your analytics data to focus on the pages that matter most.
Fix at the template level. If the same issue appears across hundreds of pages, the root cause is almost certainly in a template or CMS configuration, not in individual content. Fixing the template resolves the issue everywhere at once.
Re-crawl after fixing. Always verify your fixes by running another crawl. It is common for one fix to introduce a new issue, especially with redirects and canonical tags.
Establish a schedule. Technical issues accumulate over time. Plugins get updated, content editors make changes, and external sites restructure. Run a full audit at least once per quarter, and run targeted crawls after any major site change such as a migration, redesign, or large content publish.
Document your findings. Keep a record of what you found, what you fixed, and what you deferred. This creates accountability, helps new team members get up to speed, and makes it easy to track progress over time.
Summary
Technical SEO is the foundation that everything else builds on. Great content and strong backlinks cannot compensate for a site that search engines cannot properly crawl and index. This 25-point checklist gives you a structured framework for identifying and resolving the most common and impactful technical issues.
Start with crawlability and indexing, because nothing else matters if search engines cannot reach your pages. Then work through on-page elements, link structure, performance, and structured data. Each issue you resolve removes friction between your content and the people searching for it.