The Technical SEO Audit examines factors that affect how search engines and AI crawlers discover, understand, and index your website. Unlike Lighthouse (which focuses on performance), this audit focuses on crawlability and indexability.
Overall ability for search engines to find and index your page:
Status | Meaning |
Indexable | Page can be indexed normally |
Indexable with warnings | Can be indexed but has minor issues |
Blocked | Search engines cannot index this page |
Your robots.txt file controls crawler access:
Exists: Whether robots.txt is present
Googlebot Access: If Googlebot is allowed
Rules Count: Number of rules defined
Sitemap References: Sitemaps listed in robots.txt
Your XML sitemap helps search engines discover pages:
Exists: Whether sitemap is accessible
Valid XML: If the format is correct
URL Count: Number of URLs in sitemap
Type: Standard or sitemap index
HTML meta tags controlling indexing:
Index/Noindex: Should search engines index the page
Follow/Nofollow: Should links be followed
Raw Tag: The actual meta robots content
The preferred version of your page:
Exists: Whether canonical is specified
URL: The canonical URL value
Self-referential: If it points to itself (recommended)
Social sharing metadata for Facebook/LinkedIn:
Has Title: og:title present
Has Description: og:description present
Has Image: og:image present
All Tags: List of Open Graph tags found
Social sharing metadata for Twitter/X:
Card Type: summary, summary_large_image, etc.
All Tags: List of Twitter tags found
JSON-LD markup for rich search results:
JSON-LD Count: Number of schema scripts
Has Microdata: If microdata is used
Schema Types: Types found (Organization, Article, etc.)
Validation: If schemas are valid
Security and encryption:
Valid: Certificate is properly installed
Issuer: Who issued the certificate
Days Until Expiry: Time until renewal needed
Protocol: TLS version used
URL redirects affecting performance:
Redirect Count: Number of hops
Total Latency: Time spent redirecting
HTTP→HTTPS: If upgrade redirect exists
Final URL: Where redirects end
Broken links on your page:
Pages Crawled: How many pages checked
Links Found: Total internal links
Broken Links: Links returning 404
Redirecting Links: Links that redirect
Which AI crawlers can access your content:
Lists major AI bots (GPTBot, ClaudeBot, etc.)
Shows if each is allowed or blocked
Indicates explicit vs inherited rules
From the sidebar, click SEO Audits → Technical Audit.
Choose a tracked page from the dropdown.
Click Run Technical Check:
Scan takes 15-30 seconds
Results appear automatically
No credits required (free!)
A technical SEO score based on all checks:
80-100: Excellent technical health
60-79: Good with minor issues
Below 60: Needs attention
Each check type is an expandable section:
Click to expand/collapse
Shows status badge (OK, Warning, Error)
Detailed findings inside
For issues found, you'll see:
What's wrong
How to fix it
Links to documentation
This unique feature shows AI crawler access:
AI assistants like ChatGPT, Claude, and Perplexity crawl websites to include brands in their responses. Blocking these bots can hurt your AI visibility.
Bot | Owner | Status |
GPTBot | OpenAI |
|
ClaudeBot | Anthropic |
|
PerplexityBot | Perplexity |
|
Google-Extended | Google AI |
|
Allow AI bots unless you have specific reasons to block
Use explicit rules rather than blocking all unknown bots
Consider AI visibility when updating robots.txt
The Technical Audit includes a Schema Generator section:
Pre-generated JSON-LD for your brand:
Schema | Purpose |
Organization | Brand info for knowledge panels |
WebSite | Site info with search action |
Speakable | Voice assistant optimization |
FAQPage | FAQ structured data template |
Article | Blog post template |
Find the schema you need
Click Copy to copy the HTML snippet
Paste into your page's <head> section
Customize placeholder values
Issue | Fix |
Missing robots.txt | Create file at domain root |
Googlebot blocked | Allow Googlebot in rules |
No sitemap reference | Add |
Issue | Fix |
Missing sitemap | Create XML sitemap |
Invalid XML | Fix syntax errors |
Not in robots.txt | Add reference |
Issue | Fix |
Noindex on important page | Remove noindex directive |
Nofollow everywhere | Use selectively |
Issue | Fix |
Missing canonical | Add self-referential canonical |
Wrong URL | Point to correct canonical |
Issue | Fix |
Missing og:image | Add Open Graph image |
Missing Twitter card | Add Twitter meta tags |
Issue | Fix |
No structured data | Add JSON-LD schemas |
Validation errors | Fix required properties |
Issue | Fix |
Expiring soon | Renew certificate |
Self-signed | Use trusted certificate |
New site launches
Robots.txt modifications
CMS or platform changes
SSL certificate updates
If your page is blocked from indexing, other issues don't matter. Fix indexability first.
Blocking too much in robots.txt can hurt discovery. Only block what's truly not for indexing.
Technical SEO Audit is FREE, no credits required!
Lighthouse Audits - Check performance scores
Competitor SEO Comparison - Compare against competitors
Improve Your Results - Optimize AI visibility
If you have questions about technical SEO audits, contact us at [email protected].