← All posts
April 14, 2026

Your Website Might Be Invisible to ChatGPT. Here's How to Check in 30 Seconds.

Open your website. Right-click anywhere on the page. Click "View Page Source." Hit Ctrl+F (or Cmd+F on Mac) and search for GPTBot.

If you see something like User-agent: GPTBot followed by Disallow: /, your website is completely invisible to ChatGPT. It can't read your pages, can't learn about your business, and will never mention you in a response.

The same goes for other AI search engines. Search for ClaudeBot, PerplexityBot, and anthropic-ai in your source code or robots.txt file. If any of them are blocked, those AI platforms can't see you either.

Why This Matters Right Now

The way people find local businesses is changing fast. When someone asks ChatGPT or Perplexity "who's the best plumber in Austin," those AI tools pull their answers from websites they've been able to read and index. If your site blocks their crawlers, you're not in the running.

This isn't a future problem. It's happening right now. AI-assisted search is growing every month, and the businesses that show up in those results are getting calls that used to come from Google.

Why Most Sites Block AI By Default

Here's the frustrating part — many website platforms and hosting providers block AI crawlers by default. They do this to "protect" your content from being used to train AI models. Sounds reasonable until you realize it also means AI search engines can't recommend your business.

Content-heavy sites like news publications have a reason to block AI training. Your plumbing business does not. You want ChatGPT to know everything about your services, your service area, and your pricing.

How to Fix It

You need to edit your robots.txt file — it lives at yourdomain.com/robots.txt. Make sure it includes:

User-agent: GPTBot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

If you don't control your robots.txt (common on Wix, Squarespace, and some WordPress setups), contact your hosting provider or developer. This is a 5-minute fix that could determine whether AI search engines ever mention your business.

Beyond robots.txt

Allowing AI crawlers is step one. To actually get cited in AI responses, your website also needs:

This is what we call GEO — Generative Engine Optimization. It's the new layer on top of traditional SEO, and most agencies haven't caught up to it yet.

Need help with this for your business? We build it, set it up, and keep it running.

Talk to us →