Is Your WordPress Host Secretly Blocking ChatGPT and Claude?

Your site ranks fine on Google but never gets cited by ChatGPT or Claude? The block might not be in your code — it might be your hosting plan. Here is how to find out in 60 seconds.

Your Chattanooga business website might be invisible to ChatGPT, Claude, and Perplexity right now — and the cause might not be anything you did wrong. A growing number of managed WordPress hosts are silently blocking AI crawlers at the platform level, below the layer where you or your developer can see it. Standard SEO audits miss it. Your dashboard does not show it. But your AI search visibility is being throttled before a single bot reaches your content.

If you read our AI-agent readiness checklist and ticked every box — schema markup, llms.txt, robots.txt allowlist, the works — this is the bug that quietly undoes all of it. Let's find it and fix it.

The Invisible Problem: Your Host Is Sending AI Bots a 429

In April 2026, an SEO investigator pointed Cloudflare logs at a WP Engine-hosted site and discovered that ClaudeBot and GPTBot were getting rate-limited 29% of the time. Amazonbot was rate-limited 51% of the time. Bytespider was hit with a 520 error 61% of the time. Browser traffic and Googlebot? Clean 200s. The block was specifically targeting AI bots by user-agent string, and it was happening at the hosting platform layer — below Cloudflare, below WordPress plugins, below anything visible in the customer dashboard.

WP Engine support confirmed it on the record: "WP Engine does enforce platform-wide rate limiting on certain high-impact bots to protect overall server performance, and that part can't be selectively disabled per bot." Translation: it is the policy, and you cannot turn it off in your account settings.

This is the kind of issue you cannot diagnose by looking at your own site. The block fires before the request ever reaches your WordPress install, your security plugin, or your robots.txt. As far as your analytics are concerned, the bot just never showed up.

Why This Matters for a Chattanooga Business

Access correlates directly with citations. In the same investigation, the site with 0% Claude crawler access received 0% Claude citations. Perplexity, which had 100% access, contributed 7.8% of the site's AI traffic. Google's AI Mode hit 37.8% citation coverage; Claude was a flat zero. That is not a coincidence — if the bot cannot read your page, no AI tool can quote you.

For a Chattanooga restaurant, landscaper, or law firm, this means a real loss. When a customer asks ChatGPT "who's the best web designer in Chattanooga?" or "where can I get tacos al pastor near downtown?", the AI pulls from sites it can actually read. If your host blocks that read, your competitor — who happens to be on a different host — gets the recommendation. You never see the lost lead because there is no log entry for a search that never reached you.

This is the new layer of answer engine optimization: it is not enough to structure the page for AI. The infrastructure under the page has to actually let AI in.

The 60-Second Test: Is Your Host Blocking AI Bots?

You can test this yourself in under a minute. Open Terminal on a Mac (or PowerShell on Windows) and run two requests against your homepage — one as a normal browser, one as an AI bot. Replace yourdomain.com with your actual site:

# Test as a normal browser

curl -s -o /dev/null -w "%{http_code}\n" -A "Mozilla/5.0" https://yourdomain.com

# Test as ClaudeBot

curl -s -o /dev/null -w "%{http_code}\n" -A "ClaudeBot/1.0" https://yourdomain.com

# Test as GPTBot

curl -s -o /dev/null -w "%{http_code}\n" -A "GPTBot/1.0" https://yourdomain.com

You want to see 200 for all three. If the browser test returns 200 but ClaudeBot or GPTBot returns 429, 403, or 520, something between you and the open internet is blocking AI bots by user-agent. Run the bot test 5–10 times in a row to also catch rate-limiting that only kicks in after a few requests — the WP Engine pattern only triggered after sustained traffic.

Not comfortable with the command line? Free no-code checkers like MRS Digital's AI Crawler Access Checker and Crawl Gate give you a one-click answer. They are not as thorough as testing under load, but they catch the obvious cases.

How the Major Managed WordPress Hosts Handle AI Bots

Not every host treats AI crawlers the same way. Here is where each of the big managed WordPress providers stood as of May 2026:

Host Default AI Bot Behavior Customer Control
WP Engine Platform-level rate limiting (429s) Not self-service; requires support escalation
Kinsta No platform-level block Opt-in Bot Protection, four levels
Pressable Does not disallow AI bots by default Customer-controlled
Pantheon Does not block identified bot traffic Customer-controlled

Shared hosts like Bluehost, SiteGround, and GoDaddy fall in a third category: their AI-bot policy is often a moving target depending on plan tier and the security plugins that ship with the install. Test your specific plan rather than assuming.

Why Hosts Block AI Bots in the First Place

Hosting providers are not blocking AI bots out of malice. They are blocking them because the math is brutal. According to Cloudflare's Q1 2026 crawl-to-referral analysis, ClaudeBot makes 20,583 crawl requests for every single referral it sends back. GPTBot's ratio is 1,255 to 1. Compare that to Googlebot, which roughly breaks even — you get about as much traffic back as you pay in crawl bandwidth.

For a host running thousands of WordPress sites on shared infrastructure, AI training crawlers are pure cost with almost no payoff for the customer. So they cap them. The honest answer is that platform-wide rate-limiting protects performance for the average WordPress customer who does not care about AI search visibility. The dishonest part is not telling you that is the trade-off.

That trade-off only makes sense for businesses that do not depend on AI discovery. If you are a content site, an agency, a service business in a competitive local market — basically anyone whose growth depends on showing up in answers people get from ChatGPT and Claude — you are paying for a default that costs you customers.

How to Fix It: Three Paths Forward

If your test came back with 429s or 403s on AI bots, you have three options — in order of effort:

1. Escalate to your host's product engineering team.
On WP Engine, the support team has confirmed an "exceptional use case" path that engineering can apply to specific accounts. Open a support ticket, cite the bots you need allowed (ClaudeBot, GPTBot, PerplexityBot, OAI-SearchBot, Google-Extended), and explain that AI search visibility is part of your business model. This works on a per-account basis but is not a self-service toggle. Before you call, also confirm in your dashboard that Utilities → Redirect Bots is off and that no Web Rules entries are blocking AI user agents — that handles the layers you can see, before you fight about the layer you cannot.

2. Switch to a host that does not platform-block.
Kinsta, Pressable, and Pantheon have all publicly stated they do not block AI crawlers at the platform level. Migrating WordPress is a project — usually 3 to 8 hours of work for a small business site — but if AI visibility is core to your strategy, the math gets simple fast. One missed lead in a competitive Chattanooga market often pays for the migration.

3. Migrate off WordPress entirely.
This is what we did with our own site, and it is what we recommend for most local businesses. A static site — plain HTML, CSS, and JavaScript — hosted on Firebase, Netlify, or Cloudflare Pages does not have a "platform-level bot block" because there is no shared platform doing the blocking. Bots see exactly what humans see. Sites load in under a second. Hosting costs drop to a few dollars a month. The full page-structure playbook for AI search works as designed because nothing is silently filtering the bots that read it.

What We Use (and Why It Is Not WP Engine)

Our own site — the one you are reading right now — is static HTML hosted on Firebase. ClaudeBot, GPTBot, PerplexityBot, OAI-SearchBot, Google-Extended, Applebot-Extended, and friends are explicitly welcomed in our robots.txt, indexed in our llms.txt, and rate-limited by nobody. The same architecture is what we ship to clients who care about AI search visibility.

This is not a knock on WordPress — WordPress runs about 43% of the web for good reason. It is a knock on managed hosting plans that make decisions about your bot traffic without telling you. If you are committed to WordPress, pick a host that gives you the controls. If you are open to a different stack, static is faster, cheaper, and gets you out of the bot-blocking conversation entirely.

The Bigger Picture: AI-Agent Readiness Is Now a Hosting Question

For the last two years, AI search visibility was a content problem: structure your pages for answer engines, ship schema markup, write a good llms.txt, allow the bots in robots.txt. That is still true — our 10-point readiness checklist still holds.

But there is now a layer underneath: the network and hosting layer. You can do everything right at the page level and still be invisible to AI if your host throttles the crawlers. That makes the question "who hosts your site?" a real AEO question, not just an IT one. When you talk to whoever runs your website, ask them to run the curl test above. If they cannot, or will not, that is its own kind of answer.

Frequently Asked Questions

Does blocking AI bots affect my Google rankings?
No. Googlebot is a separate crawler from GPTBot, ClaudeBot, and PerplexityBot. Blocking AI training and answer-engine crawlers does not change how your site ranks in classic Google Search. It does change whether you appear in ChatGPT answers, Claude responses, Perplexity citations, and Google's AI Overviews (which use a different bot, Google-Extended).

Is updating robots.txt enough?
Not always. robots.txt is a polite request that bots can choose to honor. Hosting platforms that rate-limit AI crawlers at the network edge do not consult your robots.txt — the block fires before the request reaches your file. You have to fix the block at the layer that is enforcing it: hosting platform, CDN, security plugin, or firewall.

What about AI Overviews in Google search?
Google-Extended is the user-agent Google uses for training Gemini and powering AI Overviews. It is a separate signal from regular Googlebot. If your host blocks "AI bots" broadly, Google-Extended often gets caught in the same net — which means you can rank fine in classic search but vanish from AI Overview answers.

Should I just switch hosts?
Switch if AI search visibility matters to your business and your current host will not let you opt out of the block. For a lot of Chattanooga small businesses, the simpler answer is to leave WordPress entirely — a static site avoids the problem, loads faster, and costs less. We can help with either path.

Want Us to Run the Test for You?

If you are not sure whether your site is reachable by ChatGPT and Claude, we will run a free 5-minute audit and tell you exactly what we find. No commitment, no upsell — just a straight answer about whether your AI visibility is being silently throttled.

Send us your URL and we will reply with the result, the diagnosis, and (if needed) the fix. Same offer in Spanish.

Get a Free AI-Visibility Check

Make Sure AI Can Actually Find You

From hosting diagnostics to a full AI-readiness audit, we make sure your Chattanooga business gets cited — not skipped.

Start a Project