Google Ads·23 September 2025·6 min read

The AdsBot Problem: Why Your Site Might Be Killing Your Campaigns

What AdsBot does, how blocking it triggers ad disapprovals, how to check your robots.txt, and the fix that restored a client account in under 24 hours.

By Jay

The AdsBot Problem: Why Your Site Might Be Killing Your Campaigns

The AdsBot Problem: Why Your Site Might Be Killing Your Campaigns

Your Google Ads are disapproved. The landing page loads fine in your browser. The URL in the ad is correct. Nothing obvious is wrong. And yet the campaign is not serving.

The culprit is often AdsBot, Google's automated crawler that validates ad destinations. If your site is blocking AdsBot, you get destination disapprovals regardless of how functional your site is for real visitors. This problem is common, underdiagnosed, and completely fixable once you know what to look for.

What AdsBot Is and What It Does

AdsBot is a dedicated Google crawler that exists specifically to validate Google Ads destinations. It is separate from Googlebot, which crawls your site for search indexing. AdsBot's job is narrower: it checks that your ad destination URLs are accessible, that they load correctly, and that the page content is consistent with what your ad promises.

Google runs AdsBot crawls on a regular schedule, and the result of those crawls directly affects your ad status. If AdsBot successfully reaches and validates your landing page, the destination is confirmed. If AdsBot gets an error response, a timeout, or is blocked, the ad destination is flagged as not working and the ad is disapproved.

There are 2 AdsBot user agents to know. AdsBot-Google is the desktop crawler. AdsBot-Google-Mobile is the mobile crawler. Both need to be able to access your landing pages. A site that blocks one and allows the other still triggers disapprovals.

How It Triggered a Client's Disapproval

This happened with Greek Street Unley during a website migration. The agency handling the site rebuild added a server-side rate limiting rule to protect against automated scraping. The rule was reasonable: limit requests from non-browser user agents to 10 requests per minute.

AdsBot does not look like a browser. It identifies itself as AdsBot-Google in its user agent header. The rate limiter treated it as a scraper and returned a 429 Too Many Requests response.

The campaign had been running well. Within 24 hours of the migration going live, the ads for their private dining room stopped serving. The disapproval reason in the account was "Destination not working." The page loaded instantly for every human visitor. The server logs told a different story: AdsBot was hitting the site, getting 429 responses, and reporting the destination as inaccessible.

The campaign went dark for 3 days before anyone connected the timing of the disapproval to the website update. That meant 3 days of zero impression share on a campaign targeting high-value event booking queries, during a period where bookings mattered.

How to Check If Your Site Is Blocking AdsBot

There are 3 places to look.

1. Your robots.txt file.

Visit yourdomain.com.au/robots.txt in your browser. Look for any rules that apply to AdsBot-Google or to * (all bots). A robots.txt that contains:

User-agent: *
Disallow: /

or

User-agent: AdsBot-Google
Disallow: /

is blocking AdsBot from accessing your site. This is the most common cause and the easiest to fix.

Note that a Disallow directive for Googlebot does not automatically apply to AdsBot. Google documents this explicitly: AdsBot does not inherit Googlebot's rules. But a wildcard disallow that blocks all bots will block AdsBot.

2. Your server access logs.

If you have access to server logs (through cPanel, SSH, or a logging service), search for recent AdsBot requests. Filter by user agent string containing "AdsBot-Google." Look at the HTTP status code for those requests.

200 means AdsBot reached the page successfully. 403 means access forbidden. 404 means page not found. 429 means rate limited. 500 means a server error. Any non-200 status for AdsBot requests will trigger disapprovals.

If you do not have direct log access, your hosting provider can pull these for you, or check your web application firewall dashboard if you are using Cloudflare or a similar service.

3. Your WAF and Cloudflare configuration.

Web application firewalls block automated traffic as a security measure. This is correct behaviour for most bots. AdsBot is the exception you need to configure.

In Cloudflare, check your firewall rules and bot management settings. If you have bot fight mode enabled or a custom firewall rule that blocks non-browser user agents, AdsBot may be caught in those rules. You can verify this by checking the Cloudflare Firewall Events log and filtering by the AdsBot user agent.

The robots.txt Fix

The fix for a robots.txt blocking issue is simple. Add an explicit allow rule for AdsBot above any wildcard disallow rules.

User-agent: AdsBot-Google
Allow: /

User-agent: AdsBot-Google-Mobile
Allow: /

User-agent: *
Disallow: /private/

The order matters in robots.txt: more specific user agent rules take precedence over the wildcard rule. By placing AdsBot-specific allow rules before the wildcard, you ensure AdsBot gets through while the wildcard rules still apply to other bots.

If you are using a plugin or CMS tool to manage your robots.txt (common in WordPress), make sure the manual AdsBot rules are added in the raw robots.txt file and are not being overwritten by the plugin on each save.

For WAF and rate limiting fixes, the approach is to whitelist AdsBot's user agent string or IP ranges in your firewall rules. Google publishes its AdsBot IP ranges in their bot documentation. Adding those ranges to your allowlist prevents rate limiters and WAF rules from blocking them.

How to Verify the Fix Is Working

After making the robots.txt or WAF changes, do not just assume the fix worked. Verify it.

Step 1: Use Google's robots.txt testing tool. In Google Search Console, navigate to Settings, then Crawl Stats, then Robots.txt. You can test whether AdsBot is blocked by the current robots.txt. This gives you an immediate read without waiting for the next crawl.

Step 2: Check server logs 24 hours after the fix. Confirm that AdsBot user agent requests are now returning 200 status codes. One clean log entry from AdsBot with a 200 response confirms the access issue is resolved.

Step 3: Request a manual re-review of the disapproved ads. Go into your Google Ads account, find the disapproved ads, and click "Request review" on each one. The re-review crawl will now be able to reach your page and the ads will be approved. This typically takes between 1 and 24 hours.

Step 4: Monitor impression share for 48 hours after re-approval. Confirm the campaigns are serving at their previous level. If impression share stays low, check whether there are additional disapprovals in other campaigns that the investigation missed.

The Preventive Measure

The time to check your AdsBot configuration is before you launch campaigns, not after they start disapproving. Every time a website is rebuilt or a server configuration is changed, AdsBot access should be on the verification checklist.

If you manage multiple Google Ads accounts, add a monthly robots.txt check to your account maintenance process. It takes 2 minutes. A disapproval that runs undetected for a week costs significantly more than 2 minutes of checking.

For help auditing your Google Ads setup or diagnosing campaign problems that are not yielding to obvious fixes, see our Google Ads services or reach out directly.

AdsBotGoogle Adsrobots.txtdisapproval
Skip the small talk