AI & TechArtificial IntelligenceBigTech CompaniesDigital MarketingDigital PublishingNewswireTechnology

Google Tests Web Bot Auth for AI Agent Verification

▼ Summary

– Google is testing Web Bot Auth, an experimental IETF protocol that lets websites cryptographically verify automated bot and AI agent requests using signed HTTP messages.
– The protocol uses HTTP Message Signatures: bots sign requests with a private key, and websites verify the signature against a published public key to confirm the sender’s identity.
– Google is testing the protocol with some AI agents on its infrastructure, but not all user agents or requests are signed, and sites should still rely on IP and reverse DNS checks.
– Bot impersonation is a persistent problem, and Web Bot Auth adds a cryptographic layer that cannot be forged without the agent’s private key.
– The protocol is still in the standards process, and Google’s implementation is experimental, with no urgency for sites to act beyond treating it as supplementary verification.

Google is currently testing Web Bot Auth, an experimental IETF protocol designed to give websites a cryptographic method for verifying certain automated requests from bots and AI agents. The company has published documentation detailing how this system works.

This protocol adds a new layer of trust. It allows automated agents to sign their HTTP requests using cryptographic keys. When a website receives a signed request, it can check the signature against a published public key, confirming the request’s origin. This is a step beyond relying solely on IP addresses or user-agent strings.

How the Verification Works

The system relies on HTTP Message Signatures (RFC 9421) . An automated client holds a private key, makes its public key available at a known URL, and signs each outgoing request. The receiving site then verifies the signature against that public key to confirm the agent’s identity.

Google notes that a subset of its Google-Agent requests will be authenticated under the domain `https://agent.bot.goog`. These signed requests include a `Signature-Agent` HTTP header set to `g=”https://agent.bot.goog”`. The corresponding signatures can be verified using public keys hosted in the domain’s `.well-known` directory.

According to Google, bot-detection services, CDNs, and WAFs already support the protocol. The IETF draft is co-authored by Thibault Meunier of Cloudflare and Sandor Major of Google. Cloudflare has also released a reference implementation on GitHub. The IETF Web Bot Auth Working Group was formally chartered in early 2026, with milestones for standards-track specifications and a best current practice document.

What Google Is Not Doing Yet

Participation is not universal. Google says it is testing the protocol with “some AI agents hosted on Google infrastructure,” but it has not named specific agents beyond the Google-Agent user-triggered fetcher. Even for participating agents, not every request is signed.

The documentation advises websites to continue using IP addresses, reverse DNS, and user-agent strings as their primary verification method while signed traffic is rolled out gradually. The Internet-Draft itself could change as the working group refines the standard.

Why This Matters

Bot impersonation has long been a headache for site owners. Bad actors and scrapers can easily spoof user-agent strings to mimic Googlebot or other legitimate crawlers, making it difficult to distinguish real traffic from fake.

We have covered this problem before. Google’s Martin Splitt previously warned that “not everyone who claims to be Googlebot actually is Googlebot.” The existing verification methods, reverse DNS lookups and IP range checks, are useful but can be bypassed. Web Bot Auth adds a layer that cannot be forged without access to the agent’s private key.

For sites already using a CDN or WAF that supports the protocol, verification may happen automatically. For everyone else, the experimental status means there is no immediate urgency. The documentation recommends treating existing verification as the default and Web Bot Auth as a supplementary tool.

Looking Ahead

The protocol is still moving through the standards process, and Google’s implementation remains experimental. For now, the practical change is visibility. Websites may start seeing signed requests from some Google-Agent traffic, but existing verification methods remain the default.

The next big question is whether more AI agents will adopt signed requests, and whether hosting providers will make verification automatic for sites that do not want to manage cryptographic keys themselves.

(Source: Search Engine Journal)

Topics

web bot auth 98% ietf protocol 92% cryptographic signatures 88% google implementation 87% bot impersonation 86% ai agents 85% cdn support 82% public key verification 81% experimental status 78% standards process 77%