Cloudflare unveils registry format for bot and agent authentication
Cloudflare introduces registry format enabling website operators to discover and validate cryptographic keys from bots and agents at scale, addressing authentication needs.
As bots and agents start cryptographically signing their requests, website operators face a growing challenge in discovering public keys to verify these digital identities. Cloudflare announced on October 30, 2025, a new registry format designed to solve this discovery problem, working in partnership with Amazon Bedrock AgentCore to enable scalable authentication of automated traffic across the internet.
The announcement builds on Cloudflare's Web Bot Auth protocol proposal shared in May 2025, which introduced cryptographic authentication for bot traffic using HTTP Message Signatures with public key cryptography. Multiple organizations have implemented the protocol, including Vercel, Shopify, and Visa. The new registry format extends this foundation by creating a lightweight mechanism for discovering and validating agent identities at scale.
According to Thibault Meunier and Maxime Guerreiro from Cloudflare, the discovery problem emerges as more operators sign their requests. Website administrators might locate public key material for well-known fetchers and crawlers, but finding keys for the next 1,000 or 1,000,000 agents presents significant operational challenges. The registry format addresses this by providing lists of URLs pointing to Signature Agent keys that anyone can maintain and host.
Subscribe PPC Land newsletter ✉️ for similar stories like this one
Amazon Bedrock AgentCore, a platform for building and deploying AI agents at scale, adopted Web Bot Auth for its AgentCore Browser service. The platform currently uses a service signing key available in public preview. AgentCore intends to transition to customer-specific keys once the protocol matures, enabling Cloudflare and other infrastructure operators to validate signatures from individual AgentCore customers rather than AgentCore as a monolithic entity.
The registry format operates similarly to existing IP address lists and robots.txt configurations that website operators already use. A registry contains URLs pointing to HTTP Message Signatures directories, creating a curated list of known signature agents. Examples include AI crawlers, academic research agents, or search agents. Anyone can maintain and host these registries on public file systems, GitHub repositories, Cloudflare R2 storage, or distribute them as email attachments.
The implementation uses a simple text format. A registry file might contain entries like "https://chatgpt.com/.well-known/http-message-signatures-directory" or "https://autorag.ai.cloudflare.com/.well-known/http-message-signatures-directory" along with URLs for other signature agents. This structure enables website operators to import keys from multiple registries simultaneously, with each tag checked independently.
Cloudflare provides a demonstration in Go for Caddy server showing how to import keys from multiple registries. The configuration allows operators to specify registry URLs, and the system verifies signatures against all registered agents. The code is available on the cloudflare/web-bot-auth GitHub repository.
Beyond simple key discovery, website operators need additional information about agents accessing their infrastructure. The announcement introduces a signature-agent card format extending the JWKS directory specification from RFC 7517 with additional metadata. This format functions as a digital contact card, containing operator names, contact methods, logos, expected crawl rates, and other relevant information.
The signature-agent card includes fields specifying client names, client URIs, logo URIs, contact addresses, expected user agents, RFC 9309 compliance indicators, trigger types, purpose declarations, targeted content descriptions, rate control mechanisms, rate expectations, known URLs, and cryptographic keys with validity periods. These self-certified metadata fields help website operators make informed decisions about which agents to allow.
The first protocol proposal suggested bot operators would provide a newly-defined HTTP header called Signature-Agent referring to an HTTP endpoint hosting their keys. An example from Shopify's online store shows "Signature-Agent: https://shopify.com" in the request header. This approach defaults to allowing all traffic, but operators can adjust rate limits or contact specific agents if they make excessive requests.
Cloudflare offers a registry through Radar containing bots and agents the company trusts. This registry uses the standard format, enabling consumption of Cloudflare-trusted bots on any server infrastructure. Other organizations can create specialized registries, categorizing agents based on specific criteria or use cases. Website operators can select registries aligned with their security policies and business requirements.
Operating a registry involves several approaches. Operators can monitor incoming Signature-Agents to collect signature-agent cards of agents accessing their domains. They can import agents from existing registries and categorize them according to internal policies. Some organizations establish direct relationships with agents, as Cloudflare does for its bot registry. Others learn from users by allowing customers to specify which registries or signature-agents they want to permit, generating valuable insights about trusted traffic patterns.
The technical architecture separates the registry format from the signature-agent card format. Registries maintain lists of URLs where signature-agent cards can be found. The cards themselves contain the cryptographic keys and metadata. This separation allows flexible curation ecosystems where different organizations can maintain registries serving different needs while using a common format for agent information.
For website operators without the scale and reach of large CDN providers, discovering public keys of known crawlers has been challenging. The registry format creates an open curation ecosystem that doesn't lock customers or small origins into specific vendor relationships. This mirrors existing ecosystems for IP address lists and robots.txt configurations, where canonical lists are published on the internet for easy import into websites.
The system addresses limitations in traditional bot identification methods. Historically, agent traffic has been classified using user agent strings and IP addresses. These fields can be spoofed easily, leading to inaccurate classifications. Cryptographic authentication through Web Bot Auth provides stable identifiers that are much harder to falsify, shifting from brittle identification to trustworthy authentication.
Buy ads on PPC Land. PPC Land has standard and native ad formats via major DSPs and ad platforms like Google Ads. Via an auction CPM, you can reach industry professionals.
Cloudflare processes over one billion 402 response codes daily, demonstrating existing demand for payment-required responses among content creators seeking compensation for AI training data usage. This scale illustrates the magnitude of automated traffic across the internet and the importance of distinguishing legitimate agents from malicious bots. The registry format helps operators make these distinctions based on cryptographic verification rather than easily spoofed identifiers.
The company announced it would integrate these capabilities into its bot management and rule engines in the future. Cloudflare predicts that clients and origins will choose signature-agents they trust, use a common format to migrate their configuration between CDN providers, and rely on third-party registries for curation. This prediction reflects a vision of an open ecosystem where multiple participants can contribute to agent authentication without creating vendor lock-in.
Anonymous Credentials technology was also mentioned as a complementary approach for rate-limiting bots and agents without compromising privacy. This research direction explores how to manage agent traffic and block abuse without tracking individual users, addressing privacy concerns while maintaining security.
The timing of this announcement reflects broader industry concerns about bot traffic and authentication. Research analyzing over a petabyte of web traffic data across more than two million websites over seven years found that at least 40% of web traffic consists of fake users or computerized bots, costing advertisers billions in wasted spending. AI-powered crawlers emerged as a growing source of invalid traffic in 2024, with General Invalid Traffic increasing 86% year-over-year in the second half of 2024, according to DoubleVerify's analysis.
For the marketing community, these developments matter because agentic AI threatens traditional advertising models while creating new authentication challenges. Cloudflare previously partnered with Visa and Mastercard in October 2025 to develop security protocols for automated commerce, leveraging Web Bot Auth as the foundation. Both payment networks developed protocols—Visa's Trusted Agent Protocol and Mastercard's Agent Pay—enabling merchants to distinguish legitimate AI shopping agents from malicious bots through cryptographic verification.
The company has been expanding its AI-related infrastructure throughout 2025. Cloudflare launched pay-per-crawl services on July 1, 2025, allowing content creators to charge AI crawlers for access using HTTP response code 402. The company introduced Robotcop on December 10, 2024, providing network-level enforcement for robots.txt policies. Earlier reporting from July 2024 revealed that AI bots accessed approximately 39% of the top one million internet properties using Cloudflare's services.
The registry format enables multiple use cases. Website operators can monitor incoming Signature-Agents to collect cards automatically. They can import agents from existing registries and categorize them according to business requirements. Some organizations may establish direct relationships with specific agents, creating verified channels for automated traffic. Others can learn from their user base by enabling customers to specify which registries they trust, generating insights about traffic patterns and agent behavior.
Security considerations remain paramount in the design. The signature-agent card format includes fields for RFC 9309 compliance, indicating which robots.txt directives the agent respects. Rate control mechanisms and rate expectations help operators set appropriate limits. Contact information enables direct communication when issues arise. The cryptographic keys include validity periods, allowing operators to reject expired credentials automatically.
The technical implementation uses Ed25519 key pairs for generating HTTP Message Signatures, a cryptographic algorithm providing strong security with relatively small key sizes. The signature process involves loading API endpoints and private signing credentials, generating required signature headers, attaching these headers to requests, and forwarding fully signed requests to protected APIs. Verification requires accessing the signature-agent card to retrieve the public key, then validating the signature against the request contents.
Cloudflare is hiring 1,111 interns over the next year and has open positions for engineers interested in working on these systems. The company encourages experimentation with the demo available on GitHub for developers wanting to implement the registry format in their own infrastructure.
The long-term vision anticipates an ecosystem where signature-agent cards become the standard format for bot and agent identification. Website operators would consume registries from trusted curators, similar to how they currently consume IP address lists or security threat feeds. The common format would enable migration between CDN providers without requiring reconfiguration, reducing vendor lock-in and increasing competition.
For smaller website operators, the registry format democratizes access to sophisticated bot authentication. Organizations without dedicated security teams can import registries maintained by trusted third parties rather than managing individual agent relationships. This approach mirrors how smaller websites currently rely on security vendors for threat intelligence rather than conducting independent research.
The announcement represents a step toward standardizing how bots and agents identify themselves on the internet. As automated traffic continues growing—driven by AI crawlers, shopping agents, and other automated systems—the need for robust authentication mechanisms increases. The registry format provides a foundation for this authentication while maintaining an open, decentralized ecosystem that doesn't favor specific vendors or large players.
Subscribe PPC Land newsletter ✉️ for similar stories like this one
Timeline
- May 2025: Cloudflare introduces Web Bot Auth protocol proposal for cryptographic bot authentication
- July 1, 2025: Cloudflare launches pay-per-crawl service for content monetization
- July 22, 2025: DoubleVerify reports 101% increase in bot fraud alongside 86% rise in General Invalid Traffic
- August 2025: Amazon blocks AI crawlers from major tech companies including Anthropic, OpenAI, Meta, and Google
- October 24, 2025: Cloudflare announces partnerships with Visa and Mastercard for Trusted Agent Protocol and Agent Pay
- October 30, 2025: Cloudflare announces registry format for bot and agent discovery with Amazon Bedrock AgentCore partnership
Subscribe PPC Land newsletter ✉️ for similar stories like this one
Summary
Who: Cloudflare, in collaboration with Amazon Bedrock AgentCore, announced the registry format. Authors Thibault Meunier and Maxime Guerreiro detailed the technical implementation. Amazon Bedrock AgentCore serves as the initial partner implementing the system at scale.
What: Cloudflare introduced a registry format for discovering and validating bots and agents that cryptographically sign their requests. The system includes a lightweight registry format listing URLs to signature-agent cards, which contain public keys and metadata about automated traffic sources. The format extends the JWKS directory specification with additional fields including operator information, contact methods, rate expectations, and compliance indicators.
When: Cloudflare announced the registry format on October 30, 2025. The announcement builds on the Web Bot Auth protocol proposal shared in May 2025. Amazon Bedrock AgentCore adopted Web Bot Auth for its AgentCore Browser service in public preview, with plans to transition to customer-specific keys as the protocol matures.
Where: The system operates across Cloudflare's global network infrastructure, which processes over one billion 402 response codes daily. The registry format can be implemented by any website operator or infrastructure provider. Demonstration code is available on GitHub at cloudflare/web-bot-auth. Registries can be hosted on any public file system, including GitHub repositories or cloud storage services.
Why: As bots and agents begin cryptographically signing their requests using Web Bot Auth, website operators need methods to discover public keys for verification at scale. Traditional identification methods using IP addresses and user agent strings can be easily spoofed. The registry format addresses this discovery problem by creating an open ecosystem where curators can maintain lists of trusted agents. For the marketing community, this matters because automated traffic continues increasing—with research showing 40% of web traffic consists of fake users or bots—costing advertisers billions. The authentication framework helps distinguish legitimate agents from malicious bots while enabling new business models like agentic commerce.