Cloudflare customers will now be able to block AI crawlers from accessing their web content without permission, by default.

The cybersecurity content delivery network is used by around 20% of the global internet traffic, it claims, and the firm is introducing more granular options for site owners to control the type of AI bots to take information from their pages.

Site owners will have to opt in to allow crawlers, which analyse web pages, to extract valuable data and content on their site. AI companies will also need to clearly identify whether their bots are collecting data for training, inferences, or search purposes.

In addition, Cloudflare has launched a new initiative to have AI services pay for access to these pages, which it is calling Pay Per Crawl.

“For the Internet to survive the age of AI, we need to give publishers the control they deserve,” said Cloudflare CEO Matthew Prince. “Our goal is to put the power back in the hands of creators while still helping AI companies innovate.”

The traditional web model, where search engines drive traffic to sites in exchange for indexing their content, is being disrupted by AI systems that extract text, images and code, without directing users to the source. Cloudflare states that this undermines the advertising and subscription revenue that funds journalism and online creativity.

The change has drawn strong support from publishers and content platforms. “Cloudflare’s approach sets a new standard for how content is respected online,” said Roger Lynch, CEO of Condé Nast. “It opens the door to sustainable innovation built on permission and partnership.”

Other supporters include Dotdash Meredith, Gannett Media, Pinterest, Reddit, and TIME, all of whom have expressed concern over the unauthorised use of their content by AI firms.

Cloudflare had previously offered a one-click tool to block AI crawlers, used by over one million customers. This latest update takes a step further by making control the default, rather than the exception.

The company is also contributing to a new protocol that will allow AI bots to reliably identify themselves and make it easier for sites to determine which crawlers to allow.

“This is about safeguarding the future of a free and vibrant Internet,” said Prince. “Creators deserve transparency, choice, and fair value for their work.”

Personalized Feed
Personalized Feed