Cloudflare Expands Project Galileo with Free Tools to Block Unwanted AI Scraping
Cloudflare expands Project Galileo to let nonprofits and independent media control AI crawlers and bots. PR teams can set rules, block abuse, and protect traffic and credit.

Cloudflare Expands Project Galileo: PR Teams Get New Controls Over AI Access to Content
Cloudflare is expanding Project Galileo to help nonprofit organizations and independent media monitor and control how AI services access content on their websites. The move gives participants practical levers to manage bots and AI crawlers, reducing unauthorized use of their work.
The program now covers 750 journalists, independent newsrooms, and nonprofits that support global newsgathering. Eligible participants will gain access to Bot Management and AI Crawl Control features to set clear rules for AI traffic and keep abusive or mislabeled crawlers out.
As more people rely on AI summaries, fewer visit the original source. That hurts attribution, trust, donations, and ad revenue-core concerns for PR and communications leaders.
"I believe in journalism, and I believe that local and independent news accuracy is very important to the internet and a healthy society," said Matthew Prince, Cloudflare co-founder and CEO. "Now, this vision is widespread and we want to ensure the evolution of AI goes according to their wishes, not the other way around."
What's included in the expansion
- AI Crawl Control and bot management to allow, rate limit, or block specific AI crawlers.
- Visibility into AI and bot traffic patterns to spot misuse and enforce your policy.
- No-cost access for qualifying nonprofits and independent media through Project Galileo.
Why this matters for PR and Communications
- Attribution risk: AI answers often omit links, diluting brand credit and source authority.
- Traffic impact: Fewer clicks to owned channels reduce impact of campaigns and conversions.
- Message integrity: Models may learn from outdated or out-of-context pages, increasing misinformation risk.
- Negotiation leverage: Clear access controls strengthen your position for licensing or partnership talks.
Actions to take now
- Check eligibility and apply: Project Galileo provides protection and tools at no cost to qualifying organizations. Apply here.
- Set an AI access policy: Decide which AI crawlers you will allow, rate limit, or block. Document the rationale and align with legal and leadership.
- Update site rules: Use AI Crawl Control and bot management to enforce your policy. Keep robots.txt and meta directives consistent with your stance.
- Publish a public "AI use of content" statement: Clarify acceptable use, licensing terms, and contact details for requests.
- Monitor and report: Track AI/bot traffic, unusual spikes, and suspected abuse. Share a monthly summary with leadership and editorial.
- Prepare an escalation playbook: Include detection thresholds, response steps, legal review, and comms templates for outreach or public statements.
- Measure what matters: Watch referral traffic from search/aggregators, branded search demand, time on page, and source citations in coverage.
How to frame this externally
- Lead with mission: Protect journalists and community reporting while staying open to responsible AI collaboration.
- Be precise: You welcome partnerships; you do not accept unapproved scraping or unlabeled crawlers.
- Reinforce trust: Direct readers to the original source for context, updates, and corrections.
Background
Launched in 2014, Project Galileo was built to shield journalists, human rights defenders, and vulnerable groups from online attacks. Earlier this year, Cloudflare introduced tools to help publishers and creators control and monitor AI access. With this expansion, those controls become available to more nonprofits and independent media-free of charge.
Next steps
- Confirm eligibility and apply to Project Galileo: Cloudflare Project Galileo.
- Level up team capabilities: Build internal literacy on AI policy, compliance, and tooling. Explore role-based options for PR pros via Complete AI Training.