AI To Produce 90% Of News By 2026? Separating Viral Predictions From Reality
In 2022, a widely shared prediction claimed that 90% of online content would be AI-generated by 2026. This caused alarm among media companies. Now, halfway through 2025, media executives have gained clearer insights into what’s factual and what was hype.
The original claim was traced back to a report by Europol, but that report never made such a forecast. Instead, it focused on how deepfake technology affects law enforcement and crime. The misattribution highlights how misinformation can spread quickly. However, this false alarm prompted many newsrooms to develop thoughtful AI strategies, improving their operations by focusing on measured AI integration rather than rushing into full automation.
What The Real Research Shows
Though the “90% AI content” prediction was inaccurate, legitimate studies confirm the rise of synthetic content online. Research from Amazon Web Services indicates that about 57% of web-based text is already AI-generated or AI-translated. This trend spans social media, e-commerce, and gaming platforms. The flood of low-quality synthetic content makes identifying original human work more challenging and validates the emphasis on authenticity and verification processes adopted by many media outlets since 2022.
What The Predictions Missed: Newsroom Implementation Reality
The viral prediction also wrongly implied that AI would largely replace journalists, fueling unnecessary panic about job losses. In practice, newsrooms have found AI works best as a tool to assist journalists, not replace them. The focus remains on improving operational efficiency, with human oversight preserving editorial quality.
The Newsroom AI Adoption Pattern That Actually Emerged
Since the debut of ChatGPT, successful AI integration in newsrooms has followed clear patterns. Here are some leading examples:
- Content Versioning at Scale: Morgan Murphy Media developed “AskMorgan,” an AI platform that automates converting broadcast scripts into web articles, social posts, push alerts, and newsletters. This system frees journalists from repetitive content repackaging, allowing more focus on original reporting.
- Data Analysis: Brazilian journalists discussed using human-supervised AI tools at the Data Day conference in São Paulo to analyze datasets and generate insights, enhancing investigative reporting.
- Democratized AI Access: Graham Media Group’s “Spark” platform provides AI tools to all employees, supporting tasks from sales proposals to SEO headline optimization.
- Production Efficiency: Hubbard Broadcasting uses AI-powered teleprompters, closed captioning, and robotic cameras, achieving significant cost savings after upfront investments.
- Newsroom Communication: Tegna’s “Project Spotlight” uses AI to filter incoming station communications, helping journalists focus on relevant stories by eliminating up to 90% of irrelevant emails.
- Personalization: BuzzFeed employs AI to personalize content and boost efficiency, with CEO Jonah Peretti emphasizing AI’s role in enhancing human creativity rather than replacing it.
In these cases, AI acts as a productivity enhancer under human supervision, ensuring accuracy and ethics remain central.
The Consumer Behavior Predictions Didn’t Anticipate
Another overlooked factor was how audiences would respond to AI-generated content. Research from Futuri’s AI in Media Study shows 45% of local TV news viewers believe AI could help present better stories. Meanwhile, 79% would trust AI-generated content if it’s clearly sourced from a trusted local news outlet and delivered by a human.
Despite financial pressures faced by local media, there remains a strong need for human storytellers to maintain community trust and accountability. This consumer skepticism has naturally limited the push toward fully automated journalism and supports the augmentation-over-automation approach many successful newsrooms have adopted.
Strategic Lessons For Media Executives
Three years of AI experimentation reveal that successful media companies avoid full automation and instead deploy AI selectively. For example, Morgan Murphy prohibits its journalists from using AI tools to create original content. Instead, AI supports content versioning to free journalists for deeper reporting. This method balances productivity gains with editorial integrity and cost control.
Organizations thriving in 2025 adopt AI to improve operational efficiency while relying on human expertise for editorial decisions, source relationships, and investigations. With two-thirds of Americans concerned about AI misinformation according to Pew Research, the market demands human-verified news.
In hindsight, the viral but false “90% AI content” prediction served a useful purpose. It acted as an early warning that motivated newsrooms to prepare strategically. The key takeaway for media executives is to treat AI as a tool for enhancing human work—not as a replacement—and always verify the accuracy of your sources.
Your membership also unlocks: