Poynter breaks down how its AI plagiarism investigation came together and what it means for journalism

AI startup Nota shut down its network of local news sites after a Poynter investigation found it had copied local journalists' work without attribution. The scandal has deepened distrust of AI tools among newsrooms already wary of the technology.

Categorized in: AI News Writers
Published on: Apr 17, 2026
Poynter breaks down how its AI plagiarism investigation came together and what it means for journalism

AI plagiarism scandal at Nota raises questions about journalism's trust in artificial intelligence

An AI company called Nota, which served clients including The Boston Globe and the Institute for Nonprofit News, shut down its network of local news sites after dozens of plagiarism instances were discovered. The revelation has rattled the journalism and AI communities and exposed the risks of deploying AI systems without adequate oversight.

Nota's collapse came after a Poynter investigation revealed the company had scraped local journalists' work and republished it without attribution. The story prompted broader questions about how newsrooms should evaluate and implement AI tools.

How the story unfolded

Poynter's reporting on Nota examined how the plagiarism was discovered, what prompted the company's shutdown, and how the industry has responded. The investigation included interviews with people familiar with Nota's operations and analysis of the copied content.

Alex Mahadevan, MediaWise director at Poynter, said the scandal complicates his work training newsrooms on responsible AI use. "It rightfully makes journalists even more suspicious of this technology and distrustful of the companies that are employing it within news organizations," Mahadevan said. "The biggest takeaway for me is that this could have worked."

Damage to AI-journalism relationships

The Nota case illustrates a critical gap: AI companies entering journalism without sufficient safeguards or transparency. When systems designed to fill news deserts instead plagiarize local reporting, it deepens skepticism about AI's role in newsrooms.

The incident comes as newsrooms increasingly adopt AI for reporting, editing, and content distribution. Writers and editors evaluating these tools should scrutinize how companies handle attribution, source material, and ethical boundaries. Understanding AI for Writers includes learning to identify these red flags before implementation.

Newsrooms weighing AI adoption should demand transparency about how systems process existing journalism and what safeguards prevent plagiarism or unauthorized republication.

What comes next

Nota's shutdown doesn't end questions about how AI companies should operate in journalism. Industry discussions are ongoing about standards for AI transparency, source attribution, and ethical deployment in newsrooms.

For writers and editors, the lesson is clear: AI tools require the same editorial scrutiny as any other reporting method. Verify how systems work, who controls the output, and whether they respect intellectual property and journalistic ethics.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)