SMI trains Lagos journalists on responsible AI use in newsrooms

Lagos journalists received hands-on AI training from the Safer Media Initiative, targeting a growing skills gap in the profession. The central warning: AI won't replace journalists, but those who ignore it will lose work to those who don't.

Categorized in: AI News Writers
Published on: Mar 31, 2026
SMI trains Lagos journalists on responsible AI use in newsrooms

Journalists trained on responsible AI use as industry gaps emerge

The Safer Media Initiative trained journalists in Lagos on using AI tools effectively and safely, addressing what recent data shows is a significant knowledge gap across the profession.

Peter Iorter, executive director of SMI, said technology is reshaping journalism faster than many practitioners realize. "Even traditional media has no option but to adapt, otherwise it risks being left out of the media business," he said.

The core message was direct: AI will not eliminate journalism jobs, but it will eliminate journalists who refuse to learn it. "AI will not take your job, but it will take the job of journalists who refuse to embrace it and give it to those who have adopted it," Iorter said.

Verification remains non-negotiable

Titilope Fadare Oparinde, founder of Generative AI journalism with Titi, outlined specific practices for responsible use. The principle is simple: maintain human editorial control over everything AI produces.

Verification is where responsibility begins. AI research tools frequently cite outdated or fabricated sources. Oparinde said journalists must double-check all data before publishing, label AI-generated images, and disclose where AI assisted in the work.

She warned against uploading sensitive materials into public AI tools. Confidential transcripts, unpublished investigations, source communications, and leaked documents should never enter these systems. "You use AI to work faster without cutting corners on verification," Oparinde said. "You protect your sources even from tools that promise to be helpful."

The accountability question

Transparency matters because your byline carries the responsibility, not the tool. Journalists remain accountable for accuracy, sourcing, and ethics regardless of which parts of the process involved AI.

This training reflects a broader shift: the question is no longer whether journalists will use AI, but how they'll use it without compromising the standards that define the profession.

Understanding prompt engineering and how generative AI and LLM systems work helps journalists make informed decisions about when these tools add value and when human judgment must override them.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)