Editors warn: AI boosts speed, but only reporters bring human judgment
Senior journalists gathered in Delhi on Friday to discuss how newsrooms can use AI without losing editorial integrity. The consensus: AI handles routine work efficiently, but reporting still requires human judgment.
At a conference organized by the Editors Guild of India, speakers outlined where AI adds value and where it falls short. Sanjay Kapoor, the guild's president, said plainly: "Only reporters can bring human voices and emotions."
Speed versus nuance
Pradeep Gairola, Chief Digital Business Officer at The Hindu, described AI as useful for verifying data and removing routine tasks. Automating these jobs frees journalists to focus on ground-level reporting instead of administrative work.
But cultural context matters. Bilal Bhat, editor at ETV Bharat, noted that most AI systems are trained on Western datasets and miss the nuance required for Indian reporting. He said human-led fact-checking remains essential for original content.
Tools, not replacements
Sanket Upadhyay, founder of Double Check, called AI a "brilliant tool" for independent journalists. He cautioned against blind reliance: "One must constantly evaluate the merits and demerits of AI."
Gairola added a direct warning: "Machines should not control us." The message across the conference was consistent-AI works best when journalists remain in control of editorial decisions.
For writers working with AI systems, understanding both their capabilities and limits is critical. Learning to work effectively with these tools-through techniques like Prompt Engineering-can help journalists extract value while maintaining standards. Understanding how Generative AI and LLM systems work also helps editors recognize where human judgment must override automation.
Your membership also unlocks: