Credibility Over Hype: Legal Video Producer Rejects AI Injury "Enhancement"
NEW YORK, NY - February 20, 2026. A leading U.S. producer of personal injury "day in the life" videos and legal settlement documentaries refused a law firm's request to use AI to make a client's injuries look more severe.
"There's nothing more important than credibility," he said. "Even if it leads to an unhappy law firm."
Why this matters for litigators
Day-in-the-life videos exist to document, not dramatize. Any AI-driven alteration risks authentication challenges, impeachment, and a credibility hit that bleeds into the rest of your case.
Expect scrutiny on authenticity and process. Opposing counsel can attack altered footage under FRE 901 (Authentication), and judges may weigh prejudice under FRE 403. Ethics rules add another layer: misrepresentative edits can implicate duties of candor and honesty under ABA Model Rule 3.3 and related provisions.
"AI has no place in legal video production"
"Artificial Intelligence has absolutely no place in legal video production," said Andrew Colton, an award-winning former network news correspondent who works with more than 200 attorneys and law firms nationwide. He says his legal video work aims to credibly document injuries to reach appropriate settlements or judgments-and that using AI to exaggerate harm is out of bounds.
He reports refusing a firm's request to "enhance" injuries with AI, even if it meant negative chatter on attorney listservs. The stance aligns with a message he says he's delivered for more than a decade: hire a communication professional to produce day-in-the-life and settlement documentaries; treating this like routine deposition recording invites problems.
"There are law firms out there that utilize so-called CLVS legal videographers for one of the most important elements of a personal injury case," he said. "A legal videographer is someone who records depositions. That's not the person you want documenting someone's personal moments like catheter maintenance, bowel program, or amputation aftermath."
Practical guidance for your firm
- Adopt a written policy: no AI-generated or AI-altered depictions of injury, period. Limit edits to clarity (color balance, exposure, stabilization) and disclose them.
- Define "permissible edits" vs. "substantive alterations." Prohibit reenactments unless clearly labeled, documented, and consented to.
- Demand chain-of-custody: time-stamped logs from capture to export, stored originals, and cryptographic hashes for source and final files.
- Require vendor attestations that no AI tools were used to alter appearance, movement, sounds, or scenes. Obtain project files upon request.
- Authenticate proactively: include capture device details, metadata reports, and an editing memo for potential evidentiary hearings.
- Train your team on deepfake risks and how to spot synthetic artifacts in audio and video.
- Vet specialists: day-in-the-life documentaries are communication and storytelling-choose producers with documentary credentials and medical sensitivity, not just deposition experience.
- Prepare for disclosure: if any non-substantive enhancement was applied, be ready to explain it simply, consistently, and with documentation.
Building or updating your internal AI policy? See AI for Legal for guidance on responsible adoption and risk controls.
Who he is and what he does
Andrew Colton operates nationwide as a one-man band. His video documentaries have helped induce major, record, and landmark settlements and judgments across personal injury, traumatic brain injury, wrongful death, truck accidents, medical malpractice, and other serious cases.
Through Colton Legal Media, he produces Personal Injury Day In The Life Video, Legal Video, and Legal Settlement Documentary productions, without domestic travel fees. He works in person across the United States and internationally.
His bottom line stays the same: keep the camera honest. "Artificial Intelligence has absolutely no place in legal video production."
Your membership also unlocks: