New Brunswick shelves AI-made NB Liquor ad after backlash: practical takeaways for public agencies
New Brunswick has pulled an AI-generated NB Liquor ad after complaints from the arts community and the public. The video showed guests arriving for a holiday party, but viewers flagged telltale signs: flawless lighting and bottle labels that read like gibberish.
Minister responsible for NB Liquor, Luke Randall, asked that the ad be paused. "I wanted to respect those voices," he said, noting a clear preference to hire New Brunswick actors in future campaigns.
The flashpoint: authenticity and local talent
The creative looked synthetic at a glance, and unmistakable on a second look. That was enough to spark criticism-not just about the visuals, but about why local performers weren't part of the work.
Randall called generative AI an "ongoing national issue" and said he'll keep talking with NB Liquor about how it's used. He also acknowledged the public's response: "There's no lack of creativity in New Brunswick."
Cost, compliance, and the AI choice
NB Liquor said it used AI to keep costs down on a modest $38,000 campaign. That budget also covered voice talent, writing, animation, editing, custom graphics, and supporting display assets.
"AI didn't replace our team or our creative direction," spokeswoman Florence Gouton said. It was a technique they tested, similar to animation. The ad has been removed from paid placements, though copies continue to circulate online.
The corporation followed procurement rules, according to the minister. The issue wasn't contract compliance-it was public trust, creative standards, and local economic impact.
What government communicators can learn
This incident is a useful stress test for public-sector marketing. The core lesson: cost savings don't offset stakeholder backlash if authenticity or local participation is questioned.
- Policy first: Define when AI is allowed in communications, when it isn't, and how it must be disclosed to audiences.
- Human review: Require human-in-the-loop checks for accuracy, brand standards, and community impact before publishing.
- Vendor terms: Mandate clear declarations of AI use, IP/copyright assurances, dataset and model transparency, and indemnities.
- Local participation: Set expectations for hiring local performers and creators where feasible, especially for Crown work.
- Quality controls: Add pre-flight checks for visual artefacts (e.g., hands, text, labels), factual errors, and legal risks (trademarks, likeness rights).
- Full cost view: Weigh reputational, cultural, and policy risks alongside dollar savings. Run a quick focus test if uncertain.
- Governance and escalation: Give ministers and senior comms leads authority to pause content quickly. Do a post-mortem after any issue.
- Recordkeeping: Log tools, prompts, models, and edits used to produce content to support transparency and audits.
- Team readiness: Train staff on ethical use, quality standards, and disclosure practices for AI-assisted creative work.
Helpful guidance
- Government of Canada: Guidance on the use of generative AI
- Directive on the Management of Communications (TBS)
If your department needs practical upskilling on AI for communications and creative workflows, see this curated list by role: AI courses by job.
What to watch next
Expect continued discussions between the minister and NB Liquor about guardrails for future campaigns. Watch for clearer provincial guidance on AI use in advertising, stronger disclosure norms, and a renewed push to involve local talent in publicly funded creative work.
For public agencies, this is the moment to update playbooks: set the policy, tighten QA, and make room for local creators. It will save time-and trust-later.
Your membership also unlocks: