NB Liquor pulls AI-generated holiday ad after backlash - key takeaways for public agencies
New Brunswick's government paused an AI-produced ad for NB Liquor after artists and residents complained. The spot showed guests arriving at a home with booze in hand, but viewers flagged obvious AI tells: spotless lighting and gibberish on bottle labels.
Minister Luke Randall asked the Crown corporation to shelve the ad, saying he wanted to respect the calls to use New Brunswick talent. He also noted the broader policy conversation around generative AI and confirmed NB Liquor followed procurement rules.
NB Liquor said the campaign cost $38,000 and AI was used to keep expenses down for a small run. The budget covered voice talent, writing, animation, editing, custom graphics, and supporting assets. The video is off paid channels, but recordings are circulating online.
Why this hit a nerve
- Authenticity: Viewers spotted AI artifacts fast. Trust dropped just as quickly.
- Local jobs: The arts community asked why local actors weren't hired.
- Public expectation: "Compliant with procurement" isn't the same as "meets public standards."
What this means for government teams
- Policy and perception move at different speeds. You can be compliant and still lose public support.
- AI in creative work needs guardrails: talent use, disclosure, and quality checks.
- Local economic impact matters. Hiring residents can be a policy goal, not just a production choice.
- Reputation costs can erase short-term savings.
Practical steps before your next AI-enabled campaign
- Decide where AI is acceptable. For example: background imagery vs. people and products. Document it.
- Set "use of local talent" targets in briefs and RFPs when appropriate.
- Require vendors to disclose AI tools and outputs. Include audit rights.
- Build a preflight QA checklist for AI visuals: hands, labels, text, reflections, shadows, continuity.
- Mandate copy reviews to catch nonsensical text on packaging or signage.
- Plan disclosure. If AI is used, say how and why. Keep it simple and upfront.
- Run small audience tests to spot trust issues before paid distribution.
- Add a reputational risk gate in approvals. Ask: if this goes viral, are we comfortable defending it?
- Track costs beyond production: staff time, public response, corrections, and potential pull-backs.
- Train comms, procurement, and legal teams on AI-specific clauses, standards, and review practices.
The $38,000 question: cost vs. trust
NB Liquor's team framed AI as a way to manage a modest budget. That's a fair goal, especially for small, seasonal campaigns.
But trust carries its own price tag. If an ad gets pulled, the spend doesn't vanish-distribution stops, messaging stalls, and the public conversation moves without you.
Policy and guidance you can use
- Government of Canada: Guidance on the responsible use of generative AI
- Ad Standards: Canadian Code of Advertising Standards
Upskilling for public-sector teams
If your organization is updating comms and procurement practices for AI, make sure staff can spot quality issues and apply clear standards. A small amount of training saves headaches later.
Bottom line
AI can trim budgets, but public trust is the real constraint. Set clear rules, include local talent where it makes sense, and run sharp QA before anything goes live.
If you use AI, own the choice. If you don't, explain why. Either way, keep citizens-and their expectations-front and center.
Your membership also unlocks: