Alberta Tests AI to Draft the Alberta Whisky Act - With Guardrails and Human Oversight
Alberta's provincial government plans to use artificial intelligence to draft an upcoming bill: the Alberta Whisky Act. The legislation will define the standards and requirements for what can be legally labeled "Alberta whisky," as outlined in an autumn mandate letter.
Minister of Service Alberta and Red Tape Reduction Dale Nally calls it a low-stakes pilot to speed up first drafts. "It doesn't involve any hearts or lungs," he said, noting that human legislative writers will review and correct the text before it moves forward. The justice minister is expected to have the final proofread.
Officials say they won't rely on a public chatbot. Instead, they'll use a controlled system to compile relevant details like ingredients, characteristics, and taste profiles - then run the draft through multiple checks before it ever reaches the floor.
Industry will have a seat at the table. "I'm actually really excited about the utilization of artificial intelligence in the drafting of this act," said Burwood Distillery's Jordan Ramey. "Right now, we're producing some of the world's highest-quality whisky, but we're not telling that story on an international level."
On the tech side, Edmonton's Sam Jenkins points to both promise and risk. He sees potential to analyze larger datasets for more effective legislation - but warns that failure is an option: inconsistent wording, misaligned definitions with existing statutes, and unintended loopholes. The fix, he says, is keeping a strong human touch from start to finish.
The government expects to table the Alberta Whisky Act in 2026.
Why this matters for public servants
- Practical pilot: A contained, lower-risk file lets teams test AI-assisted drafting without disrupting core services.
- Precedent-setting workflows: Expect pressure to define how AI is sourced, governed, audited, and reviewed inside the legislative process.
- Better alignment: Standards for "Alberta whisky" should account for existing federal definitions and food labelling rules to avoid conflicts. See Canada's whisky standard in the Food and Drug Regulations, section B.02.022 (Justice Laws).
Suggested guardrails and workflow
- Model choice and security: Use a secure, private model with documented data handling. Log prompts, outputs, and edits for auditability.
- Corpus curation: Restrict training and reference material to vetted sources (existing statutes, authoritative standards, prior bills, style guides). Flag and quarantine non-authoritative inputs.
- Drafting protocol: Require the model to cite its sources for each clause. Enforce style and definitions via a machine-readable schema (terms, cross-references, plain-language rules).
- Legal QA: Mandate reviews by legislative counsel and policy leads. Run automated checks for internal consistency, cross-reference integrity, and conflicts with existing law.
- Version control and traceability: Maintain a full change log from AI draft to final text. Store prompts, model versions, reviewers, and decisions.
- Red-teaming: Stress-test the draft for loopholes, ambiguous terms, and unintended effects. Include external reviewers with domain expertise (distillers, standards bodies).
- Public transparency: Disclose where AI assisted, what data sources were used, and how human oversight was applied.
- Standards alignment: Map provisions against federal and provincial rules (e.g., food and beverage identity standards, labeling, and enforcement frameworks). Reference drafting best practices (Guide to Making Federal Acts and Regulations).
What success looks like
- Clear, enforceable definitions that fit cleanly with existing statutes and policy intent.
- Fewer drafting cycles on first-pass text, with equal or better legal quality after human review.
- Documented, repeatable workflow that other ministries can adopt for low-risk files.
What can go wrong (and how to avoid it)
- Inconsistent wording or misaligned definitions: Lock a shared glossary and force cross-references in the model's prompts.
- Unintended loopholes: Run structured adversarial reviews and scenario tests before committee.
- Source contamination: Whitelist authoritative sources; block unvetted web content from influencing drafts.
- Overreliance on AI phrasing: Require human edits for tone, clarity, and legislative fit; use plain-language checks.
Immediate next steps for ministries
- Select two to three low-risk candidates for AI-assisted drafting pilots with clear success metrics.
- Set up a cross-functional team (policy, legal, data, security, records management, comms).
- Publish an AI drafting playbook: model governance, data sources, style schema, review gates, and audit procedures.
- Train reviewers on prompt critique, legal QA with AI outputs, and bias/ambiguity spotting.
- Commit to a public postmortem after the Whisky Act pilot: what worked, what didn't, and what will change.
For teams that need structured upskilling on AI-assisted drafting and review workflows, see curated role-based training resources (Complete AI Training - Courses by Job).
Your membership also unlocks: