Detroit's AI Emily now answers residents' calls for city services

Detroit is piloting "Emily," an AI that answers calls in districts 3 and 4, captures key details, and routes issues fast. Early tests point to quicker intake and smoother handoffs.

Categorized in: AI News Government
Published on: Dec 20, 2025
Detroit's AI Emily now answers residents' calls for city services

Detroit pilots an AI voice assistant to handle resident calls in two council districts

Detroit is testing a new way to pick up the phone. "Emily," a voice assistant built by Detroit-based Believe in AI, now answers calls to the district managers for City Council districts three and four.

Emily sounds human. Callers hear light office background noise when the line connects. The goal is simple: collect the right details quickly and help residents reach city services without waiting on hold.

Developers Gabe Wilson and Mario Kelly built Emily to carry a natural conversation while staying on task. Wilson said he trained Emily for 10 months to get there.

What Emily does on a call

  • Greets the caller and explains it's assisting the office.
  • Captures key information (issue, location, contact details) without long menus.
  • Routes or documents the request for the district team to act on.
  • Hands off to a person when needed.

Why this matters for city operations

  • Faster response: fewer rings and shorter hold times during peak hours.
  • Consistency: every call starts with the same essential questions for clean, usable data.
  • Triage: routine requests get logged and routed; complex cases go straight to staff.
  • Better tracking: structured call data supports service-level goals and staffing plans.

Policy and guardrails to put in place

  • Clear disclosure: callers should know they're speaking with an AI assistant and how to reach a person.
  • Privacy: define what's recorded, how it's stored, retention periods, and who can access it.
  • Security: protect transcripts and metadata; restrict API keys and admin access.
  • Bias and fairness: review prompts and decision paths to avoid unequal treatment across neighborhoods.
  • Accessibility: ensure speech clarity, TTY alternatives, and multilingual paths where applicable.
  • Fallbacks: quick transfer to a human when the assistant is unsure or on sensitive topics.
  • Auditability: keep logs for oversight, quality checks, and public records requests.

For a structured approach to risk, many agencies reference the NIST AI Risk Management Framework.

Implementation checklist for city teams

  • Define scope: start with specific lines (e.g., district offices) and clear call types.
  • Set success criteria: speed of answer, abandonment rate, first-contact resolution, escalation rate, and caller satisfaction.
  • Prepare scripts and prompts: short, plain language; include verification and consent lines.
  • Train on real calls: redacted examples improve accuracy; update regularly.
  • Stand up escalation playbooks: who gets a transfer, within what time, and how it's documented.
  • Run a soft launch: limited hours or segments before full rollout.
  • Communicate: let residents know what's changing, why, and how to reach a person anytime.

Metrics to track weekly

  • Average speed of answer and average handle time.
  • Abandonment and transfer-to-human rates.
  • First-contact resolution and time-to-resolution for routed tickets.
  • Callback success rate and data completeness (address, callback number, service category).
  • Quality audits: accuracy of summaries, correct categorization, and escalation appropriateness.

Resident experience standards

  • Plain-language greeting that states it's an AI assistant supporting the district office.
  • Option to speak to a person at any point.
  • Short questions, no long menus, and confirmation of the request before ending the call.
  • Reference number or follow-up expectations provided to the caller.

Common risks and how to reduce them

  • Misunderstanding callers: keep questions short; confirm key details; enable quick human transfer.
  • Over-automation: cap call length; route sensitive topics to staff immediately.
  • Data sprawl: centralize storage; apply retention and deletion schedules; review access logs.
  • Drift over time: schedule monthly prompt updates and quarterly model evaluations.

What Detroit is testing right now

The pilot focuses on City Council districts three and four when residents call their district managers. Early priorities appear to be faster intake, better information capture, and smoother handoffs to the district teams.

After performance reviews, the city can decide whether to tune the assistant, expand its use, or narrow the scope. The key is staying honest about what works, what doesn't, and what residents actually prefer.

Staff enablement

AI phone intake changes workflows. Train staff on reading AI summaries, correcting errors, and closing the loop with residents quickly. Keep a simple feedback channel so frontline employees can flag problems and request prompt updates.

If your team needs practical upskilling on AI for public service, explore job-focused course paths at Complete AI Training or a hands-on credential in automation at AI Automation Certification.

Bottom line: Detroit's pilot shows a focused use of AI where it can help most-answering more calls, capturing better data, and getting issues to the right people faster. Build with clear policies, measure relentlessly, and keep residents' trust at the center.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide