Why Cities Are Dropping Flock Safety - And What Officials Should Do Next
Since the start of 2025, at least 30 cities have canceled contracts with Flock Safety, the AI surveillance firm best known for its networked license plate readers. Organizers say community pressure is rising fast. "We are seeing a lot more momentum," said Will Freeman of DeFlock.org. "I expect there to be more cities dropping Flock."
The backlash has already pushed out deployments in Flagstaff, Arizona; Eugene, Oregon; and Santa Cruz, California. "In the end, it was just clear that this wasn't going to be a technology that was going to be well received or that we could continue to use," Flagstaff mayor Becky Daggett told NPR.
What's actually being captured
License plate readers (LPRs) log time-stamped scans that build a detailed location history of drivers and vehicles. That dataset can be shared across agencies, re-queried later, and, if misused, can put ordinary residents at risk.
DeFlock, an open-source tracker, has mapped more than 77,000 cameras nationwide across multiple vendors. While Flock is the largest, it's part of a wider surveillance ecosystem that cities often adopt without clear rules for accuracy, retention, or access.
The risks cities are weighing
- False positives and weak evidence: A Denver woman was accused of a $25 package theft after police tied her car to the area via Flock data; independent GPS later showed she never stopped there.
- Civil rights exposure: Location trails can enable racial profiling, targeted stops, and officer misuse.
- Data governance gaps: Retention, vendor sharing, and cross-agency access are often under-specified or buried in contracts.
- Public trust and consent: Residents rarely get a say in siting, scope, or safeguards, which fuels backlash and legal risk.
- Policy spillover: Advocates warn DHS and ICE, under the current administration, are expanding use of AI-driven surveillance in immigration enforcement-intensifying local concern over data sharing.
For background on how ALPR systems work and common pitfalls, see the EFF primer on ALPRs and the ACLU overview.
Before you renew or sign: a short checklist
- Define success in plain numbers: Require monthly metrics that distinguish leads from arrests, arrests from prosecutions, and prosecutions from convictions. Publish false-positive rates.
- Independent accuracy testing: Mandate third-party audits of read accuracy, hotlist matching, and demographic impact. No black-box claims.
- Data minimization by default: Keep non-hit data for days, not months. Prohibit bulk exports. Log and disclose every search.
- Tight hotlist controls: Verify sources, expiration dates, and approval chains. Remove stale or low-confidence entries automatically.
- Ban sensitive-location collection: No scans around clinics, schools, religious institutions, shelters, or protest sites without explicit, time-bound authorization.
- Clear sharing rules: Prohibit use for immigration enforcement and require written approval for any inter-agency data access.
- Public governance: Adopt a council-approved surveillance ordinance, hold annual hearings, and publish siting maps and policies.
- FOIA-ready logs: Maintain immutable audit trails, redaction protocols, and public reporting dashboards.
- Procurement guardrails: Avoid auto-renewals, secure price ceilings, and include performance-based exit clauses.
- Officer training and discipline: Require training on lawful use and enforce penalties for misuse.
If you keep or pilot ALPRs, reduce harm
- Use geofencing to exclude sensitive areas and limit cameras to specific, time-bound investigations.
- Adopt a 24-72 hour retention period for non-hits and encrypt everything at rest and in transit.
- Require case numbers and supervisor sign-off for every query; spot-audit regularly and report results.
- Pair tech with basics that actually deter theft-target-hardening, lighting, and community-led prevention.
Why cities are walking away
Officials are deciding the marginal gains don't outweigh the legal exposure, community blowback, and maintenance costs. As cancellations add up, vendors will offer concessions-shorter terms, better dashboards, new claims. Treat those as negotiation starts, not outcomes.
If you've lost public trust, the fix isn't another feature. It's measurable policy, transparent reporting, and a clear line between safety and surveillance creep.
Resources for public leaders
- AI for Government - practical guides on oversight, procurement, and risk controls for public-sector AI.
- AI Learning Path for Policy Makers - frameworks for evaluating AI systems, drafting policy, and setting accountability metrics.
Your membership also unlocks: