How Law Enforcement Is Skirting Facial Recognition Bans With New AI Surveillance Tools

A new AI tool called Track identifies people using clothing and accessories instead of faces, bypassing facial recognition bans. This raises privacy and misidentification concerns.

Categorized in: AI News Legal
Published on: May 18, 2025
How Law Enforcement Is Skirting Facial Recognition Bans With New AI Surveillance Tools

The New AI Surveillance Tool Challenging Facial Recognition Bans

By the end of 2024, fifteen US states had enacted laws banning certain forms of facial recognition technology. These laws stemmed from concerns over privacy violations and the technology's frequent inaccuracies. However, a new AI tool called Track is emerging as a workaround, operating in ways that sidestep these legal restrictions.

Track is not designed to improve facial recognition itself nor to address privacy concerns. Instead, it exploits a legal loophole by using nonbiometric data to identify individuals. Developed by Veritone, a company specializing in video analytics, Track analyzes attributes such as shoes, clothing, body shape, gender, hair, and accessories—everything except the face.

How Track Works

The system scans various video sources including closed-circuit security cameras, body-worn cameras, drones, Ring cameras, and publicly shared social media footage. Users select specific attributes from dropdown menus—such as type and color of clothing, accessories like bags or hats, and physical features—to filter and identify subjects in the footage.

  • Accessory: Options include backpacks, briefcases, glasses, hats, scarves, and more.
  • Upper Clothing: Filters by color, sleeve length, and garment type.
  • Other categories: Footwear, body shape, gender, and hair attributes.

Track then presents images matching these criteria, helping users trace a person's movements through multiple video sources. This triangulation mimics the goal of facial recognition but avoids using biometric data, allowing it to operate in places where facial recognition is banned.

Legal and Ethical Implications

This approach raises serious privacy concerns. By sidestepping facial recognition bans, Track enables surveillance that lawmakers intended to restrict. The system's reliance on clothing and other physical traits can lead to misidentifications, potentially ensnaring innocent individuals.

Track’s developers promote it as a tool capable of both identifying suspects and exonerating individuals. Their CEO nicknamed it the "Jason Bourne tool," emphasizing its investigative utility. Yet, critics highlight the risks of expanded surveillance powers.

According to civil liberties advocates, this technology introduces a "categorically new scale and nature of privacy invasion and potential for abuse that was literally not possible any time before in human history." The possibility of wrongful accusations and arrests remains a pressing concern.

What Legal Professionals Need to Know

Law enforcement agencies are adopting Track, expanding their surveillance capabilities despite existing facial recognition restrictions. Legal professionals should stay informed about how such technologies operate and the implications for civil liberties and due process.

As surveillance tools evolve, so does the challenge of balancing public safety with individual rights. Laws may need to adapt to cover emerging technologies that exploit current legal gaps.

For those interested in the intersection of AI and law enforcement technologies, keeping up with tools like Track is essential. Understanding their capabilities and limitations can inform legal strategies and policy development.

To learn more about AI applications and legal considerations, explore relevant resources such as Complete AI Training's courses for legal professionals.