Illinois Lawsuit Over AI Meeting Assistant Exposes Biometric Privacy Risks
A class action lawsuit filed in December 2025 against Fireflies.AI, a popular meeting transcription tool, highlights a gap in how organizations vet and deploy AI technologies. The case centers on whether the software collected voice data without consent - a violation that could cost companies thousands of dollars per participant under Illinois law.
The plaintiff claims she participated in a virtual meeting where Fireflies.AI was enabled without her knowledge or agreement. She never created an account with the service, never signed its terms, and never authorized collection of her biometric data. Fireflies.AI's "Speaker Recognition" feature identifies different speakers in meetings by generating voiceprints - digital patterns of individual voices that Illinois law classifies as biometric identifiers.
What the Law Requires
Illinois' Biometric Information Privacy Act (BIPA) requires companies to do three things before collecting voiceprints or other biometric data: publish a retention and destruction schedule, inform people in writing what data is being collected and why, and obtain written consent.
The lawsuit alleges Fireflies.AI violated all three requirements. The plaintiff seeks $1,000 per negligent violation and $5,000 per reckless or intentional violation for each person whose voiceprints were captured, plus attorney fees and injunctive relief.
Where the Risk Extends
The exposure reaches far beyond Fireflies.AI or Illinois. Organizations using AI transcription tools in several contexts may unknowingly collect biometric data:
- Employee trainings. When multiple workers use the same conference room, each person's voiceprint gets captured unless they've signed written consent.
- HR interviews. Investigators and HR professionals documenting witness interviews or candidate conversations may be recording biometric identifiers without authorization.
- Healthcare settings. Providers using AI transcription face layered exposure under HIPAA, state privacy laws, and biometric regulations.
- Other devices. Performance management platforms and AI glasses that record audio create similar risks.
The problem isn't limited to meeting assistants. As AI embeds itself into more tools and devices, biometric collection happens quietly - often without anyone realizing it's occurring.
How Organizations Should Respond
Build a cross-functional team for technology reviews. IT departments cannot evaluate legal and compliance risks alone. Legal counsel, HR, risk management, and business leaders should jointly assess new tools before deployment. The questions that matter - what data does this collect, who has access, how long is it kept - require multiple perspectives.
Map legal obligations to each use case. A tool approved for one purpose may violate regulations in another. An AI transcription service cleared for internal meetings might trigger HIPAA violations in a clinical setting or create BIPA exposure in applicant interviews. Compliance teams should understand which laws apply to each planned use.
Reopen due diligence when functionality expands. Vendors frequently add features. Teams often find new ways to deploy existing tools. When either happens, the original compliance analysis may no longer apply. Organizations should treat significant changes as triggers for fresh review.
Document the process. Written records of what was evaluated, who approved it, and on what basis create accountability and demonstrate good faith effort. They also provide a defense if something goes wrong.
The Fireflies.AI case shows that rolling out technology without proper governance of AI deployment can expose organizations to substantial litigation and compliance costs. Data privacy and security decisions made during implementation are difficult and expensive to fix later.
Your membership also unlocks: