Uber's AI Pay Under Fire: Legal Challenge Cites Data Risks and Falling Driver Earnings

Uber faces a legal challenge over AI pay, as WIE alleges GDPR breaches and data transfers to the U.S. An Amsterdam test case could reshape gig pay and driver rights.

Categorized in: AI News Management
Published on: Nov 21, 2025
Uber's AI Pay Under Fire: Legal Challenge Cites Data Risks and Falling Driver Earnings

Uber faces legal action over AI-driven dynamic pay systems

Uber has been sent a legal letter before action by the Worker Info Exchange (WIE) alleging that its AI-driven dynamic pay systems lack transparency and breach data protection laws. The letter targets Uber BV in Amsterdam and Uber Technologies Inc. in the United States, and is brought on behalf of drivers in the UK and Europe.

WIE says the issue extends beyond pay and touches the legality and safety of algorithmic management. The group is also reviewing Uber's pay systems across Europe and may expand the claim to other countries.

Core allegations

  • Use of artificial intelligence and machine learning to set pay in ways WIE calls intrusive and exploitative, harming drivers' earnings.
  • Insufficient transparency about how dynamic pay and profiling affect compensation and work opportunities.
  • Unlawful transfer of European drivers' personal data to the United States between August 2021 and November 2023, exposing data to risk of unauthorized access and government surveillance.
  • Use of drivers' personal data to train the same algorithms without valid consent, according to WIE.

Procedural posture and forum

If Uber does not cease the challenged practices and compensate affected drivers, WIE says it will file collective proceedings before the Amsterdam District Court under the Netherlands' collective redress law. The contemplated filing places Amsterdam as a central forum for EU-wide questions around algorithmic pay and data transfers.

Why this matters for legal teams

  • Fairness, transparency, and purpose limitation (GDPR Art. 5): whether drivers receive meaningful disclosures about automated pay-setting and its impact on them.
  • Information duties (Arts. 12-14): clarity, accessibility, and completeness of notices given to drivers, including the logic involved in dynamic pay.
  • Automated decision-making and profiling (Art. 22): whether decisions producing legal or similarly significant effects are made solely by automated means, and whether appropriate safeguards and human review exist.
  • Lawful basis and consent: whether Uber relies on legitimate interests, contract necessity, or consent-and if consent was valid for training or profiling uses.
  • DPIA and governance: whether a data protection impact assessment was performed, risks mitigated, and controls monitored for algorithmic bias and disparate impact.
  • International transfers (Arts. 44-49): whether transfers to the U.S. were covered by valid safeguards (e.g., SCCs, supplementary measures) and, after July 2023, any reliance on the EU-U.S. Data Privacy Framework.

Evidence cited on pay impacts

WIE cites research conducted by the University of Oxford in partnership with the foundation indicating that 82% of Uber's UK drivers earn less per hour after dynamic pay was introduced. The study says drivers lost between 8-16% in pay over the past year.

Possible arguments and points of contention

  • Uber may argue that dynamic pricing improves marketplace efficiency, relies on legitimate interests, and includes human oversight that takes it outside "solely automated" decisions.
  • On transfers, Uber may point to contractual safeguards and, post-July 2023, any adherence to the EU-U.S. Data Privacy Framework. WIE's timeline (Aug 2021-Nov 2023) will put those safeguards under close scrutiny.
  • A key factual fight: what data was used, for what purposes, with what disclosures, and how driver outcomes were affected in practice.

What in-house and external counsel should do now

  • Track the Amsterdam District Court docket and any collective redress filing under Dutch law.
  • Inventory dynamic pay, dispatch, and driver ranking algorithms; document logic, inputs, outputs, and human-in-the-loop controls.
  • Refresh privacy notices and driver-facing explanations to meet Arts. 12-14 and provide meaningful information about automated processing.
  • Assess Article 22 exposure and ensure accessible appeal channels, human review, and contestation rights.
  • Revisit lawful bases, consent flows, and retention rules for training data and profiling.
  • Review DPIAs, bias testing, monitoring, and audit logs; close gaps with corrective action plans.
  • Validate transfer mechanisms to the U.S. (SCCs, supplementary measures, and any DPF certification) and document transfer impact assessments.
  • Prepare a litigation hold, assign a cross-functional response team, and map evidence sources across EU and U.S. entities.

For background on international data transfers under the GDPR, see the European Commission's guidance here. For an overview of the Netherlands' collective redress framework, the Dutch judiciary's English-language portal is here.

What comes next

Unless the parties reach an early resolution, expect a filing in Amsterdam testing transparency, automated decision-making, and cross-border transfers in gig-economy pay systems. The outcome could influence algorithmic pay governance across platforms operating in Europe.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide