Uber hit with legal demands to halt AI-driven pay systems
Uber is facing legal demands to stop using its AI-driven pay systems in Europe after the non-profit Worker Info Exchange (WIE) issued a letter before action alleging breaches of data protection law. The case is expected to be filed in Amsterdam and centers on Uber's "dynamic pricing" algorithm, which WIE says has cut driver earnings while increasing Uber's share of fares. WIE argues the harm began in 2020 with "upfront pricing," accelerating after dynamic pricing rolled out in 2023.
James Farrar, WIE's director, said Uber has used "deeply intrusive and exploitative" pay-setting systems trained on drivers' own data. WIE is seeking to force Uber to cease the practice, revert to transparent pay-setting with a human in the loop, and compensate affected drivers. The foundation says it will bring collective proceedings under the Netherlands' collective redress law if Uber does not comply.
What WIE is alleging
WIE claims Uber's algorithm varies driver pay using historic personal data and automated decision-making without sufficient transparency or control for drivers. Under GDPR, WIE argues drivers can demand that Uber stop the processing, reintroduce human review, and pay damages for losses.
- Cease use of algorithmic pay-setting that relies on automated decision-making about individuals.
- Revert to a clear, explainable pay method with human oversight for impactful decisions.
- Provide transparency about the logic, inputs, and effects of the system, and compensate drivers.
The evidence WIE cites - and Uber's response
WIE partnered with Oxford University on research indicating many drivers earned "substantially less" per hour after dynamic pricing launched, while Uber captured a larger share of fares. The paper reported stagnant, and in real terms lower, hourly pay post-2023.
Uber disputes the findings, saying drivers have flexibility and visibility into each trip's fare, destination, and earnings before accepting. The company says the study uses incomplete, selective data and that the authors admit they cannot isolate dynamic pricing's causal impact on pay. Uber adds that passenger demand and trips continue to grow and that many drivers choose the platform for those reasons.
Key legal issues for counsel
- Automated decision-making: If pay-setting is "solely" automated and significantly affects drivers, GDPR Article 22 may apply, triggering rights to human intervention, to express a view, and to contest decisions.
- Transparency and information duties: Articles 12-15 require clear, accessible explanations of the decision logic, meaningful information about the system, and the consequences for drivers.
- Lawful basis and fairness: Articles 5 and 6 require fair processing and a valid legal basis. Using drivers' historic behavioral data for pay decisions will draw scrutiny on necessity and proportionality.
- DPIA and governance: A Data Protection Impact Assessment (Article 35) is likely required given systematic, large-scale, high-impact processing. Expect questions on risk mitigation, bias testing, and human-in-the-loop controls.
- Territorial scope and enforcement: With Uber's EU base in Amsterdam, the Dutch DPA could be lead authority, with cross-border coordination. Civil exposure includes injunctions and damages via collective redress (WAMCA).
Potential exposure and remedies
- Injunctive relief to halt or limit algorithmic pay-setting until safeguards are in place.
- Damages for alleged earnings losses tied to automated decisions.
- Orders to implement transparency, access, and contestation mechanisms for drivers.
- Regulatory investigations, fines, and mandated remediation plans.
Practical steps for platforms using AI for pricing or pay
- Map all automated decisions that affect pay, access to work, or pricing. Identify where decisions are "solely" automated.
- Run a DPIA, document risks, and harden controls: human review, appeal paths, thresholds for intervention, and audit logs.
- Refresh privacy notices and in-app explanations. Provide meaningful insight into inputs, logic, and foreseeable effects.
- Offer a workable route to human intervention and contestation. Track outcomes to show effectiveness.
- Validate the legal basis for each use of historic personal data. Minimize, retain narrowly, and justify profiling with necessity tests.
- Test for disparate impact on specific driver cohorts. Monitor, retrain, and version models with change controls.
- Prepare evidence packs: DPIAs, model cards, testing results, and decision review statistics for regulators and courts.
What to watch next
Whether Uber agrees to changes or faces a collective action in Amsterdam under WAMCA. If the case proceeds, it could set a benchmark for algorithmic wage-setting across platform work, especially where decisions materially affect income.
Regulators may also sharpen guidance on automated decision-making in employment-like contexts, and courts will probe how "meaningful information" and "human in the loop" should work in practice.
Primary sources and guidance
General Data Protection Regulation (EUR-Lex)
Collective claims in the Netherlands (WAMCA) - Dutch Judiciary
Your membership also unlocks: