Kenya's AI Healthcare System Overcharges the Poor, Investigation Finds
A machine learning algorithm used to determine healthcare costs in Kenya systematically charges the poorest citizens more than they can afford while undercharging the wealthy, an investigation by Africa Uncensored and Lighthouse Reports has found.
President William Ruto launched the Social Health Authority (SHA) in October 2024 as his flagship healthcare reform. The system was supposed to extend affordable coverage to Kenya's 83% informal workforce-day labourers, hawkers, farmers, and self-employed workers who fall outside traditional insurance schemes.
Instead, the algorithm has sparked widespread protests. Of more than 20 million people registered, only 5 million regularly pay their premiums.
How the system works-and where it fails
Government volunteers visit households and ask detailed questions: What is your roof made of? Do you own a radio? How many children do you have? The answers feed into an algorithm that predicts household income and calculates healthcare contributions.
The system is based on proxy means testing (PMT), a decades-old World Bank method that estimates income based on possessions and living conditions rather than actual earnings.
The audit tested the algorithm against thousands of real households. For farmers with electricity and home ownership, the system predicted their income was twice what they actually earned. Families in Nairobi's poorest neighbourhoods were charged premiums between 10% and 20% of their meagre incomes.
Grace Amani, a government volunteer who registers households, has watched critically ill people unable to access treatment because they cannot pay what the algorithm demands. "People are dying at home," she said. "Many people have been unable to go to hospital. Will they pay SHA, or pay for food, or pay for the small house they live in?"
The algorithm chose to protect the rich
A health economist who advised Kenya's health ministry explained the trade-off. The system's constraints meant it could either correctly assess poor households or correctly assess wealthy ones-not both.
The government chose to prioritise accurately evaluating the wealthy. The logic: a misclassified rich person would never voluntarily report higher income, while a misclassified poor person might complain but lack recourse.
An IDinsight report, obtained by the investigation and shared with the government before launch, warned the system was "inequitable, particularly for low-income households." The data was out-of-date and "over-represents middle-income households." Kenya deployed the system anyway.
A wider pattern
PMT algorithms have spread across Africa, Asia, and Latin America in recent years, often pushed by international donors as conditions for loans. They determine eligibility for cash transfers, food subsidies, and other benefits.
Research shows they consistently fail. One poverty-targeted scheme in Indonesia excluded 82% of the population it aimed to serve. Another in Rwanda had a 90% error rate. Kenya's SHA system appears to overcharge more than half of poor households.
The core problem is that poverty is fluid. Using an iron roof or pit toilet to estimate wealth is inherently imprecise. But the opacity compounds the damage. "It feels like a lottery," said Stephen Kidd, a development economist. "The lottery is not a great way of building trust."
Real costs
Kenyans without private insurance who don't pay SHA premiums risk being turned away from hospitals or facing steep bills. On social media, people have posted accounts of charges they cannot afford. One person wrote: "From struggling to pay 500 Kenyan shillings [£2.90] previously to being billed 1,030 Kenyan shillings." A single mother said her monthly contribution was set at 3,500 shillings.
Some hospitals are reporting large deficits as SHA reimbursements remain unpaid. A former deputy president predicted in March that "SHA will collapse in another six months."
Dr Brian Lishenga, chair of Kenya's Rural and Urban Private Hospitals Association, called the system an experiment that has failed. "It's a really poor tool for identifying poor households," he said. "It's a great tool for helping the government run away from responsibility."
The SHA system shows how AI for healthcare can entrench inequality when deployed without adequate testing or transparency. Understanding how AI data analysis makes decisions about vulnerable populations matters for healthcare professionals implementing or overseeing such systems.
Your membership also unlocks: