May 4, 2026
How Kenya Used a Predictive Algorithm to Transfer Health Costs onto the Poorest
As cost-of-living protests roiled the streets of Nairobi in 2024, the Kenyan government embarked on a population-level technological experiment. They would use AI to calculate how much every Kenyan should pay for access to healthcare.
The incoming president, William Ruto, promised that the new Social Health Authority (SHA) would bring affordable healthcare to all. In reality, Kenyans took to social media in shock and anger over the cost of insurance that the AI model was setting for them.
The AI system asks dozens of questions about how people live and what they own, then uses machine learning to make a prediction of their income based on their answers. A percentage of this predicted income is then set as their annual health payment.
We used a combination of access-to-information requests and government data to reconstruct how the government designed the prediction system, spoke to inside sources about its implementation, and obtained a previously unreported document describing where it went wrong. Our findings reveal in unprecedented detail how, from the start, it was designed to systematically overcharge the poorest Kenyans, while undercharging the wealthiest.
Increasingly, governments around the world are deploying AI in efforts to increase revenue and efficiency. Those implementing them understand that they come with inbuilt biases. In the context of access to healthcare, these biases can mean the difference between life or death.
METHODS
SHA’s AI system uses a machine learning model to analyse indicators (the answers to questions about how people live) and make a prediction of their income, based on a training dataset in the form of a household survey carried out in 2020. The government published the formula it would use for making the prediction, but not the details of the indicators or the training set. We made access to information requests for these. Under pressure from the Ombudsman, SHA released these details, which we used to build our own machine learning model to replicate and test their one.
We also uncovered a previously unseen report by a team of consultants who proposed a set of adjustments to the system. We refined our model to implement these changes and test what effects they had.
Our replication of SHA’s model was peer reviewed by leading academics at the University of Kent, Georgetown University and Delft University of Technology. We have written a full methodology here.
STORYLINES
Every day, Grace* sits in people’s homes and asks them questions. What type of toilet do you use? What is your roof made of? Do you own a radio? She helps them fill the answers into an online form. When the process is complete a number comes back: the sum the AI system dictates the household must pay that year for public health insurance.
Grace is a Community Health Promoter, working with marginalised people to help them access healthcare. The people she registers are some of the poorest in Nairobi, yet the majority are charged premiums they cannot afford. For some, this has meant they can no longer access life-saving medical treatment. “People are dying at home, many people have been unable to go to hospital,” Grace said. “Will they pay SHA, or pay for food, or pay for the small house they live in?”
Peris Nduta faces the same choice. Until last year she scratched a living as a mama fua, washing other people’s clothes. But when a fall left her with a broken leg needing metal pins, she found she could no longer work. The AI system set a premium for her that she couldn’t understand. Friends helped her make an initial payment but, now jobless, she could not maintain it to cover ongoing treatment and medication. “The priority is for my children to eat,” she said, “not to pay for SHA.”
In Huruma, a poor neighborhood on the northeastern outskirts of Nairobi, Africa Uncensored spoke to Florence Atieno about how she had struggled to pay the SHA premium when pregnant. She hoped that the automated appeal line could help her reduce her burden. But it was denied, without a reason given.
CO-PUBLICATIONS
- Africa Uncensored: Hiding Behind AI: How SHA Was Used to Load Health System Costs Onto Poorest
- Africa Uncensored: ERROR BY DESIGN: How SHA's AI system is failing Kenyans
- The Guardian: Flaws in Kenya’s AI-driven health reforms driving up costs for the poorest