Skip to content
Link copied to clipboard
Link copied to clipboard

A Penn study on racial bias in health care algorithms shows “race is a bad variable”

Researchers say health systems should be careful not to assume that an algorithm is unbiased, just because it's not human.

Katherine Anderson's kidney transplant was delayed because an organ transplant algorithm made it less likely that Black patients would be approved for a transplant. A new study from the University of Pennsylvania recently examined dozens of studies on similar health care algorithms and found that race is an imprecise metric to use when determining care. In this 2023, photo, Anderson of Norristown shows some blood test results.
Katherine Anderson's kidney transplant was delayed because an organ transplant algorithm made it less likely that Black patients would be approved for a transplant. A new study from the University of Pennsylvania recently examined dozens of studies on similar health care algorithms and found that race is an imprecise metric to use when determining care. In this 2023, photo, Anderson of Norristown shows some blood test results.Read moreTom Gralish / Staff Photographer

University of Pennsylvania researchers undertook the most comprehensive review yet on the algorithms that doctors and health systems use to help make decisions on patient care and whether they contribute to long-standing racial biases in health care.

Their wide-ranging review of more than five dozen studies on algorithms and race found that some algorithms can result in more equitable care, while others can exacerbate health disparities.

Their conclusion: A patient’s race isn’t necessarily a useful data point in determining what kind of care they should receive.

Algorithms that included the race of a patient as part of an effort to reduce inequities generally resulted in better care for everyone. But those that included the race of a patient for no real reason tended to worsen racial disparities in health care.

It’s a sign that intention matters when designing and using such programs in health care, the researchers say — and that health systems should be careful not to assume that an algorithm is unbiased simply because it’s not human.

“Race is just a bad variable. It’s always a proxy for something else,” said Shazia Siddique, a Penn gastroenterologist who spent four years researching race and algorithms with a contract from the federal Agency for Healthcare Research and Quality. That’s because race is a social concept, she said, not a scientific one.

Genetics, for example, can vary widely among socially-defined racial groups: “Among Asians, for example, there’s huge differences in genetic ancestry and social and cultural practices,” Siddique said. And Black Americans descended from enslaved people might differ genetically from Black people from Africa or the Caribbean.

All of that variation makes race an unreliable data point. “The goal is really to figure out what are those variables to switch to instead of race,” Siddique said.

Applying the findings in Philly

Some industry players expect to use their findings as they work to correct algorithms that can deepen biases. Last year, 12 Philadelphia-area health systems and the region’s largest insurer, Independence Blue Cross, formed a coalition that pledged to eliminate race as a data point from several health-care algorithms.

The coalition advocates for race to be removed from 15 common clinical decision-making algorithms, all but one of which were identified in a 2020 New England Journal of Medicine study on race and algorithmic bias.

The goal is to help health systems find new approaches to clinical decision-making without the use of imprecise diagnostic metrics, like race.

“We discuss how to remove race from these tools,” said Seun Ross, IBX’s executive director of health equity. “We’re trying to understand how race is being used in these tools and if hospital systems are still using race in these tools.”

Siddique’s study, she said, underscores how important that work is.

“The use of race in these tools has to be very, very explicit,” Ross said. “You have to know what you are accounting for. You can’t just throw it in there and try to capture something objectively when race is subjective.”

How algorithms can exacerbate inequity

The researchers reviewed 63 studies on algorithms in health care to assess how their treatment of race affected a patient’s outcomes. Results varied. One study, for example, found that including race in an algorithm for prostate cancer screenings helped reduce unnecessary biopsies for Black men.

Another study found that more Black patients were eligible for kidney transplants after race was removed from an algorithm that helped diagnose kidney disease.

But several other studies found that not taking race into account in the same algorithm could also make Black patients less likely to access chemotherapy, be eligible for clinical trials, or receive accurate medication doses.

Sometimes, the algorithms did not perpetuate bias, but patients still were treated differently based on their identities. One study found that an algorithm designed to triage chest pain had no effect on racial disparities, but that doctors were less likely to use the program when they treated women.

It can be demoralizing to realize that algorithms designed to make health care more impartial may still be contributing to disparities, Penn’s Siddique said.

Those developing algorithms for care decisions should think carefully about how race works in the program, and ensure that they’re considering diverse populations and experiences as they work, Siddique said: “An algorithm is only as good as the data that underlie it.”

And it’s crucial that researchers, health systems, and insurance companies be open about how they’re using such programs to direct a patient’s care, she said.

“What we know is just the tip of the iceberg. There are many algorithms that are embedded in patient’s health records that are not transparent, and they’ve never been studied,” Siddique said. “More transparency and regulation is needed.”