Digital Media Center
Bryant-Denny Stadium, Gate 61
920 Paul Bryant Drive
Tuscaloosa, AL 35487-0370
(800) 654-4262

© 2024 Alabama Public Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

'Automating Inequality': Algorithms In Public Services Often Fail The Most Vulnerable

In the fall of 2008, Omega Young got a letter prompting her to recertify for Medicaid.

But she was unable to make the appointment because she was suffering from ovarian cancer. She called her local Indiana office to say she was in the hospital.

Her benefits were cut off anyway. The reason: "failure to cooperate."

"She lost her benefits, she couldn't afford her medication, she lost her food stamps, she couldn't pay her rent, she lost access to free transportation to her medical appointments," Virginia Eubanks tells NPR's Ari Shapiro. Eubanks is the author of a new book, Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor.

"Young died on March 1, 2009," Eubanks says. "The next day, she won an appeal for wrongful termination and all of her benefits were restored the day after her death."

Young's story is one of three detailed pictures across the country that Eubanks draws to illustrate that automated systems used by the government to deliver public services often fall short for the very people who need it most: An effort to automate welfare eligibility in Indiana, a project to create an electronic registry of the homeless in Los Angeles, and an attempt to develop a risk model to predict child abuse in Allegheny County, Penn.

Welfare Eligibility In Indiana

With automation, Eubanks says Indiana lawmakers wanted to save money and streamline the state's welfare system.

"But the way the system rolled out, it seems like one of the intentions was actually to break the relationship between caseworkers and the families they served," the author says.

In promoting the contract, she says, the governor, she says, kept pointing to one case to suggest that a system that lets caseworkers and families develop personal relationships invites fraud.

"There was one case where two caseworkers had colluded with some recipients to defraud the government for about $8,000," she says. "So what happened is the state replaced about 1,500 local caseworkers with online forms and regional call centers. And that resulted in a million benefits denials in the first three years of the experiment, which was a 54 percent increase from the three years before."

But, Eubanks says, automated public service systems that serve those living in poverty or with poor health are not inherently less effective than mainstream automated services like Uber or Lyft. Rather, she worries that these systems are used "as a kind of empathy override."

"One of my greatest fears in this work is that we're actually using these systems to avoid some of the most pressing moral and political challenges of our time — specifically poverty and racism," she says.

Resource Allocation For The Homeless In Los Angeles

Eubanks says these tools are being used to outsource hard decisions to machines — including the allocation of housing in Southern California.

"So there are 58,000 unhoused folks in Los Angeles," she says. "It's the second highest population in the United States and 75 percent of [those unhoused] are completely unsheltered, which means they're just living in the street."

"I do not want to be the caseworker who is making that decision, who is saying there are 50,000 people with no resources, I have a handful of resources available, now I have to pick," she says.

Still, automation is not the solution here, Eubanks says. To underline the point, she cites public interest lawyer Gary Blasi in her book: "Homelessness is not a systems engineering problem, it's a carpentry problem."

In other words, if you've got 10 houses for 20 people it doesn't matter how good the system for housing those people is — it won't work.

That's not to say automation doesn't have an important role in helping limit failures caused by caseworkers "who are racist, who discriminate, who favor some clients over others for inappropriate reasons," Eubanks says.

"Human bias in public assistance systems has created deep inequalities for decades," she says. "Specifically around the treatment of black and brown folks who have often been either overrepresented in the more punitive systems or diverted from the more helpful systems."

These inequalities can manifest in a number of ways. People of color are more likely to go to prison, have their children taken away from them or not receive public housing.

"But the thing that's really important to understand," the author notes, "these systems don't actually remove that bias, they simply move it."

A Child Welfare Risk Model In Allegheny County, Penn.

One case of this bias displacement is found in Pennsylvania, Eubanks says. In Allegheny County, the Department of Human Services employs a predictive algorithm aimed at projecting which children are likely to become victims of abuse.

"In that case, one of the hidden biases is that it uses proxies instead of actual measures of maltreatment," she says. "And one of the proxies it uses is called call re-referral. And the problem with this is that anonymous reporters and mandated reporters report black and biracial families for abuse and neglect three and a half more often than they report white families."

Eubanks knows she could have turned out a pretty portrait of three different automated systems elsewhere in the country that were providing services effectively. But she says wanted to give a voice to the vulnerable people — families to whom she said these systems looked "really different than they look from the point of view from the data scientists or administrators who were developing them."

"I wasn't hearing these voices at all in the debates that we've been having about what's sort of coming to be known as algorithmic accountability or algorithmic fairness," she says.

Eubanks says policymakers can look to successful models when implementing an automated system. "In Chicago there's a great system called mRelief," she says. "mRelief basically allows you to sort of ping government programs to see if you might be eligible for them. And then the folks who work for mRelief actually help step you through — either in person or through text — the process of getting all the entitlements that you are eligible for and deserve."

Copyright 2023 NPR. To see more, visit https://www.npr.org.

Ari Shapiro has been one of the hosts of All Things Considered, NPR's award-winning afternoon newsmagazine, since 2015. During his first two years on the program, listenership to All Things Considered grew at an unprecedented rate, with more people tuning in during a typical quarter-hour than any other program on the radio.
News from Alabama Public Radio is a public service in association with the University of Alabama. We depend on your help to keep our programming on the air and online. Please consider supporting the news you rely on with a donation today. Every contribution, no matter the size, propels our vital coverage. Thank you.