The High-Tech Poorhouse

Virginia Eubanks

When algorithms are introduced into public assistance programs, the effects are rarely good for poor and working-class beneficiaries.

A man on the street in Newark, NJ. Tony Fischer / Flickr

Interview by
Sam Adler-Bell

We live in what legal scholar Frank Pasquale has called a “scored society.” Corporations and governments collect unprecedented amounts of data about us — our habits, our histories, our beliefs, our desires, our social networks. Machine learning algorithms parse that data to assess our worthiness for public benefits, for jobs, for loans, for insurance, and for suspicion in the criminal justice system.

The rich are not exempt from this reality, but it’s the poor and working class who are most endangered by it. Predictive policing algorithms launder racial bias and reproduce inequality. Reputational scores based on historical data reinforce the lopsided structure of American society, further advantaging the already advantaged and marginalizing the marginalized.

Virginia Eubanks, associate professor of political science at the University at Albany, SUNY, has spent the past several years exploring how automation has played out in the American welfare system. Her new book, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, investigates three experiments in which algorithms are replacing or augmenting human decision-making in public assistance: Indiana’s automated Medicaid eligibility process; Los Angeles’s coordinated entry system for the homeless; and Allegheny County, Pennsylvania’s predictive algorithm for assessing childhood risk of abuse and neglect.

These tools have two principal effects. One, they undermine poor people’s right to self-determination — their ability to make the most important and intimate decisions in their lives without government interference. And two, automated systems abstract questions of political morality (is it acceptable that people in the world’s richest country cannot meet their most basic needs?) into questions of efficiency (how can we distribute a limited pool of resources to maximize their impact?). As Eubanks writes, “We manage the poor so that we do not have to eradicate poverty.” 

I recently sat down with Professor Eubanks at a café near the Social Science Research Council in downtown Brooklyn to discuss her work, her new book, and how to build a welfare system that respects human dignity.


Sam Adler-Bell

How did you get started working on poverty and surveillance?

Virginia Eubanks

I came up working in the Bay Area’s community technology center movements of the 1990s [which sought to expand access to computers and the early internet]. I was really excited about the possibilities of that work. But I also struggled to reconcile the utopianism of that moment with the reality of San Francisco in the 1990s. Public housing was being knocked down, the city was visibly whitening, and people were being displaced.

So I left San Francisco to escape a sort of personal, political crisis. I moved to a small town in upstate New York called Troy. And, coincidentally, I moved there just as the city decided to put all of their eggs in the basket of high-tech development.

Sam Adler-Bell

You couldn’t escape it.

Virginia Eubanks

Right. And that was the question that I started asking myself. There’s something here I can’t escape. How am I going to make sense of it?

I started doing community technology work in Troy, working primarily with poor and working-class adults out of a YWCA. A lot of people were coming out of the prison system or out of recovery. And they challenged me deeply in the ways I was thinking about technology.

There was this idea at the time that the major inequality issue in tech was the “digital divide” — you know, low-income people aren’t interacting with technology so they’re being left behind. But the people I was working with told me, “No. Listen. Technology is totally ubiquitous in our lives. I don’t know what you’re talking about.” They were seeing tons of technological innovation, just not where I was looking for it. They were seeing it in the criminal justice system, in the welfare office, and in low-wage employment.

I had this conversation with a woman who I called “Dorothy Allen” in my first book. We were talking about her EBT [Electronic Benefit Transfer] card. She told me, “Yeah. It’s more convenient. It’s great to not have to carry around paper [food] stamps anymore. But also, my caseworker uses it to track all of my purchases.” And I had this look on my face, a look of total shock. And Dorothy said, “Oh. You didn’t know that, did you?” I did not. She said, “You all” — meaning middle-class people like me — “You all should be paying attention to what happens to us, because they’re coming for you next.”

That conversation, and others like it, is where my interest in technology in the welfare system came from.

Sam Adler-Bell

Part of what I enjoyed about your book is how it lays out a continuous history from the poorhouses of the eighteenth and nineteenth centuries — where the indigent were incarcerated and forced to work — to the high-tech containment of today. Can you trace that a bit?

Virginia Eubanks

I think we tend to talk about these kinds of technological tools as if they arose outside of history. Like they just fell from the sky. One of the major arguments in the book is that these tools were more evolution than revolution. They’re very much in line with the kind of punitive policy, processes, and tools that came before them.

There’s two moments I’ll draw your attention to. First is the scientific charity movement of the 1870s, which was deeply informed by eugenics and the desire to “breed” out the moral inadequacies that produce indigence. It kicks off after the fall of Reconstruction and the violent reassertion of white supremacy that took hold then. You see the rise of Jim Crow, the exclusion of African Americans in public life, immigration restrictions that are based on scientific racism, and the involuntary sterilization of poor whites.

This all comes out of a moment where social service technologies are changing really fast; indiscriminate giving is being replaced by what was called “scientific giving.” The first “big data” set in the United States was the Eugenics Records Office in Cold Spring Harbor. It was the public arm of the eugenics movement. They sent scientists out into the world to collect these very detailed family trees that tried to track how poverty, “imbecility,” “depraved living” — all these words that they used at the time — were genetically carried.

The second moment is the rise of the “digital poorhouse,” which began in the late sixties, early seventies, really, in response to the successes of social movements in opening up public programs.

Sam Adler-Bell

The welfare rights movement.

Virginia Eubanks

That’s right. In the mid to late sixties, into the early seventies, the national welfare rights movement was having extraordinary successes. Particularly legal successes. Supreme Court victories in 1968, ’69, and ’70 made it impossible to discriminate against folks receiving AFDC [Aid to Families with Dependent Children]; they enshrined due process rights for welfare recipients; and they prohibited “man-in-house” rules that allowed caseworkers to raid the homes of poor women to look for evidence of male cohabitants. These victories came toward the end of the sixties, and by 1971 you see these new technological tools implemented. It’s a backlash.

Sam Adler-Bell

And it’s entwined with the backlash against civil rights in general. Suddenly all these black women are receiving benefits, too. And welfare becomes firmly linked with being black.

Virginia Eubanks

There’s this deep, deep, deep connection in the American mind between poverty and being a person of color, specifically being African American. You don’t get a welfare system like the one we have without using race. In each of these systems, race has been deployed as a narrative, as an organizing method.

Meanwhile, technology became a way of smuggling politics into the system without having an actual political conversation. In the 1970s, with the promise of increasing efficiency, you start seeing computerization to detect fraud and tighten eligibility rules. We think the real anti-welfare backlash started with Reagan. But the drop-off really began in the early seventies. That’s when they started culling the rolls — using these technical means. Around 1973, almost 50 percent of people living under the poverty line were receiving some kind of cash assistance. A decade later, it dropped to 30 percent. Now it’s less than ten.

At the same time the government is scrutinizing caseworkers more closely, cutting down on their discretion. As surveillance of recipients increased, so did surveillance of frontline workers. They introduced much more punitive processes, with much less human connection. And that also has continued in the systems we see today.

Sam Adler-Bell

That’s something that immediately struck me reading the book, how many intimate stories you tell about individuals on both sides of welfare provision. You’re rendering them as these really human characters, who are not just numbers.

Virginia Eubanks

That was a really important part of telling the story. My intention in the work that I do is always to start with the folks who are getting the pointiest end of the stick. It’s harder, in many ways, than talking to the administrators or the policymakers. It requires me spending a lot of time with people, developing trust and developing understanding of their situations. But if we’re missing their voices, we’re missing a huge part of what this new regime of data analytics is about.

Another reason for including those stories is that I really see recipients of public services and frontline caseworkers as possible allies. And in fact, there have been a lot of historical moments where that collaboration has been really threatening to the status quo.

For example, one of the things that happens in the 1960s is the New York City welfare caseworkers strike: eight thousand caseworkers strike on behalf of recipients and their own working conditions. They say, “We’re not going back to work until you treat them better.” That’s a terrifying moment for the system.

Sam Adler-Bell

Do you see prospects for that kind of solidarity today?

Virginia Eubanks

I did welfare rights organizing for fifteen-plus years. One of the great challenges of the work is realizing that caseworkers are, indeed, the deliverers of the attitude of the system. And recipients often see them that way.

But many are also just one sickness, one period of bad luck away from being on public assistance themselves. And so there are a lot of horizontal lines there. It’s a difficult relationship, but they’re also natural allies in some important ways.

Sam Adler-Bell

Let’s talk about the specific cases. How did you choose the algorithms to study in the book?

Virginia Eubanks

The three cases sort of ramp up in difficulty. People are very comfortable rejecting the first case [Indiana’s automated Medicaid eligibility system]. That’s clear black hat.

Sam Adler-Bell

Right. It’s fairly transparent that Mitch Daniels was using technology and privatization to punish the poor.

Virginia Eubanks

I can’t say what’s in his soul.

Sam Adler-Bell

I’m willing to say it; you don’t have it.

Virginia Eubanks

Feel free. But people struggle more with Los Angeles and Pittsburgh. And that was intentional. I was trying to step people through this process where it’s like, “Okay, bad intentions, bad outcome. We got it.” But I really didn’t want to leave people there. I thought that was too easy.

The LA chapter is frankly, really, about well-intentioned people, working hard in a situation involving an almost unbearable set of decisions. There are fifty-eight thousand homeless people, and there is just a handful of resources. And so the question that I’m trying to ask there is about the idea of triage. The algorithm is supposed to identify the most vulnerable, the most needy. But are we comfortable saying that we should triage people for their basic human rights?

One of the questions I’m trying to get at there is, are we using these technologies as empathy overrides? These decisions are too hard for humans to make. So, we’re letting computers make them for us.

Sam Adler-Bell

Right. And it seems like a lot of what these systems do is replace those tough political choices with questions about efficiency.

Virginia Eubanks

We are creating these tools that just split up this pie that’s made of shit. We can create the most sophisticated tools to share out that pie, but it’s not going to change the fact that the pie is shit. I want a different pie. I don’t want a shit pie.

Sam Adler-Bell

What about the Pittsburgh case? That algorithm assesses whether a child is at risk of being abused. That sounds useful.

Virginia Eubanks

These are kids. We don’t want horrible things to happen to them. Caregivers do awful things to children in their care sometimes. And I believe — not everybody agrees with me — that the state has a responsibility to step in to protect children. I think the system we have now is deeply flawed, but I think it’s a reasonable thing for a state to do.

The thing that’s so challenging about that case is, in Allegheny County, they have done every single thing that progressive folks who talk about these algorithms have asked people to do. There is participatory design in the process. They are completely transparent. They publish everything. They’ve been working on racial disproportionality for a long time.

And so this is the case that I kind of think of as the worst best-case scenario. Because at the same time, I still believe the system is one of the most dangerous things I’ve ever seen.

Sam Adler-Bell

Why is that?

Virginia Eubanks

Erin Dalton, the head of data analytics for Allegheny County, told me, “We definitely oversample the poor. All of the data systems we have are biased. We still think this data can be helpful in protecting kids.”

There’s something really deep there: it suggests we’re okay doing unethical things to poor people. You couldn’t really fill in that blank with any other group and still say, “But, we think it’s okay.”

Sam Adler-Bell

Can you describe the stakes of being targeted by child protective services?

Virginia Eubanks

CPS is a really complicated system, ethically, because it mixes these dual roles, of being the provider of last resort for poor families and being the agency that can investigate your parenting and remove your children. The great majority of investigations, 75 percent, are neglect, not emotional, sexual, or physical abuse. And neglect is very hard to distinguish from just the conditions of poverty.

I think most of the families experience the services they get with, at the very least, ambivalence. They feel like they are put in situations where they have to give up their children in order to get the resources they needed to care for their children. And once you’re in the system, there’s a ratcheting effect. The standard on your parenting is raised really high, and the resources aren’t sufficient to keep you at a place where you can comply with the expectations. It results in absolute tragedy too much of the time.

Even if CPS is functioning the way it should, and only taking children who are really in great danger of maltreatment out of their families, we’re also assuming that we have someplace better for them to go. I think many kids who have been through the foster care system would disagree.

Sam Adler-Bell

So even though the Allegheny algorithm is well intentioned and designed carefully, it still has these pernicious effects.

Virginia Eubanks

That chapter is really saying, “If you do everything right, and you’re still living in a society that is deeply impacted by its hatred of the poor and its fear of precarity, you’re still going to create systems that punish, police, and profile.”

And that’s the most frustrating place that people get to. Because then they’re like, “Well then, what do we do?” And they want me to give them a ten-point plan for creating better technologies. And I’m like, “Well, what we do is, join in social movements, and end poverty everywhere, and forever.” And I’m sorry that that’s a frustrating answer. But it’s the real answer.

Sam Adler-Bell

So how we start to build, at the very least, a system that takes into account the privacy of poor people?

Virginia Eubanks

Part of my work is shifting the conversation from “privacy” to “self-determination.” Because privacy, frankly, just doesn’t work for the folks whose stories I tell.

Folks want some degree of privacy and resent intrusion, but it’s not the first thing they talk about. It’s not even the fifth thing they talk about. They want to be able to make the most consequential decisions in their life, for themselves: how they raise their kids, how they spend their money, where they live. I find that shifting the language in that way does make some of the pathways to solutions more clear, although the work is much bigger.

I think that the major political change that has to happen is shifting the public assistance system from something where the majority of time, money, and effort is spent trying to decide whether or not it’s your fault that you’re poor.

In the meantime, these technological systems aren’t going to stop, while we work this all out. And because we live in the system we live in, they’re going to continue to have these assumptions and analyses built into them: that poor folks are primarily fraudulent, lazy, addictive.

It’s a little bit silly, but I have this little Hippocratic oath for data designers in the book. There are two basic questions I encourage them to ask themselves. One is, does it increase self-determination of poor people? And two, if it was aimed at anyone but poor and working people, would you be able to do it? And if you answer “no” to either of those, don’t do it. You’re on the wrong side of history.