THE NEW YORK TIMES SUNDAY REVIEW OF BOOKS
“How Big Data is ‘Automating Inequality’”
May 4, 2018, Liza Featherstone
“‘Automating Inequality’ is riveting (an accomplishment for a book on technology and policy). Its argument should be widely circulated, to poor people, social service workers and policymakers, but also throughout the professional classes. Everyone needs to understand that technology is no substitute for justice.”
THE NEW YORK REVIEW OF BOOKS
https://www.nybooks.com/articles/2018/06/07/algorithms-digital-poorhouse/
June 7, 2018, Jacob Weisberg
“Public policy that hinges on understanding the distinctions among outcome variables, prediction variables, training data, and validation data seems certain to become the domain of technocrats. An explanation is not what’s wanted. What’s wanted is for the harm not to have occurred in the first place, and not to continue in the future.”
HARVARD LAW REVIEW, “Digitizing the Carceral State”
https://harvardlawreview.org/2019/04/digitizing-the-carceral-state/
April 10, 2019, Dorothy E Roberts
“Automating Inequality shines a needed spotlight on government assistance programs the public is more likely to view as benevolent than as punitive. The key aspects Eubanks highlights — big data collection, automated decisionmaking, and predictive analytics — also characterize expanding high-tech approaches to criminal justice.”
NY TID, “Automatiserad profilering ett hot för USA:s fattiga” (in Swedish)
https://www.nytid.fi/2018/12/automatiserad-profilering-ett-hot-for-usas-fattiga/
December 14, 2018, Marcus Floman
“Den amerikanska forskaren och författaren Virginia Eubanks har gjort en odyssé genom USA i sin bok Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor där hon beskriver hur sjuka, fattiga och hemlösa skuffats åt sidan av en byråkratiapparat där automatiseringen gått för långt.”
***
El PAÍS SEMANAL, February 10, 2022, by Manuel G. Pascual
Virginia Eubanks: “La tecnología nos ofrece una excusa para no afrontar problemas sociales cada vez más críticos”
MARKETPLACE TECH, August 2, 2021
An app to track home health care aides has unintended effects
THE OBJECTIVE, July 17, 2021, by Ariana Basciani
Virginia Eubanks: «Estados Unidos siente una profunda negación sobre la naturaleza de su pobreza»
EL SALTO, July 11, 2021, by Gessamí Forner
Virginia Eubanks: “En la automatización de datos, las familias pobres y trabajadoras son como los canarios en la mina”
PUBLIC INTEGRITY
Automating Inequality (Review), Issue 21, Number 4 (2019), Hannah Lebovitz
WRITER’S VOICE WITH FRANCESCA RHEANNON
Virginia Eubanks, Automating Inequality
October 2019
JOURNAL OF WORKING CLASS STUDIES
https://workingclassstudiesjournal.files.wordpress.com/2018/12/jwcs-vol-3-issue-2-dec-2018-purser-1-1.pdf
December 2018, Review by Gretchen Purser
PROBONO AUSTRALIA, “Automated Welfare Worsens Inequality”
https://probonoaustralia.com.au/news/2018/10/automated-welfare-worsens-inequality-experts-say/
October 30, 2018, Maggie Coggan
“Rapid automation of welfare services is creating extreme barriers for disadvantaged Australians, in particular people with disability, with an expert panel calling for a rethink of the current processes.”
MASTER OF DATA PODCAST, “Fighting Data-Driven Inequality”
https://www.podbean.com/media/share/pb-5zcp7-9d0ea6
October 29, 2018, Ben Newton
FORBES, “A Rising Crescendo Demands Data Ethics and Responsibility”
https://www.forbes.com/sites/ciocentral/2018/10/29/a-rising-crescendo-demands-data-ethics-and-data-responsibility/#5ae8684b5d5d
October 29, 2018, Randy Bean
PURSUIT, “Why does Artificial Intelligence Discriminate?”
https://pursuit.unimelb.edu.au/articles/why-does-artificial-intelligence-discriminate
October 24, 2018, Jeannie Marie Paterson and Yvette Maker
“Combatting bias and creating more inclusive AI is unlikely to succeed unless developers include those people who have been historically excluded or ignored”
SUPERPOSITION, “Algorithms Can’t Fix Us”
https://sci-techmaven.io/superposition/society/algorithms-can-t-fix-us-KGX7s78ST0uAMMgDZFCYMQ/ October 23, 2018, Niki Diefenbach
“Algorithms are not a societal cure-all; we must first address the underlying social ills.”
PHILOSOPHICAL DISQUISITIONS/ALGOCRACY, “Episode 47”
https://philosophicaldisquisitions.blogspot.com/2018/10/episode-47-eubanks-on-automating.html
October 20, 2018, John Danaher
ON CONTACT WITH CHRIS HEDGES, “Digital Monitoring of the Poor”
https://www.rt.com/shows/on-contact/440570-eubanks-inequality-poor-control/
October 7, 2018, on RT
ISSUES IN SCIENCE AND TECHNOLOGY, “Let Them Eat Efficiency”
https://issues.org/let-them-eat-efficiency/
Fall, 2018, Stephanie Wykstra
SOCIALIST WORKER, “Let Them Eat Data”
https://socialistworker.org/2018/08/02/let-them-eat-data
August 2, 2018, Don Lash
“The impersonal nature of decision-making using high-tech tools presents new challenges in organizing and publicizing. Not only can you not stab an algorithm, you can’t personify it, and you can’t stage a sit-in against a cloud storage facility. Eubanks acknowledges these challenges and the effect of isolation of those subjected to decision-making using high-tech tools, but she ultimately strikes an optimistic note. She emphasizes — and demonstrates — the power of individual stories to overcome indifference.”
EdSURGE, “Why One Professor Says We Are ‘Automating Inequality'”
https://www.edsurge.com/news/2018-07-24-why-one-professor-says-we-are-automating-inequality
July 24, 2018, Jeffrey R. Young
“If we don’t tackle the deep social issues at root of these problems, then we reproduce them through our tools. And we don’t just reproduce them. Now that we’re looking at these densely networked, very fast tools that scale so quickly, we also run the risk of vastly amplifying those problems.”
REAL CHANGE
https://www.realchangenews.org/2018/07/18/book-review-automating-inequality-how-high-tech-tools-profile-police-and-punish-poor July 18, 2018, Mike Wold
“When politicians and business leaders talk about using technology to streamline service provision to the poor, they conjure up an efficient, values-free process. … But, as Virginia Eubanks describes in “Automating Inequality,” the computerized tools applied to social service provision are designed with the institutional biases endemic in our society, starting with the idea that poverty is the fault of poor people and that a goal of our welfare systems is to make sure that nobody gets aid who doesn’t deserve it, even if that means denying aid to people who do.”
JOTWELL
https://cyber.jotwell.com/the-difference-engine-perpetuating-poverty-through-algorithms/July 18, 2018, Rebecca Tushnet
“Ultimately, Eubanks argues, the problem is that we’re in denial about poverty. … We don’t keep our tormented child in an isolated place, as they do in Omelas. Instead of walking away, we walk by—but we don’t meet each other’s eyes as we do so. This denial is expensive in so many ways—morally, monetarily, and even physically, as we build entire highways, suburbs, private schools, and prisons so that richer people don’t have to share in the lives of poorer people. It rots politics: “people who cannot meet each others’ eyes will find it very difficult to collectively govern.” Eubanks asks us to admit that, as Dan Kahan and his colleagues have repeatedly demonstrated in work on cultural cognition, our ideological problems won’t be solved with data, no matter how well formed the algorithm.”
TRIANGULATION
https://twit.tv/shows/triangulation/episodes/354
July 6, 2018, TWiT TV with Megan Morrone
LSE REVIEW OF BOOKS
Book Review: Automating Inequality
July 2, 20018, Louise Russell-Prywata
SOCIETY FOR INDUSTRIAL AND APPLIED MATHEMATICS NEWS
“The Impact of Computerized Social Services on the Underprivileged”
May 1, 2018, Ernest Davis
“How does the computerization of governmental social services impact the poor? In Automating Inequality, Virginia Eubanks delivers a harsh verdict: throughout American history, government policy towards the poor has often amounted to criminalizing poverty; computer technology makes these policies more inescapable, more implacable, and more brutal. Eubanks’ book is deeply researched, well-written, passionate, and extremely troubling.”
FUTUR.E.S. (France)
“Les algorithmes, ces ennemis?” (In French and English)
“People who are very clever when it comes to data are often very naïve about politics.”
WNUR CHICAGO, THIS IS HELL
“How Tech Targets the Poor”
May 12, 2018, 11:05 AM (Central time), WNUR 89.3 FM
WNYU, WEEKLY REFRESH
“Automating Inequality”
April 25, 2018
ÖSTERREICH 1 (AUSTRIA 1), MATRIX
“Datenmissbrauch gefährdet die Demokratie”
April 13, 2018
Data and algorithms are used to monitor, control and punish the poor. A Digital Poorhouse is emerging. This Austrian radio program, produced by Lukas Plank, explores why.
BBC, WHY FACTOR
“Machines and Morals”
April 1, 2018
Machines are merging into our lives in ever more intimate ways. They interact with our children and assist with medical decisions. Cars are learning to drive themselves, data on our likes and dislikes roam through the internet. … In this episode of Why Factor, Sandra Kanthal asks if now is the moment we need to think about machines and morals?
SLATE, “How Algorithms Can Punish the Poor”
Amanda Lenhart, March 29, 2018
“These tools are not disrupters so much as they’re amplifiers. It shouldn’t surprise us when a tool grows out of our existing public assistance system to be primarily punitive. Diversion, moral diagnosis, and punishment are often key goals of our public-service programs. But if you start with a different values orientation—if you start from an orientation that says everyone should get all of the resources they’re eligible for, with a minimum of disruption, and without losing their rights—then you can get a different tool.”
BOOKS, BEATS, AND BEYOND w/ Taj Salaam
“Automating Inequality”
March 25, 2018
FAST COMPANY
“Algorithms are Creating A ‘Digital Poorhouse’ That Makes Inequality Worse”
March 1, 2018, Adele Peters
NPR, ALL THINGS CONSIDERED
“‘Automating Inequality:’ Algorithms in Public Services Often Fail the Most Vulnerable”
Feb 19, 2018
“We’re actually using these systems to avoid some of the most pressing moral and political challenges of our time — specifically poverty and racism.”
THE NEW INQUIRY, “Privacy for Whom?”
Sam Adler-Bell, February 21, 2018
“While many liberal privacy advocates warn that a dystopian society is around the corner…new scholars argue that a huge portion of the American public already lives in a privacy-free rights environment. … We would do well to abandon the popular understanding of the privacy-rights bearer as an affluent white man—unburdened by history, by power, by coercion. He bears almost no resemblance to those who endure the worst consequences of surveillance.”
THE ATLANTIC, “When Welfare Decisions are Left to Algorithms”
Tanvi Misra, February 15, 2018
“One of the things I most fear about these systems is they allow us the emotional distance that’s necessary to make what are inhuman decisions… [they] act as empathy overrides—we are allowing machines to make decisions that are too difficult for us to make as human beings.”
FINANCIAL TIMES, “When Algorithms Reinforce Inequality”
Gillian Tett, February 9, 2018
“Computing has long been perceived to be a culture-free zone — this needs to change. But change will only occur when policymakers and voters understand the true scale of the problem. This is hard when we live in an era that likes to celebrate digitisation — and where the elites are usually shielded from the consequences of those algorithms. Except, of course, when random accidents occur. In that sense, Eubanks’ tale is a chilling lesson to us all.”
PACIFIC STANDARD, “How America Uses Digital Tools to Punish its Poor”
Peter C. Baker, February 7, 2018
“In her new book, Automating Inequality, the scholar and activist Virginia Eubanks insists that the poorhouse is very much still with us, and very much still working to isolate, surveil, judge, and punish the poor. But the poorhouse of 2018 isn’t an actual building; instead, it’s an “invisible spider web” of computer networks used by the American state to mediate our relationship to poverty. Eubanks calls this web the ‘digital poorhouse,’ and she wants us to dismantle it.”
CITYLAB, “The Rise of ‘Digital Poorhouses‘”
Tanvi Misra, February 6, 2018
“The decision that we don’t have enough resources to help everyone and we have to triage? That we have to ration care? That is a political decision.”
VOX, “How Big Data is Helping States Kick Poor People Off Welfare”
Sean Illing, February 6, 2018
“Ultimately, these systems make our values visible in a way that calls us to a moral reckoning…the solution for us as a nation is to get our souls right about poverty. Until we do that, we will continue to produce systems that profile and punish poor and working families.”
KONKRET MEDIA, “Policing Poverty Through Automation”
Aleta Sprague, January 31, 2018
“With big data and automated decision-making playing an ever growing role in the administration of public assistance, the potential for surveillance has never been greater. As [UN Special Rapporteur on extreme poverty and human rights Philip] Alston observed, ‘despite the good intentions of officials in Los Angeles, there is an Orwellian side to CES [the Coordinated Entry System].’ Its requirements that unhoused individuals divulge ‘the most intimate details of their lives’ in exchange for a slim shot at permanent housing leaves many feeling like they are sacrificing one human right for another.”
bOINGbOING, “Automating Inequality: Using algorithms to create a modern ‘digital poor-house‘”
Cory Doctorow, January 31, 2018
“A recurring theme in Eubanks’s work is the power of algorithms to diffuse responsibility for human suffering: using math to decide who the ‘deserving’ poor are makes it easier to turn away from everyone else whom the system has deemed undeserving. … By using algorithms to ‘triage’ the neediness of poor people, system designers can ensure that the people harmed by the system are the least sympathetic and least likely to provoke outrage among those with political clout.”
ABC AUSTRALIA, “The Digital Poorhouse: Coders need a Hippocratic Oath to protect disadvantaged people”
Ariel Bogle, January 29, 2018
“Dr. Eubanks offered a so-called ‘Hippocratic oath’ (for doctors, the rule to ‘do no harm’) for data scientists. ‘Fundamentally, it boils down to two questions that designers should ask themselves about their systems,’ she explained. ‘One is, does it increase the self-determination of poor and working people? And the second is, if the system was aimed at anyone but poor and working people, would it be tolerated?'”
JACOBIN MAGAZINE, “The High-Tech Poorhouse”
Sam Adler-Bell, January 29, 2018
“Part of my work is shifting the conversation from ‘privacy’ to ‘self-determination.’ Because privacy, frankly, just doesn’t work for the folks whose stories I tell. Folks want some degree of privacy and resent intrusion, but it’s not the first thing they talk about. It’s not even the fifth thing they talk about. They want to be able to make the most consequential decisions in their life, for themselves: how they raise their kids, how they spend their money, where they live. I find that shifting the language in that way does make some of the pathways to solutions more clear, although the work is much bigger.”
THE BRIAN LEHRER SHOW, “How Data in Public Policy Can Foster Bias and Inequality”
January 29, 2018
“Often, we talk about these systems as entirely neutral. But these systems are actually moving discretion from the frontline [caseworkers] to the economists, data analysts, and computer programmers who are building these systems.”
MIT TECHNOLOGY REVIEW, “Algorithms are Making American Inequality Worse”
Jackie Snow, January 26, 2018
“The punitive and moralistic view of poverty that built the poorhouses never left us, and has been wrapped into today’s automated and predictive decision-making tools. These algorithms can make it harder for people to get services while forcing them to deal with an invasive process of personal data collection.”
NY DAILY NEWS, “Automating Inequality warns of a dystopian future punishing the poor — in the present.”
Michael Nam, January 23, 2018
“Eubanks ably demonstrates why everyone should be very, very worried about the present and future of poverty management. Along with the personalized stories, her data exposes the political will, the ease of establishment and the ripe soil for letting cold math take our deepest biases and, in effect, render them into invisible cages for the most vulnerable.”
GIZMODO, “How Algorithmic Experiments Harm People Living in Poverty”
Interview with Sidney Fussell, January 23, 2018
“These tools have come [as] a response to the politics of scarcity. They offer us a moment—because they make inequalities so visible—to really attack the roots of the problem.”
THE NEW REPUBLIC, “The Injustice of Algorithms”
Ethan Chiel, Jan 23, 2018
“To call the stories and data Eubanks has collected infuriating feels like an understatement.”
Virginia Eubanks appears on THE OPEN MIND with Alexander Heffner
January 16, 2018
“The designers of these systems assume…that discrimination enters with caseworker decision-making. But data scientists, engineers & administrators build their own bias into these systems.”
INTERNETACTU.net, “De l’Automatisation des Inégalités“(in French)
Hubert Guillaud, January 15, 2018
“Les ingénieurs qui construisent ces outils attirent notre attention sur les biais qui impactent leurs systèmes. Indirectement, ils font retomber leurs responsabilités sur la société, sans voir que le racisme et le comportement de classe des élites sont ‘mathwashés,’ c’est-à-dire neutralisés par la mystification technologique et la magie des bases de données.’ Les nouveaux outils high-tech sont vus comme de simples mises à jour administratives, sans conséquence politiques. Ils ne sont montrés que comme des solutions anodines pour améliorer l’efficacité des systèmes, alors que les transformations qu’ils proposent sont bien plus radicales.”
POINTS, “Beyond the Rhetoric of Algorithmic Solutionism”
danah boyd, Jan 11, 2018
“Automating Inequality is on par with Barbara Ehrenreich’s “Nickel and Dimed” or Matthew Desmond’s “Evicted.” It’s rigorously researched, phenomenally accessible, and utterly humbling. While there are a lot of important books that touch on the costs and consequences of technology through case studies and well-reasoned logic, this book is the first one that I’ve read that really pulls you into the world of algorithmic decision-making and inequality, like a good ethnography should.”
THE PROGRESSIVE, “Keeping the Poor Poor: How Government Automates Inequality”
Jake Whitney, Jan 10, 2018
“Automating Inequality…is loaded with horror stories…alarming evidence that Americans continue to treat poor people as second-class citizens.”
BOOKLIST (Starred Review)
Emily Dziuban, Dec 15, 2017″Recognizing the direct link between the poorhouses of U.S. history and contemporary automated systems that re-create poorhouse conditions…Eubanks argues that automated systems separate people from resources, classify and criminalize people, and invade privacy—and that these problems will affect everyone eventually, not just the poor.”
KIRKUS REVIEWS, October 17, 2017
“Algorithms, predictive models, regression analyses: all are tools for criminalizing the poor and immiserating the middle class. Equal parts advocacy and analysis—a welcome addition to the growing literature around the politics of welfare.”
SAN FRANCISCO REVIEW OF BOOKS
David Wineberg, December 7, 2017
“Target, track, punish. Repeat. … In Automated Inequality, Virginia Eubanks says we manage the poor so we don’t have to eradicate poverty. Instead, we have developed a Digital Poorhouse – high tech containment of the poor and recording of their every action, association and activity.”