virginia eubanks
adding page while awaiting her book (automating ineq) at library
________
first recommend via Ethan(can’t find that tweet – requested library purchase)..
then danah, ..
danah boyd (@zephoria) tweeted at 9:03 AM – 11 Jan 2018 :
Introducing algo systems into govt decision-making isn’t inherently evil; in practice these tools often magnify inequities while seeking to redress them. Yesterday, I asked you to buy “Automating Inequality” (https://t.co/rhil9Flxa7) Here’s more on why:
https://t.co/PvELrgF19Vhttps://t.co/W0au6l4CDv (http://twitter.com/zephoria/status/951484845670182913?s=17)I don’t know how Eubanks chose her title, but one of the subtle things about her choice is that she’s (unintentionally?) offering a fantastic backronym for AI. Rather than thinking of AI as “artificial intelligence,” Eubanks effectively builds the case for how we should think that AI often means “automating inequality” in practice.
then others..
Clive Thompson (@pomeranian99) tweeted at 7:55 AM – 15 Jan 2018 :
Terrific excerpt from “Automating Inequality” by @PopTechWorks: Here, she investigates how models to predict child abuse can be led astray by slender training data: https://t.co/lwZQ1iiTo9 (http://twitter.com/pomeranian99/status/952917217280131073?s=17)
_______
At 4PM ET, @alondra & @JuliaAngwin will be leading a discussion with @PopTechWorks at @datasociety on her new book: “Automating Inequality.” The livestream is here: https://t.co/hIRv3oZY65https://t.co/ciCJ72FMk0
Original Tweet: https://twitter.com/zephoria/status/953664223577563136
Virginia Eubanks is an Associate Professor of Political Science at the University at Albany, SUNY. In addition to her latest book, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, she is the author of Digital Dead End: Fighting for Social Justice in the Information Age and co-editor, with Alethia Jones, of Ain’t Gonna Let Nobody Turn Me Around: Forty Years of Movement Building with Barbara Smith. For two decades, Eubanks has worked in community technology and economic justice movements. Today, she is a founding member of the Our Data Bodies Project and a Fellow at New America.
notes/quotes from livestream: @PopTechWorks @alondra & @JuliaAngwin
‘Virginia has been doing work at intersection of inequality and tech’ @zephoria intro
i’m calling it the new regime of data analytics…. thinking deeply about issues of data basis .. whatever
1\ history of tools 2\ starting from pov of those targeted by injustices
quick history lesson.. 1819 to today.. romp thru poverty policy in us.. one moment in this history – rise of digital poorhouse.. tools arrive in late 60s early 70s.. national welfare rights movement.. pushed for equal access to public resources.. by overturning discriminatory laws..solve problem.. w new set of digi techs.. see almost immediate drop in ability to access resources should have access.. at moment tools are integrated.. (before talks.. politics)
i tell 3 stories.. 1\ indiana.. 2\ electronic registry in la.. 3\ predictive model in pittsburgh..
1\ indiana.. 1.16 bn contract in 2006.. goal to automate all illegibility.. online forms.. replaced case based (local case worker) w task based.. (so online people responding to whatever task was next) pretty effective at breaking relationship between caseworkers and assistance.. about 1 mn apps denied.. in first year.. so badly that indiana broke contract w ibm.. then ibm sued and won.. story of amega young w cancer.. largely how people lost benefits.. ‘failure to cooperate’.. usually missed appt.. or fill out all forms.. she died in 1 yr.. next day.. all her benefits restored..
3\ pittsburgh.. allegany family screening tool.. 1999.. human services commissioned large data warehouse.. in 2012 office released an rfp to ask folks to propose projects to mine the data.. contract went to international team of auckland.. basically a statistically regression.. not an ai.. based on 131 factors.. basically how it works..when call comes in .. interview caller.. then make decision on 1\ risk factor 2\ how safe they feel child is.. then get a risk score.. then make a decision whether or not to screen case in.. if score is high enough.. automatic trigger.. story of angel shephard and patrick.. had long list of calls for bad parenting.. each time case was closed.. angel: you feel like a prisoner .. trapped.. that they could take my daughter at any time .. creates challenges for solutions.. ie: uses inappropriate proxies.. limited data set.. only data ends up if you ask for help .. most challening.. what i call.. the best/worst case scenario.. 1\ they’ve done everything we’ve asked them to do.. caring people .. 2\ the design was participatory 3\ transparent 4\ publicly owned.. so ..
how do we respond when designers to everything we’ve asked them to and still ineq.. why are we still doing systems that produce ineq..
convo of three ladies
v: if we could lower barriers by integrating systems for people to access services.. that would be huge.. that said.. it is sometimes necessary to stay less visible in these systems to survive/thrive.. so problem w integration.. really easy to track you.. that hypervisibility really creates great harm.. a cycle that can criminalize them very quickly.. and very hard to get out..evidence that creates specific interventions also can be used to persecute..
j: the deserving war.. so much of this tech is about.. who is deserving and who is not.. the collective harm is actually the real harm
v: @Yascha_Mounk.. we spend all our time/money/energy.. trying to decide if you’re poor.. rather than unleashing.. you
a: on people who know their behaviors are being accounted for in a numerical system
v: on the amazing stories of our movements ie: the panthers ..that take on survival problems.. by.. talking to people closest to the problems..
v: moving people from paranoia to power.. how do you do that political work.. goal is not.. don’t try to ie: access food stamps.. so we talk to people a lot around self defense strategies.. people have brilliant strategies already being deployed.. they don’t need experts.. have good instinct for what’s happening..
q&a
q: trying to imagine an alt.. to ie: pittsburgh story.. why is algo worse.. than people
v: child protective services.. 75% involve neglect and not abuse.. they also tend to combine 2 things that would probably be better to be separated.. ie: get parents calling services on themselves…another ie: services say to landlord.. this is neglect.. need to fix it.. then parents in system.. so i think primary solution is to separate those two roles.. algo worse.. sees human decision making as black box we can’t understand.. and sees computer as being open and transparent.. when it’s just moved decision making to engineers.. so.. who has bias..i do think algo is worse because it hides bias and promotes bias we don’t believe is true
a: hurts labor force too
j: diff is.. judge could be biased.. but systemizing bias is concrete
today we have the means to have no one play judge..let’s do that
(tech) as it could be: 2 convos .. as the day
ie: hlb via 2 convos that io dance.. as the day..[aka: not part\ial.. for (blank)’s sake…].. a nother way
q: can any of us live under that surveillance.. i think i could live in a world where we all knew everything about everyone.. but it’s not symmetric like that..
v: if we build a world w contempt/hatred for poor people .. that’s going to affect all of us
j: i went to germany and looked at stasi archives.. basically you see.. there’s something on everyone.. they use that to flip you.. so incredibly corrosive to society.. that was way they used power.. to turn everyone on everyone else..
q: who’s lined up to do the work to get us to the next point.. who is showing up.. how do we work around that..
v: my next project.. dual bio.. american poor people’s movement.. folks are thinking about it.. but haven’t done coalition movement to see that we have shared struggle.. that work is starting to happen.. i hope my book will push that.. ie: blm, police brutality.. think about policing in slightly diff way.. not just by law enforcement.. but in govt agencies.. ‘the state doesn’t need a cop to kill you‘ we have to start seeing state violence in places that attempt to serve you..
a: ally ship gets very complicated.. how to redefine that
v: guess what v.. everything you’re asking us has nothing to do w our lives.. so got better questions.. ie: everybody tells us we have no access to tech.. that doesn’t mean we have power.. so not .. do you have access.. but do you have power.. easy for this to be abstract.. but impossible.. when looking in ie: a prison
_________
Clint Smith (@ClintSmithIII) tweeted at 6:08 PM – 17 Jan 2018 :
I don’t think people fully appreciate how the public policy in this country actively makes the lives of poor people more difficult. (http://twitter.com/ClintSmithIII/status/953796298112688128?s=17)
___________
CityLab (@CityLab) tweeted at 2:49 PM – 6 Feb 2018 :
When criminalizing the poor goes high-tech https://t.co/fTKYlF1bAEhttps://t.co/z1EtmuBvCM (http://twitter.com/CityLab/status/960993872112439298?s=17)
That distinction between the impotent and the able poor, which today we would talk about as “deserving” and “undeserving” poor, created a public assistance system that was more of a moral thermometer than a floor that was under everybody protecting their basic human rights.
I think of that as the deep social programming of all of the administrative public assistance systems that serve poor working-class communities. That social programming often shows up in invisible assumptions that drive the kind of automation of inequality that I talk about in the book.
it seems to me that you are stretching the boundaries of informed consent if access to a basic human needs like housing is in any way contingent on you filling out this form.
It’s really important to understand that in the United States, 54 percent of us will be in poverty at some point in our lives between the ages of 25 and 60. Two-thirds of us will access a means-tested public program, which is just straight-up welfare. So, the reality is these systems are already a majority concern. [A study from 2015 shows that four out of five Americans will likely face “economic insecurity,” at some point in their lives, which includes using a means-tested welfare program, experience poverty, or unemployment.]
You get a very different system if you design from a principle that says,“public service systems should be a floor underneath us all rather than a moral thermometer.”
we often talk about these systems as like disruptors or as equalizers, at least in the cases that I research they really act more like intensifiers or amplifiers of the system we already have.
What I want to point out is that the decision that we don’t have enough resources to help everyone and we have to triage? That we have to ration care? That is a political decision..t
One of the things I most fear about these systems is they allow us the emotional distance that’s necessary to make what are inhuman decisions.
we are allowing these machines to make decisions that are too difficult for us to make as human beings. That’s something that we really need to pay attention to because in the long run that means that we’re giving up on the shared goal of caring for each other. I don’t think that’s who we are as a society
_______
on laura flanders (nathan schneider too)
[https://www.youtube.com/watch?v=sL0m68BL32Y]
19 min – from their pov these systems often look very diff..t
23 min – a form of math washing.. that we don’t understand.. 1st step: we don’t buy that story
24 min – i really think it’s about listening..t
tech as it could be – we could be listening to 7bn voices.. everyday
________
“The rise of the care bots risks creating a system where we only value the parts of care that can be turned into data.” Thrilled to join @cariatidaa as part of this series about #AutomatingCare for @guardian. More to come! https://t.co/uNbPBYnyUt
Original Tweet: https://twitter.com/PopTechWorks/status/1400422772556304385
The most essential aspects of caring for one another – presence, compassion, connection – are not always easy, or even possible, to measure.
so let’s stop measuring things and use tech to undo our hierarchical listening
imagine if we tried a nother way
Here’s a description of the whole series – more pieces coming in June and July. https://t.co/NGymOSwynH
Original Tweet: https://twitter.com/PopTechWorks/status/1400436050342957062
You need to register to keep reading It’s still free to read – this is not a paywall
still a paywall
We’re committed to keeping our quality reporting open. By registering and providing us with insight into your preferences, you’re helping us to engage with you more deeply, and that allows us to keep our journalism free for all. You’ll always be able to control your own privacy settings.. (options): register for free.. i’ll do it later
if hit – i’ll do it later.. goes to article.. with this
you’ve read ‘number of’ articles this year.. we have a small favor to ask.. just give a dollar..
1\ i get that today people (think they) need money.. but that’s a huge part of the problem.. we need try/code money (any form of measuring/accounting) as the planned obsolescence w/ubi as temp placebo.. (legit) needs met w/o money.. till people forget about measuring 2\ many people can’t pay a dollar.. can’t read english.. whatever.. what we need is a means to undo this hierarchical listening.. so that 8b people grok their legit needs.. so that 8b people grok enough ness
________
find/follow Virginia:
Writing about technology and social justice.
troy, ny
________
________
________