the social dilemma

Tech experts sound the alarm on the dangerous human impact of social networking.

We tweet, we like, and we share— but what are the consequences of our growing dependence on social media? As digital platforms increasingly become a lifeline to stay connected, Silicon Valley insiders reveal how social media is reprogramming civilization by exposing what’s hiding on the other side of your screen.

The world has long recognized the positive applications of social media, from its role in empowering protesters to speak out against oppression during the Arab Spring uprisings almost a decade ago, to serving an instrumental role in fighting for equity and justice today. And in 2020, during an astonishing global pandemic, social media has become our lifeline to stay in touch with loved ones, as well as proving to be an asset for mobilizing civil rights protests. However, the system that connects us also invisibly controls us. The collective lack of understanding about how these platforms actually operate has led to hidden and often harmful consequences to society—consequences that are becoming more and more evident over time, and consequences that, the subjects in The Social Dilemma suggest, are an existential threat to humanity.

The Social Dilemma is a powerful exploration of the disproportionate impact that a relatively small number of engineers in Silicon Valley have over the way we think, act, and live our lives. The film deftly tackles an underlying cause of our viral conspiracy theories, teenage mental health issues, rampant misinformation and political polarization, and makes these issues visceral, understandable, and urgent. Through a unique combination of documentary investigation and entertaining narrative drama, award-winning filmmakers Jeff Orlowski (Chasing Ice, Chasing Coral) and Larissa Rhodes (Chasing Coral) have once again exposed the invisible in a manner that is both enlightening and harrowing as they disrupt the disrupters by unveiling the hidden machinations behind everyone’s favorite social media and search platforms.

The film features compelling interviews with high-profile tech whistleblowers and innovation leaders including Tristan Harris of the Center for Humane Technology; the co-inventor of the Facebook “Like” button, Justin Rosenstein; Tim Kendall, former President of Pinterest and former Director of Monetization at Facebook; Cathy O’Neil, author of Weapons of Math Destruction; Rashida Richardson, Director of Policy at the AI Now Institute, and many others. Demonstrating how social media affects consumers on a personal level, these fascinating insider insights are seamlessly woven into a captivating narrative, including Vincent Kartheiser (Mad Men), that illuminates the very real consequences these seemingly innocent technologies can have on our everyday lives.

tristan harris.. justin rosenstein.. cathy o’neil.. tristan and aza on ai..



6 min – tristan: is there something beneath all these problems.. there’s a problem happening in the tech industry.. and it doesn’t have a name.. and it has to do w one source..

7 min – tristan: looks like world is going crazy.. is this normal..? this shouldn’t be this thing that only tech industry knows.. should be something everybody knows

8 min – (tristan called ethical conscience of tech) – tristan: struggled w/ how to change it.. decided to make a presentation.. as a call to arms ie: we have a moral obligations as google to solve this problem.. 50 (20-35 yr old white) guys at google.. 2b people will have thoughts because of designers at google..

center for humane tech

10 min – created a cultural moment (response from workers was positive).. that google needed to take seriously.. and then.. nothing..

tim kendell (fb, pinterest, moment): i was the director of monetization at fb..

12 min – jaron lanier on the view – about book – 10 arguments for deleting sm accounts right now..

jaron: what are they (workers at google, fb, et al) getting paid for.. that’s a really good question

jaron lanier

roger macnamee (early fb investor): last 10 yrs – selling their users

roger mcnamee

center for humane tech

aza raskin (firefox, mozilla, co founder humane tech): advertisers (who pay for products we use) are the customers.. we’re the thing being sold

center for humane tech

13 min – tristan: the classic saying is.. if you’re not paying for the product.. then you are the product.. people don’t realize all these sm co’s are competing for your attention

tim: how much time will you spend.. how much of your life are you willing to give to us

justin rosenstein (fb, google, asana): when think about how these co’s work.. it starts to make sense.. all these services on internet that we think of as free.. but they’re not free.. paid for by advertisers.. our attention is the product being sold to advertisers

justin rosenstein

14 min – jaron: that’s a little too simplistic.. it’s the gradual, slight, imperceptible change in your own behavior and perception that is the product.. that’s the only thing there is for them to make money from.. changing what you do.. how you think.. .who you are.. it’s a gradual/slight change.. but it’s the world.. and so even 10% changing is a lot of money

15 min – shoshana zuboff (harvard) @shoshanazuboff: this is what every business has always dreamt of .. to have a guarantee that if it places an ad.. it will be successful.. that’s their business.. they sell certainty.. have to have great predictions.. great predictions begin w one imperative.. you need a lot of data

shoshana zuboff


no matter if it’s non legit data (ie: from whales in sea world)

tristan: many people call this surveillance capitalism.. capitalism profiting off infinite tracking of where everyone goes by large tech co’s whose business model is to make sure that advertisers are as successful as possible

age of surveillance capitalism

16 min – shoshana: this is a new kind of marketplace now.. one that never existed before.. it trades exclusively in human futures.. at scale.. and those markets have made the trillions of dollars that have made the internet co’s the richest co’s in the history of humanity

jeff seibert (former twitter exec): what i want people to know is that every single action they take online is being monitored.. for how long you look at it et al.. know when people are lonely.. depressed..

17 min – shoshana: they have more info about us then ever has been imagined in human history..t

yeah.. but not info about us .. about whales in sea world.. so can only harm us if we remain in sea world.. and the ‘they’ wants out too..

sandy parakilas (fb and uber): so all this data we’re just pouring out all the time.. fed into systems that have almost no human supervision.. making all these predictions about what we’re going to do and who we are

that’s been going on forever.. ie: black science of people/whales law

if we could figure out a humane job for tech.. one that would be good that it had no human supervision (aka: judge\ment).. that could help us get back to us (back out of sea world)

perhaps we can have tech w/o judgment ie: tech as it could be

18 min – aza: a lot of people have this misconception that it’s our data being sold.. it’s not in fb’s interest to give up data.. what do they do w the data.. they build models.. that predict our actions.. and whoever has the best model.. wins

like.. supposed to’s.. of school/work.. building robots.. and whoever has the most obedient robots.. wins..

platforms/tech/sm.. are all just distractions (or clearer picture of symptoms) of the deeper problem that has been w us for ever.. that’s why in the very beginning of doc.. none of the people could answer your question.. what’s the problem.. none of us are seeing the center of problem law

ie: missing pieces

18 min – tristan: all the things we’ve ever done.. all the likes.. all that data gets brought back to building a *more and more accurate model.. once we have it you can **predict the kinds of things that person does..

*of whales.. but further and further away from us/fittingness.. just like what work/school (any form of people telling people what to do) does.. has been doing.. forever – since beginning of human history

**red flag data isn’t legit.. live humans are not predict\able.. just robot ones.. just observed ones..

19 min – tristan: at a lot of these co’s 2 main goals: 1\ engagement.. to keep you scrolling 2\ growth.. to keep you/friends coming back 3\ advertising.. making as much money as possible..

tristan: each of these goals are powered by algo’s whose job is to figure out what keeps those numbers going up

tim: we often talked about this at fb.. being able to dial that as needed.. we talked about mark having those dials.. ie: want more users in korea today.. turn the dial.. dial ads/monetization.. so in all of these co’s there is that level of precision

21 min – ie of scenario

i don’t think tech (anyone) is that focused on all of us .. most of us don’t matter enough for that precision.. would waste too much energy.. just like work/school.. as long as w/in boundaries (ie: at school/work for most of your day – and even deeper – not yourself) need for that much precision

jaron: we created a world in which online connection has become primary.. esp for younger generations.. and yet.. in that world.. anytime two people connect.. the only way it’s financed is thru a sneaky 3rd person.. who’s paying to manipulate those two people.. so we’ve created an entire global generation of people who were raised w/in a context.. w the very meaning of communication/culture.. is manipulation.. we’ve put deceit/sneakiness at the absolute center of everything we do

since the beginning of time man.. it’s all of us.. maybe just more visible/ridiculous/blatant now.. but it’s all of us .. both.. 1\ not being us.. and 2\ telling people what to do (aka: to not be themselves)

what we need is a means to undo our hierarchical listening to self/others/nature

22 min – (quote on good tech being like magic) tristan: as a kid i could fool adults w phds w magic

we’ve fooled all of us into thinking phd’s make us more us

23 min – tristan: magicians were almost like the first neuroscientists/psychologists.. they were the ones who first understood how people’s minds work.. they just in real time are testing lots and lots of stuff on people

not necessarily how people’s minds work.. they are just testing responses.. not what our minds are capable of ..

tristan: the magician understands some part of your mind.. that we’re not aware of.. that’s what makes the illusion work.. dr’s, lawyers, people who know how to build 747s or nuclear missiles.. they don’t know more about how their own mind is vulnerable.. that’s a separate discipline.. and it’s a discipline that applies to all human beings.. from that perspective.. you can have a very diff understanding of what tech is doing

what anything is doing that is telling you what to do

tristan: when i was at the stanford persuasive tech lab.. this is what you learned.. how could you use *everything we know about the psychology of what persuades people and build that into tech

actually.. everything we know about the psych of what persuades whales in sea world..

24 min – sandy: there are many sv figures that went thru that class and learned how to make tech be more persuasive.. tristan being one

tristan: we want to modify someone’s behavior

joe toscano (google.. author: automating humanity): pull down and refresh.. gonna be a new thing at the top.. which in psychology we call a positive intermittent reinforcement

tristan: so you don’t know when/if you’re going to get something.. which operates just like the slot machines in vegas.. it’s not enough that you use the tech.. i want to dig deeper down into the brain stem and implant.. inside of you.. and unconscious habit.. so that you are being programmed at a deeper level.. you don’t even realize it

so bizarre.. robots (whales) explaining how the world has worked since beginning of time as if it started with the tech we have today

tristan: every time you look over (at phone) and reach for it because it may have something for you.. so you play that slot machine to see what you got.. that’s not by accident.. that’s a design technique

and going on for years.. in diff forms of supposed to ness

jeff: another ie is photo tagging.. not something you can decide to ignore (wanting to see photo you’re tagged in).. this is deep seated human personality that they’re tapping into.. what you should be asking yourself.. is.. why doesn’t that email contain the photo in it.. it would be a lot easier to see the photo

again.. yeah.. much more blatant.. out there.. but same thing that’s been going on forever

26 min – tristan: when fb found that feature.. they just dialed the hell out of it.. great way to get activity.. let’s just get people tagging each other in photos .. all day long

again.. deeper problem is missing pieces.. not that there is some tech that’s getting better at distracting us

and the scenarios are so ridiculous.. making ie: youth seem so shallow.. they are seeking connection..

tristan: there’s a whole discipline called growth hacking.. people who’s job is to hack psychology so they can get more growth.. more user sign ups.. more engagement.. get you to invite more people

not new.. flavor of it new.. but we’ve been doing that for years.. making whales more formable/compliant

27 min – chamath palihapitiya (former fb vp of growth): after all the testing/iterating.. single biggest thing we realized.. get any individual to 7 friends in 10 days.. that was it

chamath palihapitiya

sandy: chamath was the head of growth at fb early on.. and he’s very well known in the tech industry for pioneering a lot of the growth tactics that were used to grow fb at incredible speed.. and those growth tactics have then become standard playbook for sv.. one of the things he pioneered was scientific use of a/b testing.. of small feature changes.. co’s like google/fb would role out lots of little experiments.. they were constantly doing on users.. and over time by running these constant experiments.. you develop the most optimal way to get *users to do what you want them to do.. it’s manipulation.. **we’re all lab rats

*whales being told what to do

not new.. same song..manufacturing consent.. voluntary compliance.. et al

deeper issue *we need to be in rat park.. rather than sea world.. being a lab rat is no big deal if you’re legit/alive et al

sandy: and it’s not like we’re lab rats developing a cure for cancer.. it’s not like they’re trying to benefit us.. right..

cure for cancer is same song as well.. not the deeper problem.. just a band aid.. not legit healing (roots of) anyway

sandy: we’re just zombies and they want us to look at more ads so they can make more money

framing it as us/them is same song ness as well.. part of the disillusionment.. distraction.. cancer

28 min – shoshana: fb conducted what they called massive scale contagion experiments.. how do we use subliminal cues on the fb pages.. to get more people to go vote in the midterm elections.. and they discovered they were able to do that

basically.. we’re just trying to perfect the same old manipulation.. w seemingly less (visible) violence

not to mention.. *voting et al.. part of the manip/cancer/distraction

shoshana: one thing thing they concluded is that we now know .. we can effect real world behavior/emotions w/o ever triggering the users awareness.. they are completely clueless

not new.. (and ironic.. ie: these people don’t appear to notice how manipulated they already are.. just like the rest of us)

and not real world.. (sea world)

tristan: we’re pointing these engines of ai back at ourselves to reverse engineer what elicits responses.. like stimulating nerve cells on a spider to see what causes legs to respond.. so it really is this kind of prison experiment.. where we’re just roping people into the matrix.. and we’re just harvesting all this money and data from all their activity.. to profit from .. and we’re not even aware that it’s happening

not new.. black science of people/whales law.. and the look when saying ‘not even aware that it’s happening’.. is killer.. because not aware that it’s been happening from beginning of human history..


29 min – chamath: we have to psychologically figure out how to manip you as fast as possible and then give you back that dopamine hit

sean parker (former fb pres): it’s exactly the kind of thing that a hacker like myself would come up with.. because you’re *exploiting a vulnerability in human psychology

rather *in sea world.. not legit human psych

sean parker

sean: the inventors/creators.. it’s me/mark.. it’s all these people.. understood this.. consciously.. and we did it anyway

done to you.. it’s how you got brainwashed enough to do it.. inspectors of inspectors ness.. regenerating itself..

30 min – tristan: no one got upset when bicycles showed up .. if everyone started to go around on bicycles.. no one said.. we’ve just ruined society.. like bikes are affecting people.. pulling people away from their kids.. they’re ruining the fabric of democracy.. people can’t tell what’s true.. we never said any of that about a bicycle

but we do say that.. enough.. that bike able ness is rare.. doesn’t fit w our econ

and according to wikipedia.. early days helped to perpetuate it: Early bicycles were an example of conspicuous consumption, being adopted by the fashionable elites. In addition, by serving as a platform for accessories, which could ultimately cost more than the bicycle itself, it paved the way for the likes of the Barbie doll.

i love bikes.. just don’t think that ie is helpful.. ie: i also love what sm has allowed for in terms of connection/communicate (ie: freeing us from things like pluralistic ignorance)

tristan: if something is a tool.. it genuinely is just sitting there.. waiting patiently.. if something is not a tool .. it’s demanding things from you.. seducing/manipulating things from you.. it wants things from you.. and we’ve moved away from having a tools-based tech environ to an addiction/manip-based tech environ.. that’s what’s changed.. sm isn’t a tool that’s waiting to be used.. it has its own goals.. and its own means of pursuing them by using your psych against you

good distinction there (on tool waiting vs pursuing).. and why we need to give tech w/o judgment a go.. ie: tech that just listens.. and then uses that data to connect us..

but that’s (pursuing) not new/changed.. we’ve been using tech like that since beginning of human history.. as long as we’ve had any form of telling people what to do ness

33 min – anna lemke (stanford director of addiction med): sm is a drug.. we have a basic biological imperative to connect w other people that directly affects the relation to dopamine.. and the reward pathway.. millions of years of evolution are behind that system to get us to come together and live in communities

? yeah i don’t think it’s about evolution.. and ‘getting us to’ live in communities.. i think it’s part of how we were made from the get go.. part of the essence of being human.. if we’re legit ourselves (missing piece #1) we crave each other (missing piece #2)

anna: so there’s no doubt that a vehicle like sm that optimizes this connection between people is going to have a potential for addiction

but it doesn’t optimize our natural interconnectedness.. tech could.. that’s the ai humanity needs: augmenting interconnectedness.. but nothing has ever done that yet.. because it also would need a detox embed.. so that again.. we are legit ourselves first..

otherwise.. yeah.. addiction .. from the trauma of non-fittingness .. et al

35 min – anna: i’m worried about my kids.. armed w all the knowledge/experience i have.. i am fighting my kids about the time they spend on phones/computer

but i bet you’re cool with them spending their days being told what to do by other people.. (ie: supposed to’s.. of school/work).. which is much more cancerous than connecting with their friends on their phone

we’re so messed up.. i’m so worried about us.. thinking we have all this knowledge/experience.. which translates.. we’re just better whales

37 min – anna: there’s not a day that goes by that i don’t remind my kids about the pleasure/pain balance.. about the risk of addiction..

oi.. telling people what to do ness

maté parenting law et al

tristan: these tech products were not designed by child psychologists who are trying to protect and nurture children..

oh my..

but who perpetuate the sending of them to school everyday.. who perpetuate the idea that they need other people telling them what to do everyday..

safety addiction et al

38 min – tristan: it’s not just that it’s controlling where they’re spending their attention.. esp sm starts to dig down deeper and deeper into the brain stem and take over kids’ sense of self worth and id

oh my..

we blew that out of the water long ago..

otherwise we wouldn’t be in this situation now..

if we really cared about kids’ (anybody’s) self worth/id.. we’d be living a nother way

again.. maté parenting law et al

39 min – tristan: we evolved to care about whether other people in our tribe think well of us or not.. because it matters.. but were we evolved to be aware of what 10 000 people think of us..

no.. on both small and large scales.. that’s from addiction to attention from others.. that we got way back when.. when we lost our essence

tristan: we were not evolved to have social approval being dosed to us every 5 min.. that is not at all what we were built (?) to experience

we weren’t made for approval ness.. that’s part of the cancer.. we were made for unconditional ness.. with nothing to prove

maté trump law et al

chamath: we curate our lives around this perceived sense of perfection.. *because we get rewarded from these short term signals.. and we conflate that with value/truth.. and instead what it really is is fake/brittle popularity.. that’s short term.. and leaves you even more vacant and empty.. forces you into vicious cycle.. what’s the next thing i need to do.. it’s really bad

rather *because we’re trying to fill holes .. trying to cope.. from the trauma of missing pieces.. thinking we need (or realizing we’re living off of) rewards.. is a red flag we’re doing it/life wrong..

40 min – jonathan haidt (nyu social psychologist.. school of business): there has been a gigantic increase in depression/anxiety for american teenagers which began right around 2011 and 2013.. nearly triple.. and that pattern points to sm.. those kids first gen in history that got on sm in middle school

same song man.. not new.. suffocating – from the day.. the death of us.. et al

if we really want to fix it.. we’d listen deeper.. get to the healing (roots of)

jonathan: those kids first gen in history that got on sm in middle school.. how do they spend their time.. they come home from school and they’re on their devices

‘they come from school’.. where they’ve spend the day with other people telling them what to do

listen deeper guys.. you’re missing it

jonathan: a whole generation is more anxious.. more fragile.. more depressed.. they’re much less comfortable taking risks

not new.. whales.. since way back.. supposed to’s.. of school/work much worse than sm.. because it’s got us perpetuating it (school et al)

(one of his ie’s of the despair – rates they get dr licenses is dropping.. so cars are good?)

42 min – tim: it’s plain as day to me.. these services are killing people.. and causing people to kill themselves

socrates supposed to law

tristan: i don’t know any parent who says.. yeah i really want my kid growing up feeling manipulated by tech designers .. manipulating their attention.. making it impossible to do their hw

oh my gosh.. did you really say that

wake up people

tristan: or making them compare themselves to unrealistic standards of beauty..

we raise them to compare themselves to unrealistic standards to their own fittingness

tristan: like no one wants that.. no one does

totally agree.. but we’ve been doing it for years.. we have to listen deeper.. or we’re just spinning our wheels..

tristan: we used to have these protections.. when children watched sat morning cartoons.. we cared about protecting children.. we would say.. you can’t advertise to these aged children in these ways

oh my..

i need a break

tristan: but then you take youtube for kids and it gobbles up that entire portion of the attention econ.. and now all kids are youtube for kids and all those protections/regulations are gone

43 min – we’re training/conditioning a whole new generation that when we are uncomfortable/lonely/uncertain/afraid.. we have a digital pacifier for ourselves.. that is kind of atrophying our own ability to deal with that

ok.. so .. the uncomfort/loneliness/afraidness.. most/all are not natural.. all came from us .. as we fed it to them.. and to ourselves.. by perpetuating a world where people tell other people what to do..

and to me.. from what i’ve seen.. the use of ie: phones et al.. is the most healthy sign i’ve seen of kids crying out for connection.. and those connections have emboldened many (by addressing that pluralistic ignorance ness).. it’s the shame/control we keep dishing about it all .. that is turning our more and more sour..

44 min – tristan: a totally new species of power (tech) – (then guy at conference countering that.. that it’s just another version)

45 min – tristan: there’s this narrative that we’ll just learn how to adapt to it.. that we’ll learn how to live with it.. just like everything else.. and what this misses is that there’s something distinctively new here..

randima (randy) fernando (nvidia, mindful schools, center for humane tech co founder): perhaps the most dangerous piece of all this is the fact that it’s driven by tech that’s advancing exponentially ie: 60s to today.. processing power has gone up about a trillion times.. nothing else we have has improved at anything near that rate

and yet.. mufleh humanity lawwe have seen advances in every aspect of our lives except our humanity– Luma Mufleh

center for humane tech

randy: and perhaps more importantly.. our brains.. have evolved.. not at all..

tristan: human beings at a mind/body level are not going to fundamentally change.. i mean we can do genetic engineering et al.. but realistically speaking.. you’re living inside of a hardware.. a brain.. that was like millions of years old.. and then there’s this screen.. and on the opposite side of this screen there’s this.. 1000s of engineers and supercomputers that have goals that are diff than your goals.. and so .. who’s going to win in that game..

46 min – tristan: when you think of ai.. and ai is going to ruin the world.. what people miss.. is that ai already runs today’s world right now

not new.. supposed to’s have run the world for forever

47 min – justin: even talking about an ai.. is just a metaphor.. at these co’s like google.. there’s just massive massive rooms.. some underground.. some underwater.. of just computers.. tons and tons of them deeply connected w each other and running extremely complicated programs.. sending info back and forth between each other all the time.. some algo’s just simple.. some so complicated you would call them intelligence

yeah.. we do call them intelligence.. but.. they’re not.. at least no where near human thinking.. just like the adult is no where near the not yet scrambled ness of the child.. not to mention.. intellect ness isn’t the point of human being

cathy o’neill (data scientist.. weapons): i like to say that algos are opinions embedded in code.. and that algos are not objective .. algos are optimized to some defn of success.. usually commercial interest.. usually profit

cathy o’neilweapons of math

algos and redefining success ness

48 min – jeff: you’re giving the computer the goal state.. i want this outcome.. and then the computer itself is learning how to do it.. that’s where the term machine learning comes from.. and so every day it becomes better and better at picking the right posts.. so that you spend longer and longer on them/that product.. and no one really understands what they’re doing to achieve that goal

what computers can’t do et al ..machine learning.. myth of machine.. et al

bailey richardson (instagram early team): the algo has a mind of its own.. so even though a person writes it.. it’s written in a way that you kind of build the machine and then the machine changes itself

sandy: there’s only a handful of people at these co’s who understand how those systems work.. and even they don’t fully understand what’s going to happen w a particular piece of content.. so .. as humans .. we’ve almost lost control.. over these systems.. because they’re controlling the info we see.. they’re controlling us .. more than we’re controlling them

again.. not new.. supposed to’s took over long ago

53 min – tristan (speaking at center for humane tech): we’re all looking out for the moment when tech would overwhelm human strengths and intelligence.. when is it going to cross the singularity.. replace our jobs .. be smarter than humans.. but there’s this much earlier moment.. when tech exceeds and overwhelms human weaknesses.. this point being crossed.. is at the root of addiction, polarization, radicalization, outrage/vanity.. the entire thing.. this is over powering human nature.. and this is check mate on humanity

not new.. and tech.. not deep enough to get to healing (roots of)

check mate was when we became whales.. when we emptied our holes

what we need most is for tech to help/augment our weakness.. the root of our weakness.. ie: to undo our hierarchical listening .. to augment our interconnectedness

55 min – jaron: one of the ways i try to get people to understand just how wrong feeds from places like fb are .. is to think about wikipedia.. when you go to a page you’re seeing the same thing as other people.. so it’s one of the few things online that we at least hold in common.. just imagine if wikipedia said.. we’re going to give each person a customized defn.. that’s exactly what’s happening on fb/youtube..

justin: when you go to google and type in ‘climate change is’.. you’re going to see diff results depending on where you live.. some cities will see ‘is a hoax’.. in others ‘is disrupting the planet’ and that’s a function.. not of what the truth is about climate change.. but about where you happen to googling from and the particular things google knows about your interests

56 min – tristan: even 2 friends who are close.. who have almost the exact set of friends.. see completely diff worlds/updates on ie: fb

jonathan: each person has own reality w own facts.. over time you have the false sense that everyone agrees with you because everyone in your news feed sounds just like you.. and once you’re in that state.. turns out you’re easily manipulated.. the same way you would be manipulated by a magician.. what you don’t realize is they’ve done a sell .. and so you pick the card they want you to pick.. that’s how fb works.. saying .. you pick your friends.. pick the links you want to follow.. but that’s all nonsense.. fb is in charge of your newsfeed

same with ie: supposed to’s.. of school/work.. et al .. this is all spinach or rock ness.. living under the guise of some finite set of choices.. because we believe that people telling other people what to do is a given.. and even a good thing.. when it’s actually part of the cancer..

same with parents.. who want best for kids but going about it from their own intoxications.. ie: maté parenting law et al

57 min – rashida richardson (nyu school of law prof, ai now institute director of policy research): we all simply are operating on a diff set of facts.. when that happens at scale .. you’re no longer able to reckon with or even consume info that contradicts w that worldview that you’ve created.. that means.. we aren’t actually being objective/constructive individuals

not new.. black science of people/whales law.. pluralistic ignorance.. et al

justin: then you start to look over at the other side.. and you think.. how can those people be so stupid.. look at all this info that i’m constantly seeing.. how are they not seeing that same info.. and the answer is.. they’re not seeing that same info

deeper issue.. all info today is non legit.. what we need is a means to get back to us.. to legit ness.. otherwise just spinning our wheels anyway

58 min – justin: so many of the problems we’re discussing.. like around political polarization.. exist.. in spades .. on cable tv.. the media has this same problem.. where their business model is.. they’re selling our attention to advertisers and the internet is just a new.. even more efficient way to do that

that’s a distraction.. much deeper.. because politics (political parties et al) itself is a distraction.. irrelevant to human being.. should be irrelevant to a center for humane tech

we’re missing the center of problem law

guillaume chaslot (former youtube engineer, ceo intuitive ai, founder algo transparency): at youtube i was working on youtube recommendations.. it worries me that an algo i worked on is actually increasing polarization in society.. but from the pov of watch time.. this polarization is extremely efficient at keeping people online

thurman interconnectedness lawwhen you understand interconnectedness it makes you more afraid of hating than of dying – Robert Thurman 

59 min – guillaume: people think the algo is designed to give them what they really want.. but it’s not.. the algo is actually trying to find a few rabbit holes that are very powerful and try to find which rabbit hole is closest to your interest and then if you start watching one of those videos.. then it will recommend it over and over again

1:00 – tristan (back speaking at center for humane tech): it’s not like anyone wants this to happen.. it’s just that this is what the recommendation system is doing.. ie: of earth is flat/round

guillaume: the algo is getting smarter and smarter everyday.. so today convincing some earth is flat.. but tomorrow will be convincing you of something that’s false

again.. distraction.. this isn’t the essence of human being.. we need to let go of info/intellect ness if we ever want to get back to us.. back to knowing what enough is.. to knowing we are enough.. w/o all the info et al

1:01 – renée diresta (stanford internet observatory research manager, former head of policy at data for democracy): pizzagate.. the idea that ordering a pizza meant ordering a trafficked person.. as the groups got bigger on fb.. fb’s recommendation engine started suggesting to regular users that they join pizzagate groups.. so if a user was for ie.. anti vaccine or believed in chemtrails.. indicated to fb’s algos in some way that they were prone to believe in conspiracy theories.. fb’s recommendation engine would serve them pizzagate groups.. eventually this culminated in a man showing up w a gun deciding he was gonna go liberate the children from the basement of the pizza place.. that did not have a basement.. (trying to free kids of pedophile ring).. this is an ie of conspiracy theory that was propagated across all social networks.. the social networks own recommendation engine is voluntarily serving this up to people who had never searched for the term pizzagate in their life

(lovely girl i’m sure.. but her titles are ie’s of how distracted we’ve become.. w/o this tech focus they are all talking about.. ie: research, policy, democracy, all killing us way before this time period)

1:02 – tristan: there’s an mit study that fake news on twitter spreads 6x faster than true news.. what is that world going to look like.. when one has a 6x advantage to the other one

and what is true news.. ie: if today we’re taking all our data from whales in sea world..

we need a diff focus.. if we legit want change

aza: you can imagine these things.. tilt the floor of human behavior.. they make some behavior harder.. some easier.. and you’re always free to walk up the hill .. but fewer people do.. so at society scale.. you really are just tilting floor and changing what billions of people think and do

not new.. supposed to’s of school/work.. et al.. tilted us long ago..

begs we go deeper if we want to solve/cure.. rather than spin/bandaid

sandy: we’ve created a system that biases toward false info.. not because we want to .. but because false info makes the co’s more money than the truth.. the truth is boring

not new

tristan: it’s a dis info for profit business model..

1:04 – tristan: fb has trillions of these posts.. they can’t know what’s true.. which is why this convo is so critical right now

convo is critical right now.. but not this one.. because none of them are legit right now..

wish we could talk about humane tech tristan.. but.. you can’t hear ..which is the deeper problem – our hierarchical listening.. which what legit humane tech could help us with

people have no idea what’s true.. and now it’s a matter of life/death

it’s always been a matter of life/death.. and we keep choosing/being death.. the death of us.. et al

1:05 – what we’re seeing w covid is an extreme version of what’s happening across our info eco system.. sm amplifies to the point we don’t know what’s true.. no matter what issue we care about

just as internet et al.. an extreme (or more blatant) version of what’s been happening across communication system.. (ie: our hierarchical listening)

and perhaps.. too much info ness.. can be finally.. a good opp for us to let go enough.. (too much to keep holding on) to try something legit diff..

1:07 – jonathan: one of problems w fb is.. that as a tool of persuasion it may be the greatest thing ever created.. now imagine what that means in the hands of a dictator or authoritarian.. if you want to control the population of the country.. there has never been a tool as effective as fb

ironic.. setting up your own conspiracy .. as you talk about conspiracies.. extremes..

there has been a tool for years.. much more effective than fb.. the structural violence keeping us enslaved to the supposed to’s of school/work et al.. at least w fb more are seeing it.. with structural violence.. it’s invisible.. and everyone – even your parents – are enacting it.. every day..

cynthia m wong (former researcher human rights watch): some of most troubling implications of govts and other bad actors.. weaponizing sm.. is that it has led to real offline harm.. most prominent ie.. what’s happened in myanmar.. in myanmar when people think about the internet.. what people are thinking about is fb.. what happens is .. cellphone seller pre loads fb on phone and opens an account for buyer.. so when people get their phone.. the first thing they open and only thing they know how to open.. is fb


1:08 – cynthia: fb really gave the military and other bad actors a new way to manip public opinion and to help incite violence against the rohingyan muslims.. that included mass killings.. burning of entire villages.. mass rape.. and other serious crimes against humanity.. that have now led to 700 000 rohingyan muslims having to flee the country

rohingya people

renée: it’s not that highly motivated propagandists haven’t existed before.. it’s that the platforms make it possible to spread manipulative narratives with phenomenal ease and w/o very much money

tristan: if i want to manip an election.. i can now go into a conspiracy theory group (et al)

these are all symptoms of a deeper problem.. and election ness.. a huge distraction

1:09 – justin: algos and manip politicians are becoming so expert at learning how to trigger us.. getting so good at creating fake news.. that we absorb it as if it were reality.. infusing us into believing those lies.. it’s as though we have less control over who we are and what we really believe

not new

and this doc is ironically confirming.. that what they’re saying is the worst/newest ever.. isn’t.. because they are all seemingly clueless about how much we already were .. not ourselves..

black science of people/whales law et al

1:10 – tristan: imagine a world where no one believes anything that’s true.. everyone believes govt is lying to them.. everything is conspiracy theory.. shouldn’t trust anyone.. i hate the other side.. that’s where all this is headed

not new.. already there..

deeper problem/issue.. we have to wake 8b people up

1:10 – ie’s of protests across the world

1:11 – renée: what we’re seeing is a global assault on democracy.. most of the countries that are targeted are countries that run democratic ellections

either we’ve never had a legit democracy.. or.. (i believe) democracy ness is part of the problem/cancer.. ie: it’s been doing all the things you’re talking about as the (new) social dilemma.. telling us what to do with our days.. by telling us we have a finite set of choices .. et al

and again.. talking ‘elections’ is a huge distraction.. much like talking ‘test scores’ is to ed.. much like talking ed is to human being.. et al

tristan: this is happening at scale .. by state actors.. by people w millions of dollars.. saying.. i want to de stabilize kenya/cameron.. oh angola.. that only costs this much

tristan: we in the tech industry have created the tools.. to de stabilize and erode the fabric of society.. in every country .. all at once.. everywhere

again.. good thing.. because the ‘fabric of society’ of which you speak .. is our very cancer.. we need a nother way to live.. humane tech could help us leap there..

wish you could hear.. because you’re/we’re spinning your/our wheels..

1:12 – joe: you have this in germany, spain, france, brazil, australia,.. some of the most ‘developed’ nations in the world.. are now imploding on each other.. and what do they have in common

jonathan: the manip by 3rd parties .. is not a hack (on the hearings of fb, twitter, google.. about russian interference w 2016 elections).. the russians didn’t hack fb.. what they did was they used the tools that fb created for legit advertisers/users and they applied it to a nefarious purpose

it’s like remote control warfare.. one country can invade another w/o getting to physical borders

tristan: but it wasn’t about who you wanted to vote for.. it was about sowing total chaos and division..

we need chaos.. and division

tristan: it’s about making two sides.. who couldn’t hear each other anymore.. who didn’t want to hear each other anymore.. who didn’t trust each other anymore

not new.. we’ve had hierarchical listening and lack of legit trust going on (and killing us) for e v e r

1:14 – tristan: do we want the system for sale to the highest bidder.. for democracy to be completely for sale.. we can reach any mind you want.. target a lie to that particular population and create culture wars.. do we want that

we don’t want the system tristan

let go of that

1:15 – jonathan: if everyone is entitled to own facts.. no reason to come together.. we need to have some kind of common understanding of reality.. otherwise.. we’re not a country

we don’t want to be a country .. let go of that

we could have (what i think you all are trying to say.. what i know 8b people souls are craving) a common understanding of what the essence of human being is.. as our means to org us..

ie: 2 convers as infra

cathy: we’re are allowing technologists to frame this as a problem that they are equipped to solve.. that is a lie.. people talk about ai as if it will know truth.. ai is not going to solve these problems.. ai cannot solve the problem of fake news

well.. it can make fake news irrelevant .. if we see/try ai as augmenting interconnectedness

cathy: google doesn’t have the option of saying.. oh is this conspiracy/truth.. because they don’t know what truth is.. they don’t have a proxy for truth that’s better than a click

and.. truth isn’t the deeper problem (well in a way it is because there is none.. until we are all legit free to be)

tristan: if we don’t agree on what is true.. or that there is such a thing as truth.. we’re toast

indeed.. to that.. we are toast.. have been for e v e r

1:16 – tristan: this is the problem beneath other problems.. because if we can’t agree on what’s true.. then we can’t navigate out of any of our problems

well.. once we realize (aka: listen deep enough).. we’ll realize that navigating out of problems is a waste of time.. (aka: irrelevant to alive people) (aka: part of the problem)

1:17 – jaron: a lot of people in sv subscribe to some kind of theory that we’re building some kind of global super brain.. and all of us little neurons/not-important.. and it subjugates people into this weird role where you’re just like this little computing element that we’re programming thru our behavior manip for the service of this giant brain.. and you don’t matter.. *you’re not going to get paid/acknowledged.. you don’t have self determination.. we’ll sneakily manip you because you’re computing node so we need to program you .. because that’s what you do w computing nodes


ok.. so .. *thinking we need to get paid/acknowledged in order to have self determination.. in order to be.. is the deeper (or just earlier) issue.. supposed to’s of school/work has been sneakily (or rather violently) been doing this for e v e r

1:18 – tristan: when you think about tech and it being and existential threat.. it’s a big claim.. it’s easy to then in your mind think.. so there i am with the phone.. scrolling/using it.. like .. where’s the existential threat.. et al.. it’s not about the tech being the existential threat (as he’s heading into senate hearing on persuasive tech).. it’s that tech’s ability to bring out the worst in society.. and the worst in society.. being the existential threat..t

begs gershenfeld something else law


tristan: if tech creates mass chaos/outrage/instability.. lack of trust in each other.. loneliness/alienation/polarization.. more election hacking.. more populism.. more distraction and inability to focus on the real issues.. that’s just society.. and how society is incapable healing itself.. and just devolving into a kind of chaos

i think i agree? your words hear sound good..

but not new

tristan (talking at hearing): this effects everyone.. and if you don’t use these products.. these things have become digital frankensteins.. terraforming the world.. whether it’s mental health in children.. or our politics and our political discourse.. w/o taking responsibility for taking over the public square.. so again it comes back to.. (then interrupted w question)

not new.. ie: we’ve been scrambling children for e v e r

our political discourse.. huge distraction. that we’re capable of because we all got scrambled as children.. et al

1:19 – tristan (asked.. who do you think is responsible): i think we have to have the platforms being responsible for.. when they take over election advertising.. they’re responsible for protecting elections.. when they take over mental health in kids sat morning .. they’re responsible for protecting sat morning

wow.. just more distractions tristan

tristan: the race to keep people’s attention isn’t going away.. our tech is going to become more integrated into our lives.. not less.. the ai’s are going to get better at predicting what keeps us on the screen .. not worse

1:20 – tim (asked.. what do you most worry about): i think in the short time horizon.. civil war

jaron: if we go down the current status quo for let’s say another 20 yrs.. we’d probably destroy our civilization thru willful ignorance.. we’d probably fail to meet the challenge of climate change.. we’d probably degrade the world’s democracies so that they fall into some sort of bizarre autocratic dysfunction.. we’d probably ruin the global econ.. we’d probably.. don’t survive.. i really do view it as existential

but.. ie: civilization, democracy, global econ, .. all already killing us.. existentially

1:21 – tristan: is this the last gen of people that are going to know what it was like before this illusion took place

umm.. none of us know what it’s like.. already

tristan: like how do you wake up from the matrix when you don’t know you’re in the matrix..t

yeah that.. but that’s not new..

but i do have some insight for you on that.. how 8b people wake up from the matrix when they don’t know they’re in the matrix (aka: sea world)


excellent question.. wish we could talk

tristan: it’s confusing because it’s simultaneous utopia and dystopia

none of it is utopia.. can’t be until it’s all of us

tristan: i can hit a button on my phone and a car shows up in 30 seconds.. and i can go exactly where i need to go.. that is magic.. that’s amazing

yeah.. not utopia man.. ie: we don’t have any idea what we need.. and we don’t need cars.. like that

1:22 – justin: when we were making the like button.. our entire motivation was.. can we spread positivity and love in the world.. the idea that .. ff to today.. that teens would be getting depressed if they don’t have enough likes.. or getting to political polarization.. was no where on our radar

because getting to love.. is much deeper than that..

it is something we can do.. if we go deep enough

joe: i don’t think these guys set out to be evil.. it’s just the business model that has the problem

alex roetter (former vp of engineering at twitter): you could just down the service and destroy whatever it is.. 20 billion dollars of shareholder value and get sued.. but you can’t in practice put the genie back in the bottle.. you can make some tweaks .. but.. at end of day.. you’ve got to grow revenue and usage quarter for quarter.. it’s .. the bigger it gets the harder it is for anyone to change

yeah.. i don’t know.. the myth of us.. is about as huge as possible.. before all this.. i think the new in tech (if we use it as fitting for human being) is that it can help jump start the change.. help us leap to a nother way.. but no one is using tech/net in that way as of yet..

tristan: what i see are a bunch of people who are trapped by a business model and econ incentive and shareholder pressure.. that makes it almost impossible to do something else.. t

unless we create something that 8b souls already crave/itch-for

humane tech – as it could be

impossible is irrelevant

sandy: i think we need to accept that it’s ok for co’s to be focused on making money.. what’s not ok is when there’s no reg/rules/competition.. and the co’s are acting as de facto govts.. and then they’re saying.. we can reg ourselves.. that’s just a lie.. that’s just ridiculous


we need to let go of all the red flags.. we need to let go of money (any form of measuring/accounting)

jaron: financial incentives kind of run the world.. so any solution to this problem has to realign the financial incentives

rather.. let them go

ie: let’s try/code money (any form of measuring/accounting) as the planned obsolescence

w/ubi as temp placebo.. needs met w/o money.. till people forget about measuring

joe: there’s not fiscal reason for these co’s to change.. and that is why i think we need regulation

sandy: the phone co has tons of sensitive data (?) value.. and we have a lot of laws to make sure they don’t do the wrong things

how’s that working?

sandy: we have almost no laws around digital privacy for ie

not the problem man

joe: we could tax data collection and processing.. the same way that you for ie pay your water bill.. by monitoring the amount of water you use.. you tax these co’s on the data assets that they have.. gives them a fiscal reason to not acquire every piece of data on the planet

these are the solutions? wow

1:23 – jonathan: the law runs way behind on this thing.. what i know is the current situation exists.. not for the protection of users.. but for the protection of the rights and privileges of these gigantic .. incredibly wealthy co’s.. are we always going to defer to the richest most powerful people or are we ever going to say.. you know .. there are times when there is a national interest.. there are times when the interest of the people.. of users.. is actually more important than the profits.. of somebody who’s already a billionaire

maybe you guys are missing it .. because you still are craving ie: more money..?

1:24 – shoshana: these markets undermine democracy and they undermine freedom.. and they should be outlawed.. this is not a radical proposal.. there are other markets that we outlaw.. we outlaw markets in human organs.. in human slaves.. because they have inevitable destructive consequences

let go of democracy.. because it’s helped us to not know what legit freedom is.. ie: freedom has nothing to do w laws (laws aren’t helping with enslavement .. et al)

1:25 – justin: we live in a world in which a tree/whale is worth more financially.. dead.. than alive.. for so long as our econ works in that way.. and corps go unregulated.. *they’re going to continue to destroy trees/whales/earth.. this is short term thinking based on this religion of profit at all costs..

funny you’d say whale.. we are all dead/whales.. already

reg ness isn’t the answer.. we need to let go of money (any form of measuring/accounting)

*so much of this going on thru out the doc.. we have to realize.. it has to be all of us.. or it won’t work

justin: as if somehow magically.. each corp working in its self interests is going to produce the best result

that is actually how we dance.. but only works if we’re legit free

‘in undisturbed ecosystems ..the average individual, species, or population, left to its own devices, behaves in ways that serve and stabilize the whole..’ –Dana Meadows

justin: this has been effecting the environ for a long time.. what’s frightening.. and hopefully is the last straw that will make us wake up as a civilization to how flawed this theory has been in the first place.. is to see that now.. we’re the tree/whale.. our attention can be mined.. we are more profitable to a corp if we’re spending time staring at a screen/ad than if we’re spending that time living our life in a rich way.. so we’re seeing results of that.. large corps using powerful ai to outsmart us and figure out how to pull our attention to the thing they want us to look at.. rather than the things that are *most consistent w our goals/values/lives..t @rosenstein

yeah.. none of us are listening deep enough to even know *that.. that fittingness..

and that’s the deeper issue.. money/corps/platforms et al.. irrelevant.. what we need is to undo our hierarchical listening (humane tech) .. so we can get back/to that fittingness

not about a battle w ‘them’.. because it has to be all of us

humanity needs a leap.. to get back/to simultaneous spontaneity ..  simultaneous fittingness..  everyone in sync..

1:26 – steve jobs: computer is equiv of a bicycle for our minds

aza: the idea of humane tech.. that’s where sv got its start.. and we’ve lost sight of it because.. it became the cool thing to d as opposed to the right thing to do

mufleh humanity lawwe have seen advances in every aspect of our lives except our humanity– Luma Mufleh

including this doc.. wish you could hear

bailey: the internet was like a weird wacky.. it was experimental.. creative things happened on the internet.. and certainly they do still.. but it just feels like this giant mall.. it’s like.. there’s got to be more to it than that.. am optimistic.. i think we can change what sm looks like and means

yeah.. www ness

1:27 – justin: way sm works.. not set in stone.. these are choices that human beings.. like myself.. have been making.. and human beings can change those techs

tristan: and the question now is whether or not we’re willing to admit that those bad outcomes are coming directly as a product of our work.. it’s that we built these things and we have a responsibility to change it..t

tech as it could be man..

tristan: the attention extraction model is not how we want to treat human beings.. the fabric of a healthy society depends on us getting off this corrosive business model..

any business model.. any form of people telling people what to do.. any form of measuring/accounting

1:28 – tristan: we can demand that these products be designed humanely

yeah.. but .. since we’re all whales.. none of us really knows what this would entail.. ie: tech as it could be.. we think it has to be more complicated than simply listening deeper.. to all of us

tristan: we can demand to not be treated as an extractable resource

demands aren’t going to get us to healing (roots of)

tristan: the intention could be.. how do we make the world better

yeah.. let’s try that.. let’s focus on that..

imagine if we just focused on listening to the itch-in-8b-souls.. first thing.. everyday.. and used that data to augment our interconnectedness.. we might just get to a more antifragile, healthy, thriving world.. the ecosystem we keep longing for..

what the world needs most is the energy of 8b alive people

jaron: thru out history.. every single time something’s gotten better it’s because somebody has come along to say.. this is stupid.. we can do better.. like it’s the critics that drive improvement.. it’s the critics who are the true optimists

this is ridiculous ness

tristan: um.. it seems kind of crazy.. the fundamental way this stuff is designed.. isn’t going in a good direction.. the entire thing.. so .. it sounds crazy to say.. we need to change all that.. but.. that’s what we need to do..t

indeed.. something legit diff..

no more part\ial.. for (blank)’s sake.. there’s a nother way

tristan (asked: do you think we’re going to get there): we have to


end of official doc.. rest is advice and questions about optimism

1:30 – justin: i feel like we’re on a fast track to dystopia and it’s going to take a miracle to get us out of it.. and that miracle is collective will

so.. let’s focus on something already in each one of us. no?

anna: i am optimistic that we’re going to figure it out.. but i think it’s going to take a long time.. because not everybody recognizes that .. this is a problem

rather.. no one is offering a legit alt.. for all of us.. today..

costello screen\service law et al

one that 8b people can leap to (aka: not a long time.. overnight even).. for (blank)’s sake

bailey: i think one of the big failures in tech today is a real failure of leadership.. of people coming out and having these open convos about .. not just what went well.. but what isn’t perfect.. so that someone can come in and build something new

can’t be about leader\ness.. unless by that you mean 8b leaders everyday.. and can’t be about convos rehashing things.. we need to go much deeper than both of those.. and today.. we can.. ie: itch-in-the-soul ness.. 8b.. everyday.. something new.. everyday

tristan: at the end of the day.. this machine isn’t going to turn around.. until there’s massive public pressure

justin: having these convos and voicing your opinion.. in some cases thru these very techs.. we can start to change the tide/convo

has to be a diff convo from the get go.. ie: self-talk as data et al.. curiosity over decision making

we have to let go of responding to things.. if we want to wake up

jaron: it might sounds strange .. but it’s my community/world.. i don’t hate them.. i don’t want to do any harm to google/fb.. i just want to reform them so they don’t destroy the world.. you know

justin: i uninstalled a ton of apps from my phone that i felt were just wasting my time.. all the sm/news apps.. and i’ve turned off notifications on anything that was vibrating my leg w info that wasn’t timely/important to me.. right now.. and it’s for the same reason i don’t keep cookeis in my pocket

1:31 – sandy/tristan/ava: reduce the number of .. turn off.. all notifications

guillaume: not using google anymore i’m using quant.. which doesn’t store your searches thru it

jaron: never accept the video recommended to you on youtube.. always choose.. that’s another way to fight

guillaume: there are tons of ways to remove recommendations (i love that you’re recommending something to undo what you made)

deeper.. let’s undo our hierarchical listening ie: 2 convers as infra

renée: before you share.. fact check.. consider the source.. do that extra google.. if it seems like it’s something designed to really push your emotional buttons.. like.. it probably is

justin: essentially.. you vote w your clicks

cathy: make sure that you get all kinds of diff info in your own life.. i follow people on twitter that i disagree with.. because i want to be exposed to diff points of view

tristan: notice that many people in tech industry don’t give these devices to their own children

1:32 – alex: my kids don’t use sm at all.. that’s a rule

tim: we are zealots about.. we’re crazy.. and we don’t let our kids have really any screen time

jonathan: i’ve worked out what i think are 3 simples rules tha tmake life a lot easier for families and that are justified by the research.. 1\ all devices out of bedroom at a fixed time every night 2\ no sm till highschool.. personally i think the age should be 16 3\ work out a time budge w your kid.. ask them what they think is a good amount per day.. they’ll often say something pretty reasonable.

so loaded.. not even going to respond

1:33 – jaron: i know i’m not going to get everybody to delete sm accounts.. but i think i can get a few.. and just by getting a few matters a lot.. that creates the space for convo.. because i want there to be enough people out in society who are free of the manip and to have a societal convo that isn’t bounded by the manip of sm.. so do it.. get out of the system.. delete.. get off the stupid stuff.. the world’s beautiful

let’s have a convo about fixing it (sm)





what we need is a means to to undo our hierarchical listening.. to augment our interconnectedness

not a doc.. but these are our findings:

1\ undisturbed ecosystem (common\ing) can happen

2\ if we create a way to ground the chaos of 8b free people


facebook, twitter, instagram ness, google, al..


Glad to see Mozilla calling this out, but even this is far too gentle. The Social Dilemma is the worst type of advocacy: it’s close enough to feel relevant, yet lacks the teeth to make real and meaningful change.

Original Tweet:

how i feel about anyone doing close to the edge but partial.. i.e.: in ed

there’s a nother way.. for all of it.. for (blank)’s sake


11 books to read after watching Netflix’s The Social Dilemma

Original Tweet:








(sent message via contact form on tristan’s site)

have some insight for humane tech
ie: a means to undo our hierarchical listening (to ourselves/others/nature)
to get at root of the deeper problem we keep missing

just spent day watching and taking notes on the social dilemma
asked tristan to follow me on twitter for bit so i could dm him
trying this as well