tristan harris

tristan harris

intro’d to Tristan here:

Distracted? Let’s make technology that helps us spend our time well | TEDxBrussels|2014

on time we spend

as the day

we need to restore choice.

we want to have a relationship with technology that gives us back choice.

we need help from designers.. because knowing this doesn’t help..

we’re bulldozing each other’s attention left and right – 23 min to refocus – conditions and trains us to interrupt ourselves, every 3.5 min

5 min – nancy puts a message out.. john sends message..

like the necklaces in the be you house

goal of chat: easy to send message..  let’s change it to – let’s create highest quality communicaiton


the social dilemma (doc)

tristan and aza on ai


find/follow Tristan:

link twitter

I work on Design Ethics @ Google. Also entrepreneur, design thinker, philosopher.


pay attention to world clive

– – –

app\chip ness

watch as verb

revolution of everyday life

a nother way


how tech hacks mind.. aka: spinach or rock ness

Western Culture is built around ideals of individual choice and freedom. Millions of us fiercely defend our right to make “free” choices, while..

we ignore how we’re manipulated upstream by limited menus we didn’t choose.

This is exactly what magicians do. They give people the illusion of free choice while architecting the menu so that they win, no matter what you choose.

I can’t emphasize how deep this insight is.

The “most empowering” menu is different than the menu that has the most choices. But when we blindly surrender to the menus we’re given, it’s easy to lose track of the difference

If you’re an app, how do you keep people hooked? Turn yourself into a slot machine.

So when Marc tags me, he’s actually responding to Facebook’s suggestion, not making an independent choice.


somebody’s fb share (don’t remember who) – article on Justin and Tristan and Roger.. and others.. 2017

Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Paul Lewis reports on the Silicon Valley refuseniks alarmed by a race for human attention

like Rosenstein, several years ago put in place the building blocks of a digital world from which they are now trying to disentangle themselves. “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”

Harris is the student who went rogue; a whistleblower of sorts, he is lifting the curtain on the vast powers accumulated by technology companies and the ways they are using that influence.

“I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.”

let’s try this..

ie: hlb via 2 convos that io dance.. as the day..[aka: not part\ial.. for (blank)’s sake…]..  a nother way


Tristan Harris (@tristanharris) tweeted at 10:18 PM – 7 Jan 2018 :

ICYMI: I HIGHLY recommend reading Roger McNamee’s huge piece in Washington Monthly detailing my work with him, @noUpside (and so many others behind the scenes) to defend democracy from the “maximal manipulation” model of social media.

I recommend that Facebook, Google, Twitter, and others be required to contact each person touched by Russian content with a personal message that says, “You, and we, were manipulated by the Russians. This really happened, and here is the evidence.” The message would include every Russian message the user received.

This idea, which originated with my colleague Tristan Harris, is based on experience with cults. When you want to deprogram a cult member, it is really important that the call to action come from another member of the cult, ideally the leader.

Eighth, and finally, we should consider that the time has come to revive the country’s traditional approach to monopoly. Since the Reagan era, antitrust law has operated under the principle that monopoly is not a problem so long as it doesn’t result in higher prices for consumers. Under that framework, Facebook and Google have been allowed to dominate several industries—not just search and social media but also email, video, photos, and digital ad sales, among others—increasing their monopolies by buying potential rivals like YouTube and Instagram. While superficially appealing, this approach ignores costs that don’t show up in a price tag. Addiction to Facebook, YouTube, and other platforms has a cost. Election manipulation has a cost. Reduced innovation and shrinkage of the entrepreneurial economy has a cost. All of these costs are evident today. We can quantify them well enough to appreciate that the costs to consumers of concentration on the internet are unacceptably high.

how about we make monopoly irrelevant by modeling a nother way to live.. sans money


part of time well spent team

[new name.. center for humane tech.. so added new page.. and will be adding updates there..]


via Howard fb share:

Early Facebook and Google employees form coalition to fight what they built

“The largest supercomputers in the world are inside of two companies — Google and Facebook — and where are we pointing them?” Mr. Harris said. “We’re pointing them at people’s brains, at children.”

The new Center for Humane Technology includes an unprecedented alliance of former employees of some of today’s biggest tech companies. Apart from Mr. Harris, the center includes Sandy Parakilas, a former Facebook operations manager; Lynn Fox, a former Apple and Google communications executive; Dave Morin, a former Facebook executive; Justin Rosenstein, who created Facebook’s Like button and is a co-founder of Asana; Roger McNamee, an early investor in Facebook; and Renée DiResta, a technologist who studies bots.

He said the people who made these products could stop them before they did more harm… “This is an opportunity for me to correct a wrong,” Mr. McNamee said.


Saul Kaplan (@skap5) tweeted at 6:02 AM – 7 Feb 2018 :

Tech titans are ‘good people guided by a very bad business model’. @tristanharris #HumaneDesign (

thinking.. good people guided by a business model

10 day cares .. killing us.. et al


My opening keynote from #Dreamforce18 last week in SF is now live: “How to Stop Technology from De-stabilizing the World”- explains how technology tilts the global social fabric in dangerous directions and how we can steer away:
Original Tweet:

5 min – tech is its own force pushing culture in a particular direction.. and we can predict what that direction is and steer it differently

tech is persuasive.. magic shows that you can manipulate

6 min – instagram founders and i studied persuasion at stanford.. and how to manipulate human behavior

one thing that’s not going away.. and that’s the race to capture human attention.. there’s only so much attention out there and we have to capture it

7 min – it becomes this race to the bottom of the brain stem.. who’s going to gain attention by going lower on psych persuasion stack than someone else

9 min – persuasion today – in your phone – is a supercomputer.. figuring.. what can i play next to keep you here.. ie: 70% of youtube traffic is based on what videos their algos recommend

10 min – youtube followers the size of islam .. fb followers the size of christianity.. except both are run by algo’s w bias toward sensational/radical

11 min – so at scales of 1.9 bn youtube is tilting the scales toward what is most radicalizing..

15 min – what to do about it..? go grayscale.. but that didn’t work.. so much deeper than that.. much like saying ban straws.. we need to think more systemically than that.. we need to change the system.. think more deeply about what’s really going on here..t

lacking maté basic needs

tech could facil that restoration..

16 min – so excited to go into space.. took us 10 yrs before we turned the camera back and took photo of earth.. just like tech.. want to build the latest thing out there.. don’t want to look at selves.. t

imagining tech as it could be.. listening to all the voices (self-talk as data) .. everyday

17 min – humane tech – how we fix this problem.. looking at an honest appraisal of human nature

18 min – way to fix this problem is to have a more honest view of human nature.. that instead of drilling attention out of people’s brain.. race to bottom.. extracting.. that there’s actually something to protect about how it works

2 needs

we want to re align tech w the 21st understanding of human nature.. ie: customer always right; voter knows best.. if those things are up for negotiation it’s as if you’ve debased the source of authority in a democratic society.. t.. so we’re going to have to protect it..

actually.. if still talking costumer/voter.. et al.. debased human nature..  so going to have to think way differently..

mufleh humanity lawwe have seen advances in every aspect of our lives except our humanity – Luma Mufleh

18 min – ie’s: we’re going to have to protect the ways that kids develop their id’s..t

human nature begs we render id irrelevant..  ie: marsh label law

19 min – also protect agains the mass manipulation of truth..and realize that ww3 is happening in info space.. changing entire belief system of whole culture.. t

that happened long was already non legit
ie: data on people who aren’t themselves.. much like looking at data of whales in sea world

20 min – we can also ask.. what does it mean to empower ourselves.. diff between tech we like and tech we don’t.. in 80s steve jobs said mac was a bicycle for our mind

humane tech is possible if you’re looking at .. how do we leverage human strengths..t

ai/tech humanity needs..

augmenting interconnectedness

but first have to get to individual’s daily curiosity

have to get quiet enough to hear that

21 min – facetime is a great ie of humane tech.. we’re built for empathy.. thru voices/eye-contact

we also have to get better at understanding what people are really after..t

yeah that.. imagine tech that listens to 7bn voices..2 convers.. as infra . ie: 2 convers as infra

cure ios city

instead of youtube trying to max what keeps you on screen.. what if we asked you.. what are you actually here for when you’re watching this ukelele video..t

what if we didn’t ask.. we just listened..first thing each day..  2 convers.. as infra.. ie: 2 convers as infra

it’s understanding that we can easily misunderstand each other in digital/text communication

that’s why going to basic.. to cure ios city.. and via idio-jargon/self-talk as data matters.. and getting a go at that everyday (equity)

22 min – where are we really good at deep convos.. in person.. you could have new options on the menu.. for where to talk in person..t

3 min self talk.. enough data to match you with a local(s) w same curiosity

we’re vulnerable to learned helplessness.. if you give us a stimulus.. you’re creating exponential learned helplessness.. people experiencing big global problems they can’t do anything about.. what if instead you said.. here’s great things being done on ie: plastics in ocean

better.. but not deep enough to last.. begs we have our baseline being daily curiosity

23 min – so how to transition to this.. this goes against the business model..

24 min – 4 one actually wants this – show them and they don’t want it  4 levers.. 1\ activate public 2\ activate employees of tech com’s  3\ activate pressure from govts  4\ inspire alternative: humane tech

25 min – 72% of teens thing they’re being manipulated by tech

time well spent..

time well spent

27 min – baby steps in right direction..  now a race to top about who can care more about society.. change is happening thru awareness alone

this will take all of us


june 2019 w nikola

Ex-Google Design Ethicist Tristan Harris on Technology and Human Downgrading

Tymek Majewski (@astroherbatnik) tweeted at 9:05 PM – 30 Jul 2019 :
A refreshingly uncomfortable discussion between @singularityblog & @tristanharris about technology and making the world a worse place.
I wonder what @lexfridman, @SamHarrisOrg, @MIRIBerkeley, @FLIxrisk and other giants of online AI discourse think of it. (

t: now we have this nonprofit.. center for humane tech.. work on catalyzing system change: incentive, design, practices, policy.. human downgrading

3 min – t: i care a lot about where the world is going and i’m very concerned about all the trend lines and see it as my duty/karma to be as helpful as i can in a situation of pretty dire circumstances

4 min – t: i was a magician as a kid.. leads you to see humans as different than most.. ie: see *humans as this incredibly easy to deceive species.. that’s the core thing that gives me my world view.. seeing the way beliefs take hold and stretch across the experience of our reality and distort what’s coming in thru the looking glass into false beliefs..

i think you’re talking about *whales in sea world.. not free humans in an undisturbed ecosystem

6 min – t: even something as simple as ‘oh i’m late’..  we started interview late, nikola hates me.. your mind conjures these beliefs and when it believes them.. your experience as a human animal w this canvas.. is totalizing..  this way that we experience we reality

7 min – t: the core of magic.. if there’s anything.. is the power of where attention goes.. (bali retreat on pickpocketing hypnosis and magic) pickpocketing is another exploration of attention.. people think you’re quickly rushing in to get it while they’re not looking .. it’s not like that at all.. you’re actually with them.. you’re present with them.. talking.. making eye contact.. and you’re even with them when you’re exploring what’s in your left pocket.. let’s look at it together.. and it’s all happening.. you feel like you’re right there with it and it’s not like that..

8 min – t: but magic is the study of the misdirection of attention

9 min – t: at stanford & wikipedia, .. prof taught persuasive tech.. once we know what is persuasive to mind/nervous-system/habit-formation-system.. how do you design tech that utilizes all those insights

10 mini – t: this is an uncomfortable thing for a lot of people to think about.. esp people w tech utopian ideas.. we think we’re this great species.. we’ve sent people to the moon.. done amazing things.. the idea that we’re not in control or that we can be persuaded is something that people resist.. all you have to do is look at magic to realize how universal those human weaknesses and blind spots are.. this is where it shows up later in things like fake news and addiction and slot machine techniques and casinos.. they’re all related to one continuum.. which is the universal hackability of human nature

i think if we had our shells back.. and made sure we all started w curiosity .. everyday.. we’d have a lot less of that ‘ability to persuade’.. i think we’re so easily persuaded because we’re like shell less whales in sea world

we’re currently not us (ie: so very hackable) .. from all the voluntary compliance .. manufacturing consent.. further embedded/perpetuated by the supposed to’s.. of school/work

t: at stanford worked w group from instagram.. built tech to build each other up.. called.. send the sunshine.. literally sending weather pics to those in depressive climates

11 min – t: so that was tech for good.. but what i saw when i landed at google.. 2012.. most of what people were doing in tech.. was basically.. how do i manipulate your attention.. more notifications/infinite-scroll.. whatever will keep you.. and it became this arms race.. for who can go lower in the human brain stem.. who can crawl down to your deeper more vulnerable soft underbelly of your brain/nervous-system to get results from you

12 min – t: in the same way that in the industrial revolution we wanted to get more productivity/work out of people.. in the consumerist late stage capitalism version of an engagement econ .. we want to get more desirable behavior out of you.. which means want to make you more and more predictable

predict\able ness – only when we’re whales in sea world.. not when we’re truly free

13 min – t: controlling/shaping 2 b people’s attention.. diagnosis of now

22 min – t: the fundamental thing that’s dangerous right now is that we’re decentralizing increasingly exponential and uncontrollable tech.. problem w tech today is that as it’s scaling exponentially up.. we’re decentralizing exponential power into more and more *especially less and less wise hands with really bad moral/ethical reasoning

*not so much about wisdom.. as.. not awake ness .. shell less.. et al .. into the hands of ie: whales in sea world

what we need most – the energy of 7bn alive people.. then dangerous side of tech becomes irrelevant as it is and augmenting interconnectedness.. as it could be..

23 min – t: the combinatorics of all that are way too many to capture.. and that’s why we really need a new infra to manage the exponential tech we’re creating..t

we need a new infra with a tech/mech to listen to and facil the exponentiation of our interconnectedness

what we need: gershenfeld something else law.. embedded in 2 convers as infra

n: i think problem is diff.. you’re investing 100s of millions of dollars creating most powerful tech.. and yet you’re not investing in wisdom as to how to apply that tech.. that’s an after thought.. so all the current problems.. global warming et al.. is basically our tech power far out pacing our wisdom as to how best to control and apply.. ie: 2-3000 yrs ago.. if didn’t have training for emperor of rome.. changes are that power is going to undermine your character

sounds like a distraction.. so you’re saying .. training on power is key..

i’d say.. focus on daily curiosity.. sans training

24 min – t: yep.. for center for humane tech.. we use as sort of our mission statement.. that the fundamental problem w humanity is that we have paleolithic emotions, medieval institutions, and god-like tech.. that is we have power of gods w/o the wisdom/love/prudence of gods (barbara marx hubbard paraphrase)


27 min – t: our moral ideas are vastly inadequate to a new wiring diagram

well.. moral ideas from whales in sea world.. if we were truly free.. and the infra supported that ongoingly – ie: listen to self first everyday..  it’s more that our wiring (even way before internet) are vastly inadequate/cancerous to our moral ideas

t: freedom of speech is not the same as freedom of reach.. we need a whole new set of moral concepts.. we’re in this totally diff landscape.. we don’t have moral ideas that even match the problems..

the whales don’t.. we would/could .. if we could rewire tech to listen to us/our-hearts.. first.. everyday

28 min – t: my biggest fear – where 2b people’s attention goes is a substrate for every other consequence that happens in the world

exactly why we need to focus on curiosity.. over ie: decision making

t: it’s the basis for culture/politics/elections

see.. those aren’t even real issues.. that’s whales talking

t: it’s the basis for people’s own life choices and well being.. if you manipulate 100 million human teenage social animals into being addicted into getting attention from other people.. then you cause mass mental-health/self-esteem/insecurity issues..teen suicide’s gone up by like 2.5x in the last .. 6 yrs..

see that already happened forever ago..

ie: supposed to’s.. of school/work

we’re not talking about human teenage social animals.. we’re talking about whales in sea world

so yeah to life choices and well being.. but i don’t know that we’ve ever seen that

t: i’m worried that our ability to focus attention and construct shared agendas around our existential issues is being destroyed

we have the existential issues .. because we’ve used earlier techs.. ie: supposed to’s.. of school/work.. to destroy our ability to focus attention..

our best human attention is daily curiosity.. we killed that long ago

t: that our info/sense-making ecology is being debased by tech that was innocuously maximizing whatever kept our nervous system hooked

again .. not new.. why we’re here

29 min – t: as world get more and more complex.. ie: climate change.. it’s never been more important to have accurate info about where/what the problem is and what to do about it than now..

exactly.. but what we need is the energy of 7bn alive people.. not info.. unless you want to call info ie: self-talk as data

t: and if people can’t agree on what is true.. because tech shreds their truth apart into a billion diff truman shows..

i don’t think agreeing on what is true is the point.. i think it’s a distraction..keeping us from us.. that we have to agree on something that is arbitrary (ie: truth)

t: that is catastrophic to our ability to coordinate and take action

rather.. lack of trust (aka: unconditionality) is catastrophic to our ability to get back/to an undisturbed ecosystem.. rather than.. like we keep doing.. playing like whales in sea world.. distracted in defense/rote mode..

undisturbed ecosystem: ‘in undisturbed ecosystems ..the average individual, species, or population, left to its own devices, behaves in ways that serve and stabilize the whole..’ –Dana Meadows

t: more than ever.. we need the ability to see the reality in similar and shared ways.. or semi reliable communal sense making (eric weinstein)

we need to focus on something deep enough to resonate with all of us today.. ie: maté basic needs

t: and we need the ability to construct shared agendas about what we want to do about it

the only agenda we could share (w/o perpetuating the cancer we live in now).. has to be deep enough to resonate with all of us today.. ie: maté basic needs

begs we try 3 and 30 as our shared agenda.. ie: 2 convers as infra

30 min – t: right now.. even people in tech who are trying to reform it.. who are trying to solve these big problems are yelling at each other (tech’s amplification of outrage econ) instead of agreeing w each other and trying to construct a shared agenda

again.. unless shared agenda looks something like this: 2 convers as infra.. we are spinning our wheels..

t: left to our own devices.. we’re screwed.. the interesting thing is that human mind is the only intelligence that we know that’s a live right now that has the capacity to see its own limits/weaknesses/vulnerability and that this is happening to it.. and upon seeing that .. to do something different..

again.. not new.. we’ve been working off data/info from whales in sea world for forever.. we’re missing that deeper catastrophe.. most of all

31 min – t: we’re exponentiating the weakest parts of ourselves.. and debasing the social fabric.. we’re the only species/ones that can see this is happening to us.. be in situation and see what’s happening .. and change it.. anybody who doesn’t see what we’re doing is insane.. and we need to change

32 min – t: (asked how they plan to fix this) .. we got criticized for this.. saying.. thinking they can sit in rooms and draw on whiteboards and that will solve our problems.. and of course we don’t think that we’ll solve problems by ourselves.. but if you ask people.. what’s the problem you’re trying to solve in tech industry.. what you will get back in responses are a cacophony of grievances, scandals and turf wars.. that say.. privacy is most important.. no anti-trust is.. no addiction.. no teen mental health.. (called them all hurricanes)

much like when we asked how to fix school

33 min – t: no one is asking the question.. why are we getting all these hurricanes.. what we need to realize is that we’re getting all these problems from one common generator function.. which is.. tech is misaligned with human weaknesses.. and it’s designed to extract our of those weaknesses.. profit.. thru extracting attention/behavior.. and that causes mass human downgrade.. as tech exponentiates.. that attention grabber is hollowing our your brain stem

go deeper.. as to why getting hurricanes.. not the tech ability to grab attention.. the people not being themselves.. and so seeking some attention grabber..

34 min – t: two things going on.. two ways to predict human behavior: 1\ take a human being.. we’re complex/intelligent/creative/expressive.. but one way to make us more predictable is to simplify us.. to make us act our of impulses/anxiety/fear.. then far more predictable.. then i can manipulate you

yeah.. like whales in sea world.. fashioned from the supposed to’s.. of school/work

t: second way 2\ build a bigger supercomputer that can predict the space of possible things you might do.. and tech is doing both of those things

yeah.. not new.. ie: whales in sea world.. fashioned from the supposed to’s.. of school/work

begs we get to a space no one can predict: ie: daily cure ios city in an undisturbed ecosystem

in the city.. as the day..

35 min – t: tech is both hijacking the lower level nervous system and simplifying your behavior/values..  ie: how much attention you get.. upgrades machine to build even better and better predicted models.. then you get this vanishing point.. where the space of human expression .. what we might do next.. is increasingly collapsed.. we become increasingly predictable.. not in solving problems.. but in our downgrade

dang.. so spot on.. except for a major point – not new.. ie: whales in sea world.. fashioned from the supposed to’s.. of school/work

36 min – t: that’s why we called human downgrading an existential threat.. and we needed a name for this interconnected set of problems because otherwise if we’re trying to do solutions and we build a lever.. if you’re not also covering interrelated issues.. you’re in.. we don’t have time to build enough levers.. we’re going to be playing infinite wack a mole..t

dang.. so wish we could talk.. because as brilliant as you are being here.. it’s not deep enough.. so you too.. will be playing infinite wack a mole.. seeing the downgrade as starting w tech.. even just exponentiating with tech.. isn’t deep enough.. unless you go way back and call any supposed to’s.. tech

starfish/wack-a-mole ness begs gershenfeld something else law

we do need a name for this interconnected set of problems.. – i’d call it almaas holes law.. which is a downgrade.. but w almaas (aka: maté basic needs) there is a specific deep/simple/open solution

not part\ial.. for (blank)’s sake

t: so we have to see that all these issues are connected.. that all of them happen as a continuum of hacking different human weaknesses

yeah.. at this point i think it’s more happening because of holes than hacking

t: let me pause and really map this out for you.. because it’s really critical to understand.. when i say human weaknesses.. the first like marshal islands of tech hacking human weaknesses.. everyone’s looking out for when tech is going to overtake human intelligence.. and there’s this much earlier point where tech doesn’t have to cross human strengths.. it just has to cross human weaknesses.. and that’s the magicians insight

yeah again.. magic works on whales in sea world.. so your spot on to a point.. but not deep enough to fix this.. to get us back to us

37 min – t: once i hack your weaknesses.. that’s all i need to take control and out maneuver you

true .. for whales in sea world.. but not for truly free humans in an undisturbed ecosystem

t: and this is so hard for people to get/understand.. the first marshall islands of tech hacking human weaknesses and crossing that threshold was our experience of distraction or info overload.. ‘oh .. i can’t think.. too many tabs open..’ .. our minds getting overloaded.. that was the first hacking

no.. god no.. go back further.. those were us seeking something to fill our empty holes..

 t: then you get hacking human weakness for brevity and short attention spans.. because short things work better in the attention econ than long complex things.. so say shorter and shorter things about complex things which automatically creates polarization.. all these.. continuum of hacking human weakness.. until you get full check mate of human nature.. and that’s it..

true.. but already check mated.. which begs we go deeper than blaming it on ie: tech hacking human weakness

39 min – t: so this is an existential issue that we have to actually protect human we have to protect the limits of human weakness

so.. now.. you’re suggesting we focus on protecting the limits of weakness of whales in sea world..  major wack a mole man

t: and this should not be controversial

wow.. way to maintain.. way to hack weakness

t: it’s not like you would walk into a bank that doesn’t have encryption and say.. oh no we shouldn’t protect the bank.. we should just let it be unencrypted

dang.. what an ie.. how ironic

40 min – t: they (sv) are trapped in a business model of maximizing engagement.. so long as they are in this bm.. because they are not treating you/your-sovereignty/agency w dignity.. they want to treat you as a domesticated human animal

not new.. ie: supposed to’s.. of school/work

41 min – t: a domesticated human in attention econ.. is one that wants attention.. so that is an existential threat.. that’s what we have to fix

42 min – t: should anyone be given power to steer 2/5 b people’s thoughts.. is the question you’re asking

n: steering and ethical.. aren’t those like contradictory terms..t

indeed.. and why we need an infra.. that starts from w/in 7b people.. everyday a new

43 min – t: what i hear in your question is about whether or not it’s ethical to build tech.. defn of tech: i can exponentiate consequences

a means to exponentiate good consequences for every single person .. everyday.. ie: augmenting interconnectedness

44 min – n: you did say something that hurt me.. you totally misdefined tech.. that’s a very common occurance in sv..

t: how would you define tech

n: it’s not how i define tech.. it’s what tech was defined.. tech consists in 2 greek words techne and logos..  techne means an art/craft/skill.. or a means by which a thing is accomplished.. logos means words/utterance/sayings.. an outward expression of an inner thought.. so literally .. the term technology means.. a discourse/conversation/words about the way that things are gained.. so literally from original meaning.. is merely a means to an end.. it is never an ending itself.. and it’s always only concerned with the how.. not the why and the what.. and unfortunately.. it really hurt me.. is like .. i believe.. and that part that hurt me is that i didn’t think you would do that too.. is that in sv we are obsessed with the how.. and then we bring in the why and the what.. way as an afterthought.. so we put the cart in front of the horse and we worship the cart.. and we start looking for the best usages of the cart.. not because it’s good in its own right.. but because we have a cart.. and we have to find a way to apply/monetize it.. and you have the resulting problems with that switching of the proper process

47 min – n: let’s get on topic.. because we kind of got off.. but it really is on topic.. i’m trying to see.. i love your work and what you stand for.. and yet i’m struggling to see if that’s progress.. what we’re seeing right now.. in the formation of your non for profit org..

48 min – t: think of it as the world is on fire.. we are trying to salvage the basic sense making infra

sounds like bleeding out.. which is (to me) just another (though indeed better) bandage

the ‘basic sense making infra’ you seem to be trying to salvage.. is from whales in sea world.. not alive/free humans

that (to me) does beg a conversation (n’s defn?).. everyday.. w self and w others ie: 2 convers as infra via tech as it could be.. listening to every voice.. every day

48 min – n: but you burned it.. your classmates burned it..

yeah.. i’m thinking it was burned way way before sv

t: dangerous the you your using

n: you .. sv.. former google/fb..  that’s great.. maybe now you can save it.. i really hope so.. that’s why i am rooting for you

t: let’s cover this.. this actually comes up a lot.. those who destroyed everything are now trying to save it.. i think anybody who was directly involved in creating these systems.. need to put money/resources/time/energy where mouth is and actually work to reverse this problem..

49 min – t: there is not that many people who are doing that.. roger mcnamee.. who partnered up with me back in 2017.. getting first senate hearings happening.. et al.. we’re putting.. i’ve dedicated my life to this.. and i say that not as ego.. this is such a massive problem.. there should be an army of people who are trying to ..devote to this..  i mean .. i didn’t profit from this.. i didn’t actually build these systems..


n: well .. roger mcnamee did profit.. and maybe he’s supporting you with the million bucks.. but he made a billion from burning it.. figuratively speaking.. and that’s the ie of many people there who are supporting you.. and i’m supporting you too.. and you didn’t profit from it.. but they profited exponentially and now they’re trying to change it linearly concern.. can’t have it both ways

50 min – t: will this is the ineq in general in philanthropy.. if you look at winner take all book.. i like the way you are framing it.. you can’t have linear solutions to exponential problems

winners take all

51 min – t: that’s the only way we’re going to solve this.. to balance out the balance sheet.. i love what you’re saying

so not true.. we’ve been trying that for years.. we need to let go of balancing acts.. and just focus on listening.. we need to leap to another way to live.. sans money (any form of measuring/accounting/balancing)

this is why we haven’t yet gotten to global equity .

t: we are trying to call those other technologists to step forward and to do that.. i share your concern.. i’ve been working on this for 7 yrs.. and it’s only in the last 2 yrs that anything has ever happened.. the bad news is that there’s far fewer people who are in this position as ex technologists.. calling these problems out and being honest/direct.. not trying to sugar coat it.. the good news.. there’s far more today than a year and a half ago

not good enough.. we don’t need ex techs.. we need everyone..  today

(which we are totally capable of now)

53 min – t: all hands on deck.. a very existential moment.. it’s so clear..

54 min – t: one of the things i am most interested in is how could sv bring its own resources with weight to bare on being the mass coordination/infra for adaptation/litigation/restoration/reversal.. on climate.. not that they can invent the solutions.. but they’re the sense/choice making structure for 2b people.. and as people see the accelerating harms..when you really take this in you say.. this isn’t the new normal (on climate).. this is the best it will ever be.. right now

distraction man.. start/stay with first part of this comment.. how to be mass coord/infra..?

ie: 2 convers as infra via tech as it could be..

n: so what do we do

t: we see where the trend lines go.. we need to be on a ww3 war funding.. and anybody can participate in that.. there are lots of people who see these structural problems and some people have plans of how to fix it.. but it’s a big complicated process.. a lot of people feel powerless because you have to start understanding so many diff systems

no.. that’s not the way.. the way is already in each one of us.. we just need (perhaps a ww3 funding level/focus) on listening to every single one of those voices.. everyday.. using that as data (/self-talk as data via tech as it could be).. to augment our interconnectedness

55 min – t: what is enough to act in service right now.. in the case of tech.. we need to make sure that we’re protecting/repairing our info ecology

oy.. we need to augment our interconnectedness.. because 1\ info is at best secondary  2\ all info now is illegit as it has been taken from ie: whales in sea world

t: and the quality of our sense making.. this involves a full stack effort.. we have to put pressure.. and ultimately abandon this business model of treating humans as extractable resources.. which leads to downgrading.. which means getting of the advertising business model.. which is equivalent to getting off fossil fuels


57 min – n: how do you reverse the reward system.. of ie: apps

58 min – t: our problems aren’t going anywhere..  we’ve been saying the same optimistic things.. and it got us to this point.. whatever /moral philosophical view that we’ve had has clearly been insufficient.. and we need a totally diff system that incentivizes diff things..t

thinking we need incentives is a major element to that insufficient/cancerous system.. red flag that we’re doing it wrong..  we need totally diff.. not diff of the same

59 min – t: which includes diff economics.. i think.. here’s a radical idea.. if big tech.. should be incentivized to serve public interest

not radical

1:00 – n: i don’t see how that’s helping.. singularity u did that.. then big scandal.. they are a benefit corp.. and in fact they’re worse than ever

incentives/money (any form of measuring/accounting).. red flag we’re doing it wrong

n: if you screw up the be part (of be exponential) it doesn’t matter how good you are at the exponential part.. and that’s my concern here .. is that sv is the best at being exponential.. but they’re the worst at being

1:01 – t: exactly.. which is why we need a philosophical/wisdom revolution more than a tech rev..

t: we need something that marries daily action/interest with basically reversing the honest appraisal of the current state of affairs.. t.. which is catastrophic.. if that was the daily incentive.. if that’s what we all did.. that is what happens during wars.. everybody orients around .. not let’s max the econ.. but everybody is going to work to make sure that we’re serving our survival

war and survival.. not good ie’s here..

we can do better – ie: 2 convers as infra

t: this is just a diff kind of war.. we say the way for humanity to win the final war is to make peace w ourselves.. w our own weaknesses/denial.. we have to see that that’s (not looking at real problems) is what we’re doing

1:02 – t: one of my most profound influences was when i read this 600 pg book on denial and how powerful denial is.. we are horrible at being in touch w reality.. if we make peace that that is our native functioning..we can change that

that’s native to whales in sea world perhaps.. but not to us.. what we need to do is realize..we’re in sea world.. and get out

1:03 – and/or follow me on twitter.. we’re trying to build a movement..


1:05 – n: let’s do basic ethics that you can apply to your day.. everyday

best way for that nikola is via people free enough to listen to the ethics deep w/in them.. everyday..

t: i come to work w humility everyday.. any kind of belief that you know what the right thing is is always dangerous.. esp if you can exponentiate that.. the challenge we have is recognizing that.. we don’t want people to have exponentiating power and also reckon w the fact that that’s where we are..

exponentiating power is fine.. if it’s natural.. ie: via gershenfeld something else law

the energy (power) of 7bn alive people toward/within an undisturbed ecosystem

t: so now that that’s where we are.. what are the wise actions to take.. it’s kind of like kemo.. killing the tumor while saving patient.. what are the actions that make sense given all that..

a means for 7b to leap to a nother way

1:07 – t: (most vital thing) i want people to understand what is going wrong with tech as an interconnected system of harms.. that we don’t have addiction/isolation happening separately from people believing in more conspiracy theories.. so an interconnected system of harms that’s a kind of social climate change.. have to approach it systematically.. root case of that is going to fixing the business model of advertising.. so long as everybody’s on that page we can actually work together to make that transition from extractive based on ads.. to something that’s regenerative and per your words .. ethical

1:09 – t: 42 min podcast – your undivided attention – is the primary vehicle we hope to get people both inside/outside sv.. and explore solutions together.. what would it mean topic by topic

dang.. topic by topic won’t cut it.. not part\ial.. for (blank)’s sake


the social dilemma (doc)


via tristan tweet []:

“loneliness becomes largest national security threat” @aza

our obsession with ‘knowing ness’ over ‘connected ness’ is a cancerous distraction..

notes from rest of 70 min video here: tristan and aza on ai