tristan harris

tristan harris

intro’d to Tristan here:

Distracted? Let’s make technology that helps us spend our time well | TEDxBrussels|2014

on time we spend

as the day

we need to restore choice.

we want to have a relationship with technology that gives us back choice.

we need help from designers.. because knowing this doesn’t help..

we’re bulldozing each other’s attention left and right – 23 min to refocus – conditions and trains us to interrupt ourselves, every 3.5 min

5 min – nancy puts a message out.. john sends message..

like the necklaces in the be you house

goal of chat: easy to send message..  let’s change it to – let’s create highest quality communicaiton


find/follow Tristan:

link twitter

I work on Design Ethics @ Google. Also entrepreneur, design thinker, philosopher.


pay attention to world clive

– – –

app\chip ness

watch as verb

revolution of everyday life

a nother way


how tech hacks mind.. aka: spinach or rock ness

Western Culture is built around ideals of individual choice and freedom. Millions of us fiercely defend our right to make “free” choices, while..

we ignore how we’re manipulated upstream by limited menus we didn’t choose.

This is exactly what magicians do. They give people the illusion of free choice while architecting the menu so that they win, no matter what you choose.

I can’t emphasize how deep this insight is.

The “most empowering” menu is different than the menu that has the most choices. But when we blindly surrender to the menus we’re given, it’s easy to lose track of the difference

If you’re an app, how do you keep people hooked? Turn yourself into a slot machine.

So when Marc tags me, he’s actually responding to Facebook’s suggestion, not making an independent choice.


somebody’s fb share (don’t remember who) – article on Justin and Tristan and Roger.. and others.. 2017

Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Paul Lewis reports on the Silicon Valley refuseniks alarmed by a race for human attention

like Rosenstein, several years ago put in place the building blocks of a digital world from which they are now trying to disentangle themselves. “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”

Harris is the student who went rogue; a whistleblower of sorts, he is lifting the curtain on the vast powers accumulated by technology companies and the ways they are using that influence.

“I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.”

let’s try this..

ie: hlb via 2 convos that io dance.. as the day..[aka: not part\ial.. for (blank)’s sake…]..  a nother way


Tristan Harris (@tristanharris) tweeted at 10:18 PM – 7 Jan 2018 :

ICYMI: I HIGHLY recommend reading Roger McNamee’s huge piece in Washington Monthly detailing my work with him, @noUpside (and so many others behind the scenes) to defend democracy from the “maximal manipulation” model of social media.

I recommend that Facebook, Google, Twitter, and others be required to contact each person touched by Russian content with a personal message that says, “You, and we, were manipulated by the Russians. This really happened, and here is the evidence.” The message would include every Russian message the user received.

This idea, which originated with my colleague Tristan Harris, is based on experience with cults. When you want to deprogram a cult member, it is really important that the call to action come from another member of the cult, ideally the leader.

Eighth, and finally, we should consider that the time has come to revive the country’s traditional approach to monopoly. Since the Reagan era, antitrust law has operated under the principle that monopoly is not a problem so long as it doesn’t result in higher prices for consumers. Under that framework, Facebook and Google have been allowed to dominate several industries—not just search and social media but also email, video, photos, and digital ad sales, among others—increasing their monopolies by buying potential rivals like YouTube and Instagram. While superficially appealing, this approach ignores costs that don’t show up in a price tag. Addiction to Facebook, YouTube, and other platforms has a cost. Election manipulation has a cost. Reduced innovation and shrinkage of the entrepreneurial economy has a cost. All of these costs are evident today. We can quantify them well enough to appreciate that the costs to consumers of concentration on the internet are unacceptably high.

how about we make monopoly irrelevant by modeling a nother way to live.. sans money


part of time well spent team

[new name.. center for humane tech.. so added new page.. and will be adding updates there..]


via Howard fb share:

Early Facebook and Google employees form coalition to fight what they built

“The largest supercomputers in the world are inside of two companies — Google and Facebook — and where are we pointing them?” Mr. Harris said. “We’re pointing them at people’s brains, at children.”

The new Center for Humane Technology includes an unprecedented alliance of former employees of some of today’s biggest tech companies. Apart from Mr. Harris, the center includes Sandy Parakilas, a former Facebook operations manager; Lynn Fox, a former Apple and Google communications executive; Dave Morin, a former Facebook executive; Justin Rosenstein, who created Facebook’s Like button and is a co-founder of Asana; Roger McNamee, an early investor in Facebook; and Renée DiResta, a technologist who studies bots.

He said the people who made these products could stop them before they did more harm… “This is an opportunity for me to correct a wrong,” Mr. McNamee said.


Saul Kaplan (@skap5) tweeted at 6:02 AM – 7 Feb 2018 :

Tech titans are ‘good people guided by a very bad business model’. @tristanharris #HumaneDesign (

thinking.. good people guided by a business model

10 day cares .. killing us.. et al


My opening keynote from #Dreamforce18 last week in SF is now live: “How to Stop Technology from De-stabilizing the World”- explains how technology tilts the global social fabric in dangerous directions and how we can steer away:
Original Tweet:

5 min – tech is its own force pushing culture in a particular direction.. and we can predict what that direction is and steer it differently

tech is persuasive.. magic shows that you can manipulate

6 min – instagram founders and i studied persuasion at stanford.. and how to manipulate human behavior

one thing that’s not going away.. and that’s the race to capture human attention.. there’s only so much attention out there and we have to capture it

7 min – it becomes this race to the bottom of the brain stem.. who’s going to gain attention by going lower on psych persuasion stack than someone else

9 min – persuasion today – in your phone – is a supercomputer.. figuring.. what can i play next to keep you here.. ie: 70% of youtube traffic is based on what videos their algos recommend

10 min – youtube followers the size of islam .. fb followers the size of christianity.. except both are run by algo’s w bias toward sensational/radical

11 min – so at scales of 1.9 bn youtube is tilting the scales toward what is most radicalizing..

15 min – what to do about it..? go grayscale.. but that didn’t work.. so much deeper than that.. much like saying ban straws.. we need to think more systemically than that.. we need to change the system.. think more deeply about what’s really going on here..t

lacking maté basic needs

tech could facil that restoration..

16 min – so excited to go into space.. took us 10 yrs before we turned the camera back and took photo of earth.. just like tech.. want to build the latest thing out there.. don’t want to look at selves.. t

imagining tech as it could be.. listening to all the voices (self-talk as data) .. everyday

17 min – humane tech – how we fix this problem.. looking at an honest appraisal of human nature

18 min – way to fix this problem is to have a more honest view of human nature.. that instead of drilling attention out of people’s brain.. race to bottom.. extracting.. that there’s actually something to protect about how it works

2 needs

we want to re align tech w the 21st understanding of human nature.. ie: customer always right; voter knows best.. if those things are up for negotiation it’s as if you’ve debased the source of authority in a democratic society.. t.. so we’re going to have to protect it..

actually.. if still talking costumer/voter.. et al.. debased human nature..  so going to have to think way differently..

mufleh humanity lawwe have seen advances in every aspect of our lives except our humanity – Luma Mufleh

18 min – ie’s: we’re going to have to protect the ways that kids develop their id’s..t

human nature begs we render id irrelevant..  ie: marsh label law

19 min – also protect agains the mass manipulation of truth..and realize that ww3 is happening in info space.. changing entire belief system of whole culture.. t

that happened long was already non legit
ie: data on people who aren’t themselves.. much like looking at data of whales in sea world

20 min – we can also ask.. what does it mean to empower ourselves.. diff between tech we like and tech we don’t.. in 80s steve jobs said mac was a bicycle for our mind

humane tech is possible if you’re looking at .. how do we leverage human strengths..t

ai/tech humanity needs..

augmenting interconnectedness

but first have to get to individual’s daily curiosity

have to get quiet enough to hear that

21 min – facetime is a great ie of humane tech.. we’re built for empathy.. thru voices/eye-contact

we also have to get better at understanding what people are really after..t

yeah that.. imagine tech that listens to 7bn voices..2 convers.. as infra . ie: 2 convers as infra

cure ios city

instead of youtube trying to max what keeps you on screen.. what if we asked you.. what are you actually here for when you’re watching this ukelele video..t

what if we didn’t ask.. we just listened..first thing each day..  2 convers.. as infra.. ie: 2 convers as infra

it’s understanding that we can easily misunderstand each other in digital/text communication

that’s why going to basic.. to cure ios city.. and via idio-jargon/self-talk as data matters.. and getting a go at that everyday (equity)

22 min – where are we really good at deep convos.. in person.. you could have new options on the menu.. for where to talk in person..t

3 min self talk.. enough data to match you with a local(s) w same curiosity

we’re vulnerable to learned helplessness.. if you give us a stimulus.. you’re creating exponential learned helplessness.. people experiencing big global problems they can’t do anything about.. what if instead you said.. here’s great things being done on ie: plastics in ocean

better.. but not deep enough to last.. begs we have our baseline being daily curiosity

23 min – so how to transition to this.. this goes against the business model..

24 min – 4 one actually wants this – show them and they don’t want it  4 levers.. 1\ activate public 2\ activate employees of tech com’s  3\ activate pressure from govts  4\ inspire alternative: humane tech

25 min – 72% of teens thing they’re being manipulated by tech

time well spent..

time well spent

27 min – baby steps in right direction..  now a race to top about who can care more about society.. change is happening thru awareness alone

this will take all of us