kate darling

[boston, ma]
intro’d to Kate here:
Ethan asking – how to make drones with ethical qualities… asks about open.. (otherwise we’re saying… keep it private or else chinese will have more ethical drones than us)
what do you mean by ethics….
interesting that the topic of ethics isn’t taken further than how we feel about each other.. Kate said – hadn’t much thought about the definition beyond that (which is a great simple definition that i love.. in a good world) but as is now- we force beliefs… esp with ed system… so for that definition we need authenticity .. no?
surveillance, intellectual ownership, liability
adapting behavior for patients – sound like personal fabrication
sherry turkle’s research – children do distinguish between stuffed animals and robots – perhaps movement
lacking of empathy goes many levels..
it’s not about the thing.. it’s about the behavior
from me – i don’t get why there is so much talk about policy.. is that not the problem..
toward end Kate says – depends on what you are trying to serve with a justice system..
http://civic.mit.edu/blog/mstem/thinking-ethically-about-our-relationships-with-social-robots
_________________
august post/talk on robot ethics:
re:publica 2013 – Robot Ethics
Published on May 9, 2013
ethics… happening now and in the next 10 yrs: safety, privacy, social issues
anthropomorphisation – our tendency to project life-like qualities onto robotic objects
_________________
find/follow Kate:
like this:
@grok_
_________
Managing Disruption (@DisruptionConf) tweeted at 9:09 AM – 7 Apr 2017 :
Think humans treat robots just like any other device? Think again. @grok_ explains how humans interact with robots. #MTD17 https://t.co/y6tPvO7A42 (http://twitter.com/DisruptionConf/status/850364997901406208?s=17)
30 sec video – that people react diff than other devices.. something you can harness
______________
At @CFAinstitute, Lab researcher @grok_ set #AI in a human frame. “I’m not worried so much about robots developing their own agenda and taking over the world. I’m a little bit more worried about people and how we decide to use the robots as a society.” https://t.co/qoONPZpalM
Original Tweet: https://twitter.com/medialab/status/998660980488966144
Instead of buying into the hysteria of an impending robot revolution, Darling believes we should be thinking about how we as a society use robots, and what implications this has for data collection.
let’s use them/it to connect us.. so .. data collection: daily self talk
She says artificial intelligence (AI) needs “massive sets of data” to learn, and so there are a lot of incentives for companies to collect as much data as possible..t
massive sets need to be legit.. currently.. very little is… so like: (massive) x (0)
“Over the long term, we are going to see privacy massively eroded because of this.. ” Darling said..t
not if self-talk combo’d we gershenfeld sel
She acknowledged that there are some areas where robots are much better than humans. “They can do math,” she said. “They can remember everything. They can work tirelessly on an assembly line. They can recognize patterns in data. They can beat us at Go and at Jeopardy.”
But, there are areas where they lag… ie: deal with anything unexpected that happens, the robots are woefully and hopelessly lost,
Darling’s work explores the emotional connection between people and lifelike machines.. One question she is especially interested in exploring is whether we can change people’s empathy using robots and how interacting with very lifelike machines influences people’s behavior in both good and bad ways
if we use mech/robot to listen to daily self talk.. good ways
____________
Quinn “I probably should be writing now” Norton (@quinnnorton) tweeted at 6:48 AM on Thu, Sep 05, 2019:
The thing i am most frustrated by right now is the assumption the rest of the money is perfectly fine. https://t.co/Mo4fkWIGig
(https://twitter.com/quinnnorton/status/1169593176224497670?s=03)
Quinn “I probably should be writing now” Norton (@quinnnorton) tweeted at 6:50 AM on Thu, Sep 05, 2019:
It’s not like MIT ever stopped taking Saudi money or DOD money. I guess they mostly murder kids without touching their private parts, so that’s fine.
(https://twitter.com/quinnnorton/status/1169593584984547328?s=03)@chengela: Two updates to this story. First, I want to emphasize that @grok was the one who stood up to Nicholas Negroponte. She deserves credit for speaking out.
@grok: Next time I’m thinking of putting my career on the line by speaking truth to power and ugly-crying in front of 100 people, I’ll try to remember that a man will get credit for it in the press.
@grok: First of all, I’d be a terrible director, but second of all, now would be a good time to read my Guardian piece on why I want @Joi to stay and step up.
https://www.theguardian.com/commentisfree/2019/aug/27/jeffrey-epstein-science-mit-brockman
Jeffrey Epstein’s influence in the science world is a symptom of larger problems
I can count on one hand the real male allies in my world: people I have repeatedly seen stop and listen to the voices of the marginalized, without getting defensive. People I have witnessed throwing their weight around behind the scenes, at personal expense to themselves, for no reason other than to do the right thing. One of those people is Joi Ito, the director of the MIT Media Lab.
While the role he played was far from John Brockman’s, it was hard not to feel that my whole professional environment had been complicit.
Because the complicity goes all the way up, these problems require people with great power to fix them. Ultimately, I no longer believe that I can enact true change without the help of powerful allies. In my experience, one of the few people who is even capable of enacting change at MIT is Joi Ito. I hate what he did and I do not defend his actions. But I also know that he may actually act to fix his mistakes. Over the past eight years, I’ve observed him listen, introspect, and take action, even where it would have been easier for him to stay the course.
Men like Joi need to step up, and step up hard.
This is why I am leaving Brockman as soon as I’ve fulfilled my contractual obligations, but staying at the Media Lab. The Brockmans of the world are uninterested in change; Joi Ito has the humility to understand that change is imperative..t Staying is a hard decision. I’m worried that change won’t come easily. And I’m worried that I am again missing the line between working from within and being complicit.
perhaps we take this opp to go even deeper than mit/money.. and focus on a means to listen to every voice everyday.. ie: 2 convers as infra via tech as it could be
mboya/rogers can you hear me law .. because mufleh humanity law: we have seen advances in every aspect of our lives except our humanity– Luma Mufleh
_____________
via 2023 tweet [https://x.com/grok_/status/1706677362740367868?s=20]:
It’s official! I’ve joined the Boston Dynamics AI Institute to lead their ethics & society research initiative. I’ll keep MIT as an academic home, but I’m thrilled to be building my own team at a cutting-edge robotics research institute.
..The Institute’s purpose is research. Our goal is to solve fundamental challenges in robotics and publish work so that everyone can benefit. Ethics & society is one of the four core areas.
____________
__________
_____________
______________






