kate darling

kate darling bw

[boston, ma]

intro’d to Kate here:

kate darling talk

Ethan asking – how to make drones with ethical qualities… asks about open.. (otherwise we’re saying… keep it private or else chinese will have more ethical drones than us)

what do you mean by ethics….

interesting that the topic of ethics isn’t taken further than how we feel about each other.. Kate said – hadn’t much thought about the definition beyond that (which is a great simple definition that i love.. in a good world) but as is now- we force beliefs… esp with ed system… so for that definition we need authenticity .. no?

surveillance, intellectual ownership, liability

adapting behavior for patients – sound like personal fabrication

sherry turkle’s research – children do distinguish between stuffed animals and robots – perhaps movement

lacking of empathy goes many levels..

it’s not about the thing.. it’s about the behavior

from me – i don’t get why there is so much talk about policy.. is that not the problem..

toward end Kate says – depends on what you are trying to serve with a justice system..

http://civic.mit.edu/blog/mstem/thinking-ethically-about-our-relationships-with-social-robots

_________________

august post/talk on robot ethics:

kate on robot ethics

re:publica 2013 – Robot Ethics

Published on May 9, 2013

ethics… happening now and in the next 10 yrs: safety, privacy, social issues

anthropomorphisation – our tendency to project life-like qualities onto robotic objects

_________________

find/follow Kate:

link twitter

like this:

@grok_

link facebook

kate darling about me

_________

Managing Disruption (@DisruptionConf) tweeted at 9:09 AM – 7 Apr 2017 :

Think humans treat robots just like any other device? Think again. @grok_ explains how humans interact with robots. #MTD17 https://t.co/y6tPvO7A42 (http://twitter.com/DisruptionConf/status/850364997901406208?s=17)

30 sec video – that people react diff than other devices..  something you can harness

______________

At @CFAinstitute, Lab researcher @grok_ set #AI in a human frame. “I’m not worried so much about robots developing their own agenda and taking over the world. I’m a little bit more worried about people and how we decide to use the robots as a society.” https://t.co/qoONPZpalM

Original Tweet: https://twitter.com/medialab/status/998660980488966144

Instead of buying into the hysteria of an impending robot revolution, Darling believes we should be thinking about how we as a society use robots, and what implications this has for data collection.

let’s use them/it to connect us.. so .. data collection: daily self talk

She says artificial intelligence (AI) needs “massive sets of data” to learn, and so there are a lot of incentives for companies to collect as much data as possible..t

massive sets need to be legit.. currently.. very little is… so like: (massive) x (0)

“Over the long term, we are going to see privacy massively eroded because of this.. ” Darling said..t

not if self-talk  combo’d we gershenfeld sel

She acknowledged that there are some areas where robots are much better than humans. “They can do math,” she said. “They can remember everything. They can work tirelessly on an assembly line. They can recognize patterns in data. They can beat us at Go and at Jeopardy.”

But, there are areas where they lag… ie: deal with anything unexpected that happens, the robots are woefully and hopelessly lost,

Darling’s work explores the emotional connection between people and lifelike machines.. One question she is especially interested in exploring is whether we can change people’s empathy using robots and how interacting with very lifelike machines influences people’s behavior in both good and bad ways

if we use mech/robot to listen to daily self talk.. good ways

______________

Advertisements