Press "Enter" to skip to content

Pro-Social Machines | Ana Paiva | TEDxIST


I’m going to talk to you about

pro-social machines but before I start

let me tell you a little bit about me

when I was a kid I wanted to be an

astronaut you know I love science

fiction I wanted to go to the moon to

Mars to really explore the space of

course I couldn’t and in the end I ended

up in technical doing electro technical

engineering I have to say I didn’t like

to degree very much but in the end I

found something that really inspired me

I found AI artificial intelligence this

idea that we can create machines that

are intelligent that get inspiration by

what we are as humans our intelligence

and put that into machine was something

out of science fiction so I started

working in AI and over the years AI has

gone up and down and recently there’s

been a lot of discussion about what AI

can do what has achieved and the dangers

of artificial intelligence yet I’m very

positive and an optimist and in fact I

think AI can have a big role in creating

a better society by creating what I call

pro-social machines so what is social

machines we live in this connected world

where our actions influence the

environments the people will work our

friends our families and our colleagues

and in general the society and is this

influence that we have between us this

kind of relationship that exists that

kind of make us humans however every day

when we watch the news

when we read in fact when we walk in the

streets we are we see there’s signs of

like a fantasy of north compassion

environment so not caring in fact Obama

even talked about the lack of empathy in

the world in fact humans face these

dilemmas that are unprecedented scale

antibiotic resistance climate change and

increasing inequality so how do we solve

these problems well we have to address

it from population specific analysis we

have to look at the society at the

dynamics of the society in order to

really see how to improve it but what is

a society you know with social media

society is not only us is all the

connections that we have and the

influence that we have from each other

through all kinds of technological means

in fact our society is changing as

machines like robots and I are at our

doorsteps influencing interacting with

as being our partners our friends our

drivers with autonomous cars and even

our companions so let me show you a

little bit about what these machines

these robots are doing together with

humans and this is research from my lab

you

[Music]

this cats plays chest and is a friend of

one of them

he understands them

and it’s like a companion this one helps

people example

[Music]

why

and he has emotion

these one has emotions and he plays

cards you please wake up

[Music]

other people really like to play with

[Music]

this was a running suit case that we

tested how people trusted

this one we have it in the hospital to

interact with children with autism

[Music]

this one is a tutor

[Music]

it perceives the situation it helps it

manages the learning and it’s like a

[Music]

so in reality what we have now is a

society where both us humans and

machines are part and these machines

become very intelligent so we have

humans diminished and machines so

pro-social machines were are what they

are machines that support and promote

actions that benefit the society and our

challenge of AI is how do you nurture or

merge these pro-social machines to make

a better society and to make humans help

each other and be together so we need to

design these robots these agents these

BOTS that emerge immersed in our world

is our going to promote collective

action are going to be pro social in

situations where naturally collaboration

and pro-sociality wouldn’t naturally

arise so let me give you an example of

the bystander effect does anyone know

about it bystander effect

okay I’ll tell you so imagine you have a

situation that you see an aggression

someone on the street being hit by a car

and you’re there alone the likelihood is

that you would act that you do something

about it but if there’s two or three or

many people the likelihood that you

would act would decrease okay and why

there are several processes that exist

in this bystander effect

the first one is called the audience

inhibition so you’re there nobody’s

acting so you don’t know anything about

it RNA can be some something on TV you

shouldn’t the other ones are not doing

it the second one is social influence

you know okay they’re not doing anything

so it’s not mine

roll to do and finally is the diffusion

diffusion of responsibility which is

well I’m not to blame there are so many

people here so why should I be

responsible okay so these three

processes are there and make us not act

not report cases we see cyber moving we

don’t act we see things that really

makes it Angry I don’t act so how can we

change this okay that’s the idea we’re

going to put machines we’re going to use

AI to change now in this society what

does it happen if we have now machines

there are intelligent there are

autonomous sometimes they even look like

us well you have a camera you think oh

well I know the robots and they drown

the drown is going around or another

robot so what do you do well you may

think well there’s cameras there’s the

robots I should not stay there okay or I

can say it’s not my problem

it should be ice the cameras are from

the government it’s someone else’s

problem I’m not going to act so machines

may be even making it worse but if we

make them intelligent and pro-social

then we can act then can make it better

so what these machines can do well they

can intervene they can act themselves

they can make people act so they

intervene and at some point in a way

they intervene people start acting and

that’s the goal of pro-social machines

so how we’re going to engineer there’s

an equilibrium and is to do with this

equilibrium idea so there’s an

equilibrium that we need to find and

there’s an equilibrium there in the

society that makes people not acting

and we need to challenge that

equilibrium we need to kind of force

something to change to unbalance this

established way of acting or not acting

so how to do that well first of all we

need to look at transparency

our machines especially AI machines need

to be transparent needs to justify their

actions need to make it clear what

they’re doing and why so that people can

understand why they should act as well

the second one we can enforce in these

machines some pro social norms some

norms that make them really good agents

and good robots and that given social

influence may lead people to to also act

in a certain way in fact there’s a world

of AI and people doing ethics in AR

exactly in this idea where we embed some

ethical norms or some ethical

constraints in the way we program our

machines so that’s extremely important

the second one the third one is we need

to kind of put some pathological

behaviors there so a machine that always

helps may be a good thing in a

population because it propagates this

kind of feeling of help so by creating

pathological behaviors pro-social ones

who may imbalance this equilibrium that

exists in the society and finally and

we’ve been working a lot on this we can

create in these machines some empathic

please the social influence okay well

we’ve had several projects some of the

projects address issues for bullying we

had a project called fear not to address

to help kids victims of bullying or to

train people how to be social cultural

sensitive to look at cultural

sensitivity but we can also simulate

societies using evolutionary game theory

and and create agents there that are

like artificial agents that will allows

us to study what happens at a society

level and we can build robots that are

empathic and are going to really help

and interact with people in that way in

fact this is the people project that’s

my students in the course of social

robotics did this last semester where

they studied the impact of rubbish bin

in the process of behavior picking up

rubbish and we had two conditions one

where the robotic rubbish bin didn’t act

into anything

and the other one where it did act and

nudged people to pick up rubbish and

there’s a really big difference in the

way that people act in the first one not

even 10% of the people picked up the

rubbish in the second one the second

condition they actually started picking

up rubbish and it is so simple that we

can do and yet can make such a big

difference so how can we engineer

pro-sociality and caring in a society I

think AI artificial intelligence has a

role there by creating these pro-social

machines that’s my goal and I hope that

it will be your goal as well after this

thank you

you [Applause]

Please follow and like us: