"There is a lot of emotion when interacting with a robot"

“It's puzzling how people respond socially to things that are actually machines; there is a lot of emotionality and sociability when interacting with a robot; how can these machines be treated as if they were living people?”.

Oliver Thansan
Oliver Thansan
29 May 2023 Monday 05:19
4 Reads
"There is a lot of emotion when interacting with a robot"

“It's puzzling how people respond socially to things that are actually machines; there is a lot of emotionality and sociability when interacting with a robot; how can these machines be treated as if they were living people?”. This was the approach that led Stanford psychology professor Herbert Clark and his collaborator Kerstin Fischer, an expert in the interaction between language and technology, to investigate how humans interact socially with machines.

And their conclusion is that people interpret robots that are designed to interact with humans as representations of characters, similar to what they do with theater actors, a ventriloquist's puppets, or puppets. This is a controversial point of view and has been disputed by other colleagues, but Clark and Fischer argue that people are aware that social robots are made of wires and sensors, they know that they are character-shaped machines, but when they interact with them most are willing to treat them as the characters they represent.

Byron Reeves, also a Stanford researcher in the area of ​​Communication, believes that it is true that people sometimes treat robots as representations, but he thinks that the response is natural, not fake, that they often react as if they were real without have given time to the reflective thinking that is an interpretation.

According to Clark, the consequence of interpreting robots as characters is that you tend to overestimate their capabilities, even when they are very limited. "The fact that a robot can entertain a child or teach him mathematics does not mean that it can watch over him and prevent him from going out on the balcony or from suffocating", he exemplifies.

But it also means, according to the psychologist, a positive view of people who interact with machines as if they were real characters. "Our model normalizes this behavior and we don't need to assume that those who act like this are because they are lonely, confused or have some kind of deficit", the researchers point out.