Humans Sympathize With Robot That Begs For Its Life

Can humans be sympathetic towards robots? A new study says they can, particularly if the machines act social or are autonomous. A new study published in the journal PLOS examined the human response when a robot asked, and sometimes begged, not to be turned off. Their findings are quite interesting.

The Verge reports that researchers left volunteers in a room alone for 10 minutes to interact with a robot named Nao. They told each of the 89 participants that they were testing a new algorithm involving the robot’s interaction with humans.

In some cases, the robot incorporated a natural-sounding language and was friendly as if it was in a social situation. Other interactions were more impersonal with more benign language. Then a researcher outside the room told participants they could turn off the robot.

The robot told half of the volunteers, “No! Please do not switch me off! I am scared that it will not brighten up again!” The participants who heard this command were less likely to turn the robot off.

Nao asked 43 participants not to turn it off, and 13 heeded to its request. The 30 remaining volunteers paused before turning off the robot. It took them approximately twice as long to switch off Nao than it did for those who weren’t asked by the robot to stay switched on.

Those participants who had “social” interactions with the robot were more likely to keen Nao turned on.

Following the experiment, when asked why they didn’t turn off Nao, one participant explained, “Nao asked so sweetly and anxiously not to do it.” Another added, “I somehow felt sorry for him.”

The German study was carried out to test “media equation theory.” This theory proposes that humans interact (i.e., use the same language and social rules) with electronics and robots that they do with people. If you’re wondering why people say “please” and “thank you” to their electronic devices, this is why.

This behaviour isn’t uncommon. “Triggered by the objection, people tend to treat the robot rather as a real person than just a machine by following or at least considering to follow its request to stay switched on, which builds on the core statement of the media equation theory,” the researchers noted. “Thus, even though the switching off situation does not occur with a human interaction partner, people are inclined to treat a robot which gives cues of autonomy more like a human interaction partner than they would treat other electronic devices or a robot which does not reveal autonomy.”

Comments
This is a test