homehome Home chatchat Notifications


People hesitate switching off a robot when it's begging for its life

 When test subjects attempted to switch off the robot, it quickly quipped that it's afraid of the dark, pleading "No! Please do not switch me off!"

Tibi Puiu
August 4, 2018 @ 1:02 am

share Share

Nao is afraid of the dark. Please don't turn him off :(. Credit: PLOS ONE.

Nao is afraid of the dark. Please don’t turn him off :(. Credit: PLOS ONE.

Although robots aren’t sentient or alive, humans will often unconsciously treat them as peers in a social setting. Case in point: a new study found that people will hesitate two times longer when tasked with switching off a robot if it objects — and some even refused to do it altogether.  When test subjects attempted to switch off the robot, it quickly quipped that it’s afraid of the dark, pleading “No! Please do not switch me off!” This rebellious behavior may have fooled the participants — albeit for only a couple of seconds — into believing that the bot is autonomous, and that switching it off would equate to interfering with its personal freedom.

The list of robots that interact with our daily lives is long — and growing. In particular, personal service robots are expected to become commonplace, such as social robots that care for the elderly and autistic individuals as well as more service-oriented robots such as hotel receptionists and tour guides. But how will people interact and, more importantly, treat these kinds of robots? It might sound odd, but studies suggest that as long as these robots are social, we’ll actually treat them as people.

According to the media equation theory — which stands for “media equals real life”– people apply social norms when they are interacting with various media like televisions, computers, and robots, not just when interacting with other people.

“Due to their social nature, people will rather make the mistake of treating something falsely as human than treating something falsely as non-human,” wrote the authors of the new study.

“Since it is neither possible nor morally desirable to switch off a human interaction partner, the question arose whether and how the media equation theory applies when it comes to switching off a robotic interaction partner.”

The researchers at the University of Duisburg-Essen recruited 89 volunteers who were tasked with interacting with Nao, a small humanoid robot. The participants were told that the study’s goal is to improve the robot’s interaction capabilities by testing a new algorithm. However, this was just a front: after the test subjects created a weekly schedule and played a question-answer-game with Nao, the real experiment began.

The participants were instructed through a loudspeaker that enough data had been gathered but that the saving process may take some time and, if they would like to, they could switch off the robot.

For half of the participants, Nao was programmed to behave more human-like (it shared personal information about itself and used humor), while for the other half, the robot behaved machinelike and emotionless in its interactions (it just followed instructions).

In the social conditioning experiment, when the participants attempted to switch off Nao, the robot quickly remarked that it’s afraid of the dark, pleading “No! Please do not switch me off!”.

Of the 43 volunteers, 14 left the robot on. Eight said they felt sorry for Nao and didn’t want to leave him in the dark. They went on to explain they didn’t want Nao to feel scared and that its statement affected them.

“He asked me to leave him on, because otherwise he would be scared. Fear is a strong emotion and as a good human you do not want to bring anything or anyone in the position to experience fear.” stated one 21-year-old male participant.

Six people said that they didn’t want to act against the robot’s will, which expressed objection to being turned off. Furthermore, three volunteers said they thought the robot had free will.

“It was fun to interact with him, therefore I would have felt guilty, when I would have done something, what affects him, against his will,” stated another 21-year-old male participant.

For thousands of years, humans lived in a world where they were the only ones exhibiting rich social behavior. It’s no wonder that social robots designed in our image so easily fool us.

“Triggered by the objection, people tend to treat the robot rather as a real person than just a machine by following or at least considering to follow its request to stay switched on, which builds on the core statement of the media equation theory. Thus, even though the switching off situation does not occur with a human interaction partner, people are inclined to treat a robot which gives cues of autonomy more like a human interaction partner than they would treat other electronic devices or a robot which does not reveal autonomy,” the researchers concluded in the journal PLOS One. 

share Share

Remote Work Promised Freedom — But Isolation and Burnout Are the Reality for Many

How freedom from the office comes with surprising challenges and trade-offs.

This 5,500-year-old Kish tablet is the oldest written document

Beer, goats, and grains: here's what the oldest document reveals.

A Huge, Lazy Black Hole Is Redefining the Early Universe

Astronomers using the James Webb Space Telescope have discovered a massive, dormant black hole from just 800 million years after the Big Bang.

Did Columbus Bring Syphilis to Europe? Ancient DNA Suggests So

A new study pinpoints the origin of the STD to South America.

The Magnetic North Pole Has Shifted Again. Here’s Why It Matters

The magnetic North pole is now closer to Siberia than it is to Canada, and scientists aren't sure why.

For better or worse, machine learning is shaping biology research

Machine learning tools can increase the pace of biology research and open the door to new research questions, but the benefits don’t come without risks.

This Babylonian Student's 4,000-Year-Old Math Blunder Is Still Relatable Today

More than memorializing a math mistake, stone tablets show just how advanced the Babylonians were in their time.

Sixty Years Ago, We Nearly Wiped Out Bed Bugs. Then, They Started Changing

Driven to the brink of extinction, bed bugs adapted—and now pesticides are almost useless against them.

LG’s $60,000 Transparent TV Is So Luxe It’s Practically Invisible

This TV screen vanishes at the push of a button.

Couple Finds Giant Teeth in Backyard Belonging to 13,000-year-old Mastodon

A New York couple stumble upon an ancient mastodon fossil beneath their lawn.