Could we learn to love a robot? Maybe. New research suggests that drones, at least, could elicit an emotional response in people if we put cute little faces on them.
Researchers at Ben-Gurion University of the Negev (BGU) have examined how people react to a wide range of facial expressions depicted on a drone. The study aims to deepen our understanding of how flying drones might one day integrate into society, and how human-robot interactions, in general, can be made to feel more natural — an area of research that hasn’t been explored very much until today.
Electronic emotions
“There is a lack of research on how drones are perceived and understood by humans, which is vastly different than ground robots,” says Prof. Jessica Cauchard, lead author of the paper.
“For the first time, we showed that people can recognize different emotions and discriminate between different emotion intensities.”
The research included two experiments, both using drones that could display stylized facial expressions to convey basic emotions to the participants. The object of these studies was to find out how people would react to these drone-borne expressions.
Four core features were used to compose each of the facial expressions used in the study: eyes, eyebrows, pupils, and mouth. Out of the possible emotions these drones could convey, five were recognized ‘with high accuracy’ from static images (joy, sadness, fear, anger, surprise), and four more (joy, surprise, sadness, anger) were recognized most easily in dynamic expressions conveyed through video. However, people had a hard time recognizing disgust no matter how it was conveyed to them by the drone.
What the team did find particularly surprising, however, is how involved the participants themselves were with understanding these emotions.
“Participants were further affected by the drone and presented different responses, including empathy, depending on the drone’s emotion,” Prof. Cauchard says. “Surprisingly, participants created narratives around the drone’s emotional states and included themselves in these scenarios.”
Based on the findings, the authors list a number of recommendations that they believe will make drones more easily acceptable in social situations or for use in emotional support. The main recommendations include adding anthropomorphic features to the drones, using the five basic emotions for the most part (as these are easily understood), and using empathetic responses in health and behavior change applications, as they make people more likely to listen to instructions from the drone.
The paper “Drone in Love: Emotional Perception of Facial Expressions on Flying Robots” has been published in the journal Association for Computing Machinery and has been presented at the CHI Conference on Human Factors in Computing Systems (2021).