Technology and feelings
Technology has gotten pretty good at understanding how we feel, being basically able to read at least the seven universal emotions a person is feeling: fear, anger, joy, sadness, disgust, surprise, or suspicion. This has become useful in medicine and psychology, marketing, police investigations, and more recently… driving safety.
EPFL researchers, in collaboration with PSA Peugeot Citroën, have developed an on-board emotion detector based on the analysis of facial expressions. They’ve tested the prototype and reported extremely promising results.
It’s not easy to measure drivers’ emotions in a car, especially in a non-invasive way. In order to work around this problem, scientists let the driver’s face do the work for them – and researchers in EPFL’s Signal Processing 5 Laboratory (LTS5) have teamed up with Peugeot and adapted a facial detection device for use in a car, using a totally non-invasive infrared camera placed behind the steering wheel.
Detecting irritation
The main emotion you want to detect in drivers is irritation; it’s this emotion that makes drivers more reckless and make hasty, unsafe decisions. The problem is that everyone expresses it differently – small gestures, a nervous tic, a slight clenching of the jaw, even an apparently impassive face. To simplify the task at this stage of the project, Hua Gao and Anil Yüce, who spearheaded the research chose to track only two expressions, indicative of the driver’s state: anger and disgust.
In order to do this, they developed a two-stage test. First, the system was taught to identify the two emotions using a series of photos of subjects expressing them. Then the same exercise was carried out using videos. This technology has already been implemented in several other areas, nothing really revolutionary here. The images were then taken both in an office setting as well as in real life situations, in a car that was made available for the project.
Overall, the system worked out fine, being able to detect irritation in the majority of the cases – when it failed, it did so because of the great variety of human expressions – which can be worked on in time. Additional research aims to explore updating the system in real-time.
This test also worked in conjunction with another project, which aims to see how tired drivers are, by measuring the percentage of eyelid closure. The LTS5 is also working on detecting other states on drivers’ faces such as distraction, and on lip reading for use in vocal recognition. They haven’t yet announced what they want to actually do when they get better at detecting these expressions, but it’s pretty clear that estimating how annoyed and tired the driver is can work wonders to improve driving safety.