Primates are among a couple of mammals that have a dedicated system for processing faces, something that involves a lot of neuropower and energy expenditure. While there are people that resemble each other, no two humans have the same exact faces (not even identical twins). Some people see, and thus analyze, thousands of faces each day, depending on how much they go outside. Recognizing emotions is an even more complicated process, one that gives even the brain some problems.
Researchers at Caltech, Cedars-Sinai Medical Center, and Huntington Memorial Hospital in Pasadena targeted brain activity in the amygdala, the region of the brain responsible for encoding information related to emotional reactions. Their findings suggest that some brain cells recognize emotional face patterns based on a subjective approach (i.e. the viewer’s preconception), and not through an entirely objective process that should have revealed the true emotional pattern. This is the first time neurons in the amygdala were shown to encode the subjective judgment of emotions shown in face stimuli, rather than simply their stimulus features.
[RELATED] Remembering faces is influenced by genetics
For their purpose, the researchers investigated over 200 single neurons in the amygdalae of 7 patients treated for epilepsy who had surgically implanted depth electrodes. MRI image of the patients’ brain activity were taken while the participants were shown images of partially obscured faces showing either happiness or fear. Each participants was asked to judge which of the two emotions was shown. Here’s what the authors report:
“During trials where subjects responded correctly, we found neurons that distinguished fear vs. happy emotions as expressed by the displayed faces. During incorrect trials, these neurons indicated the patients’ subjective judgment. Additional analysis revealed that, on average, all neuronal responses were modulated most by increases or decreases in response to happy faces, and driven predominantly by judgments about the eye region of the face stimuli,” from the abstract of the paper published in the Proceedings of the National Academy of Science.
What this means is that the amygdala doesn’t necessarily respond to what’s actually there in the world, but to what SEEMS to be there, after it passes an internal filter. Things become more interesting when you take into account the fact that the amygdala is linked with a number of psychiatric diseases like depression or autism. Many of these afflictions might be due to a skewed perception of the patient’s surroundings. That doesn’t mean the amygdala alone is responsible for all of this.
“Of course, the amygdala doesn’t accomplish anything by itself. What we need to know next is what happens elsewhere in the brain, so we need to record not only from the amygdala, but also from other brain regions with which the amygdala is connected,” says Shuo Wang, a postdoctoral fellow at Caltech and first author of the paper.