Brains of individuals with autism successfully encode facial emotions, study reveals

A study that tested neural activity in the brains of individuals with Autism Spectrum Disorder (ASD) reveals that they successfully encode facial emotions in their neural signals – and they do so about as well as those without ASD. Led by researchers at Stony Brook University, the research suggests that the difficulties ASD individuals have reading facial emotions arise from problems in translating facial emotion information they have successfully encoded, not because their brains fail to do so in the first place. The findings are published early online in Biological Psychiatry: Cognitive Neuroscience and Neuroimaging.

According to Matthew D. Lerner, PhD, Senior Author and Associate Professor of Psychology Psychiatry & Pediatrics in the Department of Psychology at Stony Brook University, this electroencephalogram (EEG) imaging study allowed the researchers to test a fundamental question about autism that has not yet been clearly addressed: Are the challenges in emotion recognition due to the emotional information not being encoded in the brain in the first place, or are they accurately encoded and just not deployed?

Our findings indicate the latter part of that question appears to the be the more likely explanation for why many autistic individuals struggle to read facial emotions. Particularly now, when mask-wearing is pervasive and everyone has less facial emotion information available to them in daily life, it is especially important to understand how, when, and for whom struggles in reading these emotions emerge – and also when we may be misunderstanding the nature of these struggles."

Matthew D. Lerner, PhD, Senior Author and Associate Professor of Psychology Psychiatry & Pediatrics in the Department of Psychology at Stony Brook University

The study involved a total of 192 individuals of different ages nationwide whose neural signals were recorded when viewing many facial emotions. The team used a discriminative and contemporary machine learning approach called Deep Convolutional Neural Networks to classify facial emotions. The machine learning approach included an algorithm that enabled the researchers to examine the EEG activity of individuals with and without ASD while they were watching faces and decoding what emotions they saw. The algorithm in turn could indicate for each individual face what emotion the person was viewing – essentially, to try to map the neural patterns that the brains of participants were using to decode emotions.

According to the authors, the findings have major implications on how individuals with ASD process emotions and for developing new types of interventions to help improve ASD individuals' facial emotion assessments of other people.

"Specifically, many interventions try to help people with ASD to compensate for not understanding emotions – essentially, they are emotion recognition prosthetics. However, our findings suggest that these approaches may not be helpful, and rather we should focus on capitalizing on and reinforcing their intact encoding of emotions," adds Dr. Lerner.

The research, in collaboration with the University of Trento, involved imaging and data collection made possible by the Institute for Advanced Computational Science at Stony Brook University and use of the SeaWulf computing system.

Source:

Stony Brook University

Journal reference:

Torres, J.M.M., et al. (2021) Facial emotions are accurately encoded in the neural signal of those with Autism Spectrum Disorder: A deep learning approach. Biological Psychiatry: Cognitive Neuroscience and Neuroimaging. doi.org/10.1016/j.bpsc.2021.03.015.

Posted in: Medical Science News | Medical Research News | Medical Condition News

Tags: Autism, Brain, Imaging, Machine Learning, Mental Health, Neuroimaging, Neuroscience, Pediatrics, Psychiatry, Psychology, Research

Comments (0)

Source: Read Full Article