Skip Navigation
MarylandToday

Produced by the Office of Strategic Communications

Subscribe Now
Research

Noisy? Read My Lips!

Facial Cues Help Toddlers With Autism Better Comprehend Speech in Noise, UMD Research Shows

By Sara Gavin

Child with autism works with therapist

Photo by iStock

Young children with autism spectrum disorder who watch a speaker's lips have an easier time understanding speech in a noisy environment, new UMD research finds. The results could be of use in the early intervention programs that many such children participate in.

Whether it’s blaring TVs, honking horns, or chattering classmates, a noisy environment can make understanding speech a struggle for anyone. Now, new research from the Department of Hearing and Speech Sciences finds that paying close attention to faces may help toddlers with autism spectrum disorders (ASD) cut through the oral clutter.

Researchers compared the abilities of children ages 2 to 5, with and without ASD, to understand speech in environments both quiet and distractingly noisy. In findings published this month in the Journal of Neurodevelopmental Disorders, all of the children had more difficulty in the noisy scenario, but toddlers with ASD were less able to use visual cues such as facial expressions or lip-reading to help them understand. However, children with autism who spent more time looking at speakers’ faces appeared to struggle less with background racket.

“Modern households and classrooms include many sources of noise, and the ability to understand speech amidst noise is critical for success and learning,” said lead author Rochelle Newman, professor and chair of the Department of Hearing and Speech Sciences. “Our results point to potential future interventions for children with autism who are having difficulty with this skill.”

The research could prove immediately useful in therapy programs for children with autism, who now number 1 in 54 of all children, according to the Centers for Disease Control and Prevention. Some youngsters already work specifically on watching faces—for instance, to facilitate social interaction—and such practice could also be incorporated to help understand speech, Newman said. However, more investigation is needed to gauge the effectiveness of the technique in comparison to others, so she doesn’t yet recommend therapists prioritize it over other approaches.

In the study, children sat in front of images of two objects displayed on a screen and were prompted verbally to look at one of the objects. Researchers introduced a person talking at a low volume to complicate the sonic landscape, and then showed a video of a person’s face delivering the instructions to examine whether this affected comprehension.

“While previous studies have found that adults and adolescents with autism have particular difficulty with noise, ours is the first to test this in young children,” Newman said. “This is a critical age for language development, and the more we can discover about specific challenges children with autism are facing, the more effective we will be at developing strategies to overcome them earlier on.”

Newman’s co-authors include Elizabeth Redcay, associate professor of psychology; Laura Anderson Kirby, a former clinical psychology Ph.D. student who is now a clinical associate at the Duke Center for Autism and Brain Development; and Katie Von Holzen, a former postdoctoral associate in hearing and speech sciences now at the Technical University of Dortmund in Germany.

Topics:

Research

Maryland Today is produced by the Office of Strategic Communications for the University of Maryland community weekdays during the academic year, except for university holidays.