Lip Reading Engages Brain Like Real Speech

University of Michigan
Concept illustration of voice waves emanating from a woman's mouth. Image credit: Nicole Smith, made with Midjourney

Study: Auditory cortex encodes lipreading information through spatially distributed activity

Lip-read words can be decoded from the brain's auditory regions similarly to heard speech, according to a new University of Michigan report that looked at how vision supports verbal perception.

Researchers used functional magnetic resonance imaging and electrodes implanted in the patients' brains to show that watching someone speak when you can't hear them (lip reading) activates auditory regions of the brain in ways similar to real speech.

David Brang, associate professor of psychology and the study's senior author, said seeing a person's facial movements often starts before sounds are produced. The auditory system uses these early visual cues to prime auditory neurons before the sounds are heard, he said.

The study indicated that integrating visual and auditory cues makes a person get more accurate and efficient speech information, which significantly enhances communication abilities.

David Brang
David Brang

Brang and colleagues sought to understand how the visual signals during lip reading are represented in the auditory system.

They used fMRI data from healthy adults and intracranial recordings from electrodes implanted in patients with epilepsy during auditory and visual speech perception tasks.

The findings revealed that lip-read words could be classified at earlier time points compared to heard words. This suggests that lip reading might involve a predictive mechanism that facilitates speech processing before auditory information becomes available, Brang said.

The results support a model in which the auditory system combines the neural distributions evoked by heard and lip-read words to generate a more precise estimate of what was said.

Brang said these findings suggest that the auditory system quickly integrates lip reading information to enhance hearing capabilities, especially in challenging auditory environments like noisy restaurants. Observing a speaker's lips can influence our auditory perception even before any sounds are produced.

For people with hearing loss, this rapid use of lip reading information is likely even more pronounced, he added.

"As hearing abilities decline, people increasingly rely on visual cues to aid their understanding," Brang said. "The ability of visual speech to activate and encode information in the auditory cortex appears to be a crucial compensatory mechanism."

This helps people maintain their hearing capacities as they age, underscoring the value of face-to-face communication in supporting auditory comprehension.

The study, which appears in Current Biology, was co-authored by Karthik Ganesan, Cody Zhewei Cao, Michael Demidenko, Andrew Jahn, William Stacey, and Vibhangini Wasade.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.