In face-to-face conversations, speakers use hand movements to signal meaning. But do listeners actually use these gestures to predict what someone might say next? In a study using virtual avatars, scientists from the Max Planck Institute for Psycholinguistics and Radboud University in Nijmegen show that listeners used the avatar's gestures to predict upcoming speech. Both behavioural and EEG data indicated that hand gestures facilitate language processing, illustrating the multimodal nature of human communication.
People might wiggle their fingers when they talk about typing, depicting a 'typing' movement. Seeing meaningful hand movements—also called iconic gestures —helps listeners to process spoken language. " We already know that questions produced with iconic gestures get faster responses in conversation", says first author Marlijn ter Bekke.
Hand movements might speed up language processing because they help to predict what is coming up. "Gestures typically start before their corresponding speech (such as the word "typing"), so they already show some information about what the speaker might say next", explains Ter Bekke.
To investigate whether listeners use hand gestures to predict upcoming speech, the researchers decided to run two experiments using visual avatars. "We used virtual avatars because we can control precisely what they say and how they move, which is good for drawing conclusions from experiments. At the same time, they look natural."
Predicting the target word
In the first experiment, participants listened to questions asked by the avatar, such as "How old were you when you learned to … type ?", with a pause before the target word ("type"). The avatar either made a typing gesture, a meaningless control movement (such as an arm scratch) or no movement. Participants heard the question up to the target word and were asked to guess how it would continue.
As expected, participants predicted the target word (for instance, "type") more often when they had seen the corresponding gesture.
Brain waves
In the second experiment, a different set of participants simply listened to the questions played in full. Their brain activity was recorded with electroencephalography (EEG).
During the silent pause before the target word, gestures affected brain waves that are typically associated with anticipation. After the target word, gestures affected brain responses that indicate how difficult it is to understand a word (a reduced N400 effect). After seeing gestures, people found it easier to process the meaning of the upcoming word.
"These results show that even when participants are just listening, they use gestures to predict what someone might say next", concludes Ter Bekke.
Robots and virtual avatars
"Our study shows that even gestures produced by a virtual avatar facilitate language processing. If we want artificial agents (like robots or virtual avatars) to be readily understood, and in a human-like way, they should not only communicate with speech, but also with meaningful hand gestures."
Publication
Marlijn ter Bekke, Linda Drijvers & Judith Holler (2025). Co-speech hand gestures are used to predict upcoming meaning. Psychological Science. DOI: 10.1177/09567976251331041