Âé¶¹ÒùÔº


Listeners use gestures to predict upcoming words, virtual avatar study finds

Listeners use gestures to predict upcoming words
Schematic stimulus overview. Credit: Psychological Science (2025). DOI: 10.1177/09567976251331041

In face-to-face conversations, speakers use hand movements to signal meaning. But do listeners actually use these gestures to predict what someone might say next? In a study using virtual avatars, scientists from the Max Planck Institute for Psycholinguistics and Radboud University in Nijmegen showed that listeners used the avatar's gestures to predict an upcoming speech. Both behavioral and EEG data indicated that hand gestures facilitate language processing, illustrating the multimodal nature of human communication.

The research is in the journal Psychological Science.

People might wiggle their fingers when they talk about typing, depicting a "typing" movement. Seeing meaningful —also called iconic gestures—helps listeners to process spoken language. "We already know that questions produced with iconic gestures get faster responses in conversation," says first author Marlijn ter Bekke.

Hand movements might speed up language processing because they help to predict what is coming up. "Gestures typically start before their corresponding speech (such as the word 'typing'), so they already show some information about what the speaker might say next," explains Ter Bekke.

To investigate whether listeners use hand gestures to predict upcoming speech, the researchers decided to run two experiments using avatars. "We used virtual avatars because we can control precisely what they say and how they move, which is good for drawing conclusions from experiments. At the same time, they look natural."

Predicting the target word

In the first experiment, participants listened to questions asked by the avatar, such as "How old were you when you learned to … type?", with a pause before the target word ("type"). The avatar either made a typing gesture, a meaningless control movement (such as an arm scratch) or no movement. Participants heard the question up to the target word and were asked to guess how it would continue.

As expected, participants predicted the target word (for instance, "type") more often when they had seen the corresponding gesture.

Brain waves

In the second experiment, a different set of participants simply listened to the questions played in full. Their was recorded with electroencephalography (EEG).

During the silent pause before the target word, gestures affected that are typically associated with anticipation. After the target word, gestures affected brain responses that indicate how difficult it is to understand a word (a reduced N400 effect). After seeing gestures, people found it easier to process the meaning of the upcoming word.

"These results show that even when participants are just listening, they use gestures to predict what someone might say next," concludes Ter Bekke.

"Our study shows that even gestures produced by a virtual facilitate . If we want artificial agents (like robots or virtual avatars) to be readily understood, and in a human-like way, they should not only communicate with speech, but also with meaningful ."

More information: Marlijn ter Bekke et al, Co-Speech Hand Gestures Are Used to Predict Upcoming Meaning, Psychological Science (2025).

Journal information: Psychological Science

Provided by Max Planck Society

Citation: Listeners use gestures to predict upcoming words, virtual avatar study finds (2025, April 22) retrieved 27 August 2025 from /news/2025-04-gestures-upcoming-words-virtual-avatar.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Typing in mid-air or pretending to drink: How using your hands to communicate can help you understand others

0 shares

Feedback to editors