Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Henning Holle
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2011) 23 (7): 1648–1663.
Published: 01 July 2011
FIGURES
| View All (4)
Abstract
View article
PDF
The present series of experiments explores several issues related to gesture–speech integration and synchrony during sentence processing. To be able to more precisely manipulate gesture–speech synchrony, we used gesture fragments instead of complete gestures, thereby avoiding the usual long temporal overlap of gestures with their coexpressive speech. In a pretest, the minimal duration of an iconic gesture fragment needed to disambiguate a homonym (i.e., disambiguation point) was therefore identified. In three subsequent ERP experiments, we then investigated whether the gesture information available at the disambiguation point has immediate as well as delayed consequences on the processing of a temporarily ambiguous spoken sentence, and whether these gesture–speech integration processes are susceptible to temporal synchrony. Experiment 1, which used asynchronous stimuli as well as an explicit task, showed clear N400 effects at the homonym as well as at the target word presented further downstream, suggesting that asynchrony does not prevent integration under explicit task conditions. No such effects were found when asynchronous stimuli were presented using a more shallow task (Experiment 2). Finally, when gesture fragment and homonym were synchronous, similar results as in Experiment 1 were found, even under shallow task conditions (Experiment 3). We conclude that when iconic gesture fragments and speech are in synchrony, their interaction is more or less automatic. When they are not, more controlled, active memory processes are necessary to be able to combine the gesture fragment and speech context in such a way that the homonym is disambiguated correctly.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2007) 19 (7): 1175–1192.
Published: 01 July 2007
Abstract
View article
PDF
The present series of experiments explored the extent to which iconic gestures convey information not found in speech. Electroencephalogram (EEG) was recorded as participants watched videos of a person gesturing and speaking simultaneously. The experimental sentences contained an unbalanced homonym in the initial part of the sentence (e.g., She controlled the ball …) and were disambiguated at a target word in the subsequent clause ( which during the game … vs. which during the dance …). Coincident with the initial part of the sentence, the speaker produced an iconic gesture which supported either the dominant or the subordinate meaning. Event-related potentials were time-locked to the onset of the target word. In Experiment 1, participants were explicitly asked to judge the congruency between the initial homonym-gesture combination and the subsequent target word. The N400 at target words was found to be smaller after a congruent gesture and larger after an incongruent gesture, suggesting that listeners can use gestural information to disambiguate speech. Experiment 2 replicated the results using a less explicit task, indicating that the disambiguating effect of gesture is somewhat task-independent. Unrelated grooming movements were added to the paradigm in Experiment 3. The N400 at subordinate targets was found to be smaller after subordinate gestures and larger after dominant gestures as well as grooming, indicating that an iconic gesture can facilitate the processing of a lesser frequent word meaning. The N400 at dominant targets no longer varied as a function of the preceding gesture in Experiment 3, suggesting that the addition of meaningless movements weakened the impact of gesture. Thus, the integration of gesture and speech in comprehension does not appear to be an obligatory process but is modulated by situational factors such as the amount of observed meaningful hand movements.