Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Nicole Malfait
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2014) 26 (7): 1572–1586.
Published: 01 July 2014
FIGURES
Abstract
View articletitled, Different Neural Networks Are Involved in Audiovisual Speech Perception Depending on the Context
View
PDF
for article titled, Different Neural Networks Are Involved in Audiovisual Speech Perception Depending on the Context
How are we able to easily and accurately recognize speech sounds despite the lack of acoustic invariance? One proposed solution is the existence of a neural representation of speech syllable perception that transcends its sensory properties. In the present fMRI study, we used two different audiovisual speech contexts both intended to identify brain areas whose levels of activation would be conditioned by the speech percept independent from its sensory source information. We exploited McGurk audiovisual fusion to obtain short oddball sequences of syllables that were either (a) acoustically different but perceived as similar or (b) acoustically identical but perceived as different. We reasoned that, if there is a single network of brain areas representing abstract speech perception, this network would show a reduction of activity when presented with syllables that are acoustically different but perceived as similar and an increase in activity when presented with syllables that are acoustically similar but perceived as distinct. Consistent with the long-standing idea that speech production areas may be involved in speech perception, we found that frontal areas were part of the neural network that showed reduced activity for sequences of perceptually similar syllables. Another network was revealed, however, when focusing on areas that exhibited increased activity for perceptually different but acoustically identical syllables. This alternative network included auditory areas but no left frontal activations. In addition, our findings point to the importance of subcortical structures much less often considered when addressing issues pertaining to perceptual representations.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (7): 1493–1503.
Published: 01 July 2010
FIGURES
Abstract
View articletitled, fMRI Activation during Observation of Others' Reach Errors
View
PDF
for article titled, fMRI Activation during Observation of Others' Reach Errors
When exposed to novel dynamical conditions (e.g., externally imposed forces), neurologically intact subjects easily adjust motor commands on the basis of their own reaching errors. Subjects can also benefit from visual observation of others' kinematic errors. Here, using fMRI, we scanned subjects watching movies depicting another person learning to reach in a novel dynamic environment created by a robotic device. Passive observation of reaching movements (whether or not they were perturbed by the robot) was associated with increased activation in fronto-parietal regions that are normally recruited in active reaching. We found significant clusters in parieto-occipital cortex, intraparietal sulcus, as well as in dorsal premotor cortex. Moreover, it appeared that part of the network that has been shown to be engaged in processing self-generated reach error is also involved in observing reach errors committed by others. Specifically, activity in left intraparietal sulcus and left dorsal premotor cortex, as well as in right cerebellar cortex, was modulated by the amplitude of observed kinematic errors.