Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Yu He
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (2): 285–295.
Published: 01 February 2008
Abstract
View article
PDF
There is strong evidence for dissociable “what” and “where” pathways in the auditory system, but considerable debate remains regarding the functional role of these pathways. The sensory-motor account of spatial processing posits that the dorsal brain regions (e.g., inferior parietal lobule, IPL) mediate sensory-motor integration required during “where” responding. An alternative account suggests that the IPL plays an important role in monitoring sound location. To test these two models, we used a mixed-block and event-related functional magnetic resonance imaging (fMRI) design in which participants responded to occasional repetitions in either sound location (“where” task) or semantic category (“what” task). The fMRI data were analyzed with the general linear model using separate regressors for representing sustained and transient activity in both listening conditions. This analysis revealed more sustained activity in right dorsal brain regions, including the IPL and superior frontal sulcus, during the location than during the category task, after accounting for transient activity related to target detection and the motor response. Conversely, we found greater sustained activity in the left superior temporal gyrus and left inferior frontal gyrus during the category task compared to the location task. Transient target-related activity in both tasks was associated with enhanced signal in the left pre- and postcentral gyrus, prefrontal cortex and bilateral IPL. These results suggest dual roles for the right IPL in auditory working memory—one involved in monitoring and updating sound location independent of motor responding, and another that underlies the integration of sensory and motor functions.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2005) 17 (5): 811–818.
Published: 01 May 2005
Abstract
View article
PDF
The discrimination of concurrent sounds is paramount to speech perception. During social gatherings, listeners must extract information from a composite acoustic wave, which sums multiple individual voices that are simultaneously active. The observers' ability to identify two simultaneously presented vowels improves with increasing separation between the fundamental frequencies ( f 0 ) of the two vowels. Event-related potentials to stimuli presented during attend and ignore conditions revealed activity between 130 and 170 msec after sound onset that reflected the f 0 differences between the two vowels. Another, more posterior and right-lateralized, negative wave maximal at 250 msec, and a central-parietal slow negativity were observed only during vowel identification and may index stimulus categorization. This sequence of neural events supports a multistage model of auditory scene analysis in which the spectral pattern of each vowel constituent is automatically extracted and then matched against representations of those vowels in working memory.