Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-3 of 3
Michael J. Brammer
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (7): 1220–1234.
Published: 01 July 2008
Abstract
View article
PDF
Spoken languages use one set of articulators—the vocal tract, whereas signed languages use multiple articulators, including both manual and facial actions. How sensitive are the cortical circuits for language processing to the particular articulators that are observed? This question can only be addressed with participants who use both speech and a signed language. In this study, we used functional magnetic resonance imaging to compare the processing of speechreading and sign processing in deaf native signers of British Sign Language (BSL) who were also proficient speechreaders. The following questions were addressed: To what extent do these different language types rely on a common brain network? To what extent do the patterns of activation differ? How are these networks affected by the articulators that languages use? Common peri-sylvian regions were activated both for speechreading English words and for BSL signs. Distinctive activation was also observed reflecting the language form. Speechreading elicited greater activation in the left mid-superior temporal cortex than BSL, whereas BSL processing generated greater activation at the temporo-parieto-occipital junction in both hemispheres. We probed this distinction further within BSL, where manual signs can be accompanied by different types of mouth action. BSL signs with speech-like mouth actions showed greater superior temporal activation, whereas signs made with non-speech-like mouth actions showed more activation in posterior and inferior temporal regions. Distinct regions within the temporal cortex are not only differentially sensitive to perception of the distinctive articulators for speech and for sign but also show sensitivity to the different articulators within the (signed) language.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (6): 1003–1020.
Published: 01 June 2008
Abstract
View article
PDF
There is strong evidence to suggest that the complex cognitive process underlying mental rotation does not have a discrete neural correlate, but is represented as a distributed neural system. Although the neuroanatomical nodes of this so-called rotation network are well established, there is as yet little empirical evidence to indicate how these nodes interact during task performance. Using an optimized, event-related paradigm, this study aimed to test a previously proposed hypothetical neurocognitive network for mental rotation in female subjects with path analysis, and to examine changes in effective connections across different levels of task difficulty. Path analysis was carried out in combination with a time-resolved functional magnetic resonance imaging (fMRI) analysis in order to relate the observed changes on the network level to changes in specific temporal characteristics of the hemodynamic response function on the level of individual neuroanatomical nodes. Overall, it was found that the investigated sequential model did not provide an adequate fit to the data and that a model with parallel information processing was superior to the serial model. This finding challenges traditional cognitive models describing the complex cognitive process underlying mental rotation by a set of sequentially organized, functionally distinct processing stages. It was further demonstrated that the observed in interregional effective connectivity changes with the level of task demand. These changes were directly related to the time course of the experimental paradigm. The results of path analysis in fMRI should therefore only be interpreted in the light of a specific experimental design and should not be considered as general indicators of effective connections.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2002) 14 (7): 1064–1075.
Published: 01 October 2002
Abstract
View article
PDF
In all signed languages used by deaf people, signs are executed in “sign space” in front of the body. Some signed sentences use this space to map detailed “real-world” spatial relationships directly. Such sentences can be considered to exploit sign space “topographically.” Using functional magnetic resonance imaging, we explored the extent to which increasing the topographic processing demands of signed sentences was reflected in the differential recruitment of brain regions in deaf and hearing native signers of the British Sign Language. When BSL signers performed a sentence anomaly judgement task, the occipito-temporal junction was activated bilaterally to a greater extent for topographic than nontopo-graphic processing. The differential role of movement in the processing of the two sentence types may account for this finding. In addition, enhanced activation was observed in the left inferior and superior parietal lobules during processing of topographic BSL sentences. We argue that the left parietal lobe is specifically involved in processing the precise configuration and location of hands in space to represent objects, agents, and actions. Importantly, no differences in these regions were observed when hearing people heard and saw English translations of these sentences. Despite the high degree of similarity in the neural systems underlying signed and spoken languages, exploring the linguistic features which are unique to each of these broadens our understanding of the systems involved in language comprehension.