Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-3 of 3
Philip K. McGuire
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (9): 1656–1669.
Published: 01 September 2008
Abstract
View article
PDF
The Hayling Sentence Completion Task (HSCT) is known to activate left hemisphere frontal and temporal language regions. However, the effective connectivity between frontal and temporal language regions associated with the task has yet to be examined. The aims of the study were to examine activation and effective connectivity during the HSCT using a functional magnetic resonance imaging (fMRI) paradigm in which participants made overt verbal responses. We predicted that producing an incongruent response (response suppression), compared to a congruent one (response initiation), would be associated with greater activation in the left prefrontal cortex and an increase in the effective connectivity between temporal and frontal regions. Fifteen participants were scanned while completing 80 sentence stems. The congruency and constraint of sentences varied across trials. Dynamic Causal Modeling (DCM) and Bayesian Model Selection (BMS) were used to compare a set of alternative DCMs of fronto-temporal connectivity. The HSCT activated regions in the left temporal and prefrontal cortices, and the cuneus. Response suppression was associated with greater activation in the left middle and orbital frontal gyri and the bilateral precuneus than response initiation. Left middle temporal and frontal regions identified by the conventional fMRI analyses were entered into the DCM analysis. Using a systematic BMS procedure, the optimal DCM showed that the connection from the left middle temporal gyrus, which was driven by verbal stimuli per se, was significantly increased in strength during response suppression compared to initiation. Greater effective connectivity between left temporal and prefrontal regions during response suppression may reflect the transfer of information from posterior temporal regions where semantic and lexical information is stored to prefrontal regions where it is manipulated in preparation for an appropriate response.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (7): 1220–1234.
Published: 01 July 2008
Abstract
View article
PDF
Spoken languages use one set of articulators—the vocal tract, whereas signed languages use multiple articulators, including both manual and facial actions. How sensitive are the cortical circuits for language processing to the particular articulators that are observed? This question can only be addressed with participants who use both speech and a signed language. In this study, we used functional magnetic resonance imaging to compare the processing of speechreading and sign processing in deaf native signers of British Sign Language (BSL) who were also proficient speechreaders. The following questions were addressed: To what extent do these different language types rely on a common brain network? To what extent do the patterns of activation differ? How are these networks affected by the articulators that languages use? Common peri-sylvian regions were activated both for speechreading English words and for BSL signs. Distinctive activation was also observed reflecting the language form. Speechreading elicited greater activation in the left mid-superior temporal cortex than BSL, whereas BSL processing generated greater activation at the temporo-parieto-occipital junction in both hemispheres. We probed this distinction further within BSL, where manual signs can be accompanied by different types of mouth action. BSL signs with speech-like mouth actions showed greater superior temporal activation, whereas signs made with non-speech-like mouth actions showed more activation in posterior and inferior temporal regions. Distinct regions within the temporal cortex are not only differentially sensitive to perception of the distinctive articulators for speech and for sign but also show sensitivity to the different articulators within the (signed) language.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2002) 14 (7): 1064–1075.
Published: 01 October 2002
Abstract
View article
PDF
In all signed languages used by deaf people, signs are executed in “sign space” in front of the body. Some signed sentences use this space to map detailed “real-world” spatial relationships directly. Such sentences can be considered to exploit sign space “topographically.” Using functional magnetic resonance imaging, we explored the extent to which increasing the topographic processing demands of signed sentences was reflected in the differential recruitment of brain regions in deaf and hearing native signers of the British Sign Language. When BSL signers performed a sentence anomaly judgement task, the occipito-temporal junction was activated bilaterally to a greater extent for topographic than nontopo-graphic processing. The differential role of movement in the processing of the two sentence types may account for this finding. In addition, enhanced activation was observed in the left inferior and superior parietal lobules during processing of topographic BSL sentences. We argue that the left parietal lobe is specifically involved in processing the precise configuration and location of hands in space to represent objects, agents, and actions. Importantly, no differences in these regions were observed when hearing people heard and saw English translations of these sentences. Despite the high degree of similarity in the neural systems underlying signed and spoken languages, exploring the linguistic features which are unique to each of these broadens our understanding of the systems involved in language comprehension.