Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-4 of 4
Thomas J. Grabowski
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2013) 25 (4): 517–533.
Published: 01 April 2013
FIGURES
| View All (7)
Abstract
View article
PDF
Biological differences between signed and spoken languages may be most evident in the expression of spatial information. PET was used to investigate the neural substrates supporting the production of spatial language in American Sign Language as expressed by classifier constructions, in which handshape indicates object type and the location/motion of the hand iconically depicts the location/motion of a referent object. Deaf native signers performed a picture description task in which they overtly named objects or produced classifier constructions that varied in location, motion, or object type. In contrast to the expression of location and motion, the production of both lexical signs and object type classifier morphemes engaged left inferior frontal cortex and left inferior temporal cortex, supporting the hypothesis that unlike the location and motion components of a classifier construction, classifier handshapes are categorical morphemes that are retrieved via left hemisphere language regions. In addition, lexical signs engaged the anterior temporal lobes to a greater extent than classifier constructions, which we suggest reflects increased semantic processing required to name individual objects compared with simply indicating the type of object. Both location and motion classifier constructions engaged bilateral superior parietal cortex, with some evidence that the expression of static locations differentially engaged the left intraparietal sulcus. We argue that bilateral parietal activation reflects the biological underpinnings of sign language. To express spatial information, signers must transform visual–spatial representations into a body-centered reference frame and reach toward target locations within signing space.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (9): 1698–1710.
Published: 01 September 2008
Abstract
View article
PDF
Impairments in phonological processing have been associated with damage to the region of the left posterior superior temporal gyrus (pSTG), but the extent to which this area supports phonological processing, independent of semantic processing, is less clear. We used repetition priming and neural repetition suppression during functional magnetic resonance imaging (fMRI) in an auditory pseudoword repetition task as a semantics-free model of lexical (whole-word) phonological access. Across six repetitions, we observed repetition priming in terms of decreased reaction time and repetition suppression in terms of reduced neural activity. An additional analysis aimed at sublexical phonology did not show significant effects in the areas where repetition suppression was observed. To test if these areas were relevant to real word production, we performed a conjunction analysis with data from a separate fMRI experiment which manipulated word frequency (a putative index of lexical phonological access) in picture naming. The left pSTG demonstrated significant effects independently in both experiments, suggesting that this area participates specifically in accessing lexical phonology.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2007) 19 (4): 617–631.
Published: 01 April 2007
Abstract
View article
PDF
Cognitive models of word production correlate the word frequency effect (i.e., the fact that words which appear with less frequency take longer to produce) with an increased processing cost to activate the whole-word (lexical) phonological representation. We performed functional magnetic resonance imaging (fMRI) while subjects produced overt naming responses to photographs of animals and manipulable objects that had high name agreement but were of varying frequency, with the purpose of identifying neural structures participating specifically in activating whole-word phonological representations, as opposed to activating lexical semantic representations or articulatory-motor routines. Blood oxygen level-dependent responses were analyzed using a parametric approach based on the frequency with which each word produced appears in the language. Parallel analyses were performed for concept familiarity and word length, which provided indices of semantic and articulatory loads. These analyses permitted us to identify regions related to word frequency alone, and therefore, likely to be related specifically to activation of phonological word forms. We hypothesized that the increased processing cost of producing lower-frequency words would correlate with activation of the left posterior inferotemporal (IT) cortex, the left posterior superior temporal gyrus (pSTG), and the left inferior frontal gyrus (IFG). Scan-time response latencies demonstrated the expected word frequency effect. Analysis of the fMRI data revealed that activity in the pSTG was modulated by frequency but not word length or concept familiarity. In contrast, parts of IT and IFG demonstrated conjoint frequency and familiarity effects, and parts of both primary motor regions demonstrated conjoint effects of frequency and word length. The results are consistent with a model of word production in which lexical-semantic and lexical-phonological information are accessed by overlapping neural systems within posterior and anterior language-related cortices, with pSTG specifically involved in accessing lexical phonology.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2005) 17 (8): 1293–1305.
Published: 01 August 2005
Abstract
View article
PDF
We have proposed that the left inferotemporal (IT) region contains structures that mediate between conceptual knowledge retrieval and word-form retrieval, and we have hypothesized that these structures are utilized for word retrieval irrespective of the sensory modality through which an entity is apprehended, thus being “modality neutral.” We tested this idea in two sensory modalities, visual and auditory, and for two categories of concrete entities, tools and animals. In a PET experiment, 10 normal participants named tools and animals either from pictures or from characteristic sounds (e.g., “scissors” from a picture of a scissors or from the sound of a scissors cutting; “rooster” from a picture of a rooster or from the sound of a rooster crowing). Visual and auditory naming of tools activated the left posterior/lateral IT; visual and auditory naming of animals activated the left anterior/ ventral IT. For both tools and animals, the left IT activations were similar in location and magnitude regardless of whether participants were naming entities from pictures or from sounds. The results provide novel evidence to support the notion that left IT structures contain “modality-neutral” systems for mediating between conceptual knowledge and word retrieval.