Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-4 of 4
Fang Jiang
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2019) 31 (8): 1126–1140.
Published: 01 August 2019
FIGURES
| View All (5)
Abstract
View article
PDF
Individuals who are deaf since early life may show enhanced performance at some visual tasks, including discrimination of directional motion. The neural substrates of such behavioral enhancements remain difficult to identify in humans, although neural plasticity has been shown for early deaf people in the auditory and association cortices, including the primary auditory cortex (PAC) and STS region, respectively. Here, we investigated whether neural responses in auditory and association cortices of early deaf individuals are reorganized to be sensitive to directional visual motion. To capture direction-selective responses, we recorded fMRI responses frequency-tagged to the 0.1-Hz presentation of central directional (100% coherent random dot) motion persisting for 2 sec contrasted with nondirectional (0% coherent) motion for 8 sec. We found direction-selective responses in the STS region in both deaf and hearing participants, but the extent of activation in the right STS region was 5.5 times larger for deaf participants. Minimal but significant direction-selective responses were also found in the PAC of deaf participants, both at the group level and in five of six individuals. In response to stimuli presented separately in the right and left visual fields, the relative activation across the right and left hemispheres was similar in both the PAC and STS region of deaf participants. Notably, the enhanced right-hemisphere activation could support the right visual field advantage reported previously in behavioral studies. Taken together, these results show that the reorganized auditory cortices of early deaf individuals are sensitive to directional motion. Speculatively, these results suggest that auditory and association regions can be remapped to support enhanced visual performance.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (7): 1570–1582.
Published: 01 July 2010
FIGURES
| View All (4)
Abstract
View article
PDF
We examined the neural response patterns for facial identity independent of viewpoint and for viewpoint independent of identity. Neural activation patterns for identity and viewpoint were collected in an fMRI experiment. Faces appeared in identity-constant blocks, with variable viewpoint, and in viewpoint-constant blocks, with variable identity. Pattern-based classifiers were used to discriminate neural response patterns for all possible pairs of identities and viewpoints. To increase the likelihood of detecting distinct neural activation patterns for identity, we tested maximally dissimilar “face”–“antiface” pairs and normal face pairs. Neural response patterns for four of six identity pairs, including the “face”–“antiface” pairs, were discriminated at levels above chance. A behavioral experiment showed accord between perceptual and neural discrimination, indicating that the classifier tapped a high-level visual identity code. Neural activity patterns across a broad span of ventral temporal (VT) cortex, including fusiform gyrus and lateral occipital areas (LOC), were required for identity discrimination. For viewpoint, five of six viewpoint pairs were discriminated neurally. Viewpoint discrimination was most accurate with a broad span of VT cortex, but the neural and perceptual discrimination patterns differed. Less accurate discrimination of viewpoint, more consistent with human perception, was found in right posterior superior temporal sulcus, suggesting redundant viewpoint codes optimized for different functions. This study provides the first evidence that it is possible to dissociate neural activation patterns for identity and viewpoint independently.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2007) 19 (11): 1735–1752.
Published: 01 November 2007
Abstract
View article
PDF
The goal of pattern-based classification of functional neuroimaging data is to link individual brain activation patterns to the experimental conditions experienced during the scans. These “brain-reading” analyses advance functional neuroimaging on three fronts. From a technical standpoint, pattern-based classifiers overcome fatal f laws in the status quo inferential and exploratory multivariate approaches by combining pattern-based analyses with a direct link to experimental variables. In theoretical terms, the results that emerge from pattern-based classifiers can offer insight into the nature of neural representations. This shifts the emphasis in functional neuroimaging studies away from localizing brain activity toward understanding how patterns of brain activity encode information. From a practical point of view, pattern-based classifiers are already well established and understood in many areas of cognitive science. These tools are familiar to many researchers and provide a quantitatively sound and qualitatively satisfying answer to most questions addressed in functional neuroimaging studies. Here, we examine the theoretical, statistical, and practical underpinnings of pattern-based classification approaches to functional neuroimaging analyses. Pattern-based classification analyses are well positioned to become the standard approach to analyzing functional neuroimaging data.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2005) 17 (4): 580–590.
Published: 01 April 2005
Abstract
View article
PDF
Object and face representations in ventral temporal (VT) cortex were investigated by combining object confusability data from a computational model of object classification with neural response confusability data from a functional neuroimaging experiment. A pattern-based classification algorithm learned to categorize individual brain maps according to the object category being viewed by the subject. An identical algorithm learned to classify an image-based, view-dependent representation of the stimuli. High correlations were found between the confusability of object categories and the confusability of brain activity maps. This occurred even with the inclusion of multiple views of objects, and when the object classification model was tested with high spatial frequency “line drawings” of the stimuli. Consistent with a distributed representation of objects in VT cortex, the data indicate that object categories with shared image-based attributes have shared neural structure.