Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Gaspare Galati
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2011) 23 (3): 503–513.
Published: 01 March 2011
FIGURES
Abstract
View article
PDF
The perception of tactile stimuli on the face is modulated if subjects concurrently observe a face being touched; this effect, termed visual remapping of touch (VRT), is maximum for observing one's own face. In the present fMRI study, we investigated the neural basis of the VRT effect. Participants in the scanner received tactile stimuli, near the perceptual threshold, on their right, left, or both cheeks. Concurrently, they watched movies depicting their own face, another person's face, or a ball that could be touched or only approached by human fingers. Participants were requested to distinguish between unilateral and bilateral tactile stimulation. Behaviorally, perception of tactile stimuli was modulated by viewing a tactile stimulation, with a stronger effect when viewing one's own face being touched. In terms of brain activity, viewing touch was related with an enhanced activity in the ventral intraparietal area. The specific effect of viewing touch on oneself was instead related with a reduced activity in both the ventral premotor cortex and the somatosensory cortex. The present findings suggest that VRT is supported by a network of fronto-parietal areas. The ventral intraparietal area might remap visual information about touch onto tactile processing. Ventral premotor cortex might specifically modulate multisensory interaction when sensory information is related to one's own body. Then this activity might back project to the somatosensory cortices, thus affecting tactile perception.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2007) 19 (5): 799–816.
Published: 01 May 2007
Abstract
View article
PDF
We used functional magnetic resonance imaging (fMRI) in conjunction with a voxel-based approach to lesion symptom mapping to quantitatively evaluate the similarities and differences between brain areas involved in language and environmental sound comprehension. In general, we found that language and environmental sounds recruit highly overlapping cortical regions, with cross-domain differences being graded rather than absolute. Within language-based regions of interest, we found that in the left hemisphere, language and environmental sound stimuli evoked very similar volumes of activation, whereas in the right hemisphere, there was greater activation for environmental sound stimuli. Finally, lesion symptom maps of aphasic patients based on environmental sounds or linguistic deficits [Saygin, A. P., Dick, F., Wilson, S. W., Dronkers, N. F., & Bates, E. Shared neural resources for processing language and environmental sounds: Evidence from aphasia. Brain, 126 , 928–945, 2003] were generally predictive of the extent of blood oxygenation level dependent fMRI activation across these regions for sounds and linguistic stimuli in young healthy subjects.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2004) 16 (9): 1517–1535.
Published: 01 November 2004
Abstract
View article
PDF
Functional magnetic resonance imaging was used to compare the neural correlates of three different types of spatial coding, which are implicated in crucial cognitive functions of our everyday life, such as visuomotor coordination and orientation in topographical space. By manipulating the requested spatial reference during a task of relative distance estimation, we directly compared viewer-centered, object-centered, and landmark-centered spatial coding of the same realistic 3-D information. Common activation was found in bilateral parietal, occipital, and right frontal premotor regions. The retrosplenial and ventromedial occipital–temporal cortex (and parts of the parietal and occipital cortex) were significantly more activated during the landmark-centered condition. The ventrolateral occipital–temporal cortex was particularly involved in object-centered coding. Results strongly demonstrate that viewer-centered (egocentric) coding is restricted to the dorsal stream and connected frontal regions, whereas a coding centered on external references requires both dorsal and ventral regions, depending on the reference being a movable object or a landmark.