Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-6 of 6
Brigitte Röder
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2013) 25 (5): 790–801.
Published: 01 May 2013
FIGURES
| View All (7)
Abstract
View article
PDF
Previous studies have suggested that the putative human homologue of the ventral intraparietal area (hVIP) is crucially involved in the remapping of tactile information into external spatial coordinates and in the realignment of tactile and visual maps. It is unclear, however, whether hVIP is critical for the remapping process during audio-tactile cross-modal spatial interactions. The audio-tactile ventriloquism effect, where the perceived location of a sound is shifted toward the location of a synchronous but spatially disparate tactile stimulus, was used to probe spatial interactions in audio-tactile processing. Eighteen healthy volunteers were asked to report the perceived location of brief auditory stimuli presented from three different locations (left, center, and right). Auditory stimuli were presented either alone (unimodal stimuli) or concurrently to a spatially discrepant tactile stimulus applied to the left or right index finger (bimodal stimuli), with the hands adopting either an uncrossed or a crossed posture. Single pulses of TMS were delivered over the hVIP or a control site (primary somatosensory cortex, SI) 80 msec after trial onset. TMS to the hVIP, compared with the control SI-TMS, interfered with the remapping of touch into external space, suggesting that hVIP is crucially involved in transforming spatial reference frames across audition and touch.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (1): 184–202.
Published: 01 January 2010
Abstract
View article
PDF
Recent studies have suggested that the location of tactile stimuli is automatically recoded from anatomical into external coordinates, independent of the task requirements. However, research has mainly involved the two hands, which may not be representative for the whole body because they are excessively used for the visually guided manipulation of objects and tools. We recorded event-related potentials (ERPs) while participants received tactile stimuli to the hands and feet, but attended only one limb. The hands were placed near the feet either in an uncrossed or a crossed posture, thus varying the spatial distance of each hand from each foot. Centro-parietal ERPs 100–140 msec poststimulus were more positive when stimulating the anatomically same-side hand while attending a foot. They were also more positive when the Euclidean distance between the stimulated hand and the attended foot was small rather than large. When a foot was stimulated and a hand attended, a similar modulation of foot ERPs was observed for the right foot. To assess the spatial distance between two limbs in space, the external location of both must be known. The present ERP results therefore suggest that not only the hands but also other body parts are remapped into external coordinates. The use of both anatomical and external coordinates may facilitate the control of actions toward tactile events and the choice of the most suitable effector.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2009) 21 (12): 2445–2461.
Published: 01 December 2009
Abstract
View article
PDF
When a single tactile stimulus is presented together with two tones, participants often report perceiving two touches. It is a matter of debate whether this cross-modal effect of audition on touch reflects the interplay between modalities at early perceptual or at later processing stages, and which brain processes determine what in the end is consciously perceived. Event-related brain potentials (ERPs) were recorded while rare single tactile stimuli accompanied by two tones (1T2A) were presented among frequent tactile double stimuli accompanied by two tones (2T2A). Although participants were instructed to ignore the tones and to respond to single tactile stimuli only, they often failed to respond to 1T2A stimuli (“illusory double touches,” 1T2A(i)). ERPs to “illusory double touches” versus “real double touches” (2T2A) differed 50 msec after the (missing) second touch. This suggests that at an early sensory stage, illusory and real touches are processed differently. On the other hand, although similar stimuli elicited a tactile mismatch negativity (MMN) between 100 and 200 msec in a unisensory tactile experiment, no MMN was observed for the 1T2A(i) stimuli in the multisensory experiment. “Tactile awareness” was associated with a negativity at 250 msec, which was enhanced in response to correctly identified deviants as compared to physically identical deviants that elicited an illusion. Thus, auditory stimuli seem to alter neural mechanisms associated with automatic tactile deviant detection. The present findings contribute to the debate of which processing step in the brain determines what is consciously perceived.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2009) 21 (1): 58–82.
Published: 01 January 2009
Abstract
View article
PDF
The present study used functional magnetic resonance imaging to delineate cortical networks that are activated when objects or spatial locations encoded either visually (visual encoding group, n = 10) or haptically (haptic encoding group, n = 10) had to be retrieved from long-term memory. Participants learned associations between auditorily presented words and either meaningless objects or locations in a 3-D space. During the retrieval phase one day later, participants had to decide whether two auditorily presented words shared an association with a common object or location. Thus, perceptual stimulation during retrieval was always equivalent, whereas either visually or haptically encoded object or location associations had to be reactivated. Moreover, the number of associations fanning out from each word varied systematically, enabling a parametric increase of the number of reactivated representations. Recall of visual objects predominantly activated the left superior frontal gyrus and the intraparietal cortex, whereas visually learned locations activated the superior parietal cortex of both hemispheres. Retrieval of haptically encoded material activated the left medial frontal gyrus and the intraparietal cortex in the object condition, and the bilateral superior parietal cortex in the location condition. A direct test for modality-specific effects showed that visually encoded material activated more vision-related areas (BA 18/19) and haptically encoded material more motor and somatosensory-related areas. A conjunction analysis identified supramodal and material-unspecific activations within the medial and superior frontal gyrus and the superior parietal lobe including the intraparietal sulcus. These activation patterns strongly support the idea that code-specific representations are consolidated and reactivated within anatomically distributed cell assemblies that comprise sensory and motor processing systems.
Journal Articles
Orienting Attention to Points in Time Improves Stimulus Processing Both within and across Modalities
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2006) 18 (5): 715–729.
Published: 01 May 2006
Abstract
View article
PDF
Spatial attention affects the processing of stimuli of both a task-relevant and a task-irrelevant modality. The present study investigated if similar cross-modal effects exist when attention is oriented to a point in time. Short (600 msec) and long (1200 msec) empty intervals, marked by a tactile onset and an auditory or a tactile offset marker, were presented. In each block, the participants had to attend one interval and one modality. Event-related potentials (ERPs) to auditory and tactile offset markers of attended as compared to unattended intervals were characterized by an enhancement of early negative deflections of the auditory and somatosensory ERPs (audition, 100–140 msec; touch, 130–180 msec) when audition or touch was task relevant, respectively. Similar effects were found for auditory stimuli when touch was task relevant. An additional reaction time experiment revealed faster responses to both auditory and tactile stimuli at the attended as compared to the unattended point in time, irrespective of which modality was primary. Both behavioral and ERP data show that attention can be focused on a point in time, which results in a more efficient processing of auditory and tactile stimuli. The ERP data further suggest that a relative enhancement at perceptual processing stages contributes to the processing advantage for temporally attended stimuli. The existence of cross-modal effects of temporal attention underlines the importance of time as a feature for binding input across different modalities.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2006) 18 (2): 149–157.
Published: 01 February 2006
Abstract
View article
PDF
Blind individuals who lost their sight as older children or adults were compared with normally sighted controls in their ability to focus auditory spatial attention and to localize sounds in a noisy acoustic environment. Event-related potentials (ERPs) were recorded while participants attended to sounds presented in free field from either central or peripheral arrays of speakers with the task of detecting infrequent targets at the attended location. When attending to the central array of speakers, the two groups detected targets equally well, and their spatial tuning curves for both ERPs and target detections were highly similar. By contrast, late blind participants were significantly more accurate than sighted participants at localizing sounds in the periphery. For both groups, the early N1 amplitude to peripheral standard stimuli displayed no significant spatial tuning. In contrast, the amplitude of the later P3 elicited by targets/deviants displayed a more sharply tuned spatial gradient during peripheral attention in the late blind than in the sighted group. These findings were compared with those of a previous study of congenitally blind individuals in the same task [Röder, B., Teder-Sälejärvi, W., Sterr, A., Rösler, F., Hillyard, S. A., & Neville, H. J. Improved auditory spatial tuning in blind humans. Nature, 400 , 162–166, 1999]. It was concluded that both late blind and congenitally blind individuals demonstrate an enhanced capability for focusing auditory attention in the periphery, but they do so via different mechanisms: whereas congenitally blind persons demonstrate a more sharply tuned early attentional filtering, manifested in the N1, late blind individuals show superiority in a later stage of target discrimination and recognition, indexed by the P3.