Skip Nav Destination
1-3 of 3
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Journal of Cognitive Neuroscience (2019) 31 (2): 262–277.
Published: 01 February 2019
FIGURES | View All (5)
AbstractView article PDF
The neural dynamics underpinning binary perceptual decisions and their transformation into actions are well studied, but real-world decisions typically offer more than two response alternatives. How does decision-related evidence accumulation dynamically influence multiple action representations in humans? The heightened conservatism required in multiple compared with binary choice scenarios suggests a mechanism that compensates for increased uncertainty when multiple choices are present by suppressing baseline activity. Here, we tracked action representations using corticospinal excitability during four- and two-choice perceptual decisions and modeled them using a sequential sampling framework. We found that the predictions made by leaky competing accumulator models to accommodate multiple choices (i.e., reduced baseline activity to compensate increased uncertainty) were borne out by dynamic changes in human action representations. This suggests a direct and continuous influence of interacting evidence accumulators, each favoring a different decision alternative, on downstream corticospinal excitability during complex choice.
Object-guided Spatial Attention in Touch: Holding the Same Object with Both Hands Delays Attentional Selection
Journal of Cognitive Neuroscience (2010) 22 (5): 931–942.
Published: 01 May 2010
AbstractView article PDF
Previous research has shown that attention to a specific location on a uniform visual object spreads throughout the entire object. Here we demonstrate that, similar to the visual system, spatial attention in touch can be object guided. We measured event-related brain potentials to tactile stimuli arising from objects held by observers' hands, when the hands were placed either near each other or far apart, holding two separate objects, or when they were far apart but holding a common object. Observers covertly oriented their attention to the left, to the right, or to both hands, following bilaterally presented tactile cues indicating likely tactile target location(s). Attentional modulations for tactile stimuli at attended compared to unattended locations were present in the time range of early somatosensory components only when the hands were far apart, but not when they were near. This was found to reflect enhanced somatosensory processing at attended locations rather than suppressed processing at unattended locations. Crucially, holding a common object with both hands delayed attentional selection, similar to when the hands were near. This shows that the proprioceptive distance effect on tactile attentional selection arises when distant event locations can be treated as separate and unconnected sources of tactile stimulation, but not when they form part of the same object. These findings suggest that, similar to visual attention, both space- and object-based attentional mechanisms can operate when we select between tactile events on our body surface.
An ERP Investigation on Visuotactile Interactions in Peripersonal and Extrapersonal Space: Evidence for the Spatial Rule
Journal of Cognitive Neuroscience (2009) 21 (8): 1550–1559.
Published: 01 August 2009
AbstractView article PDF
The spatial rule of multisensory integration holds that cross-modal stimuli presented from the same spatial location result in enhanced multisensory integration. The present study investigated whether processing within the somatosensory cortex reflects the strength of cross-modal visuotactile interactions depending on the spatial relationship between visual and tactile stimuli. Visual stimuli were task-irrelevant and were presented simultaneously with touch in peripersonal and extrapersonal space, in the same or opposite hemispace with respect to the tactile stimuli. Participants directed their attention to one of their hands to detect infrequent tactile target stimuli at that hand while ignoring tactile targets at the unattended hand, all tactile nontarget stimuli, and any visual stimuli. Enhancement of ERPs recorded over and close to the somatosensory cortex was present as early as 100 msec after onset of stimuli (i.e., overlapping with the P100 component) when visual stimuli were presented next to the site of tactile stimulation (i.e., perihand space) compared to when these were presented at different locations in peripersonal or extrapersonal space. Therefore, this study provides electrophysiological support for the spatial rule of visual–tactile interaction in human participants. Importantly, these early cross-modal spatial effects occurred regardless of the locus of attention. In addition, and in line with previous research, we found attentional modulations of somatosensory processing only to be present in the time range of the N140 component and for longer latencies with an enhanced negativity for tactile stimuli at attended compared to unattended locations. Taken together, the pattern of the results from this study suggests that visuotactile spatial effects on somatosensory processing occur prior and independent of tactile–spatial attention.