Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Romeo Salemme
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2022) 34 (4): 675–686.
Published: 05 March 2022
FIGURES
Abstract
View article
PDF
The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2019) 31 (8): 1141–1154.
Published: 01 August 2019
FIGURES
Abstract
View article
PDF
Peripersonal space is a multisensory representation relying on the processing of tactile and visual stimuli presented on and close to different body parts. The most studied peripersonal space representation is perihand space (PHS), a highly plastic representation modulated following tool use and by the rapid approach of visual objects. Given these properties, PHS may serve different sensorimotor functions, including guidance of voluntary actions such as object grasping. Strong support for this hypothesis would derive from evidence that PHS plastic changes occur before the upcoming movement rather than after its initiation, yet to date, such evidence is scant. Here, we tested whether action-dependent modulation of PHS, behaviorally assessed via visuotactile perception, may occur before an overt movement as early as the action planning phase. To do so, we probed tactile and visuotactile perception at different time points before and during the grasping action. Results showed that visuotactile perception was more strongly affected during the planning phase (250 msec after vision of the target) than during a similarly static but earlier phase (50 msec after vision of the target). Visuotactile interaction was also enhanced at the onset of hand movement, and it further increased during subsequent phases of hand movement. Such a visuotactile interaction featured interference effects during all phases from action planning onward as well as a facilitation effect at the movement onset. These findings reveal that planning to grab an object strengthens the multisensory interaction of visual information from the target and somatosensory information from the hand. Such early updating of the visuotactile interaction reflects multisensory processes supporting motor planning of actions.