Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-7 of 7
Alessandro Farné
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2022) 34 (4): 675–686.
Published: 05 March 2022
FIGURES
Abstract
View article
PDF
The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2019) 31 (8): 1141–1154.
Published: 01 August 2019
FIGURES
Abstract
View article
PDF
Peripersonal space is a multisensory representation relying on the processing of tactile and visual stimuli presented on and close to different body parts. The most studied peripersonal space representation is perihand space (PHS), a highly plastic representation modulated following tool use and by the rapid approach of visual objects. Given these properties, PHS may serve different sensorimotor functions, including guidance of voluntary actions such as object grasping. Strong support for this hypothesis would derive from evidence that PHS plastic changes occur before the upcoming movement rather than after its initiation, yet to date, such evidence is scant. Here, we tested whether action-dependent modulation of PHS, behaviorally assessed via visuotactile perception, may occur before an overt movement as early as the action planning phase. To do so, we probed tactile and visuotactile perception at different time points before and during the grasping action. Results showed that visuotactile perception was more strongly affected during the planning phase (250 msec after vision of the target) than during a similarly static but earlier phase (50 msec after vision of the target). Visuotactile interaction was also enhanced at the onset of hand movement, and it further increased during subsequent phases of hand movement. Such a visuotactile interaction featured interference effects during all phases from action planning onward as well as a facilitation effect at the movement onset. These findings reveal that planning to grab an object strengthens the multisensory interaction of visual information from the target and somatosensory information from the hand. Such early updating of the visuotactile interaction reflects multisensory processes supporting motor planning of actions.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2012) 24 (12): 2306–2320.
Published: 01 December 2012
FIGURES
| View All (6)
Abstract
View article
PDF
Although the somatosensory homunculus is a classically used description of the way somatosensory inputs are processed in the brain, the actual contributions of primary (SI) and secondary (SII) somatosensory cortices to the spatial coding of touch remain poorly understood. We studied adaptation of the fMRI BOLD response in the somatosensory cortex by delivering pairs of vibrotactile stimuli to the finger tips of the index and middle fingers. The first stimulus (adaptor) was delivered either to the index or to the middle finger of the right or left hand, and the second stimulus (test) was always administered to the left index finger. The overall BOLD response evoked by the stimulation was primarily contralateral in SI and was more bilateral in SII. However, our fMRI adaptation approach also revealed that both somatosensory cortices were sensitive to ipsilateral as well as to contralateral inputs. SI and SII adapted more after subsequent stimulation of homologous as compared with nonhomologous fingers, showing a distinction between different fingers. Most importantly, for both somatosensory cortices, this finger-specific adaptation occurred irrespective of whether the tactile stimulus was delivered to the same or to different hands. This result implies integration of contralateral and ipsilateral somatosensory inputs in SI as well as in SII. Our findings suggest that SI is more than a simple relay for sensory information and that both SI and SII contribute to the spatial coding of touch by discriminating between body parts (fingers) and by integrating the somatosensory input from the two sides of the body (hands).
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2011) 23 (7): 1741–1751.
Published: 01 July 2011
FIGURES
| View All (4)
Abstract
View article
PDF
Autoscopic phenomena refer to complex experiences involving the illusory reduplication of one's own body. Here we report the third long-lasting case of autoscopy in a patient with right occipital lesion. Instead of the commonly reported frontal mirror view (fantôme spéculaire), the patient saw her head and upper trunk laterally in side view (fantôme de profil). We found that the visual appearance and completeness of the autoscopic image could be selectively modulated by active and passive movements, without being influenced by imagining the same movements or by tactile and auditory stimulation. Eyes closure did not disrupt either the perception of the autoscopic body or the effects of the motor stimulation. Moreover, the visual body reduplication was coded neither in purely eye-centered nor in head-centered frames of reference, suggesting the involvement of egocentric coordinate systems (eyes and head centered). A follow-up examination highlighted the stability of the visual characteristics of the body reduplication and its shift induced by displacement of both head and eyes. These findings support the view that autoscopic phenomena have a multisensory motor origin and proprioceptive signals may play an important role in modulating the illusory visual reduplication of the patient's own body, most likely via cross-modal modulation of extrastriate areas involved in body and face perception.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2004) 16 (1): 24–30.
Published: 01 January 2004
Abstract
View article
PDF
Rice University The visual modality typically dominates over our other senses. Here we show that after inducing an extreme conflict in the left hand between vision of touch (present) and the feeling of touch (absent), sensitivity to touch increases for several minutes after the conflict. Transcranial magnetic stimulation of the posterior parietal cortex after this conflict not only eliminated the enduring visual enhancement of touch, but also impaired normal tactile perception. This latter finding demonstrates a direct role of the parietal lobe in modulating tactile perception as a result of the conflict between these senses. These results provide evidence for visual-to-tactile perceptual modulation and demonstrate effects of illusory vision of touch on touch perception through a long-lasting modulatory process in the posterior parietal cortex.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2002) 14 (7): 1030–1043.
Published: 01 October 2002
Abstract
View article
PDF
In the present study we report neuropsychological evidence of the existence of an auditory peripersonal space representation around the head in humans and its characteristics. In a group of right brain-damaged patients with tactile extinction, we found that a sound delivered near the ipsilesional side of the head (20 cm) strongly extinguished a tactile stimulus delivered to the contralesional side of the head (cross-modal auditory-tactile extinction). By contrast, when an auditory stimulus was presented far from the head (70 cm), cross-modal extinction was dramatically reduced. This spatially specific cross-modal extinction was most consistently found (i.e., both in the front and back spaces) when a complex sound was presented, like a white noise burst. Pure tones produced spatially specific cross-modal extinction when presented in the back space, but not in the front space. In addition, the most severe cross-modal extinction emerged when sounds came from behind the head, thus showing that the back space is more sensitive than the front space to the sensory interaction of auditory-tactile inputs. Finally, when cross-modal effects were investigated by reversing the spatial arrangement of cross-modal stimuli (i.e., touch on the right and sound on the left), we found that an ipsilesional tactile stimulus, although inducing a small amount of cross-modal tactile-auditory extinction, did not produce any spatial-specific effect. Therefore, the selective aspects of cross-modal interaction found near the head cannot be explained by a competition between a damaged left spatial representation and an intact right spatial representation. Thus, consistent with neurophysiological evidence from monkeys, our findings strongly support the existence, in humans, of an integrated cross-modal system coding auditory and tactile stimuli near the body, that is, in the peripersonal space.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (1998) 10 (5): 581–589.
Published: 01 September 1998
Abstract
View article
PDF
Current interpretations of extinction suggest that the disorder is due to an unbalanced competition between ipsilesional and contralesional representations of space. The question addressed in this study is whether the competition between left and right representations of space in one sensory modality (i.e., touch) can be reduced or exacerbated by the activation of an intact spatial representation in a different modality that is functionally linked to the damaged representation (i.e., vision). This hypothesis was tested in 10 right-hemisphere lesioned patients who suffered from reliable tactile extinction. We found that a visual stimulus presented near the patient's ipsilesional hand (i.e., visual peripersonal space) inhibited the processing of a tactile stimulus delivered on the contralesional hand (cross-modal visuotactile extinction) to the same extent as did an ipsilesional tactile stimulation (unimodal tactile extinction). It was also found that a visual stimulus presented near the contralesional hand improved the detection of a tactile stimulus applied to the same hand. In striking contrast, less modulatory effects of vision on touch perception were observed when a visual stimulus was presented far from the space immediately around the patient's hand (i.e., extrapersonal space). This study clearly demonstrates the existence of a visual peripersonal space centered on the hand in humans and its modulatory effects on tactile perception. These findings are explained by referring to the activity of bimodal neurons in premotor and parietal cortex of macaque, which have tactile receptive fields on the hand and corresponding visual receptive fields in the space immediately adjacent to the tactile fields.