Abstract
The rubber hand illusion (RHI) paradigm—in which illusory bodily ownership is induced by synchronous tactile stimulation of a participant's (hidden) hand and a (visible) surrogate—allows one to investigate how the brain resolves conflicting multisensory evidence during perceptual inference. To identify the functional anatomy of the RHI, we used multichannel EEG, acquired under three conditions of tactile stimulation. Evoked potentials were averaged from EEG signals registered to the timing of brushstrokes to the participant's hand. The participant's hand was stroked either in the absence of an artificial hand (REAL) or synchronously with an artificial hand, which either lay in an anatomically plausible (CONGRUENT) or impossible (INCONGRUENT) position. The illusion was reliably elicited in the CONGRUENT condition. For right-hand stimulation, significant differences between conditions emerged at the sensor level around 55 msec after the brushstroke at left frontal and right parietal electrodes. Response amplitudes were smaller for illusory (CONGRUENT) compared with nonillusory (INCONGRUENT and REAL) conditions in the contralateral perirolandic region (pre- and postcentral gyri), superior and inferior parietal lobule, whereas veridical perception of the artificial hand (INCONGRUENT) amplified responses at a scalp region overlying the contralateral postcentral gyrus and inferior parietal lobule compared with the remaining two conditions. Left-hand stimulation produced similar contralateral patterns. These results are consistent with predictive coding models of multisensory integration and may reflect the attenuation of somatosensory precision that is required to resolve perceptual hypotheses about conflicting multisensory input.