Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Clare Press
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2023) 35 (7): 1133–1143.
Published: 01 July 2023
FIGURES
| View All (5)
Abstract
View article
PDF
Perceivers can use past experiences to make sense of ambiguous sensory signals. However, this may be inappropriate when the world changes and past experiences no longer predict what the future holds. Optimal learning models propose that observers decide whether to stick with or update their predictions by tracking the uncertainty or “precision” of their expectations. However, contrasting theories of prediction have argued that we are prone to misestimate uncertainty—leading to stubborn predictions that are difficult to dislodge. To compare these possibilities, we had participants learn novel perceptual predictions before using fMRI to record visual brain activity when predictive contingencies were disrupted—meaning that previously “expected” events became objectively improbable. Multivariate pattern analyses revealed that expected events continued to be decoded with greater fidelity from primary visual cortex, despite marked changes in the statistical structure of the environment, which rendered these expectations no longer valid. These results suggest that our perceptual systems do indeed form stubborn predictions even from short periods of learning—and more generally suggest that top–down expectations have the potential to help or hinder perceptual inference in bounded minds like ours.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (10): 2198–2211.
Published: 01 October 2010
FIGURES
| View All (5)
Abstract
View article
PDF
Several theories of the mechanisms linking perception and action require that the links are bidirectional, but there is a lack of consensus on the effects that action has on perception. We investigated this by measuring visual event-related brain potentials to observed hand actions while participants prepared responses that were spatially compatible (e.g., both were on the left side of the body) or incompatible and action type compatible (e.g., both were finger taps) or incompatible, with observed actions. An early enhanced processing of spatially compatible stimuli was observed, which is likely due to spatial attention. This was followed by an attenuation of processing for both spatially and action type compatible stimuli, likely to be driven by efference copy signals that attenuate processing of predicted sensory consequences of actions. Attenuation was not response-modality specific; it was found for manual stimuli when participants prepared manual and vocal responses, in line with the hypothesis that action control is hierarchically organized. These results indicate that spatial attention and forward model prediction mechanisms have opposite, but temporally distinct, effects on perception. This hypothesis can explain the inconsistency of recent findings on action–perception links and thereby supports the view that sensorimotor links are bidirectional. Such effects of action on perception are likely to be crucial, not only for the control of our own actions but also in sociocultural interaction, allowing us to predict the reactions of others to our own actions.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (2): 312–323.
Published: 01 February 2008
Abstract
View article
PDF
We studied how the integration of seen and felt tactile stimulation modulates somatosensory processing, and investigated whether visuotactile integration depends on temporal contiguity of stimulation, and its coherence with a preexisting body representation. During training, participants viewed a rubber hand or a rubber object that was tapped either synchronously with stimulation of their own hand, or in an uncorrelated fashion. In a subsequent test phase, somatosensory event-related potentials (ERPs) were recorded to tactile stimulation of the left or right hand, to assess how tactile processing was affected by previous visuotactile experience during training. An enhanced somatosensory N140 component was elicited after synchronous, compared with uncorrelated, visuotactile training, irrespective of whether participants viewed a rubber hand or rubber object. This early effect of visuotactile integration on somatosensory processing is interpreted as a candidate electro-physiological correlate of the rubber hand illusion that is determined by temporal contiguity, but not by preexisting body representations. ERP modulations were observed beyond 200 msec poststimulus, suggesting an attentional bias induced by visuotactile training. These late modulations were absent when the stimulation of a rubber hand and the participant's own hand was uncorrelated during training, suggesting that preexisting body representations may affect later stages of tactile processing.