Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Sophie Egan
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2024) 36 (10): 2067–2083.
Published: 01 October 2024
FIGURES
| View All (6)
Abstract
View article
PDF
The N1/P2 amplitude reduction for self-generated tones in comparison to external tones in EEG, which has recently also been described for action observation, is an example of the so-called sensory attenuation. Whether this effect is dependent on motor-based or general predictive mechanisms is unclear. Using a paradigm, in which actions (button presses) elicited tones in only half the trials, this study examined how the processing of the tones is modulated by the prediction error in each trial in a self-performed action compared with action observation. In addition, we considered the effect of temporal predictability by adding a third condition, in which visual cues were followed by external tones in half the trials. The attenuation result patterns differed for N1 and P2 amplitudes, but neither showed an attenuation effect beyond temporal predictability. Interestingly, we found that both N1 and P2 amplitudes reflected prediction errors derived from a reinforcement learning model, in that larger errors coincided with larger amplitudes. This effect was stronger for tones following button presses compared with cued external tones, but only for self-performed and not for observed actions. Taken together, our results suggest that attenuation effects are partially driven by general predictive mechanisms irrespective of self-performed actions. However, the stronger prediction-error effects for self-generated tones suggest that distinct motor-related factors beyond temporal predictability, potentially linked to reinforcement learning, play a role in the underlying mechanisms. Further research is needed to validate these initial findings as the calculation of the prediction errors was limited by the design of the experiment.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2021) 33 (4): 683–694.
Published: 01 April 2021
FIGURES
Abstract
View article
PDF
In our social environment, we easily distinguish stimuli caused by our own actions (e.g., water splashing when I fill my glass) from stimuli that have an external source (e.g., water splashing in a fountain). Accumulating evidence suggests that processing the auditory consequences of self-performed actions elicits N1 and P2 ERPs of reduced amplitude compared to physically identical but externally generated sounds, with such reductions being ascribed to neural predictive mechanisms. It is unexplored, however, whether the sensory processing of action outcomes is similarly modulated by action observation (e.g., water splashing when I observe you filling my glass). We tested 40 healthy participants by applying a methodological approach for the simultaneous EEG recording of two persons: An observer observed button presses executed by a performer in real time. For the performers, we replicated previous findings of a reduced N1 amplitude for self- versus externally generated sounds. This pattern differed significantly from the one in observers, whose N1 for sounds generated by observed button presses was not attenuated. In turn, the P2 amplitude was reduced for processing action- versus externally generated sounds for both performers and observers. These findings show that both action performance and observation affect the processing of action-generated sounds. There are, however, important differences between the two in the timing of the effects, probably related to differences in the predictability of the actions and thus also the associated stimuli. We discuss how these differences might contribute to recognizing the stimulus as caused by self versus others.