Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Judith M. Shedden
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2023) 35 (7): 1092–1107.
Published: 01 July 2023
FIGURES
| View All (5)
Abstract
View articletitled, Stimulus Onset Asynchrony Affects Weighting-related Event-related Spectral Power in Self-motion Perception
View
PDF
for article titled, Stimulus Onset Asynchrony Affects Weighting-related Event-related Spectral Power in Self-motion Perception
Self-motion perception relies primarily on the integration of the visual, vestibular, proprioceptive, and somatosensory systems. There is a gap in understanding how a temporal lag between visual and vestibular motion cues affects visual–vestibular weighting during self-motion perception. The beta band is an index of visual–vestibular weighting, in that robust beta event-related synchronization (ERS) is associated with visual weighting bias, and robust beta event-related desynchronization is associated with vestibular weighting bias. The present study examined modulation of event-related spectral power during a heading judgment task in which participants attended to either visual (optic flow) or physical (inertial cues stimulating the vestibular, proprioceptive and somatosensory systems) motion cues from a motion simulator mounted on a MOOG Stewart Platform. The temporal lag between the onset of visual and physical motion cues was manipulated to produce three lag conditions: simultaneous onset, visual before physical motion onset, and physical before visual motion onset. There were two main findings. First, we demonstrated that when the attended motion cue was presented before an ignored cue, the power of beta associated with the attended modality was greater than when visual–vestibular cues were presented simultaneously or when the ignored cue was presented first. This was the case for beta ERS when the visual-motion cue was attended to, and beta event-related desynchronization when the physical-motion cue was attended to. Second, we tested whether the power of feature-binding gamma ERS (demonstrated in audiovisual and visual–tactile integration studies) increased when the visual–vestibular cues were presented simultaneously versus with temporal asynchrony. We did not observe an increase in gamma ERS when cues were presented simultaneously, suggesting that electrophysiological markers of visual–vestibular binding differ from markers of audiovisual and visual–tactile integration. All event-related spectral power reported in this study were generated from dipoles projecting from the left and right motor areas, based on the results of Measure Projection Analysis.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2009) 21 (6): 1127–1134.
Published: 01 June 2009
Abstract
View articletitled, Semantic Learning Modifies Perceptual Face Processing
View
PDF
for article titled, Semantic Learning Modifies Perceptual Face Processing
Face processing changes when a face is learned with personally relevant information. In a five-day learning paradigm, faces were presented with rich semantic stories that conveyed personal information about the faces. Event-related potentials were recorded before and after learning during a passive viewing task. When faces were novel, we observed the expected N170 repetition effect—a reduction in amplitude following face repetition. However, when faces were learned with personal information, the N170 repetition effect was eliminated, suggesting that semantic information modulates the N170 repetition effect. To control for the possibility that a simple perceptual effect contributed to the change in the N170 repetition effect, another experiment was conducted using stories that were not related to the person (i.e., stories about rocks and volcanoes). Although viewers were exposed to the faces an equal amount of time, the typical N170 repetition effect was observed, indicating that personal semantic information associated with a face, and not simply perceptual exposure, produced the observed reduction in the N170 repetition effect. These results are the first to reveal a critical perceptual change in face processing as a result of learning person-related information. The results have important implications for researchers studying face processing, as well as learning and memory in general, as they demonstrate that perceptual information alone is not enough to establish familiarity akin to real-world person learning.