Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Orestis Papaioannou
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2022) 34 (2): 313–331.
Published: 05 January 2022
FIGURES
| View All (7)
Abstract
View article
PDF
Working memory is thought to serve as a buffer for ongoing cognitive operations, even in tasks that have no obvious memory requirements. This conceptualization has been supported by dual-task experiments, in which interference is observed between a primary task involving short-term memory storage and a secondary task that presumably requires the same buffer as the primary task. Little or no interference is typically observed when the secondary task is very simple. Here, we test the hypothesis that even very simple tasks require the working memory buffer, but interference can be minimized by using activity-silent representations to store the information from the primary task. We tested this hypothesis using dual-task paradigm in which a simple discrimination task was interposed in the retention interval of a change detection task. We used contralateral delay activity (CDA) to track the active maintenance of information for the change detection task. We found that the CDA was massively disrupted after the interposed task. Despite this disruption of active maintenance, we found that performance in the change detection task was only slightly impaired, suggesting that activity-silent representations were used to retain the information for the change detection task. A second experiment replicated this result and also showed that automated discriminations could be performed without producing a large CDA disruption. Together, these results suggest that simple but non-automated discrimination tasks require the same processes that underlie active maintenance of information in working memory.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2018) 30 (4): 498–513.
Published: 01 April 2018
FIGURES
| View All (12)
Abstract
View article
PDF
In auditory–visual sensory substitution, visual information (e.g., shape) can be extracted through strictly auditory input (e.g., soundscapes). Previous studies have shown that image-to-sound conversions that follow simple rules [such as the Meijer algorithm; Meijer, P. B. L. An experimental system for auditory image representation. Transactions on Biomedical Engineering, 39, 111–121, 1992] are highly intuitive and rapidly learned by both blind and sighted individuals. A number of recent fMRI studies have begun to explore the neuroplastic changes that result from sensory substitution training. However, the time course of cross-sensory information transfer in sensory substitution is largely unexplored and may offer insights into the underlying neural mechanisms. In this study, we recorded ERPs to soundscapes before and after sighted participants were trained with the Meijer algorithm. We compared these posttraining versus pretraining ERP differences with those of a control group who received the same set of 80 auditory/visual stimuli but with arbitrary pairings during training. Our behavioral results confirmed the rapid acquisition of cross-sensory mappings, and the group trained with the Meijer algorithm was able to generalize their learning to novel soundscapes at impressive levels of accuracy. The ERP results revealed an early cross-sensory learning effect (150–210 msec) that was significantly enhanced in the algorithm-trained group compared with the control group as well as a later difference (420–480 msec) that was unique to the algorithm-trained group. These ERP modulations are consistent with previous fMRI results and provide additional insight into the time course of cross-sensory information transfer in sensory substitution.