Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Kenneth A. Norman
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2022) 34 (4): 699–714.
Published: 05 March 2022
FIGURES
| View All (7)
Abstract
View articletitled, High-Order Areas and Auditory Cortex Both Represent the High-Level Event Structure of Music
View
PDF
for article titled, High-Order Areas and Auditory Cortex Both Represent the High-Level Event Structure of Music
Recent fMRI studies of event segmentation have found that default mode regions represent high-level event structure during movie watching. In these regions, neural patterns are relatively stable during events and shift at event boundaries. Music, like narratives, contains hierarchical event structure (e.g., sections are composed of phrases). Here, we tested the hypothesis that brain activity patterns in default mode regions reflect the high-level event structure of music. We used fMRI to record brain activity from 25 participants (male and female) as they listened to a continuous playlist of 16 musical excerpts and additionally collected annotations for these excerpts by asking a separate group of participants to mark when meaningful changes occurred in each one. We then identified temporal boundaries between stable patterns of brain activity using a hidden Markov model and compared the location of the model boundaries to the location of the human annotations. We identified multiple brain regions with significant matches to the observer-identified boundaries, including auditory cortex, medial prefrontal cortex, parietal cortex, and angular gyrus. From these results, we conclude that both higher-order and sensory areas contain information relating to the high-level event structure of music. Moreover, the higher-order areas in this study overlap with areas found in previous studies of event perception in movies and audio narratives, including regions in the default mode network.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2021) 33 (6): 1106–1128.
Published: 01 May 2021
FIGURES
| View All (9)
Abstract
View articletitled, Relating the Past with the Present: Information Integration and Segregation during Ongoing Narrative Processing
View
PDF
for article titled, Relating the Past with the Present: Information Integration and Segregation during Ongoing Narrative Processing
This study examined how the brain dynamically updates event representations by integrating new information over multiple minutes while segregating irrelevant input. A professional writer custom-designed a narrative with two independent storylines, interleaving across minute-long segments (ABAB). In the last (C) part, characters from the two storylines meet and their shared history is revealed. Part C is designed to induce the spontaneous recall of past events, upon the recurrence of narrative motifs from A/B, and to shed new light on them. Our fMRI results showed storyline-specific neural patterns, which were reinstated (i.e., became more active) during storyline transitions. This effect increased along the processing timescale hierarchy, peaking in the default mode network. Similarly, the neural reinstatement of motifs was found during Part C. Furthermore, participants showing stronger motif reinstatement performed better in integrating A/B and C events, demonstrating the role of memory reactivation in information integration over intervening irrelevant events.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2017) 29 (8): 1339–1354.
Published: 01 August 2017
FIGURES
| View All (4)
Abstract
View articletitled, Multiple-object Tracking as a Tool for Parametrically Modulating Memory Reactivation
View
PDF
for article titled, Multiple-object Tracking as a Tool for Parametrically Modulating Memory Reactivation
Converging evidence supports the “nonmonotonic plasticity” hypothesis, which states that although complete retrieval may strengthen memories, partial retrieval weakens them. Yet, the classic experimental paradigms used to study effects of partial retrieval are not ideally suited to doing so, because they lack the parametric control needed to ensure that the memory is activated to the appropriate degree (i.e., that there is some retrieval but not enough to cause memory strengthening). Here, we present a novel procedure designed to accommodate this need. After participants learned a list of word–scene associates, they completed a cued mental visualization task that was combined with a multiple-object tracking (MOT) procedure, which we selected for its ability to interfere with mental visualization in a parametrically adjustable way (by varying the number of MOT targets). We also used fMRI data to successfully train an “associative recall” classifier for use in this task: This classifier revealed greater memory reactivation during trials in which associative memories were cued while participants tracked one, rather than five, MOT targets. However, the classifier was insensitive to task difficulty when recall was not taking place, suggesting that it had indeed tracked memory reactivation rather than task difficulty per se. Consistent with the classifier findings, participants' introspective ratings of visualization vividness were modulated by MOT task difficulty. In addition, we observed reduced classifier output and slowing of responses in a postreactivation memory test, consistent with the hypothesis that partial reactivation, induced by MOT, weakened memory. These results serve as a “proof of concept” that MOT can be used to parametrically modulate memory retrieval—a property that may prove useful in future investigation of partial retrieval effects, for example, in closed-loop experiments.