Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-4 of 4
Andrea R. Halpern
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2022) 34 (8): 1326–1339.
Published: 01 July 2022
FIGURES
| View All (7)
Abstract
View article
PDF
Notes in a musical scale convey different levels of stability or incompleteness, forming what is known as a tonal hierarchy. Levels of stability conveyed by these scale degrees are partly responsible for generating expectations as a melody proceeds, for emotions deriving from fulfillment (or not) of those expectations, and for judgments of overall melodic well-formedness. These functions can be extracted even during imagined music. We investigated whether patterns of neural activity in fMRI could be used to identify heard and imagined notes, and if patterns associated with heard notes could identify notes that were merely imagined. We presented trained musicians with the beginning of a scale (key and timbre were varied). The next note in the scale was either heard or imagined. A probe tone task assessed sensitivity to the tonal hierarchy, and state and trait measures of imagery were included as predictors. Multivoxel classification yielded above-chance results in primary auditory cortex (Heschl's gyrus) for heard scale-degree decoding. Imagined scale-degree decoding was successful in multiple cortical regions spanning bilateral superior temporal, inferior parietal, precentral, and inferior frontal areas. The right superior temporal gyrus yielded successful cross-decoding of heard-to-imagined scale-degree, indicating a shared pathway between tonal-hierarchy perception and imagery. Decoding in right and left superior temporal gyrus and right inferior frontal gyrus was more successful in people with more differentiated tonal hierarchies and in left inferior frontal gyrus among people with higher self-reported auditory imagery vividness, providing a link between behavioral traits and success of neural decoding. These results point to the neural specificity of imagined auditory experiences—even of such functional knowledge—but also document informative individual differences in the precision of that neural response.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2012) 24 (6): 1382–1397.
Published: 01 June 2012
FIGURES
| View All (6)
Abstract
View article
PDF
We used fMRI to investigate the neuronal correlates of encoding and recognizing heard and imagined melodies. Ten participants were shown lyrics of familiar verbal tunes; they either heard the tune along with the lyrics, or they had to imagine it. In a subsequent surprise recognition test, they had to identify the titles of tunes that they had heard or imagined earlier. The functional data showed substantial overlap during melody perception and imagery, including secondary auditory areas. During imagery compared with perception, an extended network including pFC, SMA, intraparietal sulcus, and cerebellum showed increased activity, in line with the increased processing demands of imagery. Functional connectivity of anterior right temporal cortex with frontal areas was increased during imagery compared with perception, indicating that these areas form an imagery-related network. Activity in right superior temporal gyrus and pFC was correlated with the subjective rating of imagery vividness. Similar to the encoding phase, the recognition task recruited overlapping areas, including inferior frontal cortex associated with memory retrieval, as well as left middle temporal gyrus. The results present new evidence for the cortical network underlying goal-directed auditory imagery, with a prominent role of the right pFC both for the subjective impression of imagery vividness and for on-line mental monitoring of imagery-related activity in auditory areas.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (4): 775–789.
Published: 01 April 2010
FIGURES
| View All (5)
Abstract
View article
PDF
Two fMRI experiments explored the neural substrates of a musical imagery task that required manipulation of the imagined sounds: temporal reversal of a melody. Musicians were presented with the first few notes of a familiar tune (Experiment 1) or its title (Experiment 2), followed by a string of notes that was either an exact or an inexact reversal. The task was to judge whether the second string was correct or not by mentally reversing all its notes, thus requiring both maintenance and manipulation of the represented string. Both experiments showed considerable activation of the superior parietal lobe (intraparietal sulcus) during the reversal process. Ventrolateral and dorsolateral frontal cortices were also activated, consistent with the memory load required during the task. We also found weaker evidence for some activation of right auditory cortex in both studies, congruent with results from previous simpler music imagery tasks. We interpret these results in the context of other mental transformation tasks, such as mental rotation in the visual domain, which are known to recruit the intraparietal sulcus region, and we propose that this region subserves general computations that require transformations of a sensory input. Mental imagery tasks may thus have both task or modality-specific components as well as components that supersede any specific codes and instead represent amodal mental manipulation.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (1996) 8 (1): 29–46.
Published: 01 January 1996
Abstract
View article
PDF
Neuropsychological studies have suggested that imagery processes may be mediated by neuronal mechanisms similar to those used in perception. To test this hypothesis, and to explore the neural basis for song imagery, 12 normal subjects were scanned using the water bolus method to measure cerebral blood flow (CBF) during the performance of three tasks. In the control condition subjects saw pairs of words on each trial and judged which word was longer. In the perceptual condition subjects also viewed pairs of words, this time drawn from a familiar song; simultaneously they heard the corresponding song, and their task was to judge the change in pitch of the two cued words within the song. In the imagery condition, subjects performed precisely the same judgment as in the perceptual condition, but with no auditory input. Thus, to perform the imagery task correctly an internal auditory representation must be accessed. Paired-image subtraction of the resulting pattern of CBF, together with matched MRI for anatomical localization, revealed that both perceptual and imagery. tasks produced similar patterns of CBF changes, as compared to the control condition, in keeping with the hypothesis. More specifically, both perceiving and imagining songs are associated with bilateral neuronal activity in the secondary auditory cortices, suggesting that processes within these regions underlie the phenomenological impression of imagined sounds. Other CBF foci elicited in both tasks include areas in the left and right frontal lobes and in the left parietal lobe, as well as the supplementary motor area. This latter region implicates covert vocalization as one component of musical imagery. Direct comparison of imagery and perceptual tasks revealed CBF increases in the inferior frontal polar cortex and right thalamus. We speculate that this network of regions may be specifically associated with retrieval and/or generation of auditory information from memory.