Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-11 of 11
Stefan Koelsch
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2011) 23 (9): 2252–2267.
Published: 01 September 2011
FIGURES
| View All (5)
Abstract
View article
PDF
The present study investigated the effects of auditory selective attention on the processing of syntactic information in music and speech using event-related potentials. Spoken sentences or musical chord sequences were either presented in isolation, or simultaneously. When presented simultaneously, participants had to focus their attention either on speech, or on music. Final words of sentences and final harmonies of chord sequences were syntactically either correct or incorrect. Irregular chords elicited an early right anterior negativity (ERAN), whose amplitude was decreased when music was simultaneously presented with speech, compared to when only music was presented. However, the amplitude of the ERAN-like waveform elicited when music was ignored did not differ from the conditions in which participants attended the chord sequences. Irregular sentences elicited an early left anterior negativity (ELAN), regardless of whether speech was presented in isolation, was attended, or was to be ignored. These findings suggest that the neural mechanisms underlying the processing of syntactic structure of music and speech operate partially automatically, and, in the case of music, are influenced by different attentional conditions. Moreover, the ERAN was slightly reduced when irregular sentences were presented, but only when music was ignored. Therefore, these findings provide no clear support for an interaction of neural resources for syntactic processing already at these early stages.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2011) 23 (3): 604–621.
Published: 01 March 2011
FIGURES
| View All (7)
Abstract
View article
PDF
Recent studies have shown that music is capable of conveying semantically meaningful concepts. Several questions have subsequently arisen particularly with regard to the precise mechanisms underlying the communication of musical meaning as well as the role of specific musical features. The present article reports three studies investigating the role of affect expressed by various musical features in priming subsequent word processing at the semantic level. By means of an affective priming paradigm, it was shown that both musically trained and untrained participants evaluated emotional words congruous to the affect expressed by a preceding chord faster than words incongruous to the preceding chord. This behavioral effect was accompanied by an N400, an ERP typically linked with semantic processing, which was specifically modulated by the (mis)match between the prime and the target. This finding was shown for the musical parameter of consonance/dissonance (Experiment 1) and then extended to mode (major/minor) (Experiment 2) and timbre (Experiment 3). Seeing that the N400 is taken to reflect the processing of meaning, the present findings suggest that the emotional expression of single musical features is understood by listeners as such and is probably processed on a level akin to other affective communications (i.e., prosody or vocalizations) because it interferes with subsequent semantic processing. There were no group differences, suggesting that musical expertise does not have an influence on the processing of emotional expression in music and its semantic connotations.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (10): 2401–2413.
Published: 01 October 2010
FIGURES
| View All (8)
Abstract
View article
PDF
Musicians are highly trained motor experts with pronounced associations between musical actions and the corresponding auditory effects. However, the importance of auditory feedback for music performance is controversial, and it is unknown how feedback during music performance is processed. The present study investigated the neural mechanisms underlying the processing of auditory feedback manipulations in pianists. To disentangle effects of action-based and perception-based expectations, we compared feedback manipulations during performance to the mere perception of the same stimulus material. In two experiments, pianists performed bimanually sequences on a piano, while at random positions, the auditory feedback of single notes was manipulated, thereby creating a mismatch between an expected and actually perceived action effect (action condition). In addition, pianists listened to tone sequences containing the same manipulations (perception condition). The manipulations in the perception condition were either task-relevant (Experiment 1) or task-irrelevant (Experiment 2). In action and perception conditions, event-related potentials elicited by manipulated tones showed an early fronto-central negativity around 200 msec, presumably reflecting a feedback ERN/N200, followed by a positive deflection (P3a). The early negativity was more pronounced during the action compared to the perception condition. This shows that during performance, the intention to produce specific auditory effects leads to stronger expectancies than the expectancies built up during music perception.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (10): 2251–2262.
Published: 01 October 2010
FIGURES
| View All (5)
Abstract
View article
PDF
The music we usually listen to in everyday life consists of either single melodies or harmonized melodies (i.e., of melodies “accompanied” by chords). However, differences in the neural mechanisms underlying melodic and harmonic processing have remained largely unknown. Using EEG, this study compared effects of music-syntactic processing between chords and melodies. In melody blocks, sequences consisted of five tones, the final tone being either regular or irregular ( p = .5). Analogously, in chord blocks, sequences consisted of five chords, the final chord function being either regular or irregular. Melodies were derived from the top voice of chord sequences, allowing a proper comparison between melodic and harmonic processing. Music-syntactic incongruities elicited an early anterior negativity with a latency of approximately 125 msec in both the melody and the chord conditions. This effect was followed in the chord condition, but not in the melody condition, by an additional negative effect that was maximal at approximately 180 msec. Both effects were maximal at frontal electrodes, but the later effect was more broadly distributed over the scalp than the earlier effect. These findings indicate that melodic information (which is also contained in the top voice of chords) is processed earlier and with partly different neural mechanisms than harmonic information of chords.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (11): 1940–1951.
Published: 01 November 2008
Abstract
View article
PDF
Both language and music consist of sequences that are structured according to syntactic regularities. We used two specific event-related brain potential (ERP) components to investigate music-syntactic processing in children: the ERAN (early right anterior negativity) and the N5. The neural resources underlying these processes have been posited to overlap with those involved in the processing of linguistic syntax. Thus, we expected children with specific language impairment (SLI, which is characterized by deficient processing of linguistic syntax) to demonstrate difficulties with music-syntactic processing. Such difficulties were indeed observed in the neural correlates of music-syntactic processing: neither an ERAN nor an N5 was elicited in children with SLI, whereas both components were evoked in age-matched control children with typical language development. Moreover, the amplitudes of ERAN and N5 were correlated with subtests of a language development test. These data provide evidence for a strong interrelation between the language and the music processing system, thereby setting the ground for possible effects of musical training in SLI therapy.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2006) 18 (9): 1545–1554.
Published: 01 September 2006
Abstract
View article
PDF
The present study investigates the effect of a change in syntactic-like musical function on event-related brain potentials (ERPs). Eight-chord piano sequences were presented to musically expert and novice listeners. Instructed to watch a movie and to ignore the musical sequences, the participants had to react when a chord was played with a different instrument than the piano. Participants were not informed that the relevant manipulation was the musical function of the last chord (target) of the sequences. The target chord acted either as a syntactically stable tonic chord (i.e., a C major chord in the key of C major) or as a less syntactically stable subdominant chord (i.e., a C major chord in the key of G major). The critical aspect of the results related to the impact such a manipulation had on the ERPs. An N5-like frontal negative component was found to be larger for subdominant than for tonic chords and attained significance only in musically expert listeners. These findings suggest that the subdominant chord is more difficult to integrate with the previous context than the tonic chord (as indexing by the observed N5) and that the processing of a small change in musical function occurs in an automatic way in musically expert listeners. The present results are discussed in relation to previous studies investigating harmonic violations with ERPs.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2006) 18 (8): 1380–1393.
Published: 01 August 2006
Abstract
View article
PDF
The purpose of the present study was to investigate the effect of harmonic expectancy violations on emotions. Subjective response measures for tension and emotionality, as well as electrodermal activity (EDA) and heart rate (HR), were recorded from 24 subjects (12 musicians and 12 nonmusicians) to observe the effect of expectancy violations on subjective and physiological measures of emotions. In addition, an electro-encephalogram was recorded to observe the neural correlates for detecting these violations. Stimuli consisted of three matched versions of six Bach chorales, which differed only in terms of one chord (harmonically either expected, unexpected or very unexpected). Musicians' and nonmusicians' responses were also compared. Tension, overall subjective emotionality, and EDA increased with an increase in harmonic unexpectedness. Analysis of the event-related potentials revealed an early negativity (EN) for both the unexpected and the very unexpected harmonies, taken to reflect the detection of the unexpected event. The EN in response to very unexpected chords was significantly larger in amplitude than the EN in response to merely unexpected harmonic events. The ENs did not differ in amplitude between the two groups but peaked earlier for musicians than for nonmusicians. Both groups also showed a P3 component in response to the very unexpected harmonies, which was considerably larger for musicians and may reflect the processing of stylistic violations of Western classical music.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2005) 17 (10): 1565–1577.
Published: 01 October 2005
Abstract
View article
PDF
The present study investigated simultaneous processing of language and music using visually presented sentences and auditorily presented chord sequences. Music-syntactically regular and irregular chord functions were presented synchronously with syntactically correct or incorrect words, or with words that had either a high or a low semantic cloze probability. Music-syntactically irregular chords elicited an early right anterior negativity (ERAN). Syntactically incorrect words elicited a left anterior negativity (LAN). The LAN was clearly reduced when words were presented simultaneously with music-syntactically irregular chord functions. Processing of high and low cloze-probability words as indexed by the N400 was not affected by the presentation of irregular chord functions. In a control experiment, the LAN was not affected by physically deviant tones that elicited a mismatch negativity (MMN). Results demonstrate that processing of musical syntax (as reflected in the ERAN) interacts with the processing of linguistic syntax (as reflected in the LAN), and that this interaction is not due to a general effect of deviance-related negativities that precede an LAN. Findings thus indicate a strong overlap of neural resources involved in the processing of syntax in language and music.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2003) 15 (8): 1149–1159.
Published: 15 November 2003
Abstract
View article
PDF
A common stylistic element of Western tonal music is the change of key within a musical sequence (known as modulation in musical terms). The aim of the present study was to investigate neural correlates of the cognitive processing of modulations with event-related brain potentials. Participants listened to sequences of chords that were infrequently modulating. Modulating chords elicited distinct effects in the event-related brain potentials: an early right anterior negativity reflecting the processing of a violation of musical regularities and a late frontal negativity taken to reflect processes of harmonic integration. Additionally, modulations elicited a tonic negative potential suggested to reflect cognitive processes characteristic for the processing of tonal modulations, namely, the restructuring of the “hierarchy of harmonic stability” (which specifies musical expectations), presumably entailing working memory operations. Participants were “nonmusicians”; results thus support the hypothesis that nonmusicians have a sophisticated (implicit) knowledge about musical regularities.
Journal Articles
Children Processing Music: Electric Brain Responses Reveal Musical Competence and Gender Differences
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2003) 15 (5): 683–693.
Published: 01 May 2003
Abstract
View article
PDF
Numerous studies investigated physiological correlates of the processing of musical information in adults. How these correlates develop during childhood is poorly understood. In the present study, we measured event-related electric brain potentials elicited in 5and 9-year-old children while they listened to (major–minor tonal) music. Stimuli were chord sequences, infrequently containing harmonically inappropriate chords. Our results demonstrate that the degree of (in) appropriateness of the chords modified the brain responses in both groups according to music-theoretical principles. This suggests that already 5-year-old children process music according to a well-established cognitive representation of the major–minor tonal system and according to music-syntactic regularities. Moreover, we show that, in contrast to adults, an early negative brain response was left predominant in boys, whereas it was bilateral in girls, indicating a gender difference in children processing music, and revealing that children process music with a hemispheric weighting different from that of adults. Because children process, in contrast to adults, music in the same hemispheres as they process language, results indicate that children process music and language more similarly than adults. This finding might support the notion of a common origin of music and language in the human brain, and concurs with findings that demonstrate the importance of musical features of speech for the acquisition of language.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2000) 12 (3): 520–541.
Published: 01 May 2000
Abstract
View article
PDF
Only little systematic research has examined event-related brain potentials (ERPs) elicited by the cognitive processing of music. The present study investigated how music processing is influenced by a preceding musical context, affected by the task relevance of unexpected chords, and influenced by the degree and the probability of violation. Four experiments were conducted in which “nonmusicians” listened to chord sequences, which infrequently contained a chord violating the sound expectancy of listeners. Integration of in-key chords into the musical context was reflected as a late negative-frontal deflection in the ERPs. This negative deflection declined towards the end of a chord sequence, reflecting normal buildup of musical context. Brain waves elicited by chords with unexpected notes revealed two ERP effects: an early right-hemispheric preponderant-anterior negativity, which was taken to reflect the violation of sound expectancy; and a late bilateral-frontal negativity. The late negativity was larger compared to in-key chords and taken to reflect the higher degree of integration needed for unexpected chords. The early right-anterior negativity (ERAN) was unaffected by the task relevance of unexpected chords. The amplitudes of both early and late negativities were found to be sensitive to the degree of musical expectancy induced by the preceding harmonic context, and to the probability for deviant acoustic events. The employed experimental design opens a new field for the investigation of music processing. Results strengthen the hypothesis of an implicit musical ability of the human brain.