Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-8 of 8
Daniele Schön
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2017) 29 (8): 1378–1389.
Published: 01 August 2017
FIGURES
| View All (7)
Abstract
View article
PDF
Musical rhythm positively impacts on subsequent speech processing. However, the neural mechanisms underlying this phenomenon are so far unclear. We investigated whether carryover effects from a preceding musical cue to a speech stimulus result from a continuation of neural phase entrainment to periodicities that are present in both music and speech. Participants listened and memorized French metrical sentences that contained (quasi-)periodic recurrences of accents and syllables. Speech stimuli were preceded by a rhythmically regular or irregular musical cue. Our results show that the presence of a regular cue modulates neural response as estimated by EEG power spectral density, intertrial coherence, and source analyses at critical frequencies during speech processing compared with the irregular condition. Importantly, intertrial coherences for regular cues were indicative of the participants' success in memorizing the subsequent speech stimuli. These findings underscore the highly adaptive nature of neural phase entrainment across fundamentally different auditory stimuli. They also support current models of neural phase entrainment as a tool of predictive timing and attentional selection across cognitive domains.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2014) 26 (3): 593–605.
Published: 01 March 2014
FIGURES
Abstract
View article
PDF
When we direct attentional resources to a certain point in time, expectation and preparedness is heightened and behavior is, as a result, more efficient. This future-oriented attending can be guided either voluntarily, by externally defined cues, or implicitly, by perceived temporal regularities. Inspired by dynamic attending theory, our aim was to study the extent to which metrical structure, with its beats of greater or lesser relative strength, modulates attention implicitly over time and to uncover the neural circuits underlying this process of dynamic attending. We used fMRI to investigate whether auditory meter generated temporal expectancies and, consequently, how it affected processing of auditory and visual targets. Participants listened to a continuous auditory metrical sequence and pressed a button whenever an auditory or visual target was presented. The independent variable was the time of target presentation with respect to the metrical structure of the sequence. Participants' RTs to targets occurring on strong metrical positions were significantly faster than responses to events falling on weak metrical positions. Events falling on strong beats were accompanied by increased activation of the left inferior parietal cortex, a region crucial for orienting attention in time, and, by greater functional connectivity between the left inferior parietal cortex and the visual and auditory cortices, the SMA and the cerebellum. These results support the predictions of the dynamic attending theory that metrical structure with its relative strong and weak beats modulates attentional resources over time and, in turn, affects the functioning of both perceptual and motor preparatory systems.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2011) 23 (12): 3874–3887.
Published: 01 December 2011
FIGURES
| View All (5)
Abstract
View article
PDF
The aim of this study was to examine the influence of musical expertise in 9-year-old children on passive (as reflected by MMN) and active (as reflected by discrimination accuracy) processing of speech sounds. Musician and nonmusician children were presented with a sequence of syllables that included standards and deviants in vowel frequency, vowel duration, and VOT. Both the passive and the active processing of duration and VOT deviants were enhanced in musician compared with nonmusician children. Moreover, although no effect was found on the passive processing of frequency, active frequency discrimination was enhanced in musician children. These findings are discussed in terms of common processing of acoustic features in music and speech and of positive transfer of training from music to the more abstract phonological representations of speech units (syllables).
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (8): 1754–1769.
Published: 01 August 2010
FIGURES
| View All (5)
Abstract
View article
PDF
We tested whether the emergence of familiarity to a melody may trigger or co-occur with the processing of the concept(s) conveyed by emotions to, or semantic association with, the melody. With this objective, we recorded ERPs while participants were presented with highly familiar and less familiar melodies in a gating paradigm. The ERPs time locked to a tone of the melody called the “familiarity emergence point” showed a larger fronto-central negativity for highly familiar compared with less familiar melodies between 200 and 500 msec, with a peak latency around 400 msec. This latency and the sensitivity to the degree of familiarity/conceptual information suggest that this component was an N400, a marker of conceptual processing. Our data suggest that the feeling of familiarity evoked by a musical excerpt could be accompanied by other processing mechanisms at the conceptual level. Coupling the gating paradigm with ERP analyses might become a new avenue for investigating the neurocognitive basis of implicit musical knowledge.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (5): 1026–1035.
Published: 01 May 2010
FIGURES
| View All (5)
Abstract
View article
PDF
Two experiments were conducted to examine the conceptual relation between words and nonmeaningful sounds. In order to reduce the role of linguistic mediation, sounds were recorded in such a way that it was highly unlikely to identify the source that produced them. Related and unrelated sound–word pairs were presented in Experiment 1 and the order of presentation was reversed in Experiment 2 (word–sound). Results showed that, in both experiments, participants were sensitive to the conceptual relation between the two items. They were able to correctly categorize items as related or unrelated with good accuracy. Moreover, a relatedness effect developed in the event-related brain potentials between 250 and 600 msec, although with a slightly different scalp topography for word and sound targets. Results are discussed in terms of similar conceptual processing networks and we propose a tentative model of the semiotics of sounds.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2009) 21 (10): 1882–1892.
Published: 01 October 2009
Abstract
View article
PDF
The cognitive processing of concepts, that is, abstract general ideas, has been mostly studied with language. However, other domains, such as music, can also convey concepts. Koelsch et al. [Koelsch, S., Kasper, E., Sammler, D., Schulze, K., Gunter, T., & Friederici, A. D. Music, language and meaning: Brain signatures of semantic processing. Nature Neuroscience, 7, 302–307, 2004] showed that 10 sec of music can influence the semantic processing of words. However, the length of the musical excerpts did not allow the authors to study the effect of words on musical targets. In this study, we decided to replicate Koelsch et al. findings using 1-sec musical excerpts (Experiment 1). This allowed us to study the reverse influence, namely, of a linguistic context on conceptual processing of musical excerpts (Experiment 2). In both experiments, we recorded behavioral and electrophysiological responses while participants were presented 50 related and 50 unrelated pairs (context/target). Experiments 1 and 2 showed a larger N400 component of the event-related brain potentials to targets following a conceptually unrelated compared to a related context. The presence of an N400 effect with musical targets suggests that music may convey concepts. The relevance of these results for the comprehension of music as a structured set of conceptual units and for the domain specificity of the mechanisms underlying N400 effects are discussed.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2006) 18 (2): 199–211.
Published: 01 February 2006
Abstract
View article
PDF
The idea that extensive musical training can influence processing in cognitive domains other than music has received considerable attention from the educational system and the media. Here we analyzed behavioral data and recorded event-related brain potentials (ERPs) from 8-year-old children to test the hypothesis that musical training facilitates pitch processing not only in music but also in language. We used a parametric manipulation of pitch so that the final notes or words of musical phrases or sentences were congruous, weakly incongruous, or strongly incongruous. Musician children outperformed nonmusician children in the detection of the weak incongruity in both music and language. Moreover, the greatest differences in the ERPs of musician and nonmusician children were also found for the weak incongruity: whereas for musician children, early negative components developed in music and late positive components in language, no such components were found for nonmusician children. Finally, comparison of these results with previous ones from adults suggests that some aspects of pitch processing are in effect earlier in music than in language. Thus, the present results reveal positive transfer effects between cognitive domains and shed light on the time course and neural basis of the development of prosodic and melodic processing.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2005) 17 (4): 694–705.
Published: 01 April 2005
Abstract
View article
PDF
The general aim of this experiment was to investigate the processes involved in reading musical notation and to study the relationship between written music and its auditory representation. It was of main interest to determine whether musicians are able to develop expectancies for specific tonal or atonal auditory events based on visual score alone. Can musicians expect an “atonal” event or will it always sound odd? Moreover, it was of interest to determine whether the modulations in amplitude of a late positive component (P600) described in previous studies are linked to a general mismatch detection process or to specific musical expectancies. Results showed clearly that musicians are able to expect tonal auditory endings based on visual information and are also able to do so for atonal endings, although to a smaller extent. Strong interactions seem to exist between visual and auditory musical codes and visual information seems to influence auditory processing as early as 100 msec. These results are directly relevant for the question of whether music reading is actually music perception.