Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-4 of 4
Anne Kösem
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2024) 36 (2): 225–238.
Published: 01 February 2024
FIGURES
| View All (6)
Abstract
View article
PDF
Words are not processed in isolation; instead, they are commonly embedded in phrases and sentences. The sentential context influences the perception and processing of a word. However, how this is achieved by brain processes and whether predictive mechanisms underlie this process remain a debated topic. Here, we employed an experimental paradigm in which we orthogonalized sentence context constraints and predictive validity, which was defined as the ratio of congruent to incongruent sentence endings within the experiment. While recording electroencephalography, participants read sentences with three levels of sentential context constraints (high, medium, and low). Participants were also separated into two groups that differed in their ratio of valid congruent to incongruent target words that could be predicted from the sentential context. For both groups, we investigated modulations of alpha power before, and N400 amplitude modulations after target word onset. The results reveal that the N400 amplitude gradually decreased with higher context constraints and cloze probability. In contrast, alpha power was not significantly affected by context constraint. Neither the N400 nor alpha power were significantly affected by changes in predictive validity.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2020) 32 (8): 1428–1437.
Published: 01 August 2020
FIGURES
| View All (4)
Abstract
View article
PDF
Recent neuroimaging evidence suggests that the frequency of entrained oscillations in auditory cortices influences the perceived duration of speech segments, impacting word perception [Kösem, A., Bosker, H. R., Takashima, A., Meyer, A., Jensen, O., & Hagoort, P. Neural entrainment determines the words we hear. Current Biology , 28 , 2867–2875, 2018]. We further tested the causal influence of neural entrainment frequency during speech processing, by manipulating entrainment with continuous transcranial alternating current stimulation (tACS) at distinct oscillatory frequencies (3 and 5.5 Hz) above the auditory cortices. Dutch participants listened to speech and were asked to report their percept of a target Dutch word, which contained a vowel with an ambiguous duration. Target words were presented either in isolation (first experiment) or at the end of spoken sentences (second experiment). We predicted that the tACS frequency would influence neural entrainment and therewith how speech is perceptually sampled, leading to a perceptual overestimation or underestimation of the vowel's duration. Whereas results from Experiment 1 did not confirm this prediction, results from Experiment 2 suggested a small effect of tACS frequency on target word perception: Faster tACS leads to more long-vowel word percepts, in line with the previous neuroimaging findings. Importantly, the difference in word perception induced by the different tACS frequencies was significantly larger in Experiment 1 versus Experiment 2, suggesting that the impact of tACS is dependent on the sensory context. tACS may have a stronger effect on spoken word perception when the words are presented in continuous speech as compared to when they are isolated, potentially because prior (stimulus-induced) entrainment of brain oscillations might be a prerequisite for tACS to be effective.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2020) 32 (7): 1242–1250.
Published: 01 July 2020
FIGURES
Abstract
View article
PDF
Perceiving speech requires the integration of different speech cues, that is, formants. When the speech signal is split so that different cues are presented to the right and left ear (dichotic listening), comprehension requires the integration of binaural information. Based on prior electrophysiological evidence, we hypothesized that the integration of dichotically presented speech cues is enabled by interhemispheric phase synchronization between primary and secondary auditory cortex in the gamma frequency band. We tested this hypothesis by applying transcranial alternating current stimulation (TACS) bilaterally above the superior temporal lobe to induce or disrupt interhemispheric gamma-phase coupling. In contrast to initial predictions, we found that gamma TACS applied in-phase above the two hemispheres (interhemispheric lag 0°) perturbs interhemispheric integration of speech cues, possibly because the applied stimulation perturbs an inherent phase lag between the left and right auditory cortex. We also observed this disruptive effect when applying antiphasic delta TACS (interhemispheric lag 180°). We conclude that interhemispheric phase coupling plays a functional role in interhemispheric speech integration. The direction of this effect may depend on the stimulation frequency.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2017) 29 (9): 1566–1582.
Published: 01 September 2017
FIGURES
| View All (7)
Abstract
View article
PDF
Perceiving the temporal order of sensory events typically depends on participants' attentional state, thus likely on the endogenous fluctuations of brain activity. Using magnetoencephalography, we sought to determine whether spontaneous brain oscillations could disambiguate the perceived order of auditory and visual events presented in close temporal proximity, that is, at the individual's perceptual order threshold (Point of Subjective Simultaneity [PSS]). Two neural responses were found to index an individual's temporal order perception when contrasting brain activity as a function of perceived order (i.e., perceiving the sound first vs. perceiving the visual event first) given the same physical audiovisual sequence. First, average differences in prestimulus auditory alpha power indicated perceiving the correct ordering of audiovisual events irrespective of which sensory modality came first: a relatively low alpha power indicated perceiving auditory or visual first as a function of the actual sequence order. Additionally, the relative changes in the amplitude of the auditory (but not visual) evoked responses were correlated with participant's correct performance. Crucially, the sign of the magnitude difference in prestimulus alpha power and evoked responses between perceived audiovisual orders correlated with an individual's PSS. Taken together, our results suggest that spontaneous oscillatory activity cannot disambiguate subjective temporal order without prior knowledge of the individual's bias toward perceiving one or the other sensory modality first. Altogether, our results suggest that, under high perceptual uncertainty, the magnitude of prestimulus alpha (de)synchronization indicates the amount of compensation needed to overcome an individual's prior in the serial ordering and temporal sequencing of information.