Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Pia Rämä
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2024) 36 (9): 1963–1976.
Published: 01 September 2024
FIGURES
| View All (4)
Abstract
View article
PDF
Developmental language studies have shown that lexical-semantic organization develops between 18 and 24 months of age in monolingual infants. In the present study, we aimed to examine whether voice familiarity facilitates lexical-semantic activation in the infant brain. We recorded the brain activity of 18-month-old, French-learning infants using EEG while they listened to taxonomically related and unrelated spoken word pairs by one voice with which they were familiarized with before the experiment, and one voice with which they were not familiarized. The ERPs were measured in response to related and unrelated target words. Our results showed an N400 effect (greater amplitudes for unrelated as opposed to related target words) over the left hemisphere, only for the familiar voice, suggesting that the voice familiarity facilitated lexical-semantic activation. For unfamiliar voices, we observed an earlier congruence effect (greater amplitudes for related than for unrelated target words). This suggests that although 18-month-olds process lexical-semantic information from unfamiliar speakers, their neural signatures of lexical-semantic processing are less mature. Our results show that even in the absence of personal relation with a speaker, familiarity with a voice augments infant lexical-semantic processing. This supports the idea that extralinguistic information plays a role in infant lexical-semantic activation.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2009) 21 (8): 1511–1522.
Published: 01 August 2009
Abstract
View article
PDF
We examined the attentional modulation of semantic priming and the N400 effect for spoken words. The aim was to find out how the semantics of spoken language is processed when attention is directed to another modality (passive task), to the phonetics of spoken words (phonological task), or to the semantics of spoken words (word task). Equally strong behavioral priming effects were obtained in the phonological and the word tasks. A significant N400 effect was found in all tasks. The effect was stronger in the word and the phonological tasks than in the passive task, but there was no difference in the magnitude of the effect between the phonological and the word tasks. The latency of the N400 effect did not differ between the tasks. Although the N400 effect had a centroparietal maximum in the phonological and the word tasks, it was largest at the parietal recording sites in the passive task. The effect was more pronounced at the left than right recording sites in the phonological task, but there was no laterality effect in the other tasks. The N400 effect in the passive task indicates that semantic priming occurs even when spoken words are not actively attended. However, stronger N400 effect in the phonological and the word tasks than in the passive task suggests that controlled processes modulate the N400 effect. The finding that there were no differences in the N400 effect between the phonological and the word tasks indicates that the semantics of attended spoken words is processed regardless of whether semantic processing is relevant for task performance.