Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Janet F. Werker
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2012) 24 (3): 564–574.
Published: 01 March 2012
FIGURES
| View All (5)
Abstract
View article
PDF
Breaking the linguistic code requires the extraction of at least two types of information from the speech signal: the relations between linguistic units and their sequential position. Furthermore, these different types of information need to be integrated into a coherent representation of language structure. The brain networks responsible for these abilities are well known in adults, but not in young infants. Our results show that the neural architecture underlying these abilities is operational at birth. In three optical imaging studies, we found that the newborn brain detects identity relations, as evidenced by enhanced activation in the bilateral superior temporal and left inferior frontal regions. More importantly, the newborn brain can also determine whether such identity relations hold for the initial or final positions of speech sequences, as indicated by increased activity in the inferior frontal regions, possibly Broca's area. This implies that the neural foundations of language acquisition are in place from birth.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2004) 16 (8): 1452–1464.
Published: 01 October 2004
Abstract
View article
PDF
The ability to discriminate phonetically similar speech sounds is evident quite early in development. However, inexperienced word learners do not always use this information in processing word meanings [Stager & Werker (1997). Nature, 388, 381–382]. The present study used event-related potentials (ERPs) to examine developmental changes from 14 to 20 months in brain activity important in processing phonetic detail in the context of meaningful words. ERPs were compared to three types of words: words whose meanings were known by the child (e.g., “bear”), nonsense words that differed by an initial phoneme (e.g., “gare”), and nonsense words that differed from the known words by more than one phoneme (e.g., “kobe”). These results supported the behavioral findings suggesting that inexperienced word learners do not use information about phonetic detail when processing word meanings. For the 14-month-olds, ERPs to known words (e.g., “bear”) differed from ERPs to phonetically dissimilar nonsense words (e.g., “kobe”), but did not differ from ERPs to phonetically similar nonsense words (e.g., “gare”), suggesting that known words and similar mispronunciations were processed as the same word. In contrast, for experienced word learners (i.e., 20-month-olds), ERPs to known words (e.g., “bear”) differed from those to both types of nonsense words (“gare” and “kobe”). Changes in the lateral distribution of ERP differences to known and unknown (nonce) words between 14 and 20 months replicated previous findings. The findings suggested that vocabulary development is an important factor in the organization of neural systems linked to processing phonetic detail within the context of word comprehension.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2001) 13 (7): 994–1005.
Published: 01 October 2001
Abstract
View article
PDF
The detection of speech in an auditory stream is a requisite first step in processing spoken language. In this study, we used event-related fMRI to investigate the neural substrates mediating detection of speech compared with that of nonspeech auditory stimuli. Unlike previous studies addressing this issue, we contrasted speech with nonspeech analogues that were matched along key temporal and spectral dimensions. In an oddball detection task, listeners heard nonsense speech sounds, matched sine wave analogues (complex nonspeech), or single tones (simple nonspeech). Speech stimuli elicited significantly greater activation than both complex and simple nonspeech stimuli in classic receptive language areas, namely the middle temporal gyri bilaterally and in a locus lateralized to the left posterior superior temporal gyrus. In addition, speech activated a small cluster of the right inferior frontal gyrus. The activation of these areas in a simple detection task, which requires neither identification nor linguistic analysis, suggests they play a fundamental role in speech processing.