Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-4 of 4
Karen Emmorey
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Bilateral word selectivity gradients in the visual word form system in skilled deaf readers
Open AccessPublisher: Journals Gateway
Neurobiology of Language 1–37.
Published: 06 June 2025
Abstract
View articletitled, Bilateral word selectivity gradients in the visual word form system in skilled deaf readers
View
PDF
for article titled, Bilateral word selectivity gradients in the visual word form system in skilled deaf readers
In hearing people visual word recognition relies on a hierarchical organization in left ventral occipitotemporal (vOT) cortex. While right hemisphere recruitment has been implicated in poor reading, this may not be the case for deaf readers as there is evidence that for skilled deaf readers the right vOT is also engaged during word recognition. However, the nature of representations along the vOT hierarchy and the degree of laterality in skilled deaf readers remain largely unknown. This study aimed to examine the hierarchical organization for written words in the vOT bilaterally for skill-matched deaf and hearing readers to determine whether deafness and phonological ability modulates the laterality of word-selectivity gradients. Using fMRI, we employed the same design as previous studies, presenting stimuli that represent a scale of orthographic regularity: consonant strings, pseudowords, and real words. For hearing readers, our results replicate previous findings showing a hierarchical structure solely in the left visual word form system (VWFS).For deaf readers, we find this same hierarchical structure in the left VWFS, but we also observe a similar hierarchical structure in the right VWFS. Unlike studies that show maladaptive right hemisphere activation in people with dyslexia, the bilateral tuning to written words seen in our study is not maladaptive since all participants were skilled readers. The bilateral hierarchical organization of the VWFS represents a unique neural signature for successful reading in deaf adults and suggests that the typical developmental shift from bilateral to predominantly left-lateralized processing is not necessary for successful reading.
Journal Articles
Publisher: Journals Gateway
Neurobiology of Language (2023) 4 (2): 361–381.
Published: 13 June 2023
FIGURES
| View all 6
Abstract
View articletitled, Asymetric Event-Related Potential Priming Effects Between English Letters and American Sign Language Fingerspelling Fonts
View
PDF
for article titled, Asymetric Event-Related Potential Priming Effects Between English Letters and American Sign Language Fingerspelling Fonts
Letter recognition plays an important role in reading and follows different phases of processing, from early visual feature detection to the access of abstract letter representations. Deaf ASL–English bilinguals experience orthography in two forms: English letters and fingerspelling. However, the neurobiological nature of fingerspelling representations, and the relationship between the two orthographies, remains unexplored. We examined the temporal dynamics of single English letter and ASL fingerspelling font processing in an unmasked priming paradigm with centrally presented targets for 200 ms preceded by 100 ms primes. Event-related brain potentials were recorded while participants performed a probe detection task. Experiment 1 examined English letter-to-letter priming in deaf signers and hearing non-signers. We found that English letter recognition is similar for deaf and hearing readers, extending previous findings with hearing readers to unmasked presentations. Experiment 2 examined priming effects between English letters and ASL fingerspelling fonts in deaf signers only. We found that fingerspelling fonts primed both fingerspelling fonts and English letters, but English letters did not prime fingerspelling fonts, indicating a priming asymmetry between letters and fingerspelling fonts. We also found an N400 -like priming effect when the primes were fingerspelling fonts which might reflect strategic access to the lexical names of letters. The studies suggest that deaf ASL–English bilinguals process English letters and ASL fingerspelling differently and that the two systems may have distinct neural representations. However, the fact that fingerspelling fonts can prime English letters suggests that the two orthographies may share abstract representations to some extent.
Journal Articles
Publisher: Journals Gateway
Neurobiology of Language (2021) 2 (4): 628–646.
Published: 23 December 2021
FIGURES
| View all 9
Abstract
View articletitled, On the Connection Between Language Control and Executive Control—An ERP Study
View
PDF
for article titled, On the Connection Between Language Control and Executive Control—An ERP Study
Models vary in the extent to which language control processes are domain general. Those that posit that language control is at least partially domain general insist on an overlap between language control and executive control at the goal level. To further probe whether or not language control is domain general, we conducted the first event-related potential (ERP) study that directly compares language-switch costs, as an index of language control, and task-switch costs, as an index of executive control. The language switching and task switching methodologies were identical, except that the former required switching between languages (English or Spanish) whereas the latter required switching between tasks (color naming or category naming). This design allowed us to directly compare control processes at the goal level (cue-locked ERPs) and at the task performance level (picture-locked ERPs). We found no significant differences in the switch-related cue-locked and picture-locked ERP patterns across the language and task switching paradigms. These results support models of domain-general language control.
Includes: Supplementary data
Journal Articles
Neurophysiological Correlates of Frequency, Concreteness, and Iconicity in American Sign Language
Open AccessPublisher: Journals Gateway
Neurobiology of Language (2020) 1 (2): 249–267.
Published: 01 June 2020
FIGURES
| View all 5
Abstract
View articletitled, Neurophysiological Correlates of Frequency, Concreteness, and Iconicity in American Sign Language
View
PDF
for article titled, Neurophysiological Correlates of Frequency, Concreteness, and Iconicity in American Sign Language
To investigate possible universal and modality-specific factors that influence the neurophysiological response during lexical processing, we recorded event-related potentials while a large group of deaf adults ( n = 40) viewed 404 signs in American Sign Language (ASL) that varied in ASL frequency, concreteness, and iconicity. Participants performed a go/no-go semantic categorization task (does the sign refer to people?) to videoclips of ASL signs (clips began with the signer’s hands at rest). Linear mixed-effects regression models were fit with per-participant, per-trial, and per-electrode data, allowing us to identify unique effects of each lexical variable. We observed an early effect of frequency (greater negativity for less frequent signs) beginning at 400 ms postvideo onset at anterior sites, which we interpreted as reflecting form-based lexical processing. This effect was followed by a more widely distributed posterior response that we interpreted as reflecting lexical-semantic processing. Paralleling spoken language, more concrete signs elicited greater negativities, beginning 600 ms postvideo onset with a wide scalp distribution. Finally, there were no effects of iconicity (except for a weak effect in the latest epochs; 1,000–1,200 ms), suggesting that iconicity does not modulate the neural response during sign recognition. Despite the perceptual and sensorimotoric differences between signed and spoken languages, the overall results indicate very similar neurophysiological processes underlie lexical access for both signs and words.
Includes: Supplementary data