Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-3 of 3
Karen Emmorey
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neurobiology of Language (2023) 4 (2): 361–381.
Published: 13 June 2023
FIGURES
| View All (6)
Abstract
View article
PDF
Letter recognition plays an important role in reading and follows different phases of processing, from early visual feature detection to the access of abstract letter representations. Deaf ASL–English bilinguals experience orthography in two forms: English letters and fingerspelling. However, the neurobiological nature of fingerspelling representations, and the relationship between the two orthographies, remains unexplored. We examined the temporal dynamics of single English letter and ASL fingerspelling font processing in an unmasked priming paradigm with centrally presented targets for 200 ms preceded by 100 ms primes. Event-related brain potentials were recorded while participants performed a probe detection task. Experiment 1 examined English letter-to-letter priming in deaf signers and hearing non-signers. We found that English letter recognition is similar for deaf and hearing readers, extending previous findings with hearing readers to unmasked presentations. Experiment 2 examined priming effects between English letters and ASL fingerspelling fonts in deaf signers only. We found that fingerspelling fonts primed both fingerspelling fonts and English letters, but English letters did not prime fingerspelling fonts, indicating a priming asymmetry between letters and fingerspelling fonts. We also found an N400 -like priming effect when the primes were fingerspelling fonts which might reflect strategic access to the lexical names of letters. The studies suggest that deaf ASL–English bilinguals process English letters and ASL fingerspelling differently and that the two systems may have distinct neural representations. However, the fact that fingerspelling fonts can prime English letters suggests that the two orthographies may share abstract representations to some extent.
Journal Articles
Publisher: Journals Gateway
Neurobiology of Language (2021) 2 (4): 628–646.
Published: 23 December 2021
FIGURES
| View All (9)
Abstract
View article
PDF
Models vary in the extent to which language control processes are domain general. Those that posit that language control is at least partially domain general insist on an overlap between language control and executive control at the goal level. To further probe whether or not language control is domain general, we conducted the first event-related potential (ERP) study that directly compares language-switch costs, as an index of language control, and task-switch costs, as an index of executive control. The language switching and task switching methodologies were identical, except that the former required switching between languages (English or Spanish) whereas the latter required switching between tasks (color naming or category naming). This design allowed us to directly compare control processes at the goal level (cue-locked ERPs) and at the task performance level (picture-locked ERPs). We found no significant differences in the switch-related cue-locked and picture-locked ERP patterns across the language and task switching paradigms. These results support models of domain-general language control.
Includes: Supplementary data
Journal Articles
Publisher: Journals Gateway
Neurobiology of Language (2020) 1 (2): 249–267.
Published: 01 June 2020
FIGURES
| View All (5)
Abstract
View article
PDF
To investigate possible universal and modality-specific factors that influence the neurophysiological response during lexical processing, we recorded event-related potentials while a large group of deaf adults ( n = 40) viewed 404 signs in American Sign Language (ASL) that varied in ASL frequency, concreteness, and iconicity. Participants performed a go/no-go semantic categorization task (does the sign refer to people?) to videoclips of ASL signs (clips began with the signer’s hands at rest). Linear mixed-effects regression models were fit with per-participant, per-trial, and per-electrode data, allowing us to identify unique effects of each lexical variable. We observed an early effect of frequency (greater negativity for less frequent signs) beginning at 400 ms postvideo onset at anterior sites, which we interpreted as reflecting form-based lexical processing. This effect was followed by a more widely distributed posterior response that we interpreted as reflecting lexical-semantic processing. Paralleling spoken language, more concrete signs elicited greater negativities, beginning 600 ms postvideo onset with a wide scalp distribution. Finally, there were no effects of iconicity (except for a weak effect in the latest epochs; 1,000–1,200 ms), suggesting that iconicity does not modulate the neural response during sign recognition. Despite the perceptual and sensorimotoric differences between signed and spoken languages, the overall results indicate very similar neurophysiological processes underlie lexical access for both signs and words.
Includes: Supplementary data