Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Anne-Sophie Dubarry
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Imaging Neuroscience (2025) 3: imag_a_00524.
Published: 31 March 2025
FIGURES
| View All (6)
Abstract
View articletitled, Revealing the co-existence of written and spoken language coding neural populations in the visual word form area
View
PDF
for article titled, Revealing the co-existence of written and spoken language coding neural populations in the visual word form area
Reading relies on the ability to map written symbols with speech sounds. A specific part of the left ventral occipitotemporal cortex, known as the Visual Word Form Area (VWFA), plays a crucial role in this process. Through the automatization of the mapping ability, this area progressively becomes specialized in written word recognition. Yet, despite its key role in reading, the area also responds to speech. This observation raises questions about the actual nature of neural representations encoded in the VWFA and, therefore, the underlying mechanism of the cross-modal responses. Here, we addressed this issue by applying fine-grained analyses of within- and cross-modal repetition suppression effects (RSEs) and Multi-Voxel Pattern Analyses in fMRI and sEEG experiments. Convergent evidence across analysis methods and protocols showed significant RSEs and successful decoding in both within-modal visual and auditory conditions, suggesting that populations of neurons within the VWFA distinctively encode written and spoken language. This functional organization of neural populations enables the area to respond to both written and spoken inputs. The finding opens further discussions on how the human brain may be prepared and adapted for an acquisition of a complex ability such as reading.
Includes: Supplementary data