Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-3 of 3
Nathan Weisz
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2024) 36 (1): 128–142.
Published: 01 January 2024
FIGURES
Abstract
View article
PDF
Visual speech plays a powerful role in facilitating auditory speech processing and has been a publicly noticed topic with the wide usage of face masks during the COVID-19 pandemic. In a previous magnetoencephalography study, we showed that occluding the mouth area significantly impairs neural speech tracking. To rule out the possibility that this deterioration is because of degraded sound quality, in the present follow-up study, we presented participants with audiovisual (AV) and audio-only (A) speech. We further independently manipulated the trials by adding a face mask and a distractor speaker. Our results clearly show that face masks only affect speech tracking in AV conditions, not in A conditions. This shows that face masks indeed primarily impact speech processing by blocking visual speech and not by acoustic degradation. We can further highlight how the spectrogram, lip movements and lexical units are tracked on a sensor level. We can show visual benefits for tracking the spectrogram especially in the multi-speaker condition. While lip movements only show additional improvement and visual benefit over tracking of the spectrogram in clear speech conditions, lexical units (phonemes and word onsets) do not show visual enhancement at all. We hypothesize that in young normal hearing individuals, information from visual input is less used for specific feature extraction, but acts more as a general resource for guiding attention.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2023) 35 (4): 588–602.
Published: 01 April 2023
FIGURES
| View All (6)
Abstract
View article
PDF
It is widely established that sensory perception is a rhythmic process as opposed to a continuous one. In the context of auditory perception, this effect is only established on a cortical and behavioral level. Yet, the unique architecture of the auditory sensory system allows its primary sensory cortex to modulate the processes of its sensory receptors at the cochlear level. Previously, we could demonstrate the existence of a genuine cochlear theta (∼6-Hz) rhythm that is modulated in amplitude by intermodal selective attention. As the study's paradigm was not suited to assess attentional effects on the oscillatory phase of cochlear activity, the question of whether attention can also affect the temporal organization of the cochlea's ongoing activity remained open. The present study utilizes an interaural attention paradigm to investigate ongoing otoacoustic activity during a stimulus-free cue–target interval and an omission period of the auditory target in humans. We were able to replicate the existence of the cochlear theta rhythm. Importantly, we found significant phase opposition between the two ears and attention conditions of anticipatory as well as cochlear oscillatory activity during target presentation. Yet, the amplitude was unaffected by interaural attention. These results are the first to demonstrate that intermodal and interaural attention deploy different aspects of excitation and inhibition at the first level of auditory processing. Whereas intermodal attention modulates the level of cochlear activity, interaural attention modulates the timing.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2022) 34 (6): 1001–1014.
Published: 02 May 2022
FIGURES
| View All (5)
Abstract
View article
PDF
Ongoing fluctuations in neural excitability and connectivity influence whether or not a stimulus is seen. Do they also influence which stimulus is seen? We recorded magnetoencephalography data while 21 human participants viewed face or house stimuli, either one at a time or under bistable conditions induced through binocular rivalry. Multivariate pattern analysis revealed common neural substrates for rivalrous versus nonrivalrous stimuli with an additional delay of ∼36 msec for the bistable stimulus, and poststimulus signals were source-localized to the fusiform face area. Before stimulus onset followed by a face versus house report, fusiform face area showed stronger connectivity to primary visual cortex and to the rest of the cortex in the alpha frequency range (8–13 Hz), but there were no differences in local oscillatory alpha power. The prestimulus connectivity metrics predicted the accuracy of poststimulus decoding and the delay associated with rivalry disambiguation suggesting that perceptual content is shaped by ongoing neural network states.