Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Marco Loh
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2010) 22 (2): 240–247.
Published: 01 February 2010
FIGURES
| View All (5)
Abstract
View articletitled, Audiovisual Matching in Speech and Nonspeech Sounds: A Neurodynamical Model
View
PDF
for article titled, Audiovisual Matching in Speech and Nonspeech Sounds: A Neurodynamical Model
Audiovisual speech perception provides an opportunity to investigate the mechanisms underlying multimodal processing. By using nonspeech stimuli, it is possible to investigate the degree to which audiovisual processing is specific to the speech domain. It has been shown in a match-to-sample design that matching across modalities is more difficult in the nonspeech domain as compared to the speech domain. We constructed a biophysically realistic neural network model simulating this experimental evidence. We propose that a stronger connection between modalities in speech underlies the behavioral difference between the speech and the nonspeech domain. This could be the result of more extensive experience with speech stimuli. Because the match-to-sample paradigm does not allow us to draw conclusions concerning the integration of auditory and visual information, we also simulated two further conditions based on the same paradigm, which tested the integration of auditory and visual information within a single stimulus. New experimental data for these two conditions support the simulation results and suggest that audiovisual integration of discordant stimuli is stronger in speech than in nonspeech stimuli. According to the simulations, the connection strength between auditory and visual information, on the one hand, determines how well auditory information can be assigned to visual information, and on the other hand, it influences the magnitude of multimodal integration.
Includes: Supplementary data
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (3): 421–431.
Published: 01 March 2008
Abstract
View articletitled, Neurodynamics of the Prefrontal Cortex during Conditional Visuomotor Associations
View
PDF
for article titled, Neurodynamics of the Prefrontal Cortex during Conditional Visuomotor Associations
The prefrontal cortex is believed to be important for cognitive control, working memory, and learning. It is known to play an important role in the learning and execution of conditional visuomotor associations, a cognitive task in which stimuli have to be associated with actions by trial-and-error learning. In our modeling study, we sought to integrate several hypotheses on the function of the prefrontal cortex using a computational model, and compare the results to experimental data. We constructed a module of prefrontal cortex neurons exposed to two different inputs, which we envision to originate from the inferotemporal cortex and the basal ganglia. We found that working memory properties do not describe the dominant dynamics in the prefrontal cortex, but the activation seems to be transient, probably progressing along a pathway from sensory to motor areas. During the presentation of the cue, the dynamics of the prefrontal cortex is bistable, yielding a distinct activation for correct and error trails. We find that a linear change in network parameters relates to the changes in neural activity in consecutive correct trials during learning, which is important evidence for the underlying learning mechanisms.