Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Chun-Yu Tse
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2015) 27 (9): 1723–1737.
Published: 01 August 2015
FIGURES
| View All (8)
Abstract
View article
PDF
Information from different modalities is initially processed in different brain areas, yet real-world perception often requires the integration of multisensory signals into a single percept. An example is the McGurk effect, in which people viewing a speaker whose lip movements do not match the utterance perceive the spoken sounds incorrectly, hearing them as more similar to those signaled by the visual rather than the auditory input. This indicates that audiovisual integration is important for generating the phoneme percept. Here we asked when and where the audiovisual integration process occurs, providing spatial and temporal boundaries for the processes generating phoneme perception. Specifically, we wanted to separate audiovisual integration from other processes, such as simple deviance detection. Building on previous work employing ERPs, we used an oddball paradigm in which task-irrelevant audiovisually deviant stimuli were embedded in strings of non-deviant stimuli. We also recorded the event-related optical signal, an imaging method combining spatial and temporal resolution, to investigate the time course and neuroanatomical substrate of audiovisual integration. We found that audiovisual deviants elicit a short duration response in the middle/superior temporal gyrus, whereas audiovisual integration elicits a more extended response involving also inferior frontal and occipital regions. Interactions between audiovisual integration and deviance detection processes were observed in the posterior/superior temporal gyrus. These data suggest that dynamic interactions between inferior frontal cortex and sensory regions play a significant role in multimodal integration.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2012) 24 (9): 1941–1959.
Published: 01 September 2012
FIGURES
| View All (6)
Abstract
View article
PDF
The significance of stimuli is linked not only to their nature but also to the sequential structure in which they are embedded, which gives rise to contingency rules. Humans have an extraordinary ability to extract and exploit these rules, as exemplified by the role of grammar and syntax in language. To study the brain representations of contingency rules, we recorded ERPs and event-related optical signal (EROS; which uses near-infrared light to measure the optical changes associated with neuronal responses). We used sequences of high- and low-frequency tones varying according to three contingency rules, which were orthogonally manipulated and differed in processing requirements: A Single Repetition rule required only template matching, a Local Probability rule required relating a stimulus to its context, and a Global Probability rule could be derived through template matching or with reference to the global sequence context. ERP activity at 200–300 msec was related to the Single Repetition and Global Probability rules (reflecting access to representations based on template matching), whereas longer-latency activity (300-450 msec) was related to the Local Probability and Global Probability rules (reflecting access to representations incorporating contextual information). EROS responses with corresponding latencies indicated that the earlier activity involved the superior temporal gyrus, whereas later responses involved a fronto-parietal network. This suggests that the brain can simultaneously hold different models of stimulus contingencies at different levels of the information processing system according to their processing requirements, as indicated by the latency and location of the corresponding brain activity.