Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-4 of 4
Sylvain Baillet
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2017) 29 (6): 1033–1043.
Published: 01 June 2017
FIGURES
| View All (4)
Abstract
View article
PDF
In a wide variety of cognitive domains, performance is determined by the selection and execution of cognitive strategies to solve problems. We used magnetoencephalography to identify the brain regions involved and specify the time course of dynamic modulations of executive control processes during strategy execution. Participants performed a computational estimation task in which they were instructed to execute a poorer or better strategy to estimate results of two-digit multiplication problems. When participants were asked to execute the poorer strategy, two distinct sets of brain activations were identified, depending on whether the poorer strategy (engaging the left inferior frontal junction) or the better strategy (engaging ACC) had been executed on the immediately preceding items. Our findings also revealed the time course of activations in regions involved in sequential modulations of cognitive control processes during arithmetic strategy execution. These findings point at processes of proactive preparation on items after poorer strategy items and dynamics of reactive adjustments after better strategy items.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2017) 29 (1): 79–94.
Published: 01 January 2017
FIGURES
| View All (4)
Abstract
View article
PDF
The distinction between letter strings that form words and those that look and sound plausible but are not meaningful is a basic one. Decades of functional neuroimaging experiments have used this distinction to isolate the neural basis of lexical (word level) semantics, associated with areas such as the middle temporal, angular, and posterior cingulate gyri that overlap the default mode network. In two fMRI experiments, a different set of findings emerged when word stimuli were used that were less familiar (measured by word frequency) than those typically used. Instead of activating default mode network areas often associated with semantic processing, words activated task-positive areas such as the inferior pFC and SMA, along with multifunctional ventral occipitotemporal cortices related to reading, whereas nonwords activated default mode areas previously associated with semantics. Effective connectivity analyses of fMRI data on less familiar words showed activation driven by task-positive and multifunctional reading-related areas, whereas highly familiar words showed bottom–up activation flow from occipitotemporal cortex. These findings suggest that functional neuroimaging correlates of semantic processing are less stable than previously assumed, with factors such as word frequency influencing the balance between task-positive, reading-related, and default mode networks. More generally, this suggests that results of contrasts typically interpreted in terms of semantic content may be more influenced by factors related to task difficulty than is widely appreciated.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (10): 1827–1838.
Published: 01 October 2008
Abstract
View article
PDF
Humans demonstrate an amazing ability for intercepting and catching moving targets, most noticeably in fast-speed ball games. However, the few studies exploring the neural bases of interception in humans and the classical studies on visual motion processing and visuomotor interactions have reported rather long latencies of cortical activations that cannot explain the performances observed in most natural interceptive actions. The aim of our experiment was twofold: (1) describe the spatio-temporal unfolding of cortical activations involved in catching a moving target and (2) provide evidence that fast cortical responses can be elicited by a visuomotor task with high temporal constraints and decide if these responses are task or stimulus dependent. Neuromagnetic brain activity was recorded with whole-head coverage while subjects were asked to catch a free-falling ball or simply pay attention to the ball trajectory. A fast, likely stimulus-dependent, propagation of neural activity was observed along the dorsal visual pathway in both tasks. Evaluation of latencies of activations in the main cortical regions involved in the tasks revealed that this entire network of regions was activated within 40 msec. Moreover, comparison of experimental conditions revealed similar patterns of activation except in contralateral sensorimotor regions where common and catch-specific activations were differentiated.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2009) 21 (5): 905–921.
Published: 01 May 2008
Abstract
View article
PDF
Speech is not a purely auditory signal. From around 2 months of age, infants are able to correctly match the vowel they hear with the appropriate articulating face. However, there is no behavioral evidence of integrated audiovisual perception until 4 months of age, at the earliest, when an illusory percept can be created by the fusion of the auditory stimulus and of the facial cues (McGurk effect). To understand how infants initially match the articulatory movements they see with the sounds they hear, we recorded high-density ERPs in response to auditory vowels that followed a congruent or incongruent silently articulating face in 10-week-old infants. In a first experiment, we determined that auditory–visual integration occurs during the early stages of perception as in adults. The mismatch response was similar in timing and in topography whether the preceding vowels were presented visually or aurally. In the second experiment, we studied audiovisual integration in the linguistic (vowel perception) and nonlinguistic (gender perception) domain. We observed a mismatch response for both types of change at similar latencies. Their topographies were significantly different demonstrating that cross-modal integration of these features is computed in parallel by two different networks. Indeed, brain source modeling revealed that phoneme and gender computations were lateralized toward the left and toward the right hemisphere, respectively, suggesting that each hemisphere possesses an early processing bias. We also observed repetition suppression in temporal regions and repetition enhancement in frontal regions. These results underscore how complex and structured is the human cortical organization which sustains communication from the first weeks of life on.