Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-5 of 5
Olivier Collignon
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2025) 37 (2): 498–514.
Published: 01 February 2025
FIGURES
| View All (5)
Abstract
View article
PDF
The animal brain is endowed with an innate sense of number allowing to intuitively perceive the approximate quantity of items in a scene, or “numerosity.” This ability is not limited to items distributed in space, but also to events unfolding in time and to the average numerosity of dynamic scenes. How the brain computes and represents the average numerosity over time, however, remains unclear. Here, we investigate the mechanisms and EEG signature of the perception of average numerosity over time. To do so, we used stimuli composed of a variable number (3–12) of briefly presented dot arrays (50 msec each) and asked participants to judge the average numerosity of the sequence. We first show that the weight of different portions of the stimuli in determining the judgment depends on how many arrays are included in the sequence itself: the longer the sequence, the lower the weight of the latest arrays. Second, we show systematic adaptation effects across stimuli in consecutive trials. Importantly, the EEG results highlight two processing stages whereby the amplitude of occipital ERPs reflects the adaptation effect (∼300 msec after stimulus onset) and the accuracy and precision of average numerosity judgments (∼450–700 msec). These two stages are consistent with processes involved with the representation of perceived average numerosity and with perceptual decision-making, respectively. Overall, our findings provide new evidence showing how the visual system computes the average numerosity of dynamic visual stimuli, and support the existence of a dedicated, relatively low-level perceptual mechanism mediating this process.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2020) 32 (6): 1009–1025.
Published: 01 June 2020
FIGURES
| View All (4)
Abstract
View article
PDF
If conceptual retrieval is partially based on the simulation of sensorimotor experience, people with a different sensorimotor experience, such as congenitally blind people, should retrieve concepts in a different way. However, studies investigating the neural basis of several conceptual domains (e.g., actions, objects, places) have shown a very limited impact of early visual deprivation. We approached this problem by investigating brain regions that encode the perceptual similarity of action and color concepts evoked by spoken words in sighted and congenitally blind people. At first, and in line with previous findings, a contrast between action and color concepts (independently of their perceptual similarity) revealed similar activations in sighted and blind people for action concepts and partially different activations for color concepts, but outside visual areas. On the other hand, adaptation analyses based on subjective ratings of perceptual similarity showed compelling differences across groups. Perceptually similar colors and actions induced adaptation in the posterior occipital cortex of sighted people only, overlapping with regions known to represent low-level visual features of those perceptual domains. Early-blind people instead showed a stronger adaptation for perceptually similar concepts in temporal regions, arguably indexing higher reliance on a lexical-semantic code to represent perceptual knowledge. Overall, our results show that visual deprivation does changes the neural bases of conceptual retrieval, but mostly at specific levels of representation supporting perceptual similarity discrimination, reconciling apparently contrasting findings in the field.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2018) 30 (1): 86–106.
Published: 01 January 2018
FIGURES
| View All (5)
Abstract
View article
PDF
Sounds activate occipital regions in early blind individuals. However, how different sound categories map onto specific regions of the occipital cortex remains a matter of debate. We used fMRI to characterize brain responses of early blind and sighted individuals to familiar object sounds, human voices, and their respective low-level control sounds. In addition, sighted participants were tested while viewing pictures of faces, objects, and phase-scrambled control pictures. In both early blind and sighted, a double dissociation was evidenced in bilateral auditory cortices between responses to voices and object sounds: Voices elicited categorical responses in bilateral superior temporal sulci, whereas object sounds elicited categorical responses along the lateral fissure bilaterally, including the primary auditory cortex and planum temporale. Outside the auditory regions, object sounds also elicited categorical responses in the left lateral and in the ventral occipitotemporal regions in both groups. These regions also showed response preference for images of objects in the sighted group, thus suggesting a functional specialization that is independent of sensory input and visual experience. Between-group comparisons revealed that, only in the blind group, categorical responses to object sounds extended more posteriorly into the occipital cortex. Functional connectivity analyses evidenced a selective increase in the functional coupling between these reorganized regions and regions of the ventral occipitotemporal cortex in the blind group. In contrast, vocal sounds did not elicit preferential responses in the occipital cortex in either group. Nevertheless, enhanced voice-selective connectivity between the left temporal voice area and the right fusiform gyrus were found in the blind group. Altogether, these findings suggest that, in the absence of developmental vision, separate auditory categories are not equipotent in driving selective auditory recruitment of occipitotemporal regions and highlight the presence of domain-selective constraints on the expression of cross-modal plasticity.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2013) 25 (12): 2072–2085.
Published: 01 December 2013
FIGURES
| View All (4)
Abstract
View article
PDF
Light regulates multiple non-image-forming (or nonvisual) circadian, neuroendocrine, and neurobehavioral functions, via outputs from intrinsically photosensitive retinal ganglion cells (ipRGCs). Exposure to light directly enhances alertness and performance, so light is an important regulator of wakefulness and cognition. The roles of rods, cones, and ipRGCs in the impact of light on cognitive brain functions remain unclear, however. A small percentage of blind individuals retain non-image-forming photoreception and offer a unique opportunity to investigate light impacts in the absence of conscious vision, presumably through ipRGCs. Here, we show that three such patients were able to choose nonrandomly about the presence of light despite their complete lack of sight. Furthermore, 2 sec of blue light modified EEG activity when administered simultaneously to auditory stimulations. fMRI further showed that, during an auditory working memory task, less than a minute of blue light triggered the recruitment of supplemental prefrontal and thalamic brain regions involved in alertness and cognition regulation as well as key areas of the default mode network. These results, which have to be considered as a proof of concept, show that non-image-forming photoreception triggers some awareness for light and can have a more rapid impact on human cognition than previously understood, if brain processing is actively engaged. Furthermore, light stimulates higher cognitive brain activity, independently of vision, and engages supplemental brain areas to perform an ongoing cognitive process. To our knowledge, our results constitute the first indication that ipRGC signaling may rapidly affect fundamental cerebral organization, so that it could potentially participate to the regulation of numerous aspects of human brain function.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (8): 1454–1463.
Published: 01 August 2008
Abstract
View article
PDF
It has been suggested that both the posterior parietal cortex (PPC) and the extrastriate occipital cortex (OC) participate in the spatial processing of sounds. However, the precise time-course of their contribution remains unknown, which is of particular interest, considering that it could give new insights into the mechanisms underlying auditory space perception. To address this issue, we have used event-related transcranial magnetic stimulation (TMS) to induce virtual lesions of either the right PPC or right OC at different delays in subjects performing a sound lateralization task. Our results confirmed that these two areas participate in the spatial processing of sounds. More precisely, we found that TMS applied over the right OC 50 msec after the stimulus onset significantly impaired the localization of sounds presented either to the right or to the left side. Moreover, right PPC virtual lesions induced 100 and 150 msec after sound presentation led to a rightward bias for stimuli delivered on the center and on the left side, reproducing transiently the deficits commonly observed in hemineglect patients. The finding that the right OC is involved in sound processing before the right PPC suggests that the OC exerts a feedforward influence on the PPC during auditory spatial processing.