Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-11 of 11
Floris P. de Lange
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience 1–15.
Published: 26 September 2024
Abstract
View article
PDF
The human visual system is equipped to rapidly and implicitly learn and exploit the statistical regularities in our environment. Within visual search, contextual cueing demonstrates how implicit knowledge of scenes can improve search performance. This is commonly interpreted as spatial context in the scenes becoming predictive of the target location, which leads to a more efficient guidance of attention during search. However, what drives this enhanced guidance is unknown. First, it is under debate whether the entire scene (global context) or more local context drives this phenomenon. Second, it is unclear how exactly improved attentional guidance is enabled by target enhancement and distractor suppression. In the present magnetoencephalography experiment, we leveraged rapid invisible frequency tagging to answer these two outstanding questions. We found that the improved performance when searching implicitly familiar scenes was accompanied by a stronger neural representation of the target stimulus, at the cost specifically of those distractors directly surrounding the target. Crucially, this biasing of local attentional competition was behaviorally relevant when searching familiar scenes. Taken together, we conclude that implicitly learned spatial predictive context improves how we search our environment by sharpening the attentional field.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2023) 35 (7): 1133–1143.
Published: 01 July 2023
FIGURES
| View All (5)
Abstract
View article
PDF
Perceivers can use past experiences to make sense of ambiguous sensory signals. However, this may be inappropriate when the world changes and past experiences no longer predict what the future holds. Optimal learning models propose that observers decide whether to stick with or update their predictions by tracking the uncertainty or “precision” of their expectations. However, contrasting theories of prediction have argued that we are prone to misestimate uncertainty—leading to stubborn predictions that are difficult to dislodge. To compare these possibilities, we had participants learn novel perceptual predictions before using fMRI to record visual brain activity when predictive contingencies were disrupted—meaning that previously “expected” events became objectively improbable. Multivariate pattern analyses revealed that expected events continued to be decoded with greater fidelity from primary visual cortex, despite marked changes in the statistical structure of the environment, which rendered these expectations no longer valid. These results suggest that our perceptual systems do indeed form stubborn predictions even from short periods of learning—and more generally suggest that top–down expectations have the potential to help or hinder perceptual inference in bounded minds like ours.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2022) 34 (2): 332–347.
Published: 05 January 2022
FIGURES
| View All (4)
Abstract
View article
PDF
Both spatial and temporal context play an important role in visual perception and behavior. Humans can extract statistical regularities from both forms of context to help process the present and to construct expectations about the future. Numerous studies have found reduced neural responses to expected stimuli compared with unexpected stimuli, for both spatial and temporal regularities. However, it is largely unclear whether and how these forms of context interact. In the current fMRI study, 33 human volunteers were exposed to pairs of object stimuli that could be expected or surprising in terms of their spatial and temporal context. We found reliable independent contributions of both spatial and temporal context in modulating the neural response. Specifically, neural responses to stimuli in expected compared with unexpected contexts were suppressed throughout the ventral visual stream. These results suggest that both spatial and temporal context may aid sensory processing in a similar fashion, providing evidence on how different types of context jointly modulate perceptual processing.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2020) 32 (4): 722–733.
Published: 01 April 2020
FIGURES
| View All (4)
Abstract
View article
PDF
Familiarity with a stimulus leads to an attenuated neural response to the stimulus. Alongside this attenuation, recent studies have also observed a truncation of stimulus-evoked activity for familiar visual input. One proposed function of this truncation is to rapidly put neurons in a state of readiness to respond to new input. Here, we examined this hypothesis by presenting human participants with target stimuli that were embedded in rapid streams of familiar or novel distractor stimuli at different speeds of presentation, while recording brain activity using magnetoencephalography and measuring behavioral performance. We investigated the temporal and spatial dynamics of signal truncation and whether this phenomenon bears relationship to participants' ability to categorize target items within a visual stream. Behaviorally, target categorization performance was markedly better when the target was embedded within familiar distractors, and this benefit became more pronounced with increasing speed of presentation. Familiar distractors showed a truncation of neural activity in the visual system. This truncation was strongest for the fastest presentation speeds and peaked in progressively more anterior cortical regions as presentation speeds became slower. Moreover, the neural response evoked by the target was stronger when this target was preceded by familiar distractors. Taken together, these findings demonstrate that item familiarity results in a truncated neural response, is associated with stronger processing of relevant target information, and leads to superior perceptual performance.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2020) 32 (4): 691–702.
Published: 01 April 2020
FIGURES
| View All (5)
Abstract
View article
PDF
Perceptual expectations can change how a visual stimulus is perceived. Recent studies have shown mixed results in terms of whether expectations modulate sensory representations. Here, we used a statistical learning paradigm to study the temporal characteristics of perceptual expectations. We presented participants with pairs of object images organized in a predictive manner and then recorded their brain activity with magnetoencephalography while they viewed expected and unexpected image pairs on the subsequent day. We observed stronger alpha-band (7–14 Hz) activity in response to unexpected compared with expected object images. Specifically, the alpha-band modulation occurred as early as the onset of the stimuli and was most pronounced in left occipito-temporal cortex. Given that the differential response to expected versus unexpected stimuli occurred in sensory regions early in time, our results suggest that expectations modulate perceptual decision-making by changing the sensory response elicited by the stimuli.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2018) 30 (9): 1366–1377.
Published: 01 September 2018
FIGURES
| View All (6)
Abstract
View article
PDF
Prior knowledge about the visual world can change how a visual stimulus is processed. Two forms of prior knowledge are often distinguished: stimulus familiarity (i.e., whether a stimulus has been seen before) and stimulus expectation (i.e., whether a stimulus is expected to occur, based on the context). Neurophysiological studies in monkeys have shown suppression of spiking activity both for expected and for familiar items in object-selective inferotemporal cortex. It is an open question, however, if and how these types of knowledge interact in their modulatory effects on the sensory response. To address this issue and to examine whether previous findings generalize to noninvasively measured neural activity in humans, we separately manipulated stimulus familiarity and expectation while noninvasively recording human brain activity using magnetoencephalography. We observed independent suppression of neural activity by familiarity and expectation, specifically in the lateral occipital complex, the putative human homologue of monkey inferotemporal cortex. Familiarity also led to sharpened response dynamics, which was predominantly observed in early visual cortex. Together, these results show that distinct types of sensory knowledge jointly determine the amount of neural resources dedicated to object processing in the visual ventral stream.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2016) 28 (1): 1–7.
Published: 01 January 2016
FIGURES
Abstract
View article
PDF
Auditory speech perception can be altered by concurrent visual information. The superior temporal cortex is an important combining site for this integration process. This area was previously found to be sensitive to audiovisual congruency. However, the direction of this congruency effect (i.e., stronger or weaker activity for congruent compared to incongruent stimulation) has been more equivocal. Here, we used fMRI to look at the neural responses of human participants during the McGurk illusion—in which auditory /aba/ and visual /aga/ inputs are fused to perceived /ada/—in a large homogenous sample of participants who consistently experienced this illusion. This enabled us to compare the neuronal responses during congruent audiovisual stimulation with incongruent audiovisual stimulation leading to the McGurk illusion while avoiding the possible confounding factor of sensory surprise that can occur when McGurk stimuli are only occasionally perceived. We found larger activity for congruent audiovisual stimuli than for incongruent (McGurk) stimuli in bilateral superior temporal cortex, extending into the primary auditory cortex. This finding suggests that superior temporal cortex prefers when auditory and visual input support the same representation.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2015) 27 (1): 175–184.
Published: 01 January 2015
FIGURES
Abstract
View article
PDF
Perception does not function as an isolated module but is tightly linked with other cognitive functions. Several studies have demonstrated an influence of language on motion perception, but it remains debated at which level of processing this modulation takes place. Some studies argue for an interaction in perceptual areas, but it is also possible that the interaction is mediated by “language areas” that integrate linguistic and visual information. Here, we investigated whether language–perception interactions were specific to the language-dominant left hemisphere by comparing the effects of language on visual material presented in the right (RVF) and left visual fields (LVF). Furthermore, we determined the neural locus of the interaction using fMRI. Participants performed a visual motion detection task. On each trial, the visual motion stimulus was presented in either the LVF or in the RVF, preceded by a centrally presented word (e.g., “rise”). The word could be congruent, incongruent, or neutral with regard to the direction of the visual motion stimulus that was presented subsequently. Participants were faster and more accurate when the direction implied by the motion word was congruent with the direction of the visual motion stimulus. Interestingly, the speed benefit was present only for motion stimuli that were presented in the RVF. We observed a neural counterpart of the behavioral facilitation effects in the left middle temporal gyrus, an area involved in semantic processing of verbal material. Together, our results suggest that semantic information about motion retrieved in language regions may automatically modulate perceptual decisions about motion.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2014) 26 (7): 1546–1554.
Published: 01 July 2014
FIGURES
| View All (4)
Abstract
View article
PDF
Sensory processing is strongly influenced by prior expectations. Valid expectations have been shown to lead to improvements in perception as well as in the quality of sensory representations in primary visual cortex. However, very little is known about the neural correlates of the expectations themselves. Previous studies have demonstrated increased activity in sensory cortex following the omission of an expected stimulus, yet it is unclear whether this increased activity constitutes a general surprise signal or rather has representational content. One intriguing possibility is that top–down expectation leads to the formation of a template of the expected stimulus in visual cortex, which can then be compared with subsequent bottom–up input. To test this hypothesis, we used fMRI to noninvasively measure neural activity patterns in early visual cortex of human participants during expected but omitted visual stimuli. Our results show that prior expectation of a specific visual stimulus evokes a feature-specific pattern of activity in the primary visual cortex (V1) similar to that evoked by the corresponding actual stimulus. These results are in line with the notion that prior expectation triggers the formation of specific stimulus templates to efficiently process expected sensory inputs.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2011) 23 (6): 1395–1404.
Published: 01 June 2011
FIGURES
| View All (5)
Abstract
View article
PDF
A growing number of studies show that visual mental imagery recruits the same brain areas as visual perception. Although the necessity of hV5/MT+ for motion perception has been revealed by means of TMS, its relevance for motion imagery remains unclear. We induced a direction-selective adaptation in hV5/MT+ by means of an MAE while subjects performed a mental rotation task that elicits imagined motion. We concurrently measured behavioral performance and neural activity with fMRI, enabling us to directly assess the effect of a perturbation of hV5/MT+ on other cortical areas involved in the mental rotation task. The activity in hV5/MT+ increased as more mental rotation was required, and the perturbation of hV5/MT+ affected behavioral performance as well as the neural activity in this area. Moreover, several regions in the posterior parietal cortex were also affected by this perturbation. Our results show that hV5/MT+ is required for imagined visual motion and engages in an interaction with parietal cortex during this cognitive process.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2005) 17 (1): 97–112.
Published: 01 January 2005
Abstract
View article
PDF
We have used implicit motor imagery to investigate the neural correlates of motor planning independently from actual movements. Subjects were presented with drawings of left or right hands and asked to judge the hand laterality, regardless of the stimulus rotation from its upright orientation. We paired this task with a visual imagery control task, in which subjects were presented with typographical characters and asked to report whether they saw a canonical letter or its mirror image, regardless of its rotation. We measured neurovascular activity with fast event-related fMRI, distinguishing responses parametrically related to motor imagery from responses evoked by visual imagery and other task-related phenomena. By quantifying behavioral and neurovascular correlates of imagery on a trial-by-trial basis, we could discriminate between stimulus-related, mental rotation-related, and response-related neural activity. We found that specific portions of the posterior parietal and precentral cortex increased their activity as a function of mental rotation only during the motor imagery task. Within these regions, the parietal cortex was visually responsive, whereas the dorsal precentral cortex was not. Response- but not rotation-related activity was found around the left central sulcus (putative primary motor cortex) during both imagery tasks. Our study provides novel evidence on the topography and content of movement representations in the human brain. During intended action, the posterior parietal cortex combines somatosensory and visuomotor information, whereas the dorsal premotor cortex generates the actual motor plan, and the primary motor cortex deals with movement execution. We discuss the relevance of these results in the context of current models of action planning.