Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Peter König
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2017) 29 (4): 637–651.
Published: 01 April 2017
FIGURES
| View All (7)
Abstract
View article
PDF
Faces provide a wealth of information, including the identity of the seen person and social cues, such as the direction of gaze. Crucially, different aspects of face processing require distinct forms of information encoding. Another person's attentional focus can be derived based on a view-dependent code. In contrast, identification benefits from invariance across all viewpoints. Different cortical areas have been suggested to subserve these distinct functions. However, little is known about the temporal aspects of differential viewpoint encoding in the human brain. Here, we combine EEG with multivariate data analyses to resolve the dynamics of face processing with high temporal resolution. This revealed a distinct sequence of viewpoint encoding. Head orientations were encoded first, starting after around 60 msec of processing. Shortly afterward, peaking around 115 msec after stimulus onset, a different encoding scheme emerged. At this latency, mirror-symmetric viewing angles elicited highly similar cortical responses. Finally, about 280 msec after visual onset, EEG response patterns demonstrated a considerable degree of viewpoint invariance across all viewpoints tested, with the noteworthy exception of the front-facing view. Taken together, our results indicate that the processing of facial viewpoints follows a temporal sequence of encoding schemes, potentially mirroring different levels of computational complexity.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (1996) 8 (6): 603–625.
Published: 01 November 1996
Abstract
View article
PDF
Recent experimental results in the visual cortex of cats and monkeys have suggested an important role for synchronization of neuronal activity on a millisecond time scale. Synchronization has been found to occur selectively between neuronal responses to related image components. This suggests that not only the firing rates of neurons but also the relative timing of their action potentials is used as a coding dimension. Thus, a powerful relational code would be available, in addition to the rate code, for the representation of perceptual objects. This could alleviate difficulties in the simultaneous representation of multiple objects. In this article we present a set of theoretical arguments and predictions concerning the mechanisms that could group neurons responding to related image components into coherently active aggregates. Synchrony is likely to be mediated by synchronizing connections; we introduce the concept of an interaction skeleton to refer to the subset of synchronizing connections that are rendered effective by a particular stimulus configuration. If the image is segmented into objects, these objects can typically be segmented further into their constituent parts. The synchronization behavior of neurons that represent the various image components may accurately reflect this hierarchical clustering. We propose that the range of synchronizing interactions is a dynamic parameter of the cortical network, so that the grain of the resultant grouping process may be adapted to the actual behavioral requirements. It can be argued that different aspects of purposeful behavior rely on separable processes by which sensory input is transformed into adjustments of motor activity. Indeed, neurophysiological evidence has suggested separate processing streams originating in the primary visual cortex for object identification and sensorimotor coordination. However, such a separation calls for a mechanism that avoids interference effects in the presence of multiple objects, or when multiple motor programs are simultaneously prepared. In this article we suggest that synchronization between responses of neurons in both the visual cortex and in areas that are involved in response selection and execution might allow for a selective routing of sensory information to the appropriate motor program.