Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Eun-Jin Sim
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2011) 23 (8): 1864–1874.
Published: 01 August 2011
FIGURES
| View All (4)
Abstract
View article
PDF
Perception and action are classically thought to be supported by functionally and neuroanatomically distinct mechanisms. However, recent behavioral studies using an action priming paradigm challenged this view and showed that action representations can facilitate object recognition. This study determined whether action representations influence object recognition during early visual processing stages, that is, within the first 150 msec. To this end, the time course of brain activation underlying such action priming effects was examined by recording ERPs. Subjects were sequentially presented with two manipulable objects (e.g., tools), which had to be named. In the congruent condition, both objects afforded similar actions, whereas dissimilar actions were afforded in the incongruent condition. In order to test the influence of the prime modality on action priming, the first object (prime) was presented either as picture or as word. We found an ERP effect of action priming over the central scalp as early as 100 msec after target onset for pictorial, but not for verbal primes. A later action priming effect on the N400 ERP component known to index semantic integration processes was obtained for both picture and word primes. The early effect was generated in a fronto-parietal motor network, whereas the late effect reflected activity in anterior temporal areas. The present results indicate that action priming influences object recognition through both fast and slow pathways: Action priming affects rapid visuomotor processes only when elicited by pictorial prime stimuli. However, it also modulates comparably slow conceptual integration processes independent of the prime modality.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2008) 20 (10): 1799–1814.
Published: 01 October 2008
Abstract
View article
PDF
Traditionally, concepts are assumed to be situational invariant mental knowledge entities (conceptual stability), which are represented in a unitary brain system distinct from sensory and motor areas (amodality). However, accumulating evidence suggests that concepts are embodied in perception and action in that their conceptual features are stored within modality-specific semantic maps in the sensory and motor cortex. Nonetheless, the first traditional assumption of conceptual stability largely remains unquestioned. Here, we tested the notion of flexible concepts using functional magnetic resonance imaging and event-related potentials (ERPs) during the verification of two attribute types (visual, action-related) for words denoting artifactual and natural objects. Functional imaging predominantly revealed crossover interactions between category and attribute type in visual, motor, and motion-related brain areas, indicating that access to conceptual knowledge is strongly modulated by attribute type: Activity in these areas was highest when nondominant conceptual attributes had to be verified. ERPs indicated that these category-attribute interactions emerged as early as 116 msec after stimulus onset, suggesting that they reflect rapid access to conceptual features rather than postconceptual processing. Our results suggest that concepts are situational-dependent mental entities. They are composed of semantic features which are flexibly recruited from distributed, yet localized, semantic maps in modality-specific brain regions depending on contextual constraints.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2007) 19 (3): 525–542.
Published: 01 March 2007
Abstract
View article
PDF
Concepts are composed of features related to different sensory and motor modalities such as vision, sound, and action. It is a matter of controversy whether conceptual features are represented in sensory-motor areas reflecting the specific learning experience during acquisition. In order to address this issue, we assessed the plasticity of conceptual representations by training human participants with novel objects under different training conditions. These objects were assigned to categories such that for one class of categories, the overall shape was diagnostic for category membership, whereas for the other class, a detail feature affording a particular action was diagnostic. During training, participants were asked to either make an action pantomime toward the detail feature of the novel object or point to it. In a categorization task at test, we assessed the neural correlates of the acquired conceptual representations by measuring electrical brain activity. Here, we show that the same object is differentially processed depending on the sensory-motor interactions during knowledge acquisition. Only in the pantomime group did we find early activation in frontal motor regions and later activation in occipito-parietal visual-motor regions. In the pointing training group, these effects were absent. These results show that action information contributes to conceptual processing depending on the specific learning experience. In line with modality-specific theories of conceptual memory, our study suggests that conceptual representations are established by the learning-based formation of cell assemblies in sensory-motor areas.