Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-11 of 11
Ralph Adolphs
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2019) 31 (4): 482–496.
Published: 01 April 2019
FIGURES
| View All (5)
Abstract
View article
PDF
Anthropomorphism, the attribution of distinctively human mental characteristics to nonhuman animals and objects, illustrates the human propensity for extending social cognition beyond typical social targets. Yet, its processing components remain challenging to study because they are typically all engaged simultaneously. Across one pilot study and one focal study, we tested three rare people with basolateral amygdala lesions to dissociate two specific processing components: those triggered by attention to social cues (e.g., seeing a face) and those triggered by endogenous semantic knowledge (e.g., imbuing a machine with animacy). A pilot study demonstrated that, like neurologically intact control group participants, the three amygdala-damaged participants produced anthropomorphic descriptions for highly socially salient stimuli but not for stimuli lacking clear social cues. A focal study found that the three amygdala participants could anthropomorphize animate and living entities normally, but anthropomorphized inanimate stimuli less than control participants. Our findings suggest that the amygdala contributes to how we anthropomorphize stimuli that are not explicitly social.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2012) 24 (6): 1358–1370.
Published: 01 June 2012
FIGURES
| View All (6)
Abstract
View article
PDF
Electrophysiological and fMRI-based investigations of the ventral temporal cortex of primates provide strong support for regional specialization for the processing of faces. These responses are most frequently found in or near the fusiform gyrus, but there is substantial variability in their anatomical location and response properties. An outstanding question is the extent to which ventral temporal cortex participates in processing dynamic, expressive aspects of faces, a function usually attributed to regions near the superior temporal cortex. Here, we investigated these issues through intracranial recordings from eight human surgical patients. We compared several different aspects of face processing (static and dynamic faces; happy, neutral, and fearful expressions) with power in the high-gamma band (70–150 Hz) from a spectral analysis. Detailed mapping of the response characteristics as a function of anatomical location was conducted in relation to the gyral and sulcal pattern on each patient's brain. The results document responses with high responsiveness for static or dynamic faces, often showing abrupt changes in response properties between spatially close recording sites and idiosyncratic across different subjects. Notably, strong responses to dynamic facial expressions can be found in the fusiform gyrus, just as can responses to static faces. The findings suggest a more complex, fragmented architecture of ventral temporal cortex around the fusiform gyrus, one that includes focal regions of cortex that appear relatively specialized for either static or dynamic aspects of faces.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2005) 17 (10): 1509–1518.
Published: 01 October 2005
Abstract
View article
PDF
Lesion and functional imaging studies in humans have shown that the ventral and medial prefrontal cortex is critically involved in the processing of emotional stimuli, but both of these methods have limited spatiotemporal resolution. Conversely, neurophysiological studies of emotion in nonhuman primates typically rely on stimuli that do not require elaborate cognitive processing. To begin bridging this gap, we recorded from a total of 267 neurons in the left and right orbital and anterior cingulate cortices of four patients who had chronically implanted depth electrodes for monitoring epilepsy. Peristimulus activity was recorded to standardized, complex visual scenes depicting neutral, pleasant, or aversive content. Recording locations were verified with postoperative magnetic resonance imaging. Using a conservative, multistep statistical evaluation, we found significant responses in 56 neurons; 16 of these were selective for only one emotion class, most often aversive. The findings suggest sparse and widely distributed processing of emotional value in the prefrontal cortex, with a predominance of responses to aversive stimuli.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2004) 16 (10): 1796–1804.
Published: 01 December 2004
Abstract
View article
PDF
Damage to the human ventromedial prefrontal cortex (VM) can result in dramatic and maladaptive changes in social behavior despite preservation of most other cognitive abilities. One important aspect of social cognition is the ability to detect social dominance, a process of attributing from particular social signals another person's relative standing in the social world. To test the role of the VM in making attributions of social dominance, we designed two experiments: one requiring dominance judgments from static pictures of faces, the second requiring dominance judgments from film clips. We tested three demographically matched groups of subjects: subjects with focal lesions in the VM (n = 15), brain-damaged comparison subjects with lesions excluding the VM (n = 11), and a reference group of normal individuals with no history of neurological disease (n = 32). Contrary to our expectation, we found that subjects with VM lesions gave dominance judgments on both tasks that did not differ significantly from those given by the other groups. Despite their grossly normal performance, however, subjects with VM lesions showed more subtle impairments specifically when judging static faces: They were less discriminative in their dominance judgments, and did not appear to make normal use of gender and age of the faces in forming their judgments. The findings suggest that, in the laboratory tasks we used, damage to the VM does not necessarily impair judgments of social dominance, although it appears to result in alterations in strategy that might translate into behavioral impairments in real life.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2004) 16 (7): 1143–1158.
Published: 01 September 2004
Abstract
View article
PDF
Humans are able to use nonverbal behavior to make fast, reliable judgments of both emotional states and personality traits. Whereas a sizeable body of research has identified neural structures critical for emotion recognition, the neural substrates of personality trait attribution have not been explored in detail. In the present study, we investigated the neural systems involved in emotion and personality trait judgments. We used a type of visual stimulus that is known to convey both emotion and personality information, namely, point-light walkers. We compared the emotion and personality trait judgments made by subjects with brain damage to those made by neurologically normal subjects and then conducted a lesion overlap analysis to identify neural regions critical for these two tasks. Impairments on the two tasks dissociated: Some subjects were impaired at emotion recognition, but judged personality normally; other subjects were impaired on the personality task, but normal at emotion recognition. Moreover, these dissociations in performance were associated with damage to specific neural regions: Right somatosensory cortices were a primary focus of lesion overlap in subjects impaired on the emotion task, whereas left frontal opercular cortices were a primary focus of lesion overlap in subjects impaired on the personality task. These findings suggest that attributions of emotional states and personality traits are accomplished by partially dissociable neural systems.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2004) 16 (3): 453–462.
Published: 01 April 2004
Abstract
View article
PDF
Although the amygdala's role in processing facial expressions of fear has been well established, its role in the processing of other emotions is unclear. In particular, evidence for the amygdala's involvement in processing expressions of happiness and sadness remains controversial. To clarify this issue, we constructed a series of morphed stimuli whose emotional expression varied gradually from very faint to more pronounced. Five morphs each of sadness and happiness, as well as neutral faces, were shown to 27 subjects with unilateral amygdala damage and 5 with complete bilateral amygdala damage, whose data were compared to those from 12 braindamaged and 26 normal controls. Subjects were asked to rate the intensity and to label the stimuli. Subjects with unilateral amygdala damage performed very comparably to controls. By contrast, subjects with bilateral amygdala damage showed a specific impairment in rating sad faces, but performed normally in rating happy faces. Furthermore, subjects with right unilateral amygdala damage performed somewhat worse than subjects with left unilateral amygdala damage. The findings suggest that the amygdala's role in processing of emotional facial expressions encompasses multiple negatively valenced emotions, including fear and sadness.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2002) 14 (8): 1158–1173.
Published: 15 November 2002
Abstract
View article
PDF
There are two competing theories of facial expression recognition. Some researchers have suggested that it is an example of “categorical perception.” In this view, expression categories are considered to be discrete entities with sharp boundaries, and discrimination of nearby pairs of expressive faces is enhanced near those boundaries. Other researchers, however, suggest that facial expression perception is more graded and that facial expressions are best thought of as points in a continuous, low-dimensional space, where, for instance, “surprise” expressions lie between “happiness” and “fear” expressions due to their perceptual similarity. In this article, we show that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories. Without any parameter tuning, the model matches a variety of psychological data on categorization, similarity, reaction times, discrimination, and recognition difficulty, both qualitatively and quantitatively. We thus explain many of the seemingly complex psychological phenomena related to facial expression perception as natural consequences of the tasks' implementations in the brain.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2002) 14 (8): 1264–1274.
Published: 15 November 2002
Abstract
View article
PDF
Lesion, functional imaging, and single-unit studies in human and nonhuman animals have demonstrated a role for the amygdala in processing stimuli with emotional and social significance. We investigated the recognition of a wide variety of facial expressions, including basic emotions (e.g., happiness, anger) and social emotions (e.g., guilt, admiration, flirtatiousness). Prior findings with a standardized set of stimuli indicated that recognition of social emotions can be signaled by the eye region of the face and is disproportionately impaired in autism (Baron-Cohen, Wheelwright, & Jolliffe, 1997). To test the hypothesis that the recognition of social emotions depends on the amygdala, we administered the same stimuli to 30 subjects with unilateral amygdala damage (16 left, 14 right), 2 with bilateral amygdala damage, 47 brain-damaged controls, and 19 normal controls. Compared with controls, subjects with unilateral or bilateral amygdala damage were impaired when recognizing social emotions; moreover, they were more impaired in recognition of social emotions than in recognition of basic emotions, and, like previously described patients with autism, they were impaired also when asked to recognize social emotions from the eye region of the face alone. The findings suggest that the human amygdala is relatively specialized to process stimuli with complex social significance. The results also provide further support for the idea that some of the impairments in social cognition seen in patients with autism may result from dysfunction of the amygdala.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2001) 13 (2): 232–240.
Published: 15 February 2001
Abstract
View article
PDF
Autism has been thought to be characterized, in part, by dysfunction in emotional and social cognition, but the pathology of the underlying processes and their neural substrates remain poorly understood. Several studies have hypothesized that abnormal amygdala function may account for some of the impairments seen in autism, specifically, impaired recognition of socially relevant information from faces. We explored this issue in eight high-functioning subjects with autism in four experiments that assessed recognition of emotional and social information, primarily from faces. All tasks used were identical to those previously used in studies of subjects with bilateral amygdala damage, permitting direct comparisons. All subjects with autism made abnormal social judgments regarding the trustworthiness of faces; however, all were able to make normal social judgments from lexical stimuli, and all had a normal ability to perceptually discriminate the stimuli. Overall, these data from subjects with autism show some parallels to those from neurological subjects with focal amygdala damage. We suggest that amygdala dysfunction in autism might contribute to an impaired ability to link visual perception of socially relevant stimuli with retrieval of social knowledge and with elicitation of social behavior.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2000) 12 (Supplement 1): 30–46.
Published: 01 March 2000
Abstract
View article
PDF
Studies of abnormal populations provide a rare opportunity for examining relationships between cognition, genotype and brain neurobiology, permitting comparisons across these different levels of analysis. In our studies, we investigate individuals with a rare, genetically based disorder called Williams syndrome (WMS) to draw links among these levels. A critical component of such a cross-domain undertaking is the clear delineation of the phenotype of the disorder in question. Of special interest in this paper is a relatively unexplored unusual social phenotype in WMS that includes an overfriendly and engaging personality. Four studies measuring distinct aspects of hypersocial behavior in WMS are presented, each probing specific aspects in WMS infants, toddlers, school age children, and adults. The abnormal profile of excessively social behavior represents an important component of the phenotype that may distinguish WMS from other developmental disorders. Furthermore, the studies show that the profile is observed across a wide range of ages, and emerges consistently across multiple experimental paradigms. These studies of hypersocial behavior in WMS promise to provide the ground-work for crossdisciplinary analyses of gene-brain-behavior relationships.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (1999) 11 (6): 610–616.
Published: 01 November 1999
Abstract
View article
PDF
Bilateral damage to the human amygdala impairs retrieval of emotional and social information from faces. An important unanswered question concerns the specificity of the impairment for faces. To address this question, we examined preferences for a broad class of visual stimuli in two subjects with complete bilateral amygdala damage, both of whom were impaired in judgments of faces. Relative to controls, the subjects showed a positive bias for simple nonsense figures, color patterns, three-dimensional-looking objects and landscapes. The impairment was most pronounced in regard to those stimuli that are normally liked the least. The human amygdala thus appears to play a general role in guiding preferences for visual stimuli that are normally judged to be aversive.