Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Lan Yang
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience 1–18.
Published: 28 November 2024
Abstract
View article
PDF
Vocal emotions are crucial in guiding visual attention toward emotionally significant environmental events, such as recognizing emotional faces. This study employed continuous electroencephalography (EEG) recordings to examine the impact of linguistic and nonlinguistic vocalizations on facial emotion processing. Participants completed a facial emotion discrimination task while viewing fearful, happy, and neutral faces. The behavioral and ERP results indicated that fearful nonlinguistic vocalizations accelerated the recognition of fearful faces and elicited a larger P1 amplitude, whereas happy linguistic vocalizations accelerated the recognition of happy faces and similarly induced a greater P1 amplitude. In recognition of fearful faces, a greater N170 component was observed in the right hemisphere when the emotional category of the priming vocalization was consistent with the face stimulus. In contrast, this effect occurred in the left hemisphere while recognizing happy faces. Representational similarity analysis revealed that the temporoparietal regions automatically differentiate between linguistic and nonlinguistic vocalizations early in face processing. In conclusion, these findings enhance our understanding of the interplay between vocalization types and facial emotion recognition, highlighting the importance of cross-modal processing in emotional perception.
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2024) 36 (8): 1695–1714.
Published: 01 July 2024
FIGURES
| View All (12)
Abstract
View article
PDF
The brain is a hierarchical modular organization that varies across functional states. Network configuration can better reveal network organization patterns. However, the multi-hierarchy network configuration remains unknown. Here, we propose an eigenmodal decomposition approach to detect modules at multi-hierarchy, which can identify higher-layer potential submodules and is consistent with the brain hierarchical structure. We defined three metrics: node configuration matrix, combinability, and separability. Node configuration matrix represents network configuration changes between layers. Separability reflects network configuration from global to local, whereas combinability shows network configuration from local to global. First, we created a random network to verify the feasibility of the method. Results show that separability of real networks is larger than that of random networks, whereas combinability is smaller than random networks. Then, we analyzed a large data set incorporating fMRI data from resting and seven distinct tasking conditions. Experiment results demonstrates the high similarity in node configuration matrices for different task conditions, whereas the tasking states have less separability and greater combinability between modules compared with the resting state. Furthermore, the ability of brain network configuration can predict brain states and cognition performance. Crucially, derived from tasks are highlighted with greater power than resting, showing that task-induced attributes have a greater ability to reveal individual differences. Together, our study provides novel perspectives for analyzing the organization structure of complex brain networks at multi-hierarchy, gives new insights to further unravel the working mechanisms of the brain, and adds new evidence for tasking states to better characterize and predict behavioral traits.