Abstract

According to embodied theories, the symbols used by language are meaningful because they are grounded in perception, action, and emotion. In contrast, according to abstract symbol theories, meaning arises from the syntactic combination of abstract, amodal symbols. If language is grounded in internal bodily states, then one would predict that emotion affects language. Consistent with this, advocates of embodied theories propose a strong link between emotion and language [Havas, D., Glenberg, A. M., & Rinck, M. Emotion simulation during language comprehension. Psychonomic Bulletin & Review, 14, 436–441, 2007; Niedenthal, P. M. Embodying emotion. Science, 316, 1002–1005, 2007]. The goal of this study was to test abstract symbol vs. embodied views of language by investigating whether mood affects semantic processing. To this aim, we induced different emotional states (happy vs. sad) by presenting film clips that displayed fragments from a happy movie or a sad movie. The clips were presented before and during blocks of sentences in which the cloze probability of mid-sentence critical words varied (high vs. low). Participants read sentences while ERPs were recorded. The mood induction procedure was successful: Participants watching the happy film clips scored higher on a mood scale than those watching the sad clips. For N400, mood by cloze probability interactions were obtained. The N400 cloze effect was strongly reduced in the sad mood compared with the happy mood condition. Furthermore, a difference in late positivity was only present for the sad mood condition. The mood by semantic processing interaction observed for N400 supports embodied theories of meaning and challenges abstract symbol theories that assume that processing of word meaning reflects a modular process.

INTRODUCTION

Meaning is a fundamental aspect of language. The sentence “He killed her” means something radically different from the sentence “He kissed her.” The difference lies in the meanings of the individual action verbs “kill” versus “kiss.” The verb kill captures an act of violence leading to irreversible death, whereas the verb “kiss” captures an action of tenderness typically reflecting affection. In this example, the words also differ in emotional valence, with killing depicting a negative (unpleasant) word and kissing depicting a positive (pleasant) word. The aim of this article was to shed light on whether, and if so how, emotional state affects semantic processing in language comprehension.

In cognitive psychology, two theoretical perspectives regarding the representation of word meaning have been presented. According to the cognitive science view—that dominated the fields of linguistics and psycholinguistics for the past 40 years—the brain is conceived of as an organ for building internal representations of the external world. A key assumption of this approach is that knowledge resides in a semantic memory system separate from the brain's modal systems for perception (audition, vision), action (movement, proprioception), and emotion (Fodor, 1975). In line with this, classical theories of meaning are abstract symbol theories (e.g., Anderson, Matessa, & Lebiere, 1997; Landauer & Dumais, 1997; Masson, 1995; Collins & Loftus, 1975). According to these theories, meaning arises from the syntactic combination of abstract, amodal (nonperceptual) symbols that are arbitrarily related to entities in the real world. For example, according to one of the most frequently cited theories of Collins and Loftus (1975), meaning arises from the pattern of relations among nodes in a network. Every node in the network corresponds to an undefined word, and the set of nodes to which a particular node is connected corresponds to the words in the dictionary definition. The conceptual network is organized according to semantic similarity and represents facts about concepts. For instance, for the concept lion, it represents facts such as lions are large, lions live in Africa, lions are feline, lions have a mane, and so forth. This conception gave rise to different types of representation, such as feature lists, schemata, semantic nets, and connectionism. Mathematical high-dimensional models of meaning (e.g., latent semantic analysis [LSA] of Landauer & Dumais, 1997) have been presented as a new variant of the classical abstract symbol theories. Here the meaning of a word is its vector representation in a high-dimensional space, and these vectors are very similar to the abstract symbols used in the classical theories of meaning. A major strength of the classical approach to meaning is that abstract amodal symbols have been very successful in representing several kinds of knowledge (i.e., types and tokens, propositions, and abstract concepts) and to combine symbols productively. A fundamental critique is how meaning can arise from abstract symbols not grounded in perception or action (i.e., the grounding problem). As forcefully argued by Searle (1980), manipulation of abstract amodal symbols produces more abstract symbols, not meaning.

A different theoretical perspective about the representation of word meanings are perceptual theories of cognition. Recently, these theories gained popularity in the form of embodied approaches to cognition (e.g., Zwaan & Taylor, 2006; Barsalou, 1999; MacWhinney, 1999; Glenberg, 1997; Lakoff, 1987). Here, the brain's primary purpose is not to represent the external world but to regulate behavior. On this view, the brain is an equal player, next to the organism's body and the world. These three factors together form part of the physical substrate that causes behavior and cognition. A central question for embodied theories is how the brain, the body, and the world (individually and jointly) contribute to the creation and maintenance of real world behavior. A key assumption of the embodied approach is that mental processes such as thinking or language understanding are based on the physical or imagined interactions (Barsalou, 2008; Stanfield & Zwaan, 2001) that people have with their environment. The embodied framework has been applied to the field of language (e.g., Yeh & Barsalou, 2006; Zwaan & Madden, 2005; Glenberg & Robertson, 2000; MacWhinney, 1999; Glenberg, 1997). The starting point is that the structure of the body is very important in that it determines the range of effective actions. Here the term affordances plays a crucial role. Following Gibson (1979), embodied researchers define affordances as the actions suggested by a particular object (e.g., Borghi, 2005). For instance, the affordances of a knife include cutting several kinds of objects, or defending oneself or somebody else, or attacking somebody, but they do not include watering flowers. The range of possible actions suggested by an object also includes new actions that have not been previously performed.

According to embodied approaches, meaning is based on our current interactions or previous experiences of interactions with objects in different kinds of environments. Current or Past Body × Environment interactions guide us in how to think about, that is, simulate the perceptual and action details required by a situation. The most fully developed embodied view of language comprehension is the Indexical Hypothesis of Glenberg (1997; se also Glenberg & Robertson, 1999, 2000). According to this hypothesis, as a sentence is comprehended, its individual words are indexed to perceptual symbols, which are combined to produce a simulation of what the sentence describes. Meaningfulness resides in the knowledge of the possibilities versus limitations of the human body and individual experiences. One main advantage of this approach is that it brings us closer to a solution of the grounding problem (Searle, 1980). On the negative side, embodied theories have been criticized as being too vague, and it is clear that a further specification is needed before their merits to language comprehension can be fully assessed.

Findings from cognitive neuroscience have challenged abstract symbol theories. A striking example is the work from Pulvermüller (2005). He investigated the processing of verbs involving actions with different parts of the body (leg, hand, and face). Pulvermüller showed that when participants read words for an action the motor system becomes active to represent its meaning. More specifically, verbs for head, arm, and leg actions produce head, arm, and leg activations in the respective areas of the motor cortex. These activations, as revealed by magneto-encephalogram, took place very fast—that is, within just a few hundreds of milliseconds. In addition, Tettamanti et al. (2005) have shown that the motor regions of the brain also become active when participants listen to action-related sentences. These results seem inconsistent with one of the main tenets of abstract symbol theories that knowledge in the form of abstract symbols is stored in a semantic network, separate from the brain's modal systems for action, perception, and emotion.

Chwilla, Kolk, and Vissers (2007) used a different approach to test abstract symbol theories against embodied theories of meaning. They recorded ERPs to explore how the brain establishes novel meanings not stored in semantic memory. This was accomplished by presenting one or more context-setting sentences followed by a critical sentence to which ERPs were recorded that described a novel sensible or novel senseless situation. For example, the (context-setting) sentence “The scouts wanted to make music at the campfire” was followed by the critical sentence “The boys searched for branches/bushes [sensible/senseless] with which they went drumming and had a lot of fun.” Novel sensible contexts that were not associatively nor semantically related were matched to novel senseless contexts in terms of familiarity and semantic similarity by LSA.1 Abstract symbol theories like LSA can only discover meaningfulness by consulting stored symbolic knowledge and therefore cannot explain facilitation for novel sensible situations. This constraint does not hold for embodied theories. Here what is meaningful and not depends on our knowledge of the possibilities and limitations of our body. Chwilla et al. found a facilitation for novel meanings for a language-relevant ERP component, the N400, that is highly sensitive to semantic processing (i.e., a reduction in N400 amplitude for novel sensible compared with novel senseless meanings) that was independent of task. Importantly, novel meanings were created on the spot—that is, within the same time window as familiar associative and semantic relationships (300–500 msec). The demonstration of immediate facilitation for novel meanings supports embodied views and calls into question disembodied abstract symbol theories.

The goal of this article was to further test abstract symbol versus embodied theories of meaning against each other by investigating the effects of emotional context on semantic processing. According to abstract symbol theories, processing of word meaning is performed by a central cognitive module—that is, separable from the systems for perception, motor action, and emotion. Therefore, activation of word meaning should be resistant to fluctuations in (emotional) context. In contrast, according to embodied theories, words are meaningful because they are grounded in perception, action, and emotion. In the present article, ERPs were recorded, which provide a continuous record of brain activity and therefore permit insight into the time course with which a participant's emotional state affects semantic processing. This in contrast, with previous studies on the emotion by language interplay that recorded RTs in judgment tasks (see below; Havas, Glenberg, & Rinck, 2007). To approximate processes of normal language understanding, in the present study, participants did nothing else than what they do in normal life—that is, to read for comprehension.

In this ERP study, we attempt to separate the two classes of models by exploring the effects of emotional context, in particular a participant's emotional state, on semantic processing. As proposed by Niedenthal (2007), according to embodied theories perceiving and thinking about emotion involve perceptual, somatovisceral, and motoric reexperiencing (simulation) of the relevant emotion in one's self. If language is grounded in bodily states, then one would predict that emotion affects language. Consistent with this, advocates of embodied theories predict an interaction between emotion and language (Havas et al., 2007; Niedenthal, 2007; Glenberg, Havas, Becker, & Rinck, 2005; Damasio, 1994).

Supporting evidence for this claim comes from Strack, Martin, and Stepper (1988). They investigated the effects of facial expression on the comprehension of cartoons. Facial expression was varied by having participants either hold a pen between their lips, thereby inhibiting the facial muscles used in smiling (zygomaticus major or the risorius muscles), or by holding a pen between their teeth in a way that facilitates the contraction of the latter facial muscles used in smiling. They found a reciprocal relationship between the facial emotional expression and the way in which emotional information was interpreted: participants who were led to smile judged the cartoons as funnier than participants whose smile was blocked. The effectiveness of this procedure developed by Strack et al. to reliably shift the body into a happy state versus unhappy state has been well documented (e.g., Soussignan, 2002; Berkowitz & Troccoli, 1990). Hence, Havas et al. (2007) used this procedure to explore the effects of emotional state on the processing of sentences describing pleasant versus unpleasant sentences. Participants read happy, sad, and angry sentences. In Experiment 1, participants judged whether the sentences were pleasant or unpleasant, whereas in Experiment 2 they judged whether the sentences were easy or hard to comprehend. Regardless of the task demands, participants were faster to read and judge sentences describing a pleasant event (e.g., “Your lover chases you playfully around your bedroom”) when holding the pen between the teeth (smiling) than when holding the pen in between the lips (frowning). The opposite pattern was found for sentences describing an unpleasant situation. These findings of Havas et al. fit well with the embodiment claim according to which a full understanding of language about emotional states requires that those emotional states are literally embodied (simulated) during comprehension thereby recruiting the same neural and bodily mechanisms that are involved during emotional experiences.

The general goal of this article was to contrast embodied versus disembodied theories by investigating whether a participant's emotional state affects on-line semantic processing in language comprehension, as tapped by N400 (Kutas & Hillyard, 1980; for a review, see Kutas & Federmeier, 2000). Relevant for this study, N400 amplitude systematically varies with the degree of expectedness of a word in a given context; the higher the expectancy the smaller N400.2 The graded function of N400 amplitude on the basis of expectancy is referred to as the N400 cloze effect. We test the models against each other by investigating whether the N400 cloze effect is modulated by mood. This is examined by inducing different emotional states (a happy mood vs. a sad mood) while participants read sentences part of which contained high-cloze sentences (e.g., “In that library the pupils borrow books…”) and low-cloze sentences (e.g., “The pillows are stuffed with books…”). In the emotion literature, it has been shown that films are a highly effective means to induce both positive and negative mood states (Westermann, Spies, Stahl, & Hesse, 1996; see Methods). Therefore, in the present study, we manipulate participants' mood by presenting film clips that either displayed fragments from a happy movie or a sad movie.

The rational is as follows: If we succeed in inducing differences in emotional state, then we can test embodied theories against abstract symbol theories: In particular, if, as proposed by advocates of embodied theories language is grounded in bodily states then emotional state should affect on-line language comprehension. The proposed mechanism that gives rise to effects of emotion is simulation of an emotional state. Whether “emotional simulation” impacts on-line semantic processing can be assessed from N400. If so, this should be reflected by an interaction between mood and the N400 cloze effect. In contrast, as sketched above according to abstract symbol theories, emotional state should not affect semantic processing.

Little yet is known about the mood by semantics interface. For N400 few studies have probed for the effects of emotion. With respect to the effects of a word's emotional content, an increase in N400 amplitude was reported to words that were emotionally inconsistent with the prosody (happy or sad voice) of a preceding sentence (Schirmer, Kotz, & Friederici, 2002, 2005) or with the prosody of how a word was pronounced (Schirmer & Kotz, 2003). Some studies have observed a facilitation in semantic processing for pleasant as opposed to unpleasant words (Herbert, Junghofer, & Kissler, 2008; Kiefer, Schuch, Schenck, & Fiedler, 2007).

With regard to the effect of a participant's emotional state, Chung et al. (1996) investigated the effects of a positive versus negative mood (induced by imagination by asking subjects to think of happy or sad life events) on the comprehension of short life stories whose final word depicted a positive or negative outcome or was semantically incongruent. N400 amplitude was largest to semantically incongruent words but was also larger to mood-incongruent than mood-congruent words. These results were taken to suggest that mood imposes an emotional constraint on the access of semantic word meaning. In a second study, Federmeier, Kirson, Moreno, and Kutas (2001) studied the effects of a mildly positive mood and a neutral mood on semantic memory organization. To this aim, they compared the N400 with a highly expected exemplar with the N400 to a within-category violation or between-category violation (e.g., “They wanted to make the hotel look more like a tropical resort.” So, along the driveway, they planted rows of “palms” [expected exemplar], “pines” [within-category violation], “tulips” [between-category violation].) A mildly positive mood was induced by presenting pictures of positive items (e.g., cute animals) and a neutral mood was induced by presenting pictures of neutral items (e.g., household objects). The main result was that a mildly positive mood relative to a neutral mood facilitated semantic processing (as reflected by a decrease in N400 amplitude) of distantly related unexpected items (i.e., between-category violations). In a third study, Pratt and Kelly (2008) recorded N400s to positively and negatively valenced words (e.g., love vs. death) while they induced differences in mood by providing (false) positive or negative feedback. An increase in a frontally distributed negativity within the N400 window was reported for positive words relative to negative words but only when subjects were in a positive mood. However, because of methodological concerns, in particular, the use of a small set of items, frequent stimulus repetition and the use of one-sided statistical tests, the reliability of the results remains unclear. In sum, previous results suggest that a positive mood may facilitate semantic processing.

In the present article, we investigate the effects of a participant's emotional state on semantic processing by comparing the standard N400 cloze effect in a happy emotional state with that in a sad emotional state. Thus, opposite to the Federmeier et al. (2001) study that tested for differences in ERPs between a (mildly) positive mood and a neutral mood, in the present study a broader spectrum of the mood continuum was investigated by directly comparing the effects of a positive mood with those of a negative mood to the same language materials. This approach provides a more complete picture about potential effects of emotion/mood on processes of language comprehension and may provide a first step towards exploring the relation between language and emotion with ERPs in patients.

The crucial question of the present article is whether an interaction between emotional state and N400 cloze probability effect will be observed. Abstract symbol theories do not predict an interaction between language and emotion, as processing of word meaning is performed by a central cognitive module—that is, separable from the systems for perception, motor action, and most importantly, for the present study, emotions. The presence of an interaction between language and emotion, therefore, would challenge abstract symbol theories and support embodied theories of meaning. Importantly, in contrast with previous RT studies on the relation between emotion and language, the use of ERPs allows us to track the time course at which emotion affects semantic processing—as tapped by N400—in real time. The predictions for the ERPs were as follows: With respect to the effects of a positive mood on the basis of the study of Federmeier et al. (2001), we predict an increase in N400 effect with positive mood. To our knowledge, the effects of a negative mood on the comprehension of emotionally neutral sentences by means of N400 have not yet been investigated. Therefore, it is an open question how a negative mood affects on-line semantic processing as tapped by N400.

To provide further evidence that possible changes in N400 amplitude between the happy and the sad mood condition reflect emotional factors, correlation analyses will be computed between the size of the N400 effect and the mean mood ratings. The presence of significant correlations between the size of the N400 effect and the mood ratings would be consistent with the claim that potential changes in N400 amplitude are related to emotional factors.

METHODS

Participants

There were 32 participants (mean age = 22 years, age range = 18–31 years). Recent evidence makes clear that the widespread assumption that subject sex matters little if at all in studies on the neurobiology of emotional memory is no longer tenable and, therefore, should be abandoned (Cahill, 2006). In line with this preliminary evidence suggests that female and male participants process meaning differently in a positive versus neutral mood (Federmeier et al., 2001). In the present study, therefore, we explored the relation between emotion and semantics in women. Other criteria were as follows: Only those participants were selected that reported no drug abuse, neurological, mental, or chronic bodily diseases, or medication for any of these. All were native speakers of Dutch, had no reading disabilities, were right-handed, and had normal or corrected-to-normal vision. Hand dominance was assessed with an abridged Dutch version of the Edinburgh Inventory (Oldfield, 1971). Sixteen participants reported the presence of left-handedness in their immediate family. One participant had to be excluded from the analyses because of excessive eye movement artifacts.

Materials

We first constructed 127 simple declarative sentence fragments and used these in a cloze test with 25 subjects to obtain highly expected (“high-cloze”) critical words. Of these 127 sentences, 116 sentences were completed with the same word by 91% of the participants. These were used as the high-cloze context sentence fragments in this study.

We then created 116 low-cloze context sentences by exchanging the critical word from a high-cloze context fragment with the critical word from another high-cloze context fragment. For example, we exchanged the critical word from “In that library the pupils borrow books to take home” with the critical word from “The pillows are stuffed with feathers which makes them feel soft,” resulting in the following low-cloze fragment “The pillows are stuffed with books which makes them feel hard.” The critical word was always in mid-sentence position. In an earlier study, we showed that this type of sentences reliably elicits a standard N400 cloze effect (Vissers, Chwilla, & Kolk, 2006).

To counteract possible processing strategies, we draw attention from the critical high- and low-cloze items by constructing an equal proportion of filler items containing nonwords. In these filler items, we substituted the critical high- and low-cloze words by pseudohomophones derived from the correct words. The pseudohomophones were created by changing the vowel of the second syllable, keeping phonology the same. Every noun contained two syllables, and the changed vowels were always in the second part of the word. This yielded a total set of 464 sentences. The four versions of each sentence were counterbalanced across lists. Note that none of the items—neither critical items nor filler items—were repeated. The same holds for the context: Each sentence context was presented only once to a participant, meaning that the high-cloze versus low-cloze version of a sentence was presented to different participants. An additional set of 60 filler sentences was added: 30 correct sentences, 10 sentences with a pseudohomophone at the beginning of the sentence, 10 sentences with a pseudohomophone in the middle of the sentence, and 10 sentences with a pseudohomophone at the end of the sentence.

Procedure

Participants were seated in a closed chamber. Sentences were presented in serial visual presentation mode at the center of a PC monitor. Word duration was 345 msec, and the SOA was 645 msec. Sentence final words were followed by a full stop. The intertrial interval was 2 sec. Words were presented in black capitals on a white background in a 9 × 2-cm window at a viewing distance of approximately 1 m. Each sentence was preceded by a fixation cross (duration 510 msec) followed by a 500-msec blank screen. Participants were instructed to attentively read the sentences.

The experimental list was split up into four blocks; there was a pause between blocks, during which a film fragment was presented and each block was preceded by two filler items. Because eye movements distort the EEG recording, participants were trained to make eye movements, for example, blinks, only in the period that the prompt was present (stimulus duration was 2295 msec). Prompt offset was followed after 705 msec by a fixation cross, indicating the start of the next trial.

Mood Induction Procedure

Westermann et al. (1996; who compared 11 mood induction procedures [MIPs] by meta-analytic procedures) have shown that films or stories are very effective in inducing positive and negative mood states. The effects are especially large when subjects—as we did in the present ERP study—are explicitly instructed to enter the specified mood state. For elated mood, all other MIPs yielded considerably lower effectiveness scores. Because according to Westermann et al. the film/story MIP is the first choice to elicit both positive and negative mood states, we chose film fragments to induce a positive versus negative mood.

Participants were randomly assigned to the happy mood condition or to the sad mood condition. Immediately before the EEG recording, the MIP was started by presenting the first film clip to induce the intended mood. To this aim, film clips were presented from either a happy movie or a sad movie. The happy movie fragments were cut from the Warner Brothers' movie “Happy Feet”; the sad movie fragments were cut from a sad movie “Sophie's Choice” (a drama depicting the plight of a Polish woman during the Second World War). The film clips and the mood ratings were presented on the same PC monitor used for the presentation of the language materials. The participants' mood was manipulated by presenting film clips that either displayed sequential fragments from the happy movie or from the sad movie. A pilot study indicated that these two movies are effective in inducing positive and negative mood, respectively, in the first year psychology student population of the Radboud University Nijmegen, which forms the subject pool for the EEG study. The length of the film clips varied between 2.48 and 7.27 min, with a mean length of 5.41 min for the happy mood condition and 5.47 min for the sad mood condition. Next to the first film clip presented before the experiment, three additional film clips were presented between blocks of reading the experimental sentences. The aim of these film clips was to prolong the intended mood during the whole experiment. To check the effectiveness of the MIP before and after each of the film fragments, participants were asked to rate their mood on a rating scale (ranging from −10 = extremely sad, to 0 = neutral, to +10 = extremely happy) that appeared on the screen. Participants indicated their mood rating by moving a cursor on a keyboard to the level that corresponded with their current emotional state.

EEG Data Acquisition and Analysis

The EEG was recorded with 27 tin electrodes mounted in an elastic electrode cap (Electrocap International, Eaton, OH) from 5 midline sites and 22 lateral sites (see Figure 5 for the montage). The left mastoid served as reference. Electrode impedance was less than 3 kΩ. The EOG was recorded bipolarly; vertical EOG was recorded by placing an electrode above and below the right eye, and the horizontal EOG was recorded via a right to left canthal montage. The signals were amplified (time constant = 8 sec, band-pass filter = 0.02–30 Hz) and digitized on-line at 200 Hz. Presentation of stimuli and recording of performance data were accomplished by a Macintosh computer.

Before analyzing EEG and EOG, records were examined for artifacts and for excessive EOG amplitude (>100 μV) from 100 msec before the onset of the critical noun (high-cloze or low-cloze words), ending the relative clause to 1000 msec after its onset. Averages were aligned to a 100-msec baseline period preceding the critical letter string. On the basis of the ERP literature, mean amplitudes in the 300- to 500-msec epoch after critical word onset were taken as N400 window (e.g., Federmeier, Mai, & Kutas, 2005).

Repeated measures ANOVAs were conducted separately for the midline sites and for the lateral sites, with cloze probability (high vs. low) as a within-subject factor and mood (positive vs. negative) as a between subject factor. Mood was manipulated between subjects for the following reasons: First, on the basis of our pilot study, we were skeptical about switching on and off a positive versus negative mood within a single recording session. Second, although we could have invited the same subjects to a second recording session, this would have resulted in repetition of the critical stimulus materials. As it can take a long time before stimulus repetition effects go away (Cave, 1997) and given that ERPs in the time range of N400 are sensitive to stimulus repetition (e.g., Rugg, 1985), we preferred not to present the language materials twice.

The midline analyses included the additional factor site (Fza, Fz, Cz, Pz, Oz). The lateral analysis included the additional factors of hemisphere and site (left hemisphere sites: F7a, F3a, F7, F3, LAT, LT, LTP, P3, P3p, T5, OL; right hemisphere sites: F8a, F4a, F8, F4, RAT, RT, RTP, P4, P4p, T6, OR). The estimated Greenhouse–Geisser coefficient epsilon was used to correct for violations of the assumption of sphericity. All reported p values are based on corrected degrees of freedom.

RESULTS

Effectiveness of the MIP

No differences in mood scores between the two groups of participants were present before the film clips were presented (baseline measurement). As Figure 1 shows and supported by the statistical analyses, the MIP effectively induced the intended mood. That is, after watching happy film clips, participants were significantly happier than after watching sad film clips (p < .05). Similarly, after watching sad film clips, participants were significantly sadder than after watching happy film clips (p < .05). Note that a significant difference in mood scores between the happy and the sad mood condition was present after each of the film clips.

Figure 1. 

Mean mood rating scores ranging from −10 (extremely sad) to +10 (extremely happy) for the three film fragments comprising the MIP separately for the participants assigned to the two mood conditions (happy mood condition vs. sad mood condition).

Figure 1. 

Mean mood rating scores ranging from −10 (extremely sad) to +10 (extremely happy) for the three film fragments comprising the MIP separately for the participants assigned to the two mood conditions (happy mood condition vs. sad mood condition).

ERPs

The waveforms are presented separately for each mood because the analyses (below) revealed an interaction between cloze and mood. The grand mean ERPs for the happy mood condition for the high-cloze and the low-cloze items are presented in Figure 2, and those for the sad mood condition are presented in Figure 3. The critical words elicited for the visual stimuli characteristic early ERP response—that is, an N1 followed by a P2, which at occipital sites was preceded by a P1. These early components were followed by a broad negative-going wave peaking at about 400 msec, the N400. Inspection of the waveforms for the happy mood condition suggests the presence of a large and widely distributed N400 cloze effect. The N400 effect seems to show an early onset (around 200 and 250 msec). Inspection of the waveforms for the sad mood condition suggests the presence of a small N400 cloze effect for a limited set of electrodes only (over the midline and right hemisphere). In addition, in the sad mood condition, the N400 seemed to be followed by a slow positive going wave with larger (more positive) amplitudes for low-cloze than high-cloze items.

Figure 2. 

Grand ERP averages and topographical maps for the happy mood condition, time locked to the onset of the critical noun superimposed for the two levels of cloze probability (high, low) for all midline sites and a representative subset of lateral sites. The dashed rectangles indicate the time window (300–500 msec) in which N400 was measured. Negativity is plotted upward. The topographical maps obtained by interpolation from 27 sites are displayed for the N400 window and the time window capturing the late positivity (600–800 msec). Maps were computed from values resulting from the subtraction of the waveforms of the high-cloze condition from those of the low-cloze condition.

Figure 2. 

Grand ERP averages and topographical maps for the happy mood condition, time locked to the onset of the critical noun superimposed for the two levels of cloze probability (high, low) for all midline sites and a representative subset of lateral sites. The dashed rectangles indicate the time window (300–500 msec) in which N400 was measured. Negativity is plotted upward. The topographical maps obtained by interpolation from 27 sites are displayed for the N400 window and the time window capturing the late positivity (600–800 msec). Maps were computed from values resulting from the subtraction of the waveforms of the high-cloze condition from those of the low-cloze condition.

Figure 3. 

Grand ERP averages and topographical maps for the sad mood condition, time locked to the onset of the critical noun superimposed for the two levels of cloze probability for all midline sites and a representative subset of lateral sites. The dashed rectangles indicate the time window (300–500 msec) in which N400 was measured. Negativity is plotted upward. The topographical maps obtained by interpolation from 27 sites are displayed for the N400 window and the time window capturing the late positivity (600–800 msec). Maps were computed from values resulting from the subtraction of the waveforms of the high-cloze condition from those of the low-cloze condition.

Figure 3. 

Grand ERP averages and topographical maps for the sad mood condition, time locked to the onset of the critical noun superimposed for the two levels of cloze probability for all midline sites and a representative subset of lateral sites. The dashed rectangles indicate the time window (300–500 msec) in which N400 was measured. Negativity is plotted upward. The topographical maps obtained by interpolation from 27 sites are displayed for the N400 window and the time window capturing the late positivity (600–800 msec). Maps were computed from values resulting from the subtraction of the waveforms of the high-cloze condition from those of the low-cloze condition.

N400 Window (300–500 msec)

The analyses for the midline and for the lateral sites yielded main effects of cloze, F(1, 29) = 51.22, p < .001, and F(1, 29) = 40.66, p < .001, respectively, indicating that mean N400 amplitude was more positive for high-cloze items than that for low-cloze items.

No effects of mood were found, both Fs < 1. Most importantly, two-way Cloze × Mood interactions were obtained (midline sites: F(1, 29) = 10.79, p < .001, lateral sites: F(1, 29) = 5.25, p < .03). On the basis of these interactions, separate analyses were performed for the different mood conditions.

Happy Mood Condition

For the happy mood condition the ANOVA for the midline sites revealed an effect of cloze, F(1, 14) = 18.48, p < .001. No Cloze × Site interaction was obtained, p > .14, indicating that the N400 cloze effect for the happy mood condition was broadly distributed across the midline. The ANOVA for the lateral sites yielded a main effect of cloze, F(1, 14) = 33.98, p < .001, reflecting that N400 amplitude was smaller for high-cloze relative to low-cloze items (1.25 vs. −0.79 μV). Apart from an interaction between hemisphere and site, F(10, 140) = 4.30, p < .01, which reflected overall differences in amplitude across sites and/or hemisphere, no interactions with cloze were present. The absence of interactions of cloze with site and/or hemisphere supports a broad bilateral scalp distribution of the N400 effect in the happy mood condition.

Sad Mood Condition

For the sad mood condition, the ANOVA for the midline sites did not yield an effect of cloze, F(1, 15) = 1.92, p = .185. Although, a Cloze × Site interaction was found, F(4, 60) = 4.95, <.05, follow-up analyses for the single sites did not yield reliable N400 effects. The ANOVA for the lateral sites revealed a main effect of cloze, F(1, 15) = 9.23, p < .01. In addition, an interaction between cloze and hemisphere, F(1, 15) = 9.37, p < .01, and an interaction of these two factors with site, F(10, 150) = 5.90, p < .002, was found. Supplementary analyses for the two hemispheres yielded the following pattern: The ANOVA for the right hemisphere revealed a main effect of cloze, F(1, 15) = 13.28, p < .01. No other effects or interactions were observed indicating that the N400 effect was widespread across the right hemisphere. The analysis for the left hemisphere did not yield an effect of cloze, F < 2.6, but a Cloze × Site interaction, F(10, 150) = 3.34, p < .05. Follow-up single-site analyses revealed an N400 effect at the left occipital and temporal sites (OL and T5: ps < .03).

Early Differences (200–300 msec)

To check the reliability of possible early ERP differences, analyses were carried out on the mean amplitude in the 200- to 300-msec window. These analyses disclosed main effects of cloze (midline sites: F(1, 29) = 24.96, p < .001; lateral sites: F(1, 29) = 16.38, p < .001) and Mood × Cloze interactions (midline sites: F(1, 29) = 8.13, p < .01; lateral sites: F(1, 29) = 7.71, p < .02). Separate analyses for the happy mood condition revealed effects of cloze probability (midline sites: F(1, 14) = 30.25, p < .001; lateral sites: F(1, 14) = 22.95, p < .001) in the absence of other effects or interactions. Separate analyses for the sad mood condition did not yield a reliable effect of cloze (midline sites: F < 2.35; lateral sites: F < 1.1). However, an interaction of cloze with hemisphere, F(1, 15) = 6.76, p < .05, reflected a cloze effect for the right hemisphere, F(1, 15) = 5.58, p < .04, but not for the left hemisphere, F < 1. The cloze effects in the early window indicate effects in the same direction as the standard N400 cloze effect, which often starts around 250 msec. Hence, these effects are taken to reflect an early onset of the N400 effect for the happy and for the sad mood conditions.

Late Positivity (600–800 msec)

To test for differences in late positivity as a function of mood, analyses were performed for the 600- to 800-msec window. The analyses disclosed a cloze effect for the midline sites, F(1, 29) = 9.97, p < .01, indicating that mean amplitudes were more positive to low-cloze than high-cloze items (3.44 vs. 2.30 μV). The direction of the effect is thus opposite to that of the N400 effect. For the lateral sites, no effect of cloze, F < 1.5, only a trend towards a Cloze × Mood interaction, F(1, 29) = 3.26, p < .09, was found. Supplementary analyses were performed for smaller time windows (600–700 and 700–800 msec) to check for the potential presence of a reliable interaction between mood and cloze. For the 600- to 700-msec window, no main effects or interactions were found, all Fs < 2. For the 700- to 800-msec window for the midline sites, a main effect of cloze, F(1, 29) = 8.02, p < .01, and an interaction of cloze with mood was present, F(1, 29) = 4.40, p < .05. Follow-up analyses revealed a cloze effect for the sad mood condition, F(1, 15) = 16.35, p < .002, but not for the happy mood condition, F < 1.3. The lateral analysis for the 700- to 800-msec window revealed a Mood × Cloze × Hemisphere interaction, F(1, 29) = 4.52, p < .05. Separate analyses for the two levels of mood and the left versus right hemisphere indicated (a) that no differences were present for the happy mood condition and (b) that for the sad mood condition a cloze effect was present for the left hemisphere, F(1, 15) = 7.86, p < .02, but not for the right hemisphere, F < 0.5. For the sad mood condition, Cloze × Site interactions were found for the left hemisphere, F(10, 150) = 3.13, p < .04, and for the right hemisphere, F(10, 150) = 10.34, p < .01. Single-site analyses revealed differences at temporal and parietal electrodes of the left hemisphere (LTP, P3, P3p, and T5: ps < .05). For the right hemisphere, a difference in the same direction was present for two posterior sites (P4 and P4p: ps < .05).3

Correlation Analyses

Throughout this article, we assume that the Mood × N400 Cloze interactions reflect emotional factors. To test whether the modulations in N400 amplitude between the two mood conditions are accompanied by changes in emotional state, supplementary Pearson correlations were computed with the size of the N400 effect and mean mood rating (computed over three mood ratings per subject) as factors. The size of the N400 effect was computed by subtracting N400 amplitude in the high-cloze condition from that in the low-cloze condition; this difference score was computed for every single electrode and for every subject. The Bonferroni correction was used to adjust the p values for multiple (27 electrode site) comparisons (Sankoh, Huque, & Dubey, 1997). The new alpha level was p = .0077. As Figure 4 shows, the correlation analyses for the happy mood condition revealed significant correlations (rs between −.65 and −.79) between the size of the N400 effect and the mood ratings for all central and posterior electrodes except from one single site (T5). These analyses show that the happier the mood, the larger the N400 effect. With correlations at central/posterior sites ranging from −.65 to −.79, between 42% and 62% of the variation in the size of the N400 effect in the happy mood condition are accompanied by variations in emotional state. The correlation analyses for the sad mood condition did not disclose significant correlations between the size of the N400 effect and the mood ratings.

Figure 4. 

Pearson correlations on an idealized head, looking down, nose at the top, between the size of the N400 effect and the mood ratings, for the happy mood condition. The set of electrodes that showed significant r values are displayed in gray.

Figure 4. 

Pearson correlations on an idealized head, looking down, nose at the top, between the size of the N400 effect and the mood ratings, for the happy mood condition. The set of electrodes that showed significant r values are displayed in gray.

Figure 5. 

The size of the N400 effect (computed by subtracting N400 amplitude in the high-cloze condition from that in the low-cloze condition) was correlated with the mean mood ratings. The scatter diagram shows a strong relation between the N400 effect of cloze probability and emotional state at the vertex (r = −.785, p = .001). The best-fitting regression line is also plotted.

Figure 5. 

The size of the N400 effect (computed by subtracting N400 amplitude in the high-cloze condition from that in the low-cloze condition) was correlated with the mean mood ratings. The scatter diagram shows a strong relation between the N400 effect of cloze probability and emotional state at the vertex (r = −.785, p = .001). The best-fitting regression line is also plotted.

Correlation analyses were also performed for the late positivity to check whether changes in amplitude were accompanied by changes in mood. The Bonferroni-corrected p value for multiple comparisons was p = .0081. For the happy mood condition, no correlations were found. The analyses for the sad mood condition, however, revealed significant correlations (rs between .63 and .67) between the amplitude of the late positivity and the mood ratings at the frontal midline site (Fz: r = .65, p < .007) and one posterior site of the right hemisphere (RTP: r = .67, p < .004). These analyses unveil that the less sad the participant, the larger the late positive effect.

DISCUSSION

The goal of this article was to test abstract symbol theories against embodied theories of meaning by investigating the effects of emotional context on semantic processing. To this aim, we induced a happy mood versus a sad mood and investigated whether a participant's emotional state affects processes of language comprehension, in particular on-line semantic processing as reflected by N400. The predictions for the two classes of theories are as follows: On the classical, abstract symbol view, processing of word meaning par excellence is a modular process (Pinker, 1997; Fodor, 1975). Hence, context should not affect the representation of word meaning. This implies that a participant's emotional state should not have any impact on processing of word meaning. In contrast, according to embodied views, the process of language understanding is inherently perceptual (e.g., Barsalou, 1999, 2008; Glenberg et al., 2005). Words are meaningful because they are grounded in perception, action, and emotion. In line with this, embodied researchers propose that language and emotion interact (e.g., Havas et al., 2007). If indeed language is grounded in internal bodily states, then a participant's emotional condition should affect semantic processing in language comprehension.

In this article, we tested embodied views versus abstract symbol views of meaning against each other by exploring whether the well-known N400 effect of cloze probability is modulated by emotional context. The results on this point are clear-cut: The ERP results reveal that a participant's emotional state modulates semantic processing as tapped by N400. This was indicated by interactions between mood and the N400 cloze effect, both for the midline sites and for the lateral sites. These interactions reflect a strong reduction of the N400 cloze effect for the sad mood condition compared with the happy mood condition. In particular, the analyses for the midline sites revealed the presence of a clear N400 effect for the happy mood condition but absence of an N400 effect for the sad mood condition. The absence of an N400 effect across the midline for the sad mood condition is remarkable given that the centroparietal midline sites (Cz and Pz) typically show profound N400 cloze effects. The interaction between mood and cloze probability for the lateral sites indicated that the N400 cloze effect for the happy mood condition showed a widespread bilateral distribution, including anterior, central, temporal, posterior, and occipital sites. In contrast, for the sad mood condition, an N400 cloze effect was limited to the right hemisphere and two sites of the left hemisphere. In other words, a sad mood led to a disappearance of the N400 effect at the midline as well as a strong reduction in the number of electrodes of the left hemisphere that showed N400 effects. Whereas in the happy mood condition except from F7 all sites showed N400 effects, for the sad mood condition an N400 effect was only present at the left occipital (OL) and temporal sites (T5).

The present ERP results have implications for current theories on the representation of word meaning. In particular, the modulation of on-line semantic processes by a participant's emotional state—as demonstrated by the mood by N400 cloze interactions—supports embodied views of meaning and challenges abstract symbol views of meaning. With regard to the effects of a positive mood, the present N400 results accord well with those of Federmeier et al. (2001), who reported a facilitation of semantic processing for a (mildly) positive as opposed to a neutral mood. Importantly, the results of the correlation analyses demonstrate that the modulation in N400 amplitude reported in the present study was accompanied by significant changes in mood ratings. In particular, for the happy mood condition, there was a strong correlation between the size of the N400 cloze effect and the mood scores with an increase in the N400 cloze effect with increasing happiness. Of interest here is that all central and posterior sites that typically show large N400 semantic context effects showed significant correlations (see Figure 4). For the sad mood condition in which a general decrease in N400 effect was found, no significant correlations between the size of the N400 cloze effect and the mood scores were observed at central-posterior sites. Inspection of the ERP scatter plots for the different electrodes of individual participants in the sad mood condition unveiled that several subjects showed reversed N400 effects. These reversed effects led to a restriction of the range on which the correlation analyses were based compared with the happy mood condition (where no reversed effects were present). It is very likely that the restricted range in N400 effects for the sad mood condition resulted in lower correlation coefficients.

Possible Contribution of More General Processes Like Attention and/or Motivation

The proposal that processing of word meaning is affected by emotional context is new and at odds with the traditional view that word meaning is performed by a central cognitive module. A relevant question that has to be addressed is whether the present N400 Mood × Cloze Probability interaction could be accounted for by other more general processes like attention and/or motivation. For instance, it could be argued that the N400 modulation reflects that participants in a sad mood are preoccupied with the drama presented in the film and, hence, do not process the sentences for meaning. Although we cannot rule out with certainty that more general processes, in addition to the factor mood, may have contributed to the N400 pattern reported in this article, we consider this possibility as rather unlikely for the following reasons:

First, if participants after watching the sad film clips would not have attended the meaning of the sentences, then no N400 effect should have occurred in this condition. However, the data prove otherwise: Although the N400 effect for the sad participants was reliably reduced compared with the happy mood condition, a significant N400 effect was still obtained for the right hemisphere and two electrodes of the left hemisphere. The very presence of an N400 effect in the sad mood condition therefore speaks against the argument that sad participants simply did not pay attention to the critical materials. The question remains whether the reduction in N400 effect for the sad as opposed to the happy mood may reflect that sad participants paid less attention to the sentences than the happy participants. The correlation analyses show that the N400 modulation as a function of mood was accompanied by significant changes in mood ratings. Whether the N400 modulation was also accompanied by changes in attention and/or motivation is an empirical question. Future studies are needed to tease apart the effects of attention, motivation, and mood. One way, for example, to explore the relation between attention and mood would be to manipulate besides the factor mood (happy versus sad, induced by video fragments) the factor attention, for instance by comparing a selective attention task with a divided attention task. In this way, the effect of mood and the effect of attention can be explored separately.

A second reason why we consider it unlikely that the present N400 Mood × Cloze Probability interaction reflects more general factors like attention and/or motivation comes from the emotion literature. In the emotion literature, it is generally agreed upon that differences in mood do not lead to quantitative differences in cognitive processing or to a general attenuation of cognitive processes. Conversely, there is firm evidence that differences in mood lead to qualitative different strategies and that mood-dependent processing styles exist; for example, happy people are inclined for global, category level of processing information, whereas sad people are inclined to local, item-specific processing (e.g., Schwarz, 2002). Hence, sad moods do not lead to less cognitive processing (e.g., decrease in attention and/or motivation) and happy moods do not lead to more cognitive processing (e.g., increase in attention and/or motivation). However, mood induction leads to qualitative differences in cognitive processing (e.g., from a top–down processing style in happy mood to a bottom–up processing style in sad mood). On the basis of the emotion literature, therefore, we propose that the reduction of the N400 effect in sad people does not indicate a general reduction/attenuation of cognitive processes (e.g., a decrease in attention and/or motivation) but does reflect a qualitative change in processing strategy (from the use of heuristics in a happy mood to more detailed processing in a sad mood).

Nature of the Simulation Process

According to embodied theories, language understanding is grounded in a participant's bodily state. As proposed by Niedenthal (2007), perceiving and thinking about emotion involve perceptual, somatovisceral, and motoric reexperiencing (simulation) of the relevant emotion in one's self. For the domain of language, it has been put forward that there is a close relation between emotion on the one hand and the way in which we understand emotional language on the other hand (Havas et al., 2007). The results of the present study reveal that emotional context has an immediate impact on semantic processes involved in language comprehension. The fact that we demonstrate an on-line modulation of semantic processing by mood empirically supports the existence of an interaction between a person's emotional state and semantics. The mechanism proposed to mediate effects of emotion on language is via simulation of an emotional state.

The mental simulation process lies at the heart of embodied approaches to cognition in general and language comprehension in specific. It could be argued that this simulation process represents some kind of strategy that is under attentional control and—opposite to symbolic processing which is automatic—is of a more controlled nature. The nature of the simulation process has been investigated by Chwilla et al. (2007). They tested the automaticity of the N400 effect to novel meanings by varying task. In Experiment 1, participants judged the sensibility of sentences, whereas in Experiment 2, participants did nothing else than read for comprehension. The basic idea was that automatic simulation should occur regardless of task. Novel sensible contexts compared with novel senseless contexts (e.g., to paddle with Frisbees vs. to paddle with pullovers) elicited an N400 effect. This N400 effect generalized from the judgment task in which the sensibility of the action was in the center of the attention to normal reading. Crucially, the N400 effect to novel meanings occurred in the same time frame (300–500 msec) as the classical N400 effect to semantic relations. This shows that novel meanings are immediately established. Regarding the nature of the simulation process, both the immediacy and the task independence of the N400 effect to novel meanings support an automatic nature of the mental simulation process.

Mood-dependent Processing Styles

As sketched above in the emotion literature, it is generally agreed upon that there exist mood-dependent processing styles. A positive mood leads to greater cognitive flexibility and a broader focus, relying less on the details of a situation and more on top–down schematic processing (Fredrickson, 1998). It validates accessible cognition and leads to a more global, category level of processing of information, that is, relating incoming information to what is already known on the basis of our world knowledge including schemas and stereotypes (e.g., Gasper & Clore, 2002; Kimchi & Palmer, 1982). This processing strategy fits well with the facilitation of semantic processing in happy people, as reflected by strong N400 context effects to high-cloze sentences representing highly familiar scenarios on the basis of our world knowledge (e.g., “borrowing books in a library” or “picking flowers in a meadow”; for empirical support of N400 sensitivity to familiar forms of world knowledge, see Chwilla & Kolk, 2005).

Negative mood, in contrast, seems to focus our attention more narrowly on specific details of a situation: It invalidates accessible cognitions and fosters local, item-specific processing (e.g., Schwarz, 2002). In a recent review on the effects of emotion on cognition, Clore and Huntsinger (2007) propose that many famous phenomena in cognitive science such as semantic priming, global superiority effect, and false memories occur when people are in a positive mood but do not arise or occur in a reduced form, when people are in a negative mood. An important conclusion from this article is that the reduction in standard N400 cloze effect with sad mood supports the proposal that standard cognitive phenomena mainly occur in a happy mood. Above all, this N400 reduction fits well with the above described proposal that people in a sad mood are less open to accessible cognitions of what usually happens in the world around us.

The ERP data reveal a second difference in ERP pattern as a function of a participant's emotional state. A modulation in a late positivity (quantified in the 600-to 800-msec epoch), with larger more positive going amplitudes for low-cloze items as opposed to high-cloze items, was only observed for the sad mood and not for the happy mood condition. In the domain of language, positivities in this latency range are typically taken to reflect processes of reanalysis (e.g., Vissers, Kolk, & Chwilla, 2008; Kuperberg, 2007). It is not yet clear what the difference in late positivity reflects. One may argue that this late positive effect reflects a delay of processing the cloze manipulation for the sad relative to the happy group. However, it has to be pointed out that although the N400 effect for the sad group is significantly reduced as reflected by the absence of an N400 cloze effect for the midline sites and most electrodes of the left hemisphere, a reliable N400 effect of cloze was still obtained for the right hemisphere and two left posterior electrodes. The residual N400 cloze effect reveals that the participants in the sad mood condition did not simply miss the meanings of the words comprising the low- versus high-cloze sentences but processed the words for meaning. Therefore, one could speculate that the presence versus absence of the late positive effect may reflect differences in reanalysis between groups. Specifically, the presence of a late positive effect only for the sad mood condition could reflect that sad people are more inclined than happy people to reanalyze low-cloze sentences, representing implausible scenarios (e.g., that pillows are stuffed with books). This speculation would fit with a basic finding from the emotion literature that sad people are biased toward local, detail-oriented processing whereas happy people are more inclined to top–down processing.

Several studies in the literature have reported modulations in late positivities for emotional language materials (e.g., Herbert et al., 2008). Kanske and Kotz (2007) observed an increase in a late positivity to negative stimuli compared with positive stimuli, even when the attentional and arousing properties of the stimuli were controlled for. In a recent study, Holt, Lynn, and Kuperberg (2009) reported an increase in a late positivity both for negative and positive words relative to neutral ones, with larger amplitudes to negative than positive words. They interpreted these results within a more general hypothesis about affective processing; that is, the “negativity bias” which evolved from social psychology. According to this hypothesis, negatively valenced information yields more complex representations and requires more cognitive processing than positively valenced information (e.g., Dahl, 2001). In the Holt et al. study the emotional significance of the language materials led to continued processing, accompanied by modulations in late positivity. In the present article, it is the background emotional state one is in that matters. Future studies are needed to specify whether, and if so how, these late positivities are related and what the functional significance of these effects is.

Hemispheric Specialization

Kutas and Federmeier (2000) on the basis of visual half-field studies have proposed that the two hemispheres play different roles in on-line semantic processing. Empirical support for this claim comes from a study in which they compared the N400 to a highly expected exemplar with the N400 to a within-category violation or between-category violation (same materials as used in the study of Federmeier et al., 2001; e.g., “They wanted to make the hotel look more like a tropical resort. So, along the driveway they planted rows of palms” [expected exemplar], “pines” [within-category violation], “tulips” [between-category violation].) The critical word was flashed to the right visual field (input for the left hemisphere) or to the left visual field (input for the right hemisphere). For the left hemisphere, a significant N400 difference was found for the within-category violation compared with the between-category violation. No such difference was present for the right hemisphere. On the basis of these results, Kutas and Federmeier (2000) argued that the left hemisphere capitalizes on the organization of semantic memory to preactivate the meaning of forthcoming words, even if this strategy fails at times. The left hemisphere is, thus, biased towards prediction thereby enabling subjects to process language at the amazing speed and with the high-efficiency characteristic for normal language usage. On the other hand, based on the absence of an N400 amplitude difference between violations for the right hemisphere, they proposed that this hemisphere exploits a plausibility-based integration strategy. In the present article, differences in scalp distribution were observed with mood. Although for the happy mood condition a widely bilaterally distributed N400 effect was obtained, the N400 effect for the sad mood condition was mainly restricted to the right hemisphere. At face value, the bilateral widely distributed N400 effect for the happy mood condition seems to indicate that happy people use both a predictive and an integration strategy. The right hemisphere dominance of the N400 effect for the sad mood condition, on the other hand, suggests that sad people mainly exploit an integration strategy and refrain from making predictions of what comes next.

In this article, we explored the effects of a participant's emotional state on language comprehension. In line with embodied views, mood is a psychological state that involves multiple systems including the cognitive system, the perceptual system, the action system, and the somatovisceral system. The information streams from all these systems enter into one single multimodal representation; when one is happy or sad, the happiness or sadness is evident from multiple interacting systems. As the present N400 data show, mood enters language understanding through the information embodied in affective feelings via these multiple systems.

Conclusion

The interaction between a person's emotional state and semantics reported in this article provides further support for embodied views of meaning and challenges abstract symbol theories of meaning. The N400 data are consistent with the proposal of Havas et al. (2007) that language can be grounded in the bodily states that comprise emotions. At a more general level, the ERP results accord well with a recent shift in cognitive science and the field of language comprehension, from modular to (more) interactive processing models. Several investigators have provided evidence for interactions between language and action and language and perception. The interplay between emotion and language on the other hand has received little attention. The challenge of future studies will be to further our understanding about the workings and nature of the mechanism(s) that mediate the effects of emotional factors on language.

Acknowledgments

The authors thank the reviewers for constructive comments on this article. They are grateful to Judith Kroll and Ardi Roelofs for positive feedback on a previous draft of this article and thank Uli Chwilla for preparing all the figures. They also thank the technical group of the Donders Centre for Cognition for their assistance and Rinske de Graaf-Stoffers for statistical advice. This article is dedicated to all those people who invest their life in trying to make other people happy. A wonderful example is Ilse (Bobo) Chwilla.

Reprint requests should be sent to Dorothee J. Chwilla, Radboud University Nijmegen, Donders Institute for Brain, Cognition and Behaviour, P.O. Box 9104, 6500 HE Nijmegen, The Netherlands, or via e-mail: d.chwilla@donders.ru.nl.

Notes

1. 

For a comparison of measures of semantic memory and how they can be used to separate different automatic priming models, the reader is referred to Chwilla and Kolk (2002) for a reaction time study and to Chwilla, Kolk, and Mulder (2000) for an ERP study.

2. 

The amount of contextual constraint imposed by a sentence fragment can be determined by a cloze probability test. The cloze probability refers to the proportion of participants that fill in a particular word as the best completion of a sentence fragment.

3. 

Note that an effect in the opposite direction (more positive amplitudes for high than low cloze items) was found at three anterior sites (F8a, F8, and RAT: ps < .05).

REFERENCES

REFERENCES
Anderson
,
J. R.
,
Matessa
,
M.
, &
Lebiere
,
C.
(
1997
).
ACT-R: A theory of higher level cognition and its relation to visual attention.
Human–Computer Interaction
,
12
,
439
462
.
Barsalou
,
L. W.
(
1999
).
Perceptual symbol systems.
Behavioral and Brain Sciences
,
22
,
577
660
.
Barsalou
,
L. W.
(
2008
).
Grounded cognition.
Annual Review of Psychology
,
59
,
617
645
.
Berkowitz
,
L.
, &
Troccoli
,
B. T.
(
1990
).
Feelings, direction of attention, and expressed evaluations of others.
Cognition and Emotion
,
4
,
305
325
.
Borghi
,
A. M.
(
2005
).
Object concepts and action.
In D. Pecher & R. A. Zwaan (Eds.),
Grounding cognition. The role of perception and action in memory, language, and thinking
(pp.
8
34
).
Cambridge, UK
:
Cambridge University Press
.
Cahill
,
L.
(
2006
).
Why sex matters for neuroscience.
Nature Reviews Neuroscience
,
7
,
477
484
.
Cave
,
C. B.
(
1997
).
Very long-lasting priming in picture naming.
Psychological Science
,
8
,
322
325
.
Chung
,
G.
,
Tucker
,
D. M.
,
West
,
P.
,
Potts
,
G. F.
,
Liotti
,
M.
,
Luu
,
P.
,
et al
(
1996
).
Emotional expectancy: Brain electrical activity associated with an emotional bias in interpreting life events.
Psychophysiology
,
33
,
218
233
.
Chwilla
,
D. J.
, &
Kolk
,
H. H. J.
(
2002
).
Three step priming in lexical decision.
Memory & Cognition
,
30
,
217
225
.
Chwilla
,
D. J.
, &
Kolk
,
H. H. J.
(
2005
).
Accessing world knowledge: Evidence from N400 and reaction time priming.
Cognitive Brain Research
,
25
,
589
606
.
Chwilla
,
D. J.
,
Kolk
,
H. H. J.
, &
Mulder
,
G.
(
2000
).
Mediated priming in the lexical decision task: Evidence from event-related potentials and reaction time.
Journal of Memory and Language
,
42
,
314
341
.
Chwilla
,
D. J.
,
Kolk
,
H. H. J.
, &
Vissers
,
C. Th. W. M.
(
2007
).
Immediate integration of novel meanings: N400 support for an embodied view of language comprehension.
Brain Research
,
1183
,
109
123
.
Clore
,
L. C.
, &
Huntsinger
,
J. R.
(
2007
).
How emotions inform judgment and regulate thought.
Trends in Cognitive Sciences
,
11
,
393
399
.
Collins
,
A. M.
, &
Loftus
,
E. F.
(
1975
).
A spreading-activation theory of semantic processing.
Psychological Review
,
82
,
407
428
.
Dahl
,
M.
(
2001
).
Asymmetries in the processing of emotionally valenced words.
Scandinavian Journal of Psychology
,
42
,
97
104
.
Damasio
,
A. R.
(
1994
).
Descartes' error: Emotion, reason and the human brain.
New York
:
G.P. Putnam
.
Federmeier
,
K. D.
,
Kirson
,
D. A.
,
Moreno
,
E. M.
, &
Kutas
,
M.
(
2001
).
Effects of transient, mild mood states on semantic memory organization and use: An event-related potential investigation in humans.
Neuroscience Letters
,
305
,
149
152
.
Federmeier
,
K. D.
,
Mai
,
H.
, &
Kutas
,
M.
(
2005
).
Both sides get the point: Hemispheric sensitivities to sentential constraint.
Memory & Cognition
,
33
,
871
886
.
Fodor
,
J. A.
(
1975
).
The language of thought
.
New York
:
Crowell Press
.
Fredrickson
,
B. L.
(
1998
).
What good are positive emotions?
Review of General Psychology
,
2
,
300
319
.
Gasper
,
K.
, &
Clore
,
G. L.
(
2002
).
Attending to the big picture: Mood and global versus local processing of visual information.
Psychological Science
,
13
,
34
40
.
Gibson
,
J. J.
(
1979
).
The ecological approach to visual perception
.
Boston
:
Houghton Mifflin
.
Glenberg
,
A. M.
(
1997
).
What memory is for.
Behavioral and Brain Sciences
,
20
,
1
55
.
Glenberg
,
A. M.
,
Havas
,
D.
,
Becker
,
R.
, &
Rinck
,
M.
(
2005
).
Grounding language in bodily states: The case for emotion.
In D. Pecher & R. A. Zwaan (Eds.),
Grounding cognition. The role of perception and action in memory, language, and thinking
(pp.
115
128
).
Cambridge
:
Cambridge University Press
.
Glenberg
,
A. M.
, &
Robertson
,
D. A.
(
1999
).
Indexical understanding of instructions.
Discourse Processes
,
28
,
1
26
.
Glenberg
,
A. M.
, &
Robertson
,
D. A.
(
2000
).
Symbol grounding and meaning: A comparison of high-dimensional and embodied theories of meaning.
Journal of Memory and Language
,
43
,
379
401
.
Havas
,
D.
,
Glenberg
,
A. M.
, &
Rinck
,
M.
(
2007
).
Emotion simulation during language comprehension.
Psychonomic Bulletin & Review
,
14
,
436
441
.
Herbert
,
C.
,
Junghofer
,
M.
, &
Kissler
,
J.
(
2008
).
Event related potentials to emotional adjectives during reading.
Psychophysiology
,
45
,
487
498
.
Holt
,
D. J.
,
Lynn
,
S. K.
, &
Kuperberg
,
G. R.
(
2009
).
Neurophysiological correlates of comprehending emotional meaning context.
Journal of Cognitive Neuroscience
,
21
,
2245
2262
.
Kanske
,
P.
, &
Kotz
,
S. A.
(
2007
).
Concreteness in emotional words: ERP evidence from a hemifield study.
Brain Research
,
1148
,
138
148
.
Kiefer
,
M.
,
Schuch
,
S.
,
Schenck
,
W.
, &
Fiedler
,
K.
(
2007
).
Mood states modulate activity in semantic brain areas during emotional word encoding.
Cerebral Cortex
,
17
,
1516
1530
.
Kimchi
,
R.
, &
Palmer
,
S. E.
(
1982
).
Form and texture in hierarchically constructed patterns.
Journal of Experimental Psychology: Human Perception and Performance
,
8
,
521
535
.
Kuperberg
,
G. R.
(
2007
).
Neural mechanisms of language comprehension: Challenges to syntax.
Brain Research
,
1146
,
23
49
.
Kutas
,
M.
, &
Federmeier
,
K. D.
(
2000
).
Electrophysiology reveals semantic memory use in language comprehension.
Trends in Cognitive Sciences
,
4
,
463
470
.
Kutas
,
M.
, &
Hillyard
,
S. A.
(
1980
).
Reading senseless sentences: Brain potentials reflect semantic incongruity.
Science
,
207
,
203
205
.
Lakoff
,
G.
(
1987
).
Women, fire, and dangerous things
.
Chicago
:
University of Chicago Press
.
Landauer
,
T. K.
, &
Dumais
,
S. T.
(
1997
).
A solution to Plato's problem: The latent semantic analysis theory of acquisition, induction, and representations of knowledge.
Psychological Review
,
104
,
211
240
.
MacWhinney
,
B.
(
1999
).
The emergence of language from embodiment.
In B. MacWhinney (Ed.),
The emergence of language
(pp.
213
256
).
Mahwah, NJ
:
Erlbaum
.
Masson
,
M. E. J.
(
1995
).
A distributed memory model of semantic priming.
Journal of Experimental Psychology: Learning, Memory, and Cognition
,
21
,
3
23
.
Niedenthal
,
P. M.
(
2007
).
Embodying emotion.
Science
,
316
,
1002
1005
.
Oldfield
,
R. C.
(
1971
).
The assessment and analysis of handedness: The Edinburgh Inventory.
Neuropsychologia
,
9
,
97
113
.
Pinker
,
S.
(
1997
).
How the mind works
.
New York
:
Norton
.
Pratt
,
N. L.
, &
Kelly
,
S. D.
(
2008
).
Emotional states influence the neural processing of affective language.
Social Neuroscience
,
3
,
434
442
.
Pulvermüller
,
F.
(
2005
).
Brain mechanisms linking language and action.
Nature Reviews Neuroscience
,
6
,
576
582
.
Rugg
,
M. D.
(
1985
).
The effects of semantic priming and word repetition on event-related potentials.
Psychophysiology
,
22
,
642
647
.
Sankoh
,
A. J.
,
Huque
,
M. F.
, &
Dubey
,
S. D.
(
1997
).
Some comments on frequently used multiple endpoint adjustments methods in clinical trials.
Statistics in Medicine
,
16
,
2529
2542
.
Schirmer
,
A.
, &
Kotz
,
S. A.
(
2003
).
ERP evidence for a sex-specific Stroop effect in emotional speech.
Journal of Cognitive Neuroscience
,
15
,
1135
1148
.
Schirmer
,
A.
,
Kotz
,
S. A.
, &
Friederici
,
A. D.
(
2002
).
Sex differentiates the role of emotional prosody during word processing.
Cognitive Brain Research
,
14
,
228
233
.
Schirmer
,
A.
,
Kotz
,
S. A.
, &
Friederici
,
A. D.
(
2005
).
On the role of attention for the processing of emotions in speech: Sex differences revisited.
Cognitive Brain Research
,
24
,
442
452
.
Schwarz
,
N.
(
2002
).
Situated cognition and the wisdom of feelings: Cognitive tuning.
In L. Feldman Barrett & P. Salovey (Eds.),
The wisdom in feelings
(pp.
144
166
).
New York
:
Guilford Press
.
Searle
,
J. R.
(
1980
).
Minds, brains, and programs.
Behavioral and Brain Sciences
,
3
,
417
457
.
Soussignan
,
R.
(
2002
).
Duchenne smile, emotional experience, and autonomic reactivity: A test of the facial feedback hypothesis.
Emotion
,
2
,
52
74
.
Stanfield
,
R. A.
, &
Zwaan
,
R. A.
(
2001
).
The effect of implied orientation derived from verbal context on picture recognition.
Psychological Science
,
121
,
153
156
.
Strack
,
F.
,
Martin
,
L. L.
, &
Stepper
,
S.
(
1988
).
Inhibiting and facilitating condition of facial expressions: A non-obtrusive test of the facial feedback hypothesis.
Journal of Personality and Social Psychology
,
54
,
768
777
.
Tettamanti
,
M.
,
Buccino
,
G.
,
Saccuman
,
M. C.
,
Gallese
,
V.
,
Danna
,
M.
,
Scifo
,
P.
,
et al
(
2005
).
Listening to action-related sentences activates fronto-parietal motor circuits.
Journal of Cognitive Neuroscience
,
17
,
273
281
.
Vissers
,
C. Th. W. M.
,
Chwilla
,
D. J.
, &
Kolk
,
H. H. J.
(
2006
).
Monitoring in language perception: The effect of misspellings of words in highly constrained sentences.
Brain Research
,
1106
,
150
163
.
Vissers
,
C. Th. W. M.
,
Kolk
,
H. H. J.
, &
Chwilla
,
D. J.
(
2008
).
Monitoring in language perception: Evidence from ERPs in a picture-sentence matching task.
Neuropsychologia
,
46
,
967
982
.
Westermann
,
R.
,
Spies
,
K.
,
Stahl
,
G.
, &
Hesse
,
F. W.
(
1996
).
Relative effectiveness and validity of mood induction procedures: A meta-analysis.
European Journal of Social Psychology
,
26
,
557
580
.
Yeh
,
W.
, &
Barsalou
,
L. W.
(
2006
).
The situated nature of concepts.
American Journal of Psychology
,
119
,
349
384
.
Zwaan
,
R. A.
, &
Madden
,
C. J.
(
2005
).
Embodied sentence comprehension.
In D. Pecher & R. A. Zwaan (Eds.),
Grounding cognition. The role of perception and action in memory, language, and thinking
,
Cambridge
:
Cambridge University Press
.
Zwaan
,
R. A.
, &
Taylor
,
L. J.
(
2006
).
Seeing, acting, understanding: Motor resonance in language comprehension.
Journal of Experimental Psychology: General
,
135
,
1
11
.