In mediation and negotiation, we sometimes encounter people who make decisions that seem to be inconsistent with what they say they care about and want, and with their alternatives to settlement. Some seemingly irrational decisions may be a result of automatic, intuitive moral judgments, which are best approached at a corresponding intuitive level. Social scientists have identified two “systems” of thinking: an automatic, quick process made outside of awareness (System 1), and an effortful, deliberate, logical, conscious one (System 2). In their social intuitionist model and moral foundations theory, Jonathan Haidt and colleagues propose that individuals make automatic, quick, intuitive moral judgments along five universal moral domains: care/harm, fairness/cheating, loyalty/betrayal, authority/subversion, and sanctity/degradation. These moral judgments are difficult to engage with reasoned arguments. A seemingly irrational party may be in the midst of an intuitive moral judgment that is not logic‐based. The literature on these and related lines of research explains when and why a mediator should seek to explore whether a party’s assessments may be a result of intuitive moral judgments, and if so, ways a mediator could communicate with the party on that level. This may be done most effectively by shifting the focus within or to another relevant moral domain, using moral reframing, and making liberal use of stories in communicating alternative intuitive moral perspectives. These methods engage on an intuitive level.

Mediators know him well. He is the party who, when he tells his story of the dispute, appears affable, articulate, fair. He may even allow for his mistakes in the matter. He listens respectfully to his counterpart’s viewpoint and concerns. She makes a reasonable offer. This seems promising. The mediation might even end early! But then it happens—maybe in the second round of offers, or after an entire day of negotiating: two plus two suddenly equals five. Her offer is insulting! If, as the mediator, I fall into the trap of reciting facts that belie his assessment, he points out my naiveté. He brushes references to his interests aside. Instead of reaching together the Summit of Agreement, he has been on a different trip altogether—to Irrational Land, never to return.

Why was he headed there, and how did I miss his direction? Why, in some cases, do best mediation and negotiation practices fail to make an impact? Why, despite careful listening, do we sometimes fail to gain an understanding of barriers to agreement? Thanks to the explosion of social and neurological science research over recent decades, dispute resolvers are finding answers to these questions (e.g., Malhotra and Bazerman 2008). In particular, social science research on moral reasoning has offered a unique understanding of “gut‐level,” seemingly irrational decision‐making.

Over a decade ago, while an active mediator, I returned to school to study how moral and religious values might influence negotiation decisions. As I searched for research on the psychology and neuroscience of religious decision‐making, I discovered the work of Jonathan Haidt, Craig Joseph, and others who had updated existing theories of moral reasoning. Using multidisciplinary research, they theorized that we make intuitive, quick judgments along moral domains, unaware that these judgments are not logic‐ or reason‐based. These intuitions come into play in a much broader set of situations (including, I argue, negotiations) than previously recognized.

I start with a quick review of the groundbreaking work of Daniel Kahneman and Amos Tversky in teasing out the difference between quick, automatic, instinctual judgments and deliberative, effortful thinking. Then I delve into the research of social scientists and neuroscientists on how moral reasoning is comprised similarly of automatic, instinctual judgments. Moreover, “moral” is completely redefined through a synthesis of multidisciplinary research across international cultures. Haidt and Joseph (2004, 2007) identified five universal moral domains along which all humans find themselves reacting in this automatic way. These theories and related research suggest several strategies for mediators when working with parties immersed in intuitive moral judgments. Following a discussion of the research, I explain these strategies and how they can be useful in mediation settings. Though I write from the perspective of a mediator, the research discussed below—and my experience—suggest that negotiators can also benefit from the insights and tools set forth in this article.

It is now widely understood that we think and make decisions in two distinct ways. Building on the work of Stanovich and West (2000), Kahneman has popularized the terms “System 1” and “System 2” to describe the physiology and patterns of thinking that originate from ancient versus newer parts of our brain (see Kahneman 2011). System 1 is automatic, intuitive, and operates outside of our awareness. System 2 is effortful, deliberate, conscious mental activity that requires attention and is at times laborious (Kahneman 2011). Of the two patterns, System 1 thinking is used the majority of the time, although we tend to be completely unaware of it (Bargh and Morsella 2008; see Kahneman 2011). System 1 thinking allows us to make the quick, frequent assessments necessary for maneuvering through daily life. In his popular book Thinking, Fast and Slow, Kahnemann noted that “System 1 is generally very good at what it does: its models of familiar situations are accurate, its short‐term predictions are usually accurate as well, and its initial reactions to challenges are swift and generally appropriate” (Kahneman 2011: 25). However, he detailed a variety of cognitive “errors” and biases that are caused by System 1 thinking. For example, most individuals (including about half of college students at top universities) get this question wrong:

A bat and ball cost $1.10.

The bat costs one dollar more than the ball.

How much does the ball cost?

The distinctive mark of this easy puzzle is that it evokes an answer that is intuitive, appealing, and wrong. Do the math, and you will see. If the ball costs 10¢, then the total cost will be $1.20 (10¢ for the ball and $1.10 for the bat), not $1.10. The correct answer is 5¢ (Kahneman 2011: 44).

When the math logic is explained, individuals accept the correct answer and may even be charmed by the cleverness of the puzzle. Following the work of Kahneman and others on the ubiquitous nature of potentially undesirable System 1 errors in judgment and decision‐making, negotiation scholars began to look for ways to recognize and counter these errors (e.g., Birke and Fox 1999). Embedded in this line of research is the assumption that once individuals understand that they have made thinking errors that could interfere with demonstrably beneficial decisions, they will seek to correct them.

The assumption that we will want to identify and correct our System 1 thinking errors is not always borne out when moral judgments are at issue. Without the tangible data that a math problem provides, parties must make important character assessments—which involve moral judgments—on less than perfect information. Accurately making such assessments of a negotiation counterpart can be critically important when making deals or resolving legal disputes. Through System 1 thinking, we make our best guess as to whether someone is trustworthy or has honorable intentions; we rely on our intuition.

A business owner whose immigrant parents taught her that she should never buy anything she cannot pay for is reluctant to compromise with a purchaser who used multiple shell organizations to shield himself from liability. A party meets at his counterpart’s firm and is repulsed by the messiness and smell of the office and common space. Company executives struggle to compromise during mediation with a whistleblower who exposed internal company matters. As I explore below, these matters involve deeply felt instincts about moral issues such as cheating, fairness, sanctity and purity, hierarchy and respect for legitimate leaders, and loyalty to one’s in‐groups.

Several decades ago, Haidt and his colleagues began to reexamine moral thinking and reasoning. As a PhD student in psychology, Haidt, along with others, questioned the dominant “stage theory” of moral development, which asserted that as individuals grow, they move through increasingly sophisticated stages of moral development and moral reasoning (see Kohlberg 1971). In emerging cross‐cultural research, Haidt saw evidence that moral decision‐making may be an intuitive process rather than one that is reasoned deliberately, although we assume we are thinking logically and rationally (Haidt 2006).

To test this hypothesis, Haidt fabricated quirky vignettes in which the protagonist engaged in cartoonish, taboo behaviors that were benign and, he asserted, harmed no one.1 When research subjects were asked whether they would drink sterilized roach juice or make a revocable deal to sell their soul to the devil (though “not a legal or binding contract”), or whether consensual sibling sexual relations should be allowed if foolproof birth control was used, individuals quickly and confidently said, “No.” The certainty and immediacy with which most subjects responded, coupled with the difficulty they had articulating reasons other than, “It’s just wrong,” led Haidt to suspect that an automatic and unconscious level of moral intuition was at play (Haidt, Bjorklund, and Murphy 2000).

Haidt later posited the “Social Intuitionist Model” (SIM) of moral judgment: When faced with a moral stimulus or decision, we experience quick flashes of “like/dislike” or “right/wrong” that drive our judgments in a millisecond, similar to perceptions (Haidt, Bjorklund, and Murphy 2000; Haidt 2001, 2012). According to SIM, our minds “[have] been built to respond to certain moral goods [that] appear to us as self‐evident truths” (Haidt, Bjorklund, and Murphy 2000). These flashes of intuition include moral emotions such as sympathy, remorse, shame, love, and grief (Haidt 2001, 2003). Neuroscience research suggests that emotion is an integral part of moral judgments, even if we do not feel any emotion in the moment.2

SIM is grounded in evolutionary theory. Haidt believes that our moral reasoning evolved from intuitions initially developed to meet survival needs within increasing and complex social group structures (Haidt 2001, 2012). To survive, our ancestors needed to evaluate quickly and automatically what they encountered, categorizing whether the situation was safe or not (Haidt and Bjorklund 2008).

What role, then, does deliberate, effortful reasoning about moral questions—System 2 thinking—play in our everyday decisions? Little, according to SIM. Most of the time, we do not make moral judgments based on a conscious cost/benefit analysis (Haidt 2001). Rather, we make moral assessments quickly and automatically, afterward engaging in a slower conscious process through which we attempt to understand and explain the already‐made judgment or decision. Since our judgment is not fundamentally based on System 2 or logical thinking, an attempt to influence us with logical arguments is largely ineffective. We are influenced by other’s arguments, but not because of their logical merit. Rather, someone else’s point of view triggers a new flash of intuition in the listener. According to SIM, the most common way moral judgments are shifted and evolved is via interpersonal conversation, gossip, and argument (Carruthers 2007; Haidt and Graham 2007).

While Haidt was exploring how we make moral judgments, he also was exploring—with fellow doctoral student Craig Joseph—what we identify as moral issues. According to Haidt, the then‐dominant theory of moral development, developed by Lawrence Kohlberg, characterized moral development as a conflict over justice. Kohlberg measured moral development by analyzing individuals’ responses when asked whether it was morally permissible for a man to steal a drug to save the life of his wife (Kohlberg 1971). Conducting field research across Brazil, India, Egypt, and the United States, Haidt and Joseph became convinced that morality across cultures involved far more than issues of fairness, harm, and care. From studying their field data and reviewing work in anthropology and evolutionary science, they saw other themes. Ultimately, they found five groups of virtues that appeared universal, which they called moral domains. “For each one, a plausible evolutionary story had long been told, and for four of them (all but Purity), there was some evidence of continuity with the social psychology of other primates” (Haidt, Graham, and Joseph 2009: 111). We are born with the neural imprinting of a “first draft” of these moral intuitions. They evolved in response to repeated challenges to our ancestors and prepared us to survive and thrive in groups (Carruthers 2007; Haidt and Joseph 2008; Graham et al. 2013). Haidt and Joseph called their theory about moral intuitions “moral foundations theory” (MFT). The five moral foundations are as follows:

  1. The harm/care foundation evolved from the adaptive challenge of protecting and caring for children, the vulnerable, or injured kin. All mammals must care for offspring as they become self‐sufficient over time. Humans, therefore, react intuitively to signs of their children’s suffering, distress, or neediness. We also react to other children, baby animals, the vulnerable, and the victimized. From this foundation, we are able to feel the pain of others. Virtues that have evolved along this domain are kindness, caring, gentleness, and nurturance. The original triggers for these virtues were suffering and distress of one’s kin. We react with intuitive dislike when we witness cruelty or neglect and feel compassion and a desire to care for those who have been victimized (Haidt 2012; Graham et al. 2013).

  2. The fairness/cheating foundation evolved in response to the adaptive challenge of engaging in beneficial cooperative relationships without being exploited. We are intuitively sensitive to evidence of both cheating and cooperation among others. This foundation “makes us sensitive to indications that another person is likely to be a good (or bad) partner for collaboration and reciprocal altruism. It makes us want to shun or punish cheaters” (Haidt 2012: 153). The instinct for fairness in modern society extends to any expected reciprocal relationship. We get angry when our computer does not perform as expected, a vending machine does not deliver the snack we paid for, or someone cuts in front of a queue.

  3. The loyalty/betrayal foundation evolved in response to the adaptive challenge of surviving through coalitions. “[W]hen humans developed language, weapons, and tribal markers, such intergroup competition became far more decisive for survival. Individuals whose minds were organized in advance of experience to make it easy for them to form cohesive coalitions were more likely to be part of winning teams in such competitions” (Graham et al. 2013: 70). We are sensitive to signs that one is loyal to the team and thus trustworthy, and signs that they are not. We tend to identify “ingroups” and “outgroups.” Today, groups that compete within the workplace; sports teams and their fans; devotees to a TV show or film series, product, or politician build on this foundation (Haidt 2012).

  4. The authority/subversion foundation evolved in response to the adaptive challenge of navigating group hierarchies. Those with whom we are able to “forge beneficial relationships upwards and downwards have an advantage over those who fail to perceive or react appropriately in these complex social interactions” (Graham et al. 2013: 70). From this foundation arose our sensitivity to signs of rank or status. We grant legitimacy and authority to leaders and bosses and respect those who properly lead and care for their groups. Today, our courts, governmental institutions, schools, and police forces are hierarchies from whom we expect the proper use of authority and power. We dislike and undermine those who are granted such legitimacy but do not carry out their duties appropriately. Virtues that developed from this foundation include respect and deference (Graham et al. 2013: 70).

  5. The sanctity/degradation foundation—sometimes referred to as the sanctity/purity foundation—evolved in response to the challenges that arose from our ancestors’ need to avoid pathogens and parasites when they lived off the land, scavenged for meat, and gathered together in denser groups. One reaction that developed from these challenges is disgust, which enabled us to respond to pathogens and form appropriate emotional and cognitive responses to a wide variety of threats, both symbolic and real (Haidt 2012). We avoid waste products, diseased people, and food thrown on the ground. Triggers from moral transgressions such as taboo ideas, sexual deviance, and spiritually unclean objects also developed from this foundation. Virtues associated with this foundation are cleanliness, temperance, chastity, and purity (Haidt 2012; Graham et al. 2013).

According to MFT, each culture builds upon these five “imprinted” moral intuitions, highlighting or de‐emphasizing each as it sees fit. Through teachings and behavioral examples, children are taught what is right and wrong within their culture. “Each of these examples contains information about a number of aspects of the situation, including the motivations of the protagonists, their state of being (suffering, disabled, hostile, rich, etc.), the categorization of the situation, and the evaluation of the outcome offered by more experienced others” (Haidt and Joseph 2008: 386). In this way, cultures elaborate upon the intuitive, evolved moral foundations to foster moral competence in their members. We learn moral virtues—right and wrong—through narrative, not logical or scientific thinking (Haidt and Joseph 2008).

Haidt and his colleagues suggested that two of the moral foundations—care/harm and fairness/cheating—are well‐developed in all cultures. The other three domains—loyalty/betrayal, authority/subversion, and sanctity/degradation—are valued in cultures that focus on the collectivist good of the group or the whole; in other words, they are valued in most countries in the world (Graham, Haidt, and Nosek 2009; Haidt 2012). They are de‐emphasized in rich, western, industrialized, educated, democratic countries. Graham, Haidt, and Nosek (2009) also have explored how political leanings correspond to the relative value that one gives each of the moral domains. Understanding the nature of each domain and how it may play out in everyday conflicts can be powerful for mediators and negotiators.

To illustrate how intuitive moral judgments may factor into parties’ judgments during a mediation, I will examine the origin and research behind three of the moral domains. I start with fairness/cheating, which cuts across all cultures. I then explore two of the domains that influence decisions but are less obvious to some, particularly in western societies: sanctity/degradation and loyalty/betrayal.

Fairness/Cheating

Humans instinctually demand fairness; it is a universal value across cultures. In exploring this moral foundation, Haidt reviewed evolutionary research on fairness. He concluded that humans have learned that cooperation increases the likelihood of survival. For example, when hunters gather together, they can capture a large prey that none could capture alone. From these evolutionary developments, humans have evolved a sense of fairness that supports cooperation and protects us from exploitation by others. According to Haidt, we “feel pleasure, liking, and friendship when people show signs that they can be trusted to reciprocate. We feel anger, contempt, and even sometimes disgust when people try to cheat us or take advantage of us” (Haidt 2012: 136). Fairness includes at least two subtypes: fairness as equality, and fairness as proportionality—the sense that one should receive in proportion to what one contributes (Haidt 2012).

Fairness emerges as a consideration in experimental economic and negotiation games. In a comprehensive review of research on fairness in negotiations, Welsh (2003) noted that individuals will reject offers that are economically beneficial if they perceive such offers as unfair; researchers are studying why and under what conditions this is so (e.g., Henrich et al. 2010; Debove, Baumard, and Andre 2016). Consider also this complexity of the fairness/cheating moral domain: research has shown that fairness tends to be interpreted in self‐serving ways (Bazerman, Curhan, Moore, and Valley 2000), and our judgment of what is fair is subject to System 1 cognitive biases (e.g., Babcock and Loewenstein 1997; Kahneman 2011). Thus, a mediator should listen carefully and explore what a particular party considers “fair.”

Sanctity/Degradation

Our human ancestors learned to survive and grow first by eating plants, then animals. They learned which pathogens and parasites were safe and which led to disease or death. Today, humans intuitively react in disgust to a variety of substances and concepts that we immediately judge as contaminated, unsafe, or unclean (Haidt and Joseph 2008). Disgust has evolved from an instinct that helps us avoid dangerous physical substances to one that also guides our social and moral assessments of others. It can be triggered by a wide variety of stimuli from toxicity and disease to perceived violation of social and moral norms (Chapman et al. 2009; Chapman and Anderson 2012).

Disgust appears to be a strong instinctual reaction with both emotional and physical components. The physiological response to disgust is to withdraw from and reject the contaminated substance—as well as areas near the contaminated matter—to eliminate the risk of being sickened. Once disgust is invoked, it can easily generalize to surrounding areas and experiences. Han and colleagues discovered a “disgust‐promotes‐disposal effect” in a study in which they gave participants a sealed box of office supplies. After inducing disgust with a video of a man using a filthy toilet, the viewers more often wanted to trade their box for a new one than those who had seen a nature scene (Han, Lerner, and Zeckhauser 2012).

Studies show that we link physical cleanliness with moral judgments (e.g., Schnall, Benton, and Harvey 2008). Many religious services incorporate a cleansing step or ritual prior to interactions with the divine or sacred. Chapman and his colleagues found that those who received an unfair offer in an economic game exhibited the same physical markers of disgust as those exposed to unpleasant tastes or photographs of unclean or contaminated items (Chapman et al. 2009; Chapman and Anderson 2013). Russell and Giner‐Sorolla noted that once disgust is triggered, juries have difficulty considering relevant mitigating factors or discussing their reaction productively with other jury members (Russell and Giner‐Sorolla 2011).3 As noted above, the virtues associated with the sanctity/degradation foundation are temperance, chastity, piety, and cleanliness (Haidt 2012; Graham et al. 2013).

From these studies, we see that disgust and the accompanying desire to rid oneself of a contaminant might explain why parties do “not want anything to do with” another’s offer and withdraw, or conversely, give up value to rid themselves of the “contaminated.” Shoddy business dealings, cheating, perceived sexual impurity, or desecration of sacred symbols can all trigger a disgust reaction, one that is outside of a person’s awareness. Such disgust influences subsequent decisions even if it is triggered by something unrelated to the event or mediation at hand, as with the dirty toilet in the study above. If unaddressed, a party’s sense of discomfort along this moral domain can interfere with the mediation or negotiation process, particularly the brainstorming, discussion of interests, and collaboration necessary for the most effective bargaining.

Loyalty/Betrayal

The loyalty/betrayal moral domain involves groups, from families to businesses to countries. It derived from our innate predisposition to form, and function in, groups that ensured our survival. Graham et al. (2013) explained:

Chimpanzee troops compete with other troops for territory (Goodall 1986); coalitions of chimps compete with other coalitions within troops for rank and power (de Waal 1982). But when humans developed language, weapons, and tribal markers, such intergroup competition became far more decisive for survival. Individuals whose minds were organized in advance of experience to make it easy for them to form cohesive coalitions were more likely to be part of winning teams in such competitions…. Sports fandom and brand loyalty are examples of how easily modern consumer culture has built upon the foundation and created a broad set of current triggers.

(Graham et al. 2013: 70)

Today we look for commonalities among larger groups of people and form groups in which we feel we can thrive. Our families and extended kin are the first group membership, followed later by our school groups, villages or neighborhoods, sports teams, political groups, countries, and to some extent, socially imposed categories such as race, gender, and class (Graham et al. 2013). Virtues associated with this foundation are patriotism, devotion to a group’s mission, and self‐sacrifice for group members. A threat or challenge to one’s group can trigger a significant violation of one’s sense of right in this moral domain (Graham et al. 2013).

Our innate sensitivities detect and defend against threats to our group’s success and survival, and once we identify with a group, we treat those within it differently from those outside of it. We tend to trust group members more than nonmembers, and to see them as less culpable in transgressions than outgroup members (Abrams, Randsley de Moura, and Travaglino 2013). Group members will at times sacrifice their own economic gain for the good of their group. A decision by group members to be self‐sacrificing is made quickly and intuitively, whereas when members choose a selfish response, they do so more slowly and with greater deliberation (De Dreu, Dussel, and Velden 2015). An additional finding of the research on group loyalty/betrayal is that our brains have a diminishing tendency to care as much for those living far away as compared to those in personal proximity, even though the former may need our help much more (Greene 2003; Bruneau, Darfour and Saxe 2012). This has played out in the COVID‐19 pandemic and other natural disasters; in‐group survival intuitions are triggered, and individuals are less interested in helping those in other, distant states or countries.

I believe that this research helps explain one reason for the natural tendency for “small talk” at the start of mediations and negotiations. Parties seek some common group identification such as hometown, sports team affiliation, academic or business background, and shared adoration of musicians or other celebrities to increase trust and comfort with others. Parties whose word is accepted and who have created sufficient rapport to engage in interest‐sharing and creative problem‐solving increase their chances of achieving maximum negotiation results. Conversely, a party who intuitively senses that another mediation participant is violating group norms will feel a flash of negative judgment.

Combined, SIM and MFT propose that judgments within the five moral domains are made in an automatic, intuitive, emotion‐influenced fashion. The quick flashes of judgment that lead to impressionistic moral reasoning employ System 1 thinking, in Kahneman’s parlance.

In a typical day‐long mediation, these intuitive moral judgments may occur multiple times while parties talk and interact. If left unaddressed, negative intuitive moral judgments lead to a lack of trust and dampen the ability of parties to engage in some of the most beneficial aspects of integrative bargaining—brainstorming and creative problem‐solving. As in our opening vignette about the party who seemed in tune but then veered off into Irrational Land, out‐of‐awareness judgments within the moral domains can lead to a complete halt in negotiations.

Identify Possible Intuitive Moral Judgments

As noted, moral foundations theory asserts that moral judgments occur along one or more moral domains in the manner explained by SIM. It is a theory built upon anthropology, evolutionary biology, and psychology. MFT offers a more expansive view of attitudes and assertions that a party may—outside of awareness—react to as “moral” matters. While there is an instrument that measures one’s level of each moral foundation (e.g., Doğruyol, Alper, and Yilmaz 2019), mediators must make on‐the‐spot educated guesses as to whether they are working within any of these domains.

To spot possible intuitive moral thinking, look for a party who seems stuck, or is making decisions that seem inconsistent with their earlier articulated interests. Are there possible indicators of intuitive moral judgments? Do we see a sign of disgust or repulsion (sanctity/degradation)? Do we hear words that designate another as an “outgroup” member? Does one party paint a picture of the other as violating norms of hierarchy, respect, harm, or fairness? For example, Kaur and Sasahara (2016) explored keywords in Twitter conversations associated with moral topics along the five domains and found words such as attack, hurt, cruel, help, care (harm/care); dishonest, unfair, discriminate, unjust (fairness/cheating); individual, enemy, betray, foreign (loyalty or ingroup); illegal, protest, rebel (authority); sin, disgust, sick, dirt, disease, indecent (sanctity/degradation). It is also helpful to observe any flashes of intuition that come to you as the listener. Do you have a reaction to, or idea about, what you are hearing? Do any stories of people handling similar situations come to mind? Consider asking the party for their experience, how they came to their judgment. Reflect the moral values you hear and continue listening. How can you share a different moral perspective with the party, either within the domain in which they seem to be making their judgment, or another? Next, I offer suggestions for reaching someone’s intuitive level of thinking, based on SIM and MFT, research on moral domains, and my practice.

Appeal Intuitively

Intuitive moral judgments are inspired and influenced by social persuasion. Others voicing their opinions, likes, and dislikes can trigger an intuitive flash of intuition in the listener of right or wrong, like or dislike (Haidt 2001). This is especially true when the speaker is a person of influence. How can we use this knowledge to communicate with someone about their view or judgment when it is intuitive and derived from a moral domain? The strategies that I have found most effective are: (1) appeal intuitively by discussing alternative perspectives within the moral domain in which the party is thinking, or within another moral domain, (2) use moral reframing, and (3) communicate intuitive moral messages through stories, particularly those that contain a classic story arc. I next describe one of the first times I noticed the dynamics of SIM and MFT in a mediation, and what I did that helped.4

The Story of Jack

Jack was a tall, lanky, middle‐aged southerner who talked easily. He, his father, and his sister, Susan, all ran a family clothing business. They had several stores in Ohio that sold high‐end brand‐name clothing from designers who often required exclusive sales contracts. Jack and Susan had taken over day‐to‐day operations of the business, which was still owned by Larson, their proud father. Jack managed one of the stores, which unwittingly had sold clothing from one of the company’s most valued designers to a customer who, it turned out, had illegal plans for the goods. Jack explained in a private meeting with me that in his community, deals in principle were still made on a handshake; a person’s word was sacred. As it turned out, the customer was part of a ring that illegally sold American textiles in other countries. U.S. Customs and Border Patrol caught the shipment and notified the designer’s manufacturer. By contract, foreign sales were forbidden. The designer demanded Jack’s family pay it $80,000 as a penalty, provided for in their sales contract, to compensate them for their loss in domestic sales. Jack had sold clothing to the fraudulent purchaser in prior years with no known problem. As the parties talked, the family saw how a few precautions that usually aren’t necessary in Jack’s part of the country could have caught the problem early on. Yet the family fervently believed that the designer, with national distribution, had far more opportunities and resources to identify and prevent rings of fraudulent purchasers than did a small business. Jack felt he had acted in good faith and that it was unfair for the designer to charge his family’s business a penalty.

In fact, both the family and the designer were swindled by a long‐gone purchaser. Who should be left holding the bag? The first step was to unearth the parties’ interests. The family’s interests were to minimize financial loss, maintain their reputation, and keep their professional relationships with designers. The designer’s interests were to deter fraud, force retail sellers to prevent fraud at the point of sale, minimize financial loss, and maintain good working relationships with retail companies. Early on in the mediation, the parties were able to agree on a few smaller issues. They then spent a lot of time negotiating over the $80,000 “lost sales” penalty and inching toward splitting the loss. But in a private session with the family, they resisted moving all the way to the midpoint. By then, I sensed that Susan and Larson were less concerned than Jack about pushing further but wanted to support Jack’s view that a split would be unfair. Jack’s head hung; it was his store that was swindled. Now his family had to bear the consequences. But several meetings later, Susan and Larson told Jack to stop worrying; they were OK. They were willing to pay the penalty and move on, but they did not want to force Jack to do so. They essentially left the decision up to him.

Jack slumped in the corner of the caucus room. He was angry that the designer expected vigilant fraud detection in a fast‐moving environment. Whatever his family had to pay, it was both unfair and his fault. I looked at Jack, suspecting that this was intensely personal. From the MFT lens, this level of monetary concession to the designer seemed to raise the following concerns for him: breach of loyalty to his family (loyalty/betrayal), a challenge to his integrity and honesty (sanctity/degradation), and unfair “punishment” (fairness/cheating). Jack was stuck in indecision, so how could I help him decide what to do?

Then a time when I believe I was cheated in business flashed before me. I decided to share with Jack my own story and how I got past it: “Jack, I was once cheated in business. I had a business partner with whom I provided services to a local nonprofit organization. We did well for several years. Then the organization added a new fee to the contract. My partner and I disagreed on who should bear the extra cost. As we were negotiating, he abruptly canceled the agreement with me and refused to pay thousands of dollars due me. I hired attorneys but, in the end, they advised me that it would cost more to obtain the money he owed me than to simply settle for a much smaller sum. Bigger picture, I was looking to get out of this line of work and focus more on my own dispute resolution practice. So, I took the settlement. I realized that I had been in business for many years, and no one had ever treated me unfairly before this. I had one bad business partner. Until then I had terrific people as my business partners, customers, and vendors. My personal ethics require me to treat others fairly and well. I believed this person did the opposite and was able to cheat me out of money to which I was entitled. But in the end, who would I rather be? Him or me? Who would you rather be—the person who deals honorably in business, or the person who makes a living by swindling people he gets to trust him? He could never live in your community, where businesses still make handshake deals and treat each other honorably.” I could see Jack’s face relax. Jack, Susan, and Larson spoke a bit more. Soon the designer made a counteroffer; this time Jack and his family decided that the move was an indicator of good faith. They took the deal.

Shift the Focus within the Moral Domain in which the Party is Thinking, or to Another Moral Domain

Once a mediator has a sense of the moral judgments a party is making, the mediator can think about other moral values or perspectives that could also be relevant, either within the same or another moral foundation. Given that—according to SIM—we change our intuitive moral judgments through social interactions that trigger flashes of intuition, we will be more impacted by those who are in our sphere of influence such as a family member, a friend, or a trusted community leader, than by an adversary. A mediator who has developed trust and rapport with the disputants could be a person within the party’s sphere of influence. Most often, parties see their counterparts as having interests that are contrary to their own. When mediators offer alternative moral perspectives by emphasizing aspects of the conflict that highlight virtues they overlooked in themselves or others, or moral transgressions they overcame—as in my example with Jack—they will likely be more helpful than an opposing party who delivers the same message.

I could see by Jack’s body language, his interactions with others, and the confidences he afforded me as the day went on that he was angry, as well as ashamed of being swindled. These reactions may have included a sense that he let his family and its business down (betraying his in‐group) and a sense of shame and disgust for being associated with a fraudulent transaction (sanctity/degradation). Not only had the purchaser cheated the family business, but Jack felt the manufacturer, as the party with far more resources, was in a better position to prevent this type of fraud and bear the cost if it occurred (fairness/cheating). If he felt he had let down his family, then he would resist a deal that would confirm (to him) that he had done so.

So, I sought to evoke virtues along several moral domains. Shifting the focus from being cheated, I talked about virtues that were apparent in him along the sanctity/degradation and fairness/cheating moral domains. Jack seemed to experience a flash of intuition that allowed him to shift away from experiencing shame toward feeling respect for his business and personal values. Moving forward, Jack and his family could learn new procedures to prevent fraud. But this time, Jack could simply realize that being an honest person in business, even when you get cheated by a clever wrongdoer, need not be a violation of your obligation to your family. New, positive, intuitive moral judgments were triggered.

I have used my own or other individuals’ experiences to talk with parties who are struggling with what appear to be intuitive moral judgments. Just recently a mediation participant who had post‐traumatic stress disorder emailed me his thanks for sharing my story about dealing with unfairness, which he said had been helpful. In another case an employee had sued over his supervisor’s alleged improper behavior, but the law did not provide for damages. This frustrated him. I told him that my business dispute was frustrating because the legal system, with its high litigation costs, was not designed to compensate me adequately. I understood how angry he was that he could not obtain what he felt he deserved. We talked about the ways in which he was valued by his organization. The mediated settlement allowed him to move to another unit in which he wanted to work and to receive unique opportunities for professional development. This showed him that despite his negative experience with a former supervisor, the organization accepted him (his “in‐group”) and was offering him a favorable new job.

When a mediator suspects that an intuitive moral judgment could be at play, open‐ended questions can help uncover moral domains that may have been activated. As noted above, mediators can be alert to their own perspectives along moral domains that could be relevant to the party’s thinking. They can experiment with sharing alternative moral perspectives. The party’s response will be instructive and can lead to an iterative process. Often, this back and forth will allow the mediator to understand better the moral domains underlying the party’s perspective.

Use Moral Reframing

Social scientists recently have explored the power of moral reframing in the political and public health spheres. Moral reframing is a specific technique that can shift the moral perspective of the listener. In their review of the literature, Feinberg and Willer have described moral reframing as “arguing in favor of a political position that members of a political group would not normally support [by framing it] in terms of moral concerns that the members strongly ascribe to” (Feinberg and Willer 2019: 3). They argued that moral reframing works because as “central and immutable parts of one’s identity,” moral convictions drive what individuals will accept or reject (Feinberg and Willer 2019: 4). In describing their research, they wrote:

[C]onservatives demonstrated greater support for pro‐environmental legislation when advocacy statements were framed in terms of the more conservative value of purity than those presented with arguments framed in terms of the more liberal value of harm or a control. Similarly, Kidwell, Farmer, and Hardesty (2013) found that presenting pro‐environment arguments couched in terms of loyalty, authority, and purity increased the likelihood that conservatives would recycle, and found that these effects persisted over a 14‐week period.

(Feinberg and Willer 2015: 1,667)

Kaplan et al. (2021) found that moral reframing of messages about wearing masks—highlighting values important to those who chose not to wear masks, such as ingroup loyalty, national identity, and personal liberty—increased mask‐wearing.

Studies suggest that with a reasonably accurate sense of how another tends to think along one or more moral domains, one can reframe negotiation offers and other decision questions to emphasize features that the other party sees as morally positive. Mediators should first engage the party in dialogue, seeking to hear their experiences and understand their thinking related to moral judgments. Once the mediator understands where the party falls within one or more relevant moral domains, the mediator can restate the question or decision point and help explore how it aligns, if at all, with the party’s moral values (Kalla, Levine, and Broockman 2022). Mediators can use moral reframing when they have flashes of intuition about how to frame an issue so a party views it positively in relation to its values and moral judgments. A well‐crafted reframing of a moral issue can take time and thought. The mediator can work on reframing at a break or between sessions. Through moral reframing, mediators can help a party work through their decision.

My dialogue with Jack contained moral reframing. Rather than have the situation framed as one in which Jack was responsible for his family’s loss, it was framed to highlight his virtues of fairness, integrity, and honesty—positive qualities for a family businessowner. Just as with cognitive reframing, moral reframing can shift a party’s intuitive perspective.

Employ Storytelling

One of the most effective non‐logic‐based ways to communicate complex moral ideas is through stories. As noted, MFT explains that children learn moral virtues through hearing stories and watching how others behave in moral situations. We now know that stories (often referred to in research literature as “narrative”) activate different parts of the brain than do logical arguments, and that the brain regions activated when one reads fiction are different than those activated when one reads passages in non‐narrative form (Paul 2012). Wojcieszak and Kim found that attitudes toward out‐groups were more favorable when subjects read counter‐attitudinal commentaries expressed through evidence that was narrative (a representation of events with characters, a plot, bound in space and time, told in first or third person) rather than numerical (arguments that use empirically quantified numbers or descriptions of phenomena) (Wojcieszak and Kim 2016).

The Defense Advanced Research Projects Agency (DARPA) funded research on the influence of narrative on human behavior, hoping to explore how narrative builds support for terrorism or can heal post‐traumatic stress disorder (Defense Advanced Research Projects Agency n.d.). In one DARPA‐funded study, neuroeconomist Paul Zak found that when research subjects watched a video containing a simple but engaging narrative with the elements of a classic dramatic arc, they experienced an increase in cortisol and oxytocin and an empathetic response powerful enough to lead them to donate generously to charity. The dramatic story arc to which Zak referred was defined by Gustav Freytag. It contains the stages of exposition, rising action, climax leading, and falling action leading to denouement or closure (Zak Undated). “The rising and falling tension of dramatic performances facilitate the audiences emotional connection to the characters” (Zak 2015: 4). Stories without a dramatic arc elicited little if any chemical response, empathy for the characters, or donation to charity (Zak Undated, 2015). Harvard sociology professor Marshal Ganz, who served as a consultant on President Barak Obamas presidential campaign, has taught for years that narrative is key to evoking the emotion needed to call others to action (e.g., Ganz 2008, 2009). In explaining the development of the moral modules, Haidt and Joseph have noted that “narrative is a major cultural tool for the modification and socialization of fundamental intuitions” that comprise the moral mind (Haidt and Joseph 2008: 390). Putting an idea into a story—especially one that contains the classic story arc—can inspire a shift in an intuitive moral judgment.

Let us take the brief vignette from the beginning of this article: the party who seems to have entered “Irrational Land.” This party has decided that he cannot trust his counterpart; in some way, the counterpart is untrustworthy. A judgment that the other is “dirty” or morally unclean may underlie the development of the dispute. As the sanctity/degradation research shows, once an intuitive judgment of “contamination” is triggered unconsciously, individuals want to distance themselves from the object. While in a simple negotiation this judgment could lead to a higher payment and quick settlement, many disputes cannot be disposed of that easily. Perhaps the party made an intuitive moral judgment that the company with which they are negotiating is unsafe. Once a negative intuitive moral judgment is triggered, a party can get “stuck.” Narrative, using different neurological processes than reasoned arguments, can help trigger a flash of intuition that provides “irrational” parties the mental space to consider other perspectives, if they so choose.

There are instances when an opposing party or their attorney, rather than a mediator, can use stories to help resolve a dispute. I know of a mediation in which one of the attorneys suggested to a counterpart that they follow a particular mediation process, quoting advice that he had received once from a “wise judge.” By attributing wisdom to a third party, the attorney reduced his counterpart’s natural tendency to devalue his suggestions. Parties as well as mediators can tell stories or vignettes that convey moral principles espoused by admired historical, religious, or civic figures.5 As Haidt has noted, ancient philosophical and biblical texts did not rely on proof and logical argument to teach morality, but rather axioms and role models (Haidt 2006). According to theologian Harvey Cox, “What motivates people are stories, narratives, accounts of situations in which choices must be made and stands taken…. Narratives speak to the inner spirit. They link the moral reasoning we do in our heads to the courage and empathy that must come from the heart” (Cox 2004: 25).

It is also possible to use parties’ own stories of moral virtue when they appear to be struggling or internally conflicted. If, during previous discussions, a party has referenced a moral principle, the mediator can refer back to what they said and ask them about the story behind the remark. This, too, can lead to discussion and new flashes of intuition along moral domains.

The research shows that no one is free of System 1 thinking errors or intuitive moral judgments. However, mediators—who ethically are obligated to be impartial—should try to identify and challenge their own intuitive moral judgments. (See American Arbitration Association, American Bar Association, and Association for Conflict Resolution 2005: Standard II.) While a proper analysis of these considerations is beyond the scope of this article, there is a rich interdisciplinary literature on mediator impartiality or lack thereof, mediator influence, the ethics of various mediation styles,6 implicit and cognitive biases in judges and mediators, and methods of debiasing (e.g., Bowling and Hoffman 2003; Izumi 2010; Shapira 2021; American Bar Association 2022).7

When we think a disputant is behaving irrationally, in a sense that may be true—they may be responding and making assessments intuitively, not rationally. Dubbed “System 1” by scientists, most of our everyday thinking engages parts of our brain that make decisions and judgments quickly, automatically, and intuitively. Even when employing mediation and negotiation best practices in which we uncover underlying interests and priorities, facilitate dialogue, brainstorm, and bargain, System 1 intuitive moral judgments may impede progress.

Mediators and negotiators would do well to study the broad concept of “moral” upon which the social intuitionist model and moral foundation theory are based. When parties seem stuck, or reluctant to give credence to what appear to be credible options that meet their stated interests, mediators and negotiators can look for indications of intuitive moral reasoning. They may find, embedded in an individual’s assessment, one or more intuitive judgments derived from the moral domains of care/harm, fairness/cheating, loyalty/betrayal (including in‐group/out‐group effects), authority/subversion, or sanctity/degradation.

The most promising strategies for engaging with others’ moral judgments are to: (1) appeal intuitively by shifting the focus within the moral domain in which the party is thinking or to another relevant moral domain, (2) use moral reframing, and (3) use storytelling. As they listen, mediators and negotiators can be alert to their own intuition about alternative moral perspectives that may shed a different light on the party’s moral judgment. In this way, they can attempt to engage intuitively and potentially inspire a new flash of intuitive moral judgment in the listener. Moral reframing studies suggest that with a reasonably accurate sense of a party’s moral values along one or more moral domains, a mediator or negotiator can reframe negotiation offers and other decision questions to emphasize the features that the other party would see as morally positive. Finally, the literature on narrative supports the use of stories, particularly those with a classic story arc, to convey ideas that bear on intuitive moral thinking.

In short, when it appears, despite best negotiation or mediation practices, that a party is stuck, mediators and negotiators can try something different—an intuitive approach based on the science of moral reasoning.

1.

Some have argued that on a deeper level, harm is imbedded in the scenarios. For example, Royzman, Kim, and Leeman argue that “subjects' reactions are wholly in line with the rationalist model of moral judgment” (Royzman, Kim, and Leeman 2015: 296; see also Jacobson 2012).

2.

People with prefrontal cortex damage can function fairly well cognitively but suffer emotional deficits that drastically affect their ability to make decisions (Greene and Haidt 2002; Damasio 2003; Haidt 2007; see also Haidt 2012).

3.

For articles that further review and synthesize this line of research, see Chapman et al. (2009); Chapman and Anderson (2012); and Critcher, Inbar, and Pizarro (2013).

4.

Pursuant to mediation confidentiality agreements and applicable law, names and any details that would identify individuals, organizations, or the dispute context have been changed.

5.

For example, those who view the civil rights movement positively might be inspired by the philosophy of Martin Luther King, Jr., which recommends love rather than hatred and retaliation (e.g., King 1981). Admirers of John F. Kennedy might be inspired by his words on civility: “So let us begin anew—remembering on both sides that civility is not a sign of weakness, and sincerity is always subject to proof” (Kennedy 1961). Those who struggle to adhere to their values can be inspired by Hillel: “If I am not for myself, who will be for me, and if I am only for myself, what am I? And if not now, when?” (Telushkin 2008: 117).

6.

For MFT and mediator ethics, see Hyman ( 2015).

7.

The American Bar Association's Litigation Section Implicit Bias Initiative has gathered extensive study and teaching material on implicit bias in judicial and legal settings. Its PowerPoint teaching presentation lays out connections between in‐group preferences and implicit biases. (See notes to American Bar Association 2022.)

Abrams
,
D.
,
G. R.
de Moura
, and
G. A.
Travaglino
.
2013
.
A double standard when group members behave badly: Transgression credit to ingroup leaders
.
Journal of Personality and Social Psychology
105
(
5
):
799
815
.
American Arbitration Association, American Bar Association, and Association for Conflict Resolution
.
2005
.
Model standards of conduct for mediators
. Available from https://www.americanbar.org/content/dam/aba/administrative/dispute_resolution/dispute_resolution/model_standards_conduct_april2007.pdf.
American Bar Association, Section of Litigation
.
2022
.
Implicit bias initiative (PowerPoint)
. Available from https://www.americanbar.org/groups/litigation/initiatives/task‐force‐implicit‐bias/ (accessed February 21, 2022).
Babcock
,
L.
, and
G.
Loewenstein
.
1997
.
Explaining bargaining impasse: The role of self‐serving biases
.
Journal of Economic Perspectives
11
(
1
):
109
126
.
Bargh
,
J. A.
, and
E.
Morsella
.
2008
.
The unconscious mind
.
Perspectives on Psychological Science: A Journal of the Association for Psychological Science
3
(
1
):
73
79
.
Bazerman
,
M. H.
,
J. R.
Curhan
,
D. A.
Moore
, and
K. L.
Valley
.
2000
.
Negotiation
.
Annual Review of Psychology
51
(
1
):
279
314
.
Birke
,
R.
, and
C. R.
Fox
.
1999
.
Psychological principles in negotiating civil settlements
.
Harvard Negotiation Law Review
4
:
1
57
.
Bowling
,
D.
, and
D.
Hoffman
.
2003
.
Bringing peace into the room: How the personal qualities of the mediator impact the process of conflict resolution
.
San Francisco
:
John Wiley & Sons
.
Bruneau
,
E.
,
N.
Dufour
, and
R.
Saxe
.
2012
.
Social cognition in members of conflict groups: Behavioural and neural responses in Arabs, Israelis and South Americans to each other's misfortunes
.
Philosophical Transactions: Biological Sciences
367
(
1589
):
717
730
. Available from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3260847/.
Carruthers
,
P.
2007
.
The innate mind: Foundations and the future
.
New York
:
Oxford University Press
.
Chapman
,
H. A.
, and
A. K.
Anderson
.
2012
.
Understanding disgust
.
Annals of the New York Academy of Sciences
1251
(
1
):
62
76
.
Chapman
,
H. A.
A. K.
Anderson
.
2013
.
Things rank and gross in nature: A review and synthesis of moral disgust
.
Psychological Bulletin
139
(
2
):
300
.
Chapman
,
H. A.
,
D. A.
Kim
,
J. M.
Susskind
, and
A. K.
Anderson
.
2009
.
In bad taste: Evidence for the oral origins of moral disgust
.
Science
323
(
5918
):
1222
1226
.
Cox
,
H. G.
2004
.
When Jesus came to Harvard: Making moral choices today
.
Boston, MA
:
Houghton Mifflin
.
Critcher
,
C. R.
,
Y.
Inbar
, and
D. A.
Pizarro
.
2013
.
How quick decisions illuminate moral character
.
Social Psychological and Personality Science
4
:
308
315
.
Damasio
,
A. R.
2003
.
Looking for Spinoza: Joy, sorrow, and the feeling brain
.
New York
:
Houghton Mifflin Harcourt
.
Debove
,
S.
,
N.
Baumard
, and
J.
André
.
2016
.
Models of the evolution of fairness in the ultimatum game: A review and classification
.
Evolution and Human Behavior
37
(
3
):
245
254
.
De Dreu
,
C. K. W.
,
D. B.
Dussel
, and
F. S.
Ten Velden
.
2015
.
In intergroup conflict, self‐sacrifice is stronger among pro‐social individuals, and parochial altruism emerges especially among cognitively taxed individuals
.
Frontiers in Psychology
6
: Article 572.
Defense Advanced Research Projects Agency
.
Narrative networks (archived)
(accessed April 22, 2021). Available from https://www.darpa.mil/program/narrative‐networks.
Doğruyol
,
B.
,
S.
Alper
, and
O.
Yilmaz
.
2019
.
The five‐factor model of the moral foundations theory is stable across WEIRD and non‐WEIRD cultures
.
Personality and Individual Differences
151
:
Article 109547
. Available from https://www.sciencedirect.com/science/article/abs/pii/S0191886919304799?via%3Dihub
Feinberg
,
M.
, and
R.
Willer
.
2015
.
From gulf to bridge: When do moral arguments facilitate political influence?
Personality and Social Psychology Bulletin
41
(
12
):
1665
1681
.
Feinberg
,
M.
R.
Willer
.
2019
.
Moral reframing: A technique for effective and persuasive communication across political divides
.
Social and Personality Psychology Compass
13
(
12
): e12501.
Ganz
,
M.
2009
.
Organizing Obama: Campaign, organizing, movement. Paper presented at American Sociological Association Annual Meeting in San Francisco, August
. Available from http://nrs.harvard.edu/urn‐3:HUL.InstRepos:27306258.
Graham
,
J.
,
J.
Haidt
,
S.
Koleva
,
M.
Motyl
,
R.
Iyer
,
S. P.
Wojcik
, and
P. H.
Ditto
.
2013
.
Moral foundations theory: The pragmatic validity of moral pluralism
. In
Advances in experimental social psychology
(vol
47
), edited by
P.
Devine
and
A.
Plant
,
55
130
.
Boston, MA
:
Elsevier
.
Graham
,
J.
,
J.
Haidt
, and
B. A.
Nosek
.
2009
.
Liberals and conservatives rely on different sets of moral foundations
.
Journal of Personality and Social Psychology
96
(
5
):
1029
1046
.
Greene
,
J.
2003
.
From neural ‘is’ to moral ‘ought’: What are the moral implications of neuroscientific moral psychology?
Nature Reviews Neuroscience
4
:
846
850
.
Greene
,
J.
, and
J.
Haidt
.
2002
.
How (and where) does moral judgment work?
Trends in Cognitive Sciences
6
(
12
):
517
523
.
Haidt
,
J.
2001
.
The emotional dog and its rational tail: A social intuitionist approach to moral judgment
.
Psychological Review
108
(
4
):
814
834
.
Haidt
,
J.
2003
.
The moral emotions
. In
Handbook of affective sciences
, edited by
R. J.
Davidson
,
K.
Scherer
, and
H.
Goldsmith
,
852
870
.
Oxford
:
Oxford University Press
.
Haidt
,
J.
2006
.
The happiness hypothesis: Finding modern truth in ancient wisdom
.
New York
:
Basic Books
.
Haidt
,
J.
2007
.
The new synthesis in moral psychology
.
Science
316
(
5827
):
998
1002
.
Haidt
,
J.
2012
.
The righteous mind: Why good people are divided by politics and religion
.
New York
:
Pantheon Books
.
Haidt
,
J.
, and
F.
Bjorklund
.
2008
.
Social intuitionists answer six questions about moral psychology
. In
Moral psychology, Vol. 2. The cognitive science of morality: Intuition and diversity
, edited by
W.
Sinnott‐Armstrong
,
181
217
.
Cambridge, MA
:
MIT Press
.
Haidt
,
J.
,
F.
Bjorklund
, and
S.
Murphy
.
2000
.
Moral dumbfounding: When intuition finds no reason
. In
Lund psychological reports
.
Lund, Sweden
:
Department of Psychology, Lund University
. Available from http://theskepticalzone.com/wp/wp‐content/uploads/2018/03/haidt.bjorklund.working‐paper.when‐intuition‐finds‐no‐reason.pub603.pdf
Haidt
,
J.
, and
J.
Graham
.
2007
.
When morality opposes justice: Conservatives have moral intuitions that liberals may not recognize
.
Social Justice Research
20
:
98
116
.
Haidt
,
J.
,
J.
Graham
, and
C.
Joseph
.
2009
.
Above and below left–right: Ideological narratives and moral foundations
.
Psychological Inquiry
20
(
2–3
):
110
119
.
Haidt
,
J.
, and
C.
Joseph
.
2004
.
Intuitive ethics: How innately prepared intuitions generate culturally variable virtues
.
Daedalus
133
(
4
):
55
66
.
Haidt
,
J.
C.
Joseph
.
2008
.
The moral mind: How five sets of innate intuitions guide the development of many culture‐specific virtues, and perhaps even modules
. In
The innate mind
(vol. 3), edited by
P.
Carruthers
,
S.
Laurence
, and
S.
Stich
,
367
391
.
New York
:
Oxford University Press
.
Han
,
S.
,
J. S.
Lerner
, and
R.
Zeckhauser
.
2012
.
The disgust‐promotes‐disposal effect
.
Journal of Risk and Uncertainty
44
:
101
113
.
Henrich
,
J.
,
J.
Ensminger
,
R.
McElreath
,
A.
Barr
,
C.
Barrett
,
A.
Bolyanatz
,
J. C.
Cardenas
,
M.
Gurven
,
E.
Gwako
,
N.
Henrich
, and
C.
Lesorogol
.
2010
.
Markets, religion, community size, and the evolution of fairness and punishment
.
Science
327
(
592
):
1480
1484
.
Hyman
,
J. M.
2015
.
Beyond fairness: The place of moral foundations theory in mediation and negotiation
.
Nevada Law Journal
15
:
959
991
.
Izumi
,
C.
2010
.
Implicit bias and the illusion of mediator neutrality
.
Washington University Journal of Law & Policy
34
:
71
155
.
Jacobson
,
D.
2012
.
Moral dumbfounding and moral stupefaction
. In
Oxford studies in normative ethics
(vol. 2), edited by
M.
Timmons
,
289
316
.
New York
:
Oxford University Press
.
Kahneman
,
D.
2011
.
Thinking, fast and slow
.
New York
:
Farrar, Straus and Giroux
.
Kalla
,
J.
,
A.
Levine
, and
D.
Broockman
.
2022
.
Personalizing moral reframing in interpersonal conversation: A field experiment
.
The Journal of Politics
84
(
2
). https://doi.org/10.1086/716944
Kaplan
,
J.
,
A.
Vaccaro
,
M.
Henning
, and
L.
Christov‐Moore
.
2021
.
Moral reframing of messages about mask‐wearing during the COVID‐19 pandemic
. Available from https://osf.io/gmqfw/
Kaur
,
R.
and
K.
Sasahara
.
2016
.
Quantifying moral foundations from various topics on twitter conversations
. Presented at 2016 IEEE International Conference on Big Data. Available from https://arxiv.org/pdf/1610.02991.pdf
Kennedy
,
J. F.
1961
.
Inaugural address
. In
American rhetoric [database online]
. Available from http://www.americanrhetoric.com/speeches/jfkinaugural.htm
King
,
M. L.
1981
.
Strength to love
.
Minneapolis, MN
:
Fortress Press
.
Kohlberg
,
L.
1971
.
Stages of moral development
. In
Moral education: Interdisciplinary approaches
, edited by
C. M.
Beck
,
B. S.
Crittenden
, and
E. V.
Sullivan
,
23
92
.
Toronto
:
University of Toronto Press
.
Malhotra
,
D.
, and
M. H.
Bazerman
.
2008
.
Psychological influence in negotiation: An introduction long overdue
.
Journal of Management
34
(
3
):
509
531
.
Paul
,
A. M.
2012
.
Your brain on fiction
. New York Times, March 17. Available from https://www.nytimes.com/2012/03/18/opinion/sunday/the‐neuroscience‐of‐your‐brain‐on‐fiction.html
Royzman
,
E. B.
,
K.
Kim
, and
R. F.
Leeman
.
2015
.
The curious tale of Julie and Mark: Unraveling the moral dumbfounding effect
.
Judgment and Decision Making
10
(
4
):
296
313
.
Russell
,
P.
, and
R.
Giner‐Sorolla
.
2011
.
The dangers of disgust in the courtroom
.
The Jury Expert
23
(
4
):
10
16
.
Schnall
,
S.
,
J.
Benton
, and
S.
Harvey
.
2008
.
With a clean conscience: Cleanliness reduces the severity of moral judgments
.
Psychological Science
19
(
12
):
1219
1222
.
Shapira
,
O.
(ed).
2021
.
Mediation ethics: A practitioner's guide
.
Chicago
:
American Bar Association
.
Stanovich
,
K. E.
, and
R. F.
West
.
2000
.
Individual differences in reasoning: Implications for the rationality debate?
Behavioral and Brain Sciences
23
(
5
):
645
665
.
Telushkin
,
J.
2008
.
Jewish literacy
.
New York
:
HarperCollins
.
Welsh
,
N. A.
2003
.
Perceptions of fairness in negotiation
.
Marquette Law Review
87
:
753
767
.
Wojcieszak
,
M.
, and
N.
Kim
.
2016
.
How to improve attitudes toward disliked groups: The effects of narrative versus numerical evidence on political persuasion
.
Communication Research
43
(
6
):
785
809
.
Zak
,
P. J.
undated.
Empathy, neurochemistry and the dramatic arc [video]
. Available from https://futureofstorytelling.org/video/paul‐zak‐empathy‐neurochemistry‐and‐the‐dramatic‐arc (accessed April 11, 2022).
Zak
,
P. J.
2015
.
Why inspiring stories make us react: The neuroscience of narrative
. Cerebrum. Feb. Available from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4445577/?report=classic
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International (CC BY 4.0) license, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. For a full description of the license, please visit https://creativecommons.org/licenses/by/4.0/legalcode.