It Seemed Like a Good Idea at the Time: How to Make Effective Decisions Under Conflict

Effective and critical decision making is essential in our work but can still be very difficult for attorneys and clients involved in conflict.  We tend to believe our own reactions are controllable, and that others will act “rationally.” However, this is not always the case. Research has proven that we are not hardwired for effective decision-making.  Many times, after a critical decision has been made, believing it was a good idea at the time, we look back and realize we could have made a better decision. What is it that prevents us from making good decisions when we’re in conflict?

Thinking Roadblocks When in Conflict

Decision-making is a function of selecting the best option from the available choices, beliefs or patterns of action.  Effective decision-making can be a difficult process and involves many factors, including accurate perception of existing and new information, thorough examination of alternatives, objectives and values, creative and critical thinking, accurate reasoning, accurate emotional control, accurate intuition, previous experiences, and the ability to answer the question “what if I’m wrong.” We tend to believe that we are rational, that our reactions are controllable through will, and that others will also react rationally.  However, that is not the case.

In a simplistic form, our thinking operates at two levels or systems:

  • “System 1 thinking” is a fast, automatic, unconscious, and emotional process, which responds to situations and stimuli. These thinking shortcuts are habitual and involve less energy-requiring brain processes.[1] For example, have you ever driven to work without thinking about it? That is System 1 thinking at work. System 1 thinking shortcuts often rely upon heuristics and cognitive biases.
  • In contrast, “System 2 thinking” is slow, conscious, effortful thinking, and the brain operates to solve more complicated problems. It is a higher energy-requiring process.

We tend to “frame” information which focuses attention and influences our decision-making. Often, what we pay attention to is the information we prefer to rely upon, and it defines the decisions we make.  To make decisions, we regularly fall back on System 1 thinking, which uses unconscious “heuristics” and “cognitive biases.” Heuristics are neurological, decision-making shortcuts created to simplify thinking.  Cognitive biases are unconscious errors in thinking that result from the brain’s attempt to simply information processing.[2] While heuristics and cognitive biases can be useful to make quick decisions, these System 1 thinking shortcuts can result in detrimental cognitive errors in how we see ourselves and how we evaluate and respond to conflicts.

To some degree, we all suffer System 1 thinking when we are involved in conflicts. Like it or not, people regularly employ heuristics with their own conflicts. Some common heuristics and cognitive biases you may see people experience in conflict include:[3]

  • Confirmation bias. Confirmation bias occurs when an individual interprets the existing evidence and information in a partial way that confirms their own existing beliefs, expectations, or decisions.[4]
  • Lake Wobegon effect/overconfidence bias. Garrison Keillor, a host of a long-running American radio program A Prairie Home Companion, often referred to a fictional small town named Lake Wobegon in the state of Minnesota, “where all the women are strong, all the men are good-looking, and all the children are above average.” The “Lake Wobegon effect” is a tendency to overestimate our own abilities, achievements and performance, creating unsupported overconfidence.[5]
  • Illusion of control bias. The illusion of control bias occurs when one thinks he or she is more likely to win than is objectively the case, and believes one can influence outcomes that are, in fact, beyond the ability to control.[6] As part of this bias, he or she may also believe that one exerts control over the situation by setting into motion and then letting deity provide the favorable outcome.
  • Availability heuristic. When an individual believes that he or she understands all the intricacies of the problem better than anyone else. Based on System 1 thinking shortcuts, a person relies upon immediate examples that come to mind when making a decision and operates on the premise that if something can be recalled, it must be important or more important than alternative options.
  • Regret aversion. Regret aversion bias occurs when someone must make a decision between two or more sub-optimal/bad choices, tends to delay, and chooses one that is believed to cause the least regret in the future.[7]
  • Status quo bias. An emotional decision resulting in a preference to maintain the current situation and to resist change.
  • Endowment effect. An emotional bias that causes individuals to value what is “owned” higher, often irrationally, than its true value.[8] For example, a pet or a family heirloom.
  • Fundamental attribution error. An individual’s tendency to attribute another’s actions to their character or personality, while attributing their own behavior to external situational factors outside of their control.[9] For example, consider what you think of someone who cuts in front of you while driving versus what you think of yourself when you cut in front of someone when you are driving.

How Do You Avoid and Help Others Overcome Thinking Roadblocks?

When parties in conflict are not obtaining their goals, they become frustrated and emotional, and regularly revert to System 1 thinking. While they are relying on System 1 thinking, they are more likely to unknowingly depend on heuristics and cognitive biases and are unlikely to be effective decision-makers.  The goal is to mitigate emotion and utilize communication skills to de-bias decision-making to assist individuals to move to higher, critical System 2 (decision-making) thinking.  These methods include:

  • Affect labeling;
  • Recognizing the cognitive bias;
  • Framing and reframing; and
  • Asking questions to elicit change.

These skills will be more fully discussed and practiced during the presentation.  These skills are aimed to help decisionmakers better consider their circumstances, options and outcomes, including: considering all alternatives, surveying the full range of objectives and values implied by choices; weighing costs and risks of both good and negative consequences; examined all available information; assess new data, re-examine consequences of all known alternatives, engage in thorough planning, assessed potential cognitive biases, and answer the question “what if I’m wrong.” The focus turns to analysis rather than making decisions based on heuristics, intuition or random choices.


[1] Do We Make Mental Shortcuts: Heuristics explained, available at,in%20irrational%20or%20inaccurate%20conclusions.

[2] What is Cognitive Bias?, available at,and%20judgments%20that%20they%20make.&text=Cognitive%20biases%20are%20often%20a,attempt%20to%20simplify%20information%20processing; see also Cognitive Bias, available at; What is a Cognitive Bias, available at

[3] For examples of cognitive biases, see

[4] Raymond S. Nickerson, Confirmation Bias: A Ubiquitous Phenomenon in Many Guises (1998), available at

[5] Maylett, Do you Suffer from the Lake Wobegon Effect?, available at

[6] W. Lendermon, The illusion of control bias, available at

[7] Why Do We Anticipate Regret Before We Make A Decision?: Regret Aversion Explained, available at

[8] Why Do We Value Items More if They Belong to Us?: The Endowment Effect, Explained, available at

[9] Patrick Healy, The Fundamental Attribution Error: What It is & How to Avoid It, June 8, 2017, available at