The Five Ordered Steps of Problem-Solving:
Step 3: Evaluating Options

Step 3:  Evaluate options

"Solutions" should usually not be accepted without being evaluated. As Paul Ylvisaker pointed out, "For every problem, there is a solution that is simple, quick, and wrong." (If you need reminding of simple, quick, and wrong "solutions", think about Trump suggesting  that  COVID could be cured with chloroquine or reportedly suggesting that hurricanes should be nuked.)

So, you usually should not make a decision by just going with your gut.  However, going with your first impulse can be a good strategy when
  1. you have made many decisions in this area and have gotten rapid feedback on the accuracy of your decisions, so you could be considered an expert in making these decisions.
  2. what matters is how you feel about your options but you are unable to verbalize the reason for your liking or disliking of those options (e.g.,  you may not know why you love someone or some thing).
  3. there is no objectively correct choice (e.g., should you get the red or the blue watch band?)
  4. you must make a decision immediately.

Outside of those four situations, how should you choose between options?

One view of how people should choose between options is that people should optimize: choose the best [optimum] option). Unfortunately, optimizing is not simple. Instead, optimizing requires doing 5 things:

  1. Consider all the options
  2. Consider all the pros and cons of all the options
  3. Determine the probabilities of each of those pros and cons
  4. Correctly weight the importance of each of those pros and cons
  5. Combine all the  information about the pros and cons of all the options to arrive at the best (optimal) choice

To better understand everything that is involved in optimizing, look at the table below. Note that as complex as this seems, it is an extremely oversimplified example of choosing among apartments. We are looking at only 3 options and looking at only 3 characteristics of each choice. In reality, there are probably more than 3 places that you could consider, and you probably care about more than price, proximity to campus, and landlord. For example, you probably care about how quiet the apartment is is, how safe it is, how big it is, and how nice it is. Still, even this drastically oversimplified example shows you how complicated optimizing is.

 

Options Price Score on Price Price's Importance Location Location's Score Location's Importance Landlord's Reputation Landlord's Score Landlord's ImportanceTotal score
1 500/month 3 4 2 miles from campus 2 2 Excellent 5 4 36 (3 * 4) + (2 * 2) + (5 * 4)
2 400/month 4 4 5 miles from campus 1 2 Average 3 4 30 (4 * 4) + (1 * 2) + (3 * 4)
3 700/month 1 4 next to campus 5 2 Poor 1 4 18 (1 * 4) + (5 * 2) + (1 * 4)
 

As you have probably figured out, people usually do not optimize. Instead, they "satisfice" (choose the first satisfactory option, the "good enough" option). 

Why don't we optimize?

Sometimes, we don't try to optimize because optimizing is stressful and because it is not worth our time to investigate the costs and benefits of every possible option (imagine spending hours finding the best pencil for the price).

Even if we try to optimize, we may fail for 7 reasons:

  1. Partly because of the limits of short term memory (STM), we do poorly at:
    1. Considering all the options (thinking of more than 7 is tough)
    2. Considering all the pros and cons of each option (even with just 2 options, considering just  two pros and two cons of each option would exceed STM capacity by giving us 8 things to keep in mind)
    3. Combining all the information about the pros and cons of all the options to arrive at the best (optimal) choice
  2. To get around the first two of these problems caused by short-term memory's limitations, you might write down all your options as well as their pros and cons (Example).
    To get around all three of these problems, you could use this decision making program.

  3. We underestimate and overestimate risks. That is, we are bad at estimating the frequency of events (and thus how likely something is to happen) for a variety of reasons, including
    1. the availability heuristic:  assuming that the easier it is to remember examples of something happening, the more often that thing occurs--and that the harder it is to remember examples of something happening, the less often that thing occurs. One way the availability heuristic can lead us astray is that some events, even though they don't occur very often, are easy to recall. So, recent and vivid events are seen as more likely than they really are (e.g., despite many people having a fear of flying, commercial airplane crashes are rare).

    2. How politicians and some in the media have used the availability heuristic against us.
      • In 2016, Trump ran on a vision of America being unsafe due to violent crime, but, in fact, America's violent crime rate was almost half of what it had been in 1990.
      • Trump acted like cities near the Mexican border are extremely dangerous places, largely due to undocumented immigrants from Mexico. In fact, it seems that immigrants are less likely to commit crimes and that some southern border towns (e.g., El Paso) are among the safest cities in the country whereas cities far from the Mexican border (e.g., Baltimore and Detroit) are among the most dangerous U.S. cities.
      • Trump has convinced some people that ANTIFA are a bunch of murderers. In fact, as of this writing,  ANTIFA is responsible for only one death (and that may have been in self-defense).  In general, right wing extremists are responsible for much more violence than the left-wing extremists. (link to more recent data).
      • Some have argued that police are being gunned down at high rates and that COVID-19 is a hoax. However, recent figures show 101 police officers died from  COVID-19 and 82 died from all other causes combined (e.g., car accidents, being shot, etc.). In fact, some reports have 5X as many police officers dying from COVID than from gunfire.
      • Police are more likely to die from a car crash as from a shooting; yet many officers do not wear seat belts.
      • Being a police officer is a dangerous job. However, there are at least 18 jobs that are more dangerous. Jobs that are more than 2X as dangerous as being a police officer include commercial fisherman and fisherwomen (more than 7X as dangerous as the police officer job), loggers (more than 6X as dangerous), pilots (more than 3X as dangerous), roofers, steel workers, truck drivers, and garbage collectors.
      • Being a food delivery person is an extremely dangerous job, as you can see from this graph.
      • As Kristoff (March 23, 2021) writes, "In a typical year in the U.S., more preschoolers are shot dead in America (about 75) than police officers are."

      • *Note that the availability heuristic also fools us about whether we have a problem--or what the problem is. For example, many people think the U.S. has an immigration problem due to having too many immigrants. In fact, the U.S. does have an immigration problem due to a lack of immigrants. The number of Americans who were born in a foreign country has shrunk by more than half since the 1990s--and, if not fixed, this immigrant shortage will have dramatic negative effects on social security (and, probably, on the entire economy--Japan has learned how a lack of immigrants sinks an economy)
    3. The confirmation bias: Once you get the idea that something is risky--or not risky--your tendency will be to find evidence that supports your view. To fight this tendency, seek out information that opposes your view. Thanks to Google, this is easy to do.
    4. We have trouble using base-rate information: what typically happens.
      • We pay more attention to stories (even though the story may be about just one person's experience than to base-rate statistics, even when those statistics are based on millions of people).
      • Sometimes, averages do not apply to our situation (e.g., Although, in general, wearing a mask to prevent the spread of COVID-19 was a good idea in 2020, it was probably not necessary for a young person to wear a mask while biking in a rural area where COVID rates were low).
      • We often incorrectly assume that averages don't apply to us because we are unique. Usually, we exhibit an optimism bias: that we are uniquely less likely to have bad things happen to us, this is especially true when we do have some control over the outcome. Thus, many people prefer to drive rather than to fly, when driving is riskier. You may be able to stop yourself from falling for the the optimism bias by asking what the risk would be to other people and then applying that risk to yourself..
    5. As just mentioned, we have an optimism bias, which may cause us to overestimate the chances that our solution will work.
      Examples of optimism bias:
      • Businesses think that mergers will be successful, even though 84% of merger deals did not boost shareholder return.
      • President Trump said that the COVID-19 would go away by April, 2020.
      • People die because they take unproven cancer "cures" when they could have been saved by traditional medicine.
      • The planning fallacy: Things take much longer than we think that they will.
  4. We give some information too little or too much weight.
  5. We don't look at all our options or we don't consider all the relevant criteria.
  6. "All or nothing" thinking: We see a solution as all good or all bad. The result may be that we see all the options as terrible. For example, since both Democrats and Republicans have some dishonest members, we may say, "They are all terrible." As the saying goes, "the perfect is the enemy of the good." Similarly, we may hold out for the perfect solution--which will never come. Examples of all-or-nothing thinking hurting decision making
  7.  We are loss averse: the pain of losing something is much more than the pleasure of gaining something (Insurance companies and banks depend on us being loss averse). Because we are loss averse, we are vulnerable to to framing effects: the way the problem is worded affects the decision that we will make. That is, we can be manipulated by how the problem is framed (stated). For example, patients (and doctors!) are more likely to endorse a surgery when they are told that 90% of patients will survive the surgery than when they are told 10% of patients will die from the surgery (Obviously, if 90% survive, 10% die).
  8. In addition to all of the above problems, if we are part of group that is making a decision, we may also mess up due to the following problems:
    On to Step 4

    Back to Step 2

    To Problem-Solving Menu

    Back to Lecture Notes Page