Why is defining the problem the most important step?
Because defining the
problem defines the solution. That is, the diagnosis determines the
treatment. For example, if you are diagnosed with the flu, you get a different treatment than if you
are diagnosed with a cold.
Quotes illustrating the importance of defining the
problem:
"If I had an hour to solve a problem,
I'd spend 55 minutes thinking about the problem and
5 minutes thinking about solutions." --unknown, but often attributed to Albert Einstein
"If you define the problem correctly, you almost have it solved." -- Steve
Jobs
"We are all faced with a series of great opportunities brilliantly disguised
as impossible situations." --Charles Swindoll
Examples of insights due to re-defining the problem
Animal shelters asking
"How can we help owners keep their pets?" rather than asking "How can we get
all these abandoned pets adopted?"
Behaviorists--and clever parents--asking "How can we get children to do good
behaviors instead of bad behaviors?" rather than asking "How can we stop bad behaviors?"
Defining the problem of saving gasoline (and the environment) by trying to increase
gallons per mile rather than trying to increase miles per gallon. To understand why
this re-definition is so useful, check out either of the links below:
People often say, "If I like x, I will buy it" when they should be asking "Would
buying x make me happier than spending that money on something else?"
Riddles can be challenging because they are usually problems that are
deliberately stated in a way that either defines the problem poorly or that
causes most people to define the problem poorly (e.g., "How many months of
the year having 28 days?")
Why don't we know what the problem is? 8 pitfalls in defining the problem.
Not recognizing that there is a problem because
We ignore "near misses": We don't worry about cell phone
distracting us from driving because we haven't had an accident yet; in
industrial settings, the serious accident often occurs only after about 50
(ignored )near accidents have occurred.
We may not notice the problem because things are getting worse slowly or
inconsistently (e.g., global warming)
Not accepting there is a problem ("Denial is not just a river in
Egypt"). Examples:
Normalcy bias: "negative panic"--acting like things are normal
when they clearly are not. Ex: People leaving slowly--or not at
all--from dangerous situations such as the World Trade Center during
the 9/11 attack
Denying that smoking causes cancer.
Denying the threat caused by COVID-19.
Denying that there are problems with U.S. policing and justice systems.
The addict denying having a drug problem.
Narrowly defining the problem in a way that eliminates options or not
realizing that a problem can be defined in several ways.
When
confronted with COVID-19, Trump apparently framed the problem as
"Can we have a good economy OR fight COVID?"-- when he could have
framed the problem as "How can we have a good economy AND fight COVID?"
Trump has also been accused of asking "How can we have a good economy by wrecking the environment?"--
when he could have asked "How can we have a good economy
AND
improve the environment?"
Similarly, many people define problems in "all or none" terms,
such as "Should we stay married or should we get a divorce?"
Experience a simple example of defining the problem too narrowly:
The nine-dot problem
Defining the problem in a way that is too vague. Example: A student
says "I am having trouble with the course" or " I did not do well on
the last exam." This would be like a physician telling a patient that
"there is something wrong" or a psychologist saying that a patient "has
issues."
Biases may cause us to misidentify the cause of the
problem. As Maslow wrote, "it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail."
Examples:
Republicans seem to think, despite the evidence, that all problems are caused by taxes and all problems can be solved by cutting taxes.
Republicans seem to think that budget deficits are bad--when there is a Democratic president.
Four particularly common and powerful biases that hurt our
ability to find the real cause of a problem:
The "Not me" bias:
We often don't take responsibility for our contribution to the
problem. For example, you have heard people say things like:
"It's not my fault."
"Look at what you made me do!"
"You are making me mad."
"That's a nasty question."
"Fake news!"
One way to own your problem is suggested by Timothy Ferris:
"...tell my story to myself from the perspective of a victim, then I
tell the exact same story from a place of 100 percent
responsibility."
The
fundamental attribution error. Personalizing problems: Blaming people rather than situations.
As anyone who has been stuck in traffic or in a bad job knows, bad
environments can make even mature, rational people do immature,
irrational things.
Preconceptions bias perception and memory, as shown by the
confirmation bias:
We look for and remember evidence that is consistent with our
beliefs and
we interpret neutral, ambiguous, or conflicting evidence as
supporting our beliefs. So, If we believe that Joe is a bad
employee, we will be more likely to notice and remember the
times when Joe makes mistakes than we are to notice and remember
times when Joe does an average or good job. Furthermore,
we tend to
"see" what we expect to see. So, if you expect Joe to be a trouble maker, you may interpret his
behavior more negatively than if you expect Joe to be a team player.
Preconceptions create reality (behavioral
confirmation): If the teacher expects a
student to do poorly, that student is more likely to do poorly
than if a teacher expects that student to do well.
We are misled because
We think that we--and other people--know our own minds. So, if
you or someone else earnestly says that x is the cause, we believe
that x is the cause. However, as
Nisbett and Wilson showed way back in 1977
as well as in
other studies, what people believe is causing their behavior is often not what is
really causing their behavior.
Put another way, people are better at knowing that there is a problem
than knowing the cause of that problem.
We want to see a relationship problem as being due to one of the
people in the relationship, but the problem may be in the
relationship--in the way that the people interact.
We fall for coincidences. The person who says "I don't believe in coincidence," like the person who says "I don't believe in con artists," is
easily fooled. The world is a
noisy, messy, coincidence-filled place and so figuring out what is
related to the problem and what is a coincidence is difficult. As a
result, we may need to rely on scientists to determine what is and
what is not a
coincidence. Many people refuse to do so;
consequently, we have people arguing that cigarettes don't cause
cancer and that humans are not contributing to global warming.
We create
illusory correlations: Even if we aren't victimized by
coincidences, we may see an unrelated set of events as related.
Prejudices and superstitions are sometimes the result of
relationships that exist only in our heads.
We mistake symptoms or effects for causes: Even if we correctly
figure out what factors are related to our problem, these factors
may only be side effects of the real cause. As scientists say, "Correlation is not causation."
We assume we know a person's motives (the "cause" of their
behavior), but we are really only guessing-- and our guess could be
wrong.
Even if we correctly identify a cause of the problem, what we think is an
important source of the problem may be unimportant whereas a
factor that we think is unimportant may turn out to be very
important. For example, people think the key to losing weight is to
exercise more, but losing weight by cutting calories is much more
effective than losing weight by increasing exercise. Although
figuring out which causes are the most important ones is difficult,
if we take advantage of science, expertise, and experience,
we may be able to learn what factors to focus on, thus allowing us to take advantage of the
80/20 rule
(also known as the Pareto Principle).
Incorrectly identifying what kind of problem we have, so we try to solve one type
of problem when we should be trying to solve a different type of
problem.
Example 1:
What rule is determining the sequence of these numbers? 8,5, 4, 9,
1, 6, 7, 10, 3, 2
The digits (eight, five, four, etc.) are in alphabetical order,
Two other examples: Think of the last time you applied the wrong formula to a word-problem
or heard of a friend who was misdiagnosed by a doctor.
Sometimes, we misidentify the kind of problem we have because of the representativeness heuristic:
a general rule used when people decide whether
something is an example of a category. If
what we are looking at matches our memory of a typical instance
of a category, we will classify that
thing as being a member of that category. For example, you determine whether someone
is a child or an adult based on their appearance matching your
memorized examples of children and adults. The advantage of the
representativeness heuristic is that people can take advantage of their experiences
and their expertise.
For example, a doctor can quickly diagnose a patient who has
a disease that the doctor has seen hundreds of times before.
Unfortunately, because problems that look similar may be
very different, the
representativeness heuristic may lead not only to stereotyping,
but to misdiagnosing the problem by overlooking key differences between this
new problem and old problems.
The representativeness heuristic may also cause us to ignore
important information. For example, a doctor might, seeing that the
patient's symptoms matched malaria, use the representativeness
heuristic to diagnose the patient as having malaria, even though the
patient probably didn't have malaria given one important fact: There hadn't been a malaria case where
the patient lived in over 50 years.
Not testing your assumptions about the cause of the problem because of the
confirmation bias.