14.2.14

10 Problems With How We Think | Experts' Corner | Big Think

10 Problems With How We Think | Experts' Corner | Big Think



February 12, 2014, 12:00 AM

Shutterstock_152290286


By nature, human beings are illogical and irrational. For most
of our existence, survival meant thinking quickly, not methodically.
Making a life-saving decision was more important than making a 100%
accurate one, so the human brain developed an array of mental shortcuts.




Though not as necessary as they once were, these shortcuts -- called
cognitive biases or heuristics -- are numerous and innate. Pervasive,
they affect almost everything we do, from the choice of what to wear, to
judgments of moral character, to how we vote in presidential elections.
We can never totally escape them, but we can be more aware of them,
and, just maybe, take efforts to minimize their influence.


Ross Pomeroy summarizes ten widespread faults with human thought at Real Clear Science. You can read the original here




1. Sunk Cost Fallacy




Thousands of graduate students know this fallacy all too well. When
we invest time, money, or effort into something, we don't like to see
that investment go to waste, even if the task, object, or goal is no
longer worth the cost. As Nobel Prize winning psychologistDaniel Kahneman explains, "We refuse to cut losses when doing so would admit failure, we are biased against actions that could lead to regret."


That's why people finish their overpriced restaurant
meal even when they're stuffed to the brim, or continue to watch that
horrible television show they don't even like anymore, or remain in a
dysfunctional relationship, or soldier through grad school even when
they decide that they hate their chosen major.




2. Conjunction Fallacy




Sit back, relax, and read about Linda:


Linda is thirty-one years old, single, outspoken, and very bright.
She majored in philosophy. As a student, she was deeply concerned with
issues of discrimination and social justice, and also participated in
antinuclear demonstrations.




Now, which alternative is more probable?


1. Linda is a bank teller, or


2. Linda is a bank teller and is active in the feminist movement.









If you selected the latter, you've just blatantly defied logic. But
it's okay, about 85 to 90 percent of people make the same mistake. The
mental sin you've committed is known as a conjunction fallacy. Think
about it: it can't possibly be more likely for Linda to be a bank teller
and a feminist compared to just a bank teller. If you answered that she
was a bank teller, she could still be a feminist, or a whole heap of
other possibilities.




A great way to realize the error in thought is to simply look at a
Venn diagram. Label one circle as "bank teller" and the other as
"feminist." Notice that the area where the circles overlap is always
going to be smaller!




3. Anchoring




Renowned psychologists Amos Tversky and Daniel Kahneman once rigged a
wheel of fortune, just like you'd see on the game show. Though labeled
with values from 0 to 100, it would only stop at 10 or 65. As an
experiment, they had unknowing participants spin the wheel and then
answer a two-part question:




Is the percentage of African nations among UN members larger or
smaller than the number you just wrote? What is your best guess of the
percentage of African nations in the UN?




Kahneman described what happened next in his book Thinking, Fast and Slow:


The spin of a wheel of fortune... cannot possibly yield any useful
information about anything, and the participants... should have simply
ignored it. But they did not ignore it.




The participants who saw the number 10 on the wheel estimated the
percentage of African nations in the UN at 25%, while those who saw 65
gave a much higher estimate, 45%. Participants' answers were "anchored" by
the numbers they saw, and they didn't even realize it! Any piece of
information, however inconsequential, can affect subsequent assessments
or decisions. That's why it's in a car dealer's best interest to keep
list prices high, because ultimately, they'll earn more money, and when
you negotiate down, you'll still think you're getting a good deal!




4. Availability Heuristic




When confronted with a decision, humans regularly make judgments
based on recent events or information that can be easily recalled. This
is known as the availability heuristic.




Says Kahneman,
"The availability heuristic... substitutes one question for another:
you wish to estimate... the frequency of an event, but you report the
impression of ease with which instances come to mind."




Cable news provides plenty of fodder for this mental shortcut. For example, viewers ofEntertainment Tonight probably think that celebrities divorce each other once every minute. The actual numbers are more complicated, and far less exorbitant.




It's important to be cognizant of the availability heuristic because
it can lead to poor decisions. In the wake of the tragic events of 9/11,
with horrific images of burning buildings and broken rubble fresh in
their minds, politicians quickly voted to implement invasive policies to
make us safer, such as domestic surveillance and more rigorous airport
security. We've been dealing with, and griping about, the results of
those actions ever since. Were they truly justified? Did we fall victim
to the availability heuristic?




5. Optimism Bias




"It won't happen to me" isn't merely a cultural trope. Individuals
are naturally biased to thinking that they are less at risk of something
bad happening to them compared to others. The effect, termed optimism bias,
has been demonstrated in studies across a wide range of groups. Smokers
believe they are less likely to develop lung cancer than other smokers,
traders believe they are less likely to lose money than their peers,
and everyday people believe they are less at risk of being victimized in
a crime.




Optimism bias particularly factors into matters of health(PDF), prompting individuals to neglect salubrious behaviors like exercise, regular visits to the doctor, and condom use.




6. Gambler's Fallacy




On August 13, 1918, during a game of roulette at the Monte Carlo
Casino, the ball fell on black 26 times in a row. In the wake of the
streak, gamblers lost millions of francs betting against black. They
assumed, quite fallaciously, that the streak was caused by an imbalance
of randomness in the wheel, and that Nature would correct for the
mistake.




No mistake was made, of course. Past random events in no way affect future ones, yet people regularly intuit(PDF) that they do.




7. Herd Mentality




We humans are social creatures by nature. The innate
desire to be a "part of the group" often outweighs any considerations of
well being and leads to flawed decision-making. 
For a great example,
look no further than the stock market. When indexes start to tip,
panicked investors frantically begin selling, sending stocks even lower,
which, in turn, further exacerbates the selling. Herd mentality also
spawns cultural fads. In the back of their minds, pretty much everybody
knew that 
pet rocks were a waste of money, but lots of people still bought them anyway.





8. Halo Effect




The halo effect is a cognitive bias in which we judge a person's character based upon our rapid, and often oversimplified, impressions of him or her. The workplace is a haven -- more an asylum -- for this sort of faulty thinking.




"The halo effect is probably the most common bias in performance appraisal," researchers wrote in the journal Applied Social Psychology in 2012. The article goes on:


Think about what happens when a supervisor evaluates the performance
of a subordinate. The supervisor may give prominence to a single
characteristic of the employee, such as enthusiasm, and allow the entire
evaluation to be colored by how he or she judges the employee on that
one characteristic. Even though the employee may lack the requisite
knowledge or ability to perform the job successfully, if the employee's
work shows enthusiasm, the supervisor may very well give him or her a
higher performance rating than is justified by knowledge or ability.


9. Confirmation Bias




Confirmation bias is
the tendency of people to favor information that confirms their
beliefs. Even those who avow complete and total open-mindedness are not
immune. This bias manifests in many ways. When sifting through evidence,
individuals tend to value anything that agrees with them -- no matter
how inconsequential -- and instantly discount that which doesn't. They
also interpret ambiguous information as supporting their beliefs.




Hearing or reading information that backs our beliefs feels good, and
so we often seek it out. A great many liberal-minded individuals treat
Rachel Maddow or Bill Maher's words as gospel. At the same time, tons of
conservatives flock to Fox News and absorb almost everything said
without a hint of skepticism.




One place where it's absolutely vital to be aware of confirmation
bias is in criminal investigation. All too often, when investigators
have a suspect, they selectively search for, or erroneously interpret,
information that "proves" the person's guilt.




Though you may not realize it, confirmation bias also pervades your
life. Ever searched Google for an answer to a controversial question?
When the results come in after a query, don't you click first on the
result whose title or summary backs your hypothesis?




10. Discounting Delayed Rewards




If offered $50 today or $100 in a year, most people take the money and run,
even though it's technically against their best interests. However, if
offered $50 in five years or $100 in six years, almost everybody chooses
the $100! When confronted with low-hanging fruit in the Tree of Life,
most humans cannot resist plucking it.




This is best summed up by the Ainslie-Rachlin Law,
which states, "Our decisions... are guided by the perceived values at
the moment of the decision - not by the potential final value."


Image courtesy of Shutterstock





More from the Big Idea for Wednesday, February 12 2014


Cognitive Bias



Being smart doesn't mean you aren't just as vulnerable to a wide
array of biases and fallacies. So learn to recognize them. 
In today's lesson, Ross Pomeroy helps us to do just that, by walking...

Read More…



R_pomeroy