Why is it important to test what we think of as common sense if we already believe it is true?

Before reading further, click here to test your psychology knowledge. Don't worry; it isn't graded! Now, click here to see the answers. How did you do? Most people miss at least a few.

Why do we think we know so much about psychology?

Popular psychology lore permeates our culture. We hear soundbites of information from poorly vetted sources across various forms of media every day. Pop psychology is often based on flimsy studies using poorly designed methodology. We then integrate the unsupported knowledge into our fund of information. The psycho-myths spread and become difficult to challenge.

Beyond pop psychology exposure, internal forces are shaping our notions about human behavior. Many people think answers to psychological questions are obvious. Every day on social media, I see commentary asking why we need to do scientific research to understand human behavior that is just "common sense."

In fact, many people believe their intuition—immediate, automatic feeling or thought—is always (or almost always) accurate. And sometimes it is. Often, however, we are wrong.

Flawed thinking often leads us astray

We use heuristics (mental shortcuts) to understand the world around us. Sometimes, these shortcuts are helpful, allowing us to make quick decisions in a fast-moving world. Frequently, mental shortcuts lead us to make poor decisions.

If I ask you the question, which is further west in the United States, Reno, NV, or San Diego, CA, what would you say? Most people use a heuristic—California is west of Nevada, so San Diego must be west of Reno. Guess what? That is wrong. Reno is west of San Diego. Are you surprised?

Another way to ponder our intuition about the world is to consider the adages that guide our everyday thinking. Here are several "common-sense" sayings that many of us have integrated into our thinking.

  1. Birds of a feather flock together.
  2. Opposites attract.
  3. Absence makes the heart grow fonder.
  4. Out of sight, out of mind.
  5. Better safe than sorry.
  6. Nothing ventured, nothing gained.
  7. Two heads are better than one.
  8. Too many cooks spoil the broth.
  9. Actions speak louder than words.
  10. The pen is mightier than the sword

Do you notice anything peculiar about this list? Likely you have heard all or most of these proverbs. You may apply them to situations in your life. For example, when a couple you know lives apart and ends up breaking up, you may say to yourself, "Out of sight, out of mind." Other times, when a couple has a strong relationship despite a long-distance relationship, you may think, "Absence makes the heart grow fonder."

Look carefully; each one of these sayings is contradicted by another statement on the list. How can we hold these assertions as truths when we hold the contradictory proverb as fact as well?

Why we need science and logical thinking

Most of us navigate our daily lives believing we see the world as it is. We are often unaware of the forces that compel us to formulate notions about the world. Our naive realism—the belief that we see the world precisely as it is—creates the foundation for these errors in thinking. We rarely question that our beliefs are inaccurate.

The need for psychological science and the use of logical—rather than intuitive—reasoning to understand human behavior becomes apparent when we recognize our mistaken assumptions about the basics of human behavior.

Most people hear the word science and envision a person in a white lab coat, perhaps looking through a microscope—chemistry, physics, and biology come to mind. Science is merely a systematic approach to evidence—it is a toolbox we use to prevent fooling ourselves about what we observe.

Science is a safeguard against bias. We utilize empirical methods and stringent statistical testing to determine the ideas we should keep and those we should abandon. The scientific method is applied across many disciplines, including the study of human behavior.

Three reasons we trust intuition

While there are various cognitive biases and logical fallacies that contribute to the need for science when studying human behavior, three key ideas can help us understand why we often trust our common sense when we shouldn't.

1. Hindsight Bias

"Anything seems commonplace, once explained." —Sherlock Holmes

The "I-knew-it-all-along" phenomenon plagues many of us. It is easy to see outcomes after the fact and be certain we could have predicted them. This Monday-morning quarterbacking is pervasive. More than 800 research papers have shown that hindsight bias—the tendency, after the fact, to reason we would have predicted an outcome—is a worldwide phenomenon that affects people of all demographic groups.

2. Overconfidence

"We don't like their sound. Groups of guitars are on their way out." —Decca Records turning down a contract with the Beatles in 1962.

Most of us overestimate our own knowledge. Over and over again, research has shown that people's confidence in the judgment of their knowledge is much higher than the accuracy of their knowledge. From the Titanic sinking to the Challenger explosion, overconfidence has proven to be one of the most problematic biases that interfere with accurate understanding and decision-making.

3. Perceiving Order in Random Events

"Chaos was the law of nature; Order was the dream of man." —Henry B. Adams.

Humans are relentlessly eager to perceive order in chaos. Seeing a face on the moon, a religious image on a dirty towel, or the hot hand in basketball are all examples of how we want to apply meaning to our observations in the world.

Have you ever heard of a lucky person winning a huge lottery payout twice? It is such a surprising phenomenon that we are unable to conceive that it is random. As statisticians will tell, however, with large enough samples, just about anything can happen.So, when should we trust our intuition?

Despite the problematic nature of intuitive thinking, sometimes, we need to trust our common sense. In cases where split-second safety is essential, research suggests our intuition is our best bet as snap judgments about the trustworthiness of someone we watched on video tend to be more accurate than chance.

Final Thoughts

Question what you know. Are you making assumptions based on potentially flawed information? When you notice yourself utilizing aphorisms or mental shortcuts regarding explanations for human behavior, consider investigating the evidence rather than accepting your initial hunches. According to research, considering the opposite of what you may be thinking is an efficient way to bypass some of our faulty cognitive mechanisms.

While intuition is a fantastic means to keep us safe, unthinkingly applied to all situations, it can lead to mistaken beliefs and misinformation, which can trigger problematic decision-making.

"Science must begin with myths, and with the criticism of myths." —Karl Popper

The words of Popper, a famed philosopher of science, transcend time. We must critically examine our beliefs to understand human behavior—including our own.

It's a distinction we learn as kids. But it turns out judging facts isn't nearly as black-and-white as your third-grade teacher might have had you believe.

In reality, we rely on a biased set of cognitive processes to arrive at a given conclusion or belief. This natural tendency to cherry pick and twist the facts to fit with our existing beliefs is known as motivated reasoning—and we all do it.

"Motivated reasoning is a pervasive tendency of human cognition," says Peter Ditto, PhD, a social psychologist at the University of California, Irvine, who studies how motivation, emotion and intuition influence judgment. "People are capable of being thoughtful and rational, but our wishes, hopes, fears and motivations often tip the scales to make us more likely to accept something as true if it supports what we want to believe."

In today's era of polarized politics—and when facts themselves are under attack—understanding this inclination (and finding ways to sidestep it) has taken on new urgency, psychologists say.

Red facts and blue facts

Much of the early research on motivated reasoning showed that people weigh facts differently when those facts are personally threatening. More than two decades ago, Ditto and David F. Lopez, PhD, compared study participants who received either favorable or unfavorable medical tests results. People who were told they'd tested positive for a (fictitious) enzyme linked to pancreatic disorders were more likely to rate the test as less accurate, cite more explanations to discount the results and request a second opinion (Journal of Personality and Social Psychology, 1992).

"It takes more information to make you believe something you don't want to believe than something you do," Ditto says.

We don't just delude ourselves when it comes to our health and well-being. Research shows we also interpret facts differently if they challenge our personal beliefs, group identity or moral values. "In modern media terms, that might mean a person is quick to share a political article on social media if it supports their beliefs, but is more likely to fact-check the story if it doesn't," Ditto says.

For instance, Ditto and his former student Brittany Liu, PhD, have shown the link between people's moral convictions and their assessment of facts. They found people who were morally opposed to condom education, for example, were less likely to believe that condoms were effective at preventing pregnancy and sexually transmitted diseases. Similarly, people who had moral qualms about capital punishment were less likely to believe it was an effective way to deter crime (Social Psychology and Personality Science, 2012). "People blur the line between moral and factual judgments," Ditto explains.

For people who identify strongly with one side of the political spectrum or the other, it can feel like their opponents are willfully ignoring the facts. But right or left, both sides believe their positions are grounded in evidence, Ditto says. "We now live in a world where there are red facts and blue facts, and I believe these biased motivated-reasoning processes fuel political conflict. If someone firmly believes some fact to be true that you just as firmly believe to be false, it is hard for either of you not to see that other person as stupid, disingenuous or both."

In an analysis presented at the 2015 annual meeting of the Association for Psychological Science, he and colleagues examined 41 experimental studies of partisan bias involving more than 12,000 participants. They found that self-identified conservatives and liberals both showed a robust partisan bias when evaluating empirical evidence, to an almost identical degree. "It's an equal-opportunity bias," he says.

That bias is unsurprising given the powerful social incentives for group-think, says Daniel Kahan, JD, a professor of law and psychology at Yale Law School who studies risk perception, science communication and the application of decision science to law and policymaking. Consider climate change. Discounting the evidence of human-caused global warming has become a central feature of the conservative platform—and taking an opposing viewpoint could damage your reputation within that group.

"If you take an ordinary member of the public, his or her carbon footprint is too small to make an effect on climate change. If they make a mistake on the science in that part of their life, nothing bad happens to them," Kahan explains. "But they can be adversely affected if they're holding a deviant view on an identity-defining issue inside their social group."

So, consciously or not, people may twist the facts. They can even trick themselves into believing that the facts aren't relevant, as social psychologist Troy Campbell, PhD, an assistant professor of marketing at the University of Oregon, and colleagues have shown.

His team presented volunteers who either supported or opposed same-sex marriage with alleged "facts" suggesting children raised by same-sex parents did or did not experience negative outcomes. When the evidence was on their side, participants stated their opinions on the matter were based in fact. But when the evidence opposed their view, they argued the question wasn't about facts, but morals (Journal of Personality and Social Psychology, 2015). "People take flight from facts," Campbell says.

The more you know

People often dismiss those who hold opposing views as idiots (or worse). Yet highly educated people are just as likely to make biased judgments—and they might actually do it more often.

In one example of this "expertise paradox," Kahan and colleagues asked volunteers to analyze a small data set. First, they showed data that purportedly demonstrated the effectiveness of a cream for treating skin rash. Unsurprisingly, people who had a greater ability to use quantitative information did better at analyzing the data.

But there was a twist. When participants saw the very same numbers, but were told they came from a study of a gun-control ban, their political views affected how accurately they interpreted the results. And those who were more quantitatively skilled actually showed the most polarized responses. In other words, expertise magnified the tendency to engage in politically motivated reasoning (Behavioural Public Policy, in press). "As people become more proficient in critical reasoning, they become more vehement about the alignment of the facts with their group's position," Kahan says.

The pattern holds up outside the lab as well. In a national survey, Kahan and colleagues found that overall, people who were more scientifically literate were slightly less likely to see climate change as a serious threat. And the more they knew, the more polarized they were: Conservatives became more dismissive of climate change evidence, and liberals became more concerned about the evidence, as science literacy and quantitative skills increased (Nature Climate Change, 2012).

"It's almost as though the sophisticated approach to science gives people more tools to curate their own sense of reality," says Matthew Hornsey, PhD, a professor of psychology at the University of Queensland who studies the processes that influence people to accept or reject scientific messages.

Unfortunately, our modern media landscape seems to be amplifying the retreat from facts. "These are wonderful times for motivated reasoners. The internet provides an almost infinite number of sources of information from which to choose your preferred reality," says Hornsey. "There's an echo chamber out there for everyone."

Compounding the problem, fake-news websites that publish hoaxes, conspiracy theories and disinformation disguised as news have proliferated in recent years. But the recent focus on fake news might be doing more harm than good, some experts say. "Now that we have this idea that there is fake news, we can credibly attribute anything we dislike to fake news," says Campbell.

In the past, climate-change skeptics might have tried to pick apart the details of a study or demonstrate a researcher's conflict of interest to cast doubt on the evidence. Now, they can simply allege that the media can't be trusted to report the truth, and wipe away inconvenient facts with a single stroke. "Mistrust of the media is a powerful tool for motivated reasoning," says Ditto.

License to ignore reality is a dangerous path to travel, regardless of your political leanings, Kahan adds. "It's a good thing in our political culture that facts have been the currency of our discourse on disputed issues. If facts are somehow devalued as a currency, it'll be a lot harder to achieve our common goals."

The root of the problem

What can be done to restore our faith in facts?

Media literacy is one place to start. A report by researchers from Stanford University's Graduate School of Education found students in middle school, high school and college were terrible at evaluating the quality of online information (Stanford History Education Group, 2016). Though the authors described their findings as "bleak" and "dismaying," the silver lining is that kids can be taught to be better consumers of information—by, for instance, learning to pay closer attention to the source, consider possible biases or motives and think about what details a news source might have left out.

But given our cognitive biases, teaching can only get us so far. "Motivated reasoning is not something that's open to view through introspection or conscious effort," says Kahan. "I'd put more hope on a strategy of improving the science communication environment."

That's where Hornsey is focusing his efforts. In a new paper, he describes what he calls attitude roots—the fears, ideologies, worldviews, vested interests and identity needs—that motivate us to accept or reject scientific evidence. He argues that communicators must do a better job at identifying those roots and adjust their persuasion attempts accordingly (American Psychologist, in press). "This is what we call jiu jitsu persuasion: working with people's motivations rather than trying to fight against them," he says.

Why is it important to test what we think of as common sense if we already believe it is true?
So, for example, want to convince a vaccine skeptic that immunizations are safe? First it helps to figure out if they believe in Big-Pharma conspiracy theories, if they're fearful of medical intervention or whether they want to prove to their social circle that they're a concerned parent.

"The key question is not ‘Why do they disagree with the science?' but rather, ‘Why do they want to disagree with the science?'" Hornsey says.

Answering that will probably require doing something people in our increasingly polarized political climate are loathe to do: Less talking, more listening.

People communicating the facts often do so with the implication that the target is a bad person at worst, or uneducated at best, Campbell says. But an adversarial approach isn't likely to change minds.

That's a lesson cosmetics companies learned long ago: They figured out they'll sell more lipstick if they promise to enhance a woman's natural beauty rather than tell her she's ugly, Campbell points out. People who communicate information would do well to heed that example. That goes for scientists and science communicators, but also for anyone who can share an article with hundreds of people with the click of a button—which is to say, almost everyone in today's digital landscape.

"One of the most important ways to inoculate people from false information is to befriend them," Campbell says. "There's a time for the middle finger, and a time to put it away."