Checking the Bias
In a world where everyone is encouraged to have (and express) an opinion, wisdom is often conspicuously absent. Why do we believe the things we do?
What do you think?
Since 2013, more than 160 gun-related incidents have arisen at schools and colleges in the United States. In 2014, at least 65 percent of the nation’s more than 14,000 murders were committed with a firearm of some sort. Do these sobering statistics make the case for tighter gun control in the country? Or should more citizens be armed to defend themselves, their families and their property? Is it a constitutional right for all American citizens to keep and bear arms? Or has the society changed so much that the “well regulated Militia” referred to in the Second Amendment is no longer relevant?
Perhaps the world’s food supply is of greater concern than guns. Do you watch what you eat? Genetically modified (GM) foods were introduced to the dinner table in the mid-1990s. GM plants are lab-manufactured to resist pests or pesticides, to tolerate drought or frost, to increase shelf-life, or to enhance nutritional value. According to estimates, close to 75 percent of processed foods available in the US market contain at least one GM ingredient. Currently 10 GM crops are produced in the United States, with others awaiting approval. Does the ability to genetically engineer food hold the promise of securing a safe food supply for the billions of impoverished, undernourished people around the world? Or are the health and environmental risks from the modified organisms too great to trust to chance?
With a world of information at our fingertips, learning the facts and coming to a sound judgment on these and other issues doesn’t seem like it would be a problem. Yet just about every topic imaginable invites diametrically opposite perspectives on the part of people who passionately defend their respective viewpoints, often in the face of what seems to others to be overwhelming evidence to the contrary.
“When men wish to construct or support a theory, how they torture facts into their service!”
How is it possible to be so fervent about one side of an issue when others are just as worked up about their side? How can they think they’re correct when it’s so obvious we are? Are their conclusions as valid as ours? Can we even know what’s right?
Thinking About How We Think
Every day we make choices and judgments about our world. In our minds, we are well informed; we’re convinced that our conclusions are right because they’re based on a thorough analysis of the facts at hand. The truth, though, is that we humans are subject to biases, errors in thinking, peer pressure, even variations in our own emotional state. We are most comfortable with our own opinions, our own perception of what is true. But where do those opinions come from? And why do we think we’re right?
The human brain is remarkable. Weighing in at less than 3.5 pounds for adults, the brain is composed of the same fats, carbohydrates, salts, proteins and water as the rest of the body, but it does what no other organ can do. Without our help, or even our awareness, our brain monitors and controls our body functions, stores and recalls memories of faces, places, events and abilities, and makes it possible for us to communicate with the world around us.
An infant is born with no judgments, no opinions, save for preferring to be full, clean and held rather than hungry, dirty and left alone. As it grows, the child learns through experience that some things are good, tasty, comforting and enjoyable, and that other things are not; that some people are safe and others to be avoided; and that there are rights and wrongs. We reach adulthood and unconsciously trust that our beliefs, values and judgments are the result of years of accumulated experience, which we have analyzed and found to be sound. When presented with new information, or when something challenges a long-standing belief, we process the information objectively and pronounce it right or wrong. Or do we?
For thousands of years people have studied how we come to our opinions and make judgments. Thinking about thinking has been a focus of philosophers from Socrates, Plato and Aristotle to Hume, Kant and Russell; in more modern times, the process of thought has become the subject of the social sciences.
In the 1960s, cognitive psychologist Peter Wason began a study of how people test hypotheses and make judgments. He disagreed with the theory of the day—that humans reason logically. In a series of tests, he found that test subjects overwhelmingly tended to reject or ignore evidence that did not agree with their hypotheses, choosing instead to concentrate their efforts on finding answers that verified their premises. Wason named this thinking pattern “confirmation bias.”
Two forms of confirmation bias exist: motivated and unmotivated. Raymond S. Nickerson of Tufts University notes that “people may treat evidence in a biased way when they are motivated by the desire to defend beliefs that they wish to maintain. . . . But people may also proceed in a biased fashion even in the testing of hypotheses or claims in which they have no material stake or obvious personal interest.”
As a rule, then, we search out information that agrees with what we already feel and think, rely most heavily on data that supports our convictions, and sometimes even repackage information so that it will do so. Conversely, we often avoid, downplay or simply ignore evidence that challenges our beliefs. But we don’t always know we’re doing it.
“If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration.”
The human brain deals with a tremendous amount of information every day. One way in which it handles that constant input is by employing a heuristic approach—taking mental shortcuts that rely on a rule-of-thumb approach to decision making. Heuristics is understood to save time and mental energy in everyday situations and is also thought to be important when facing risk: we can quickly make decisions that help us avoid danger. The brain automatically confirms and acts on what it already knows. Because of these shortcuts, we don’t need to analyze all facets of each bit of information that comes our way in order to make a decision: we cross the street to avoid a broken sprinkler spraying water across the sidewalk; we trust the GPS in our car to move us away from accidents and road-repair work. It would be a tedious life if a response to each decision and happening of every day had to be considered from all sides.
However, when dealing with beliefs and convictions, is it wise to simply take a shortcut, or to consciously reject what we disagree with without examination?
Whether actively defending a position or simply taking a side, when we don’t evaluate our positions or choices, we risk preserving and supporting ideas that just are not true, and making poor choices as a result. Our firmly held beliefs and convictions affect our lives, finances and relationships. As such, we should want to follow what is actually an age-old principle: “Test [or prove] all things” (1 Thessalonians 5:21) to make sure that what we believe and act on is right and based on a solid foundation.
Thinking Critically
Thoughtfully considering our conclusions, and how we come to them, is not a task easily undertaken. Becoming aware that we have biases is only a beginning; it takes time and concentrated effort to identify errors and make necessary changes. Admitting our weaknesses and faults is difficult, but it is possible to change.
Coming to recognize our biases is a vital step in the right direction. Once we see the need to change our patterns of thinking, we have to learn to recognize and regulate what we think about.
In this regard, it’s important to consider what we are putting into our minds. The noisy, shallow, media-driven availability of all sides of any issue makes noisy, shallow input difficult to avoid without determined effort. “Cutting the cord” is neither realistic nor necessary; however, beginning the process of real change in our thinking does require that we stop putting in more of the same information. A political conservative who listens only to conservative radio or television is not likely to change his mind about the opposition. A Young Earth creationist who turns a blind eye to scientific evidence to the contrary will remain convinced that the world is only a few thousand years old. Likewise an atheist is less likely to come to consider the Bible a source of worthwhile wisdom and comfort by associating only with other atheists.
The Bible not only advises us to prove all things; it is filled with timeless wisdom on how and what we think. The book of Proverbs especially urges the seeking of knowledge, wisdom and discernment, and likewise warns against being “simple,” or unthinking, unwise.
“Those who trust their own insight are foolish, but anyone who walks in wisdom is safe.”
The book also declares the value of seeking wise counsel (Proverbs 12:15; 28:26, English Standard Version). Because we can’t always be sure of our own thinking and the effect it has on our behavior and attitudes, it can be useful to seek advice from someone who knows us well and likely can see our flaws better than we can.
It is helpful to have a standard against which to judge our beliefs. There must be a foundation for our thinking. This foundation isn’t built on our own experiences or on human reasoning, which is known to be subject to error, but on something far more reliable (Proverbs 9:10). When we choose to evaluate and alter our thinking using this standard as a guide, we can be confident of being on the right side.