Magicians and con-men have known for centuries how to deceive, seduce, and exploit audiences and individuals to their benefit. Francis Bacon, 16th century philosopher, scientist, and author, said, “Man prefers to believe what he prefers to be true.” We are willing victims, even active accomplices, in the regular misinterpretation of the world around us, often to our dismay and sometimes to our harm.
In fact, neuroscientists are just beginning to unravel the secrets of the brain – how we see the world, and how we remember details of events and environments. This can help us understand the hidden feelings that color our decisions and drive our actions, which in turn can help us make better decisions.
Decision Systems in Our Brains
The human brain is a magnificent organ, developed over hundreds of millions of years of evolution. It equals about 2% of your body weight but consumes more than 20% of your oxygen and blood flow. Research suggests that the brain functions through the more than 1,000 trillion synapses between brain cells (neurons) that are constantly growing and dying throughout life.
As explained in The New York Times, Dr. Daniel Kahneman, a Nobel Prize winner and author of “Thinking, Fast and Slow,” theorizes that our brains operate on two different levels or systems which he calls “Experiencing Self,” or System 1, and “Remembering Self,” or System 2. The first system operates primarily on a subconscious level: It is fast, automatic, emotional, frequently in play, and relies mostly on stereotypes. The second system is deliberate, logical, slow, infrequent, and lazy – coming into play only with effort. System 1 jumps to conclusions, while System 2 forms judgments. System 2 likes novelty, significance, and endings (the last moments of an experience).
Kahneman theorizes that we rely on System 1 – what writer Malcolm Gladwell in his book “Blink” calls “intuition” – for most decisions, exercising System 2 only with conscious effort and when we are aware that System 1 might be faulty. These basic cognitive processes are necessary to accurately perceive and understand the world around us. However, the tendency to over-rely on intuition – stereotypes, impressions, and distorted, even false memories – frequently leads to bad conclusions, inappropriate acts, and later regrets.
Limitations of the Senses & Memory
We are flooded with thousands of sensory impressions every minute of the day – sights, sounds, odors, tastes, touches – which must be interpreted and processed, too many to capture every detail of every sense. For example, the human eye can make out fine detail only in about a keyhole-sized circle at the very center of your gaze covering about one-tenth of your retina; the vast majority of your visual field is blurry, indistinct, and poor quality. As a consequence, you are constantly moving your eyes or changing your visual focus to capture bits and pieces of information.
Your brain assembles the fragments into a whole visual scene based upon your expectation of what should be there which is based upon your experience. Your brain is really a very efficient prediction machine; even though your eye is roughly equivalent to a one-megapixel camera (less resolution than you probably have on your cell phone), you enjoy a rich, detailed perception of the world. You actually “see” an illusion created by the fill-in processes of your brain.
According to the American Psychological Association, the tendency to overlook or failure to notice visual elements is called “inattentional blindness.” It is not a limitation of the eye to capture data, but a limitation of the mind. Generally, the ability to ignore distractions around us is a positive attribute, enabling us to focus. However, it is also the reason that drivers fail to “see” a motorcyclist on the highway, or that witnesses to crimes present different versions of the event.
How Memory Really Works
Memories work similarly to the way we create a visual scene in our mind. Contrary to popular opinion, the brain doesn’t function like a tape recorder or a movie camera collecting every tiny detail of an event which can be replayed in the future. It is physically impossible to store all of the sensory information that bombards us every moment of the day. So the brain stores small bits of information which are considered to be most important, reconstructing the rest of the details around those bits when you need it (when you recall the memory). If the new information is related to something you already know, it is even easier to transfer into long-term memory using the same and related neural pathways, even as short-term memories fade.
Researchers have long known that it is possible to create a false memory through suggestion (a skill that unscrupulous police detectives practice on witnesses or to obtain confessions, leading many to question the value of any eyewitness testimony). For example, the prom you attended in high school that wasn’t much fun can, over time, become the highlight of your teenage years. Bad elements are forgotten, and new positive endings are added.
One cause of false memories is change blindness, the failure to compare the present with the past or to perceive how something has changed. Most of us operate under the presumption that we notice changes of consequence, and if we didn’t recognize a change, one didn’t occur – ergo, if we don’t see it, it is not there.
Unsurprisingly, people are blind to their own change blindness. While false memories may be based on factual events, they are invariably distorted, even merging two or more disparate memories into a single event, transposing who did what. We can even adopt events we read about or see in the movies into our own lives as if they had actually occurred. Over time, the false memory becomes embedded in the mind, becoming stronger and more vivid, sometimes changing to incorporate new information or experiences.
Commonly Held Illusions
In their book “The Invisible Gorilla,” psychologists and researchers Christopher Chabris and Daniel Simons have identified a number of mental illusions as a result of their research into how we think and make decisions. Those illusions lead to pseudo-truths and misperceptions.
1. Illusion of Memory
What we think we remember and what we actually remember are not the same. Memory doesn’t store everything we perceive, but takes bits and pieces of what we see and hear and associates it with what we already know. These cues help us retrieve the information and put it together, making our memory more fluent.
Some memories can be so strong that even documentary evidence that it never happened doesn’t change what we remember. In 1997, a basketball player at University of Indiana accused Coach Bob Knight of choking him during a practice and needing to be restrained by two coaches, an incident that was widely reported in the sports pages, as Knight was considered one of the best college basketball coaches in the game. All of the participants in the incident and the witnesses, other players at the practice, had different memories of the event when questioned – some directly contradictory to others.
Sometime shortly after the incident, a videotape of the practice surfaced. Surprisingly, none of the memories were 100% correct, and a few completely distorted the actual event. Yet there is no evidence that anyone lied or deliberately embroidered their story; they all suffered from false memories. As Dr. Daniel Kahneman says, we tell stories to ourselves.
2. Illusion of Attention
We believe that we process all of the detailed information that surrounds us all of the time, when the reality is that we know vividly some aspects of our world and are completely unaware of other aspects that fall outside our center of attention. This phenomenon, another example of inattentional blindness, occurs when your attention is focused on one area and you fail to notice unexpected objects.
Chabris and Simons ran a now-famous experiment in 1999 where people intensely focused on a basketball game between two teams dressed in black and white jerseys failed to notice a female student dressed in a full gorilla suit who walked across the middle of the court during the game, stopped, faced the camera, thumped her chest, and walked off. She was on camera for nine seconds of the less-than-one-minute video. Roughly half of the people taking part in the experiment failed to notice the gorilla, even as the experiment has been repeated many times, under different conditions, with diverse audiences, and in multiple countries.
3. Illusion of Confidence
We continually and constantly overestimate our own qualities, especially our abilities relative to those of other people. At the same time, we interpret the confidence that others express as a valid indication of their knowledge, expertise, and the veracity of their memories. This tendency to overestimate our own abilities extends to our sense of humor and other talents. For this reason, according to Chabris and Simons, really bad singers appear on the television show “American Idol” because they have no clue as to their lack of talent.
The truth is that experience doesn’t guarantee expertise. Part of the illusion is that groups, where each member contributes his or her unique knowledge, skills, and deliberation, will make better decisions than individuals. Unfortunately, the decision is more likely to reflect group dynamics, personality conflicts, and other social factors that have little to do with who knows what and why they know it. Not surprisingly, group leaders are no more competent than anyone else; they become leaders by force of personality, rather than by ability.
We tend to trust people who appear confident, sometimes inappropriately. This is why con-men and scam artists are so effective.
4. Illusion of Knowledge
Humans easily deceive ourselves into thinking that we understand and can explain things that we really know very little about. It differs from the illusion of confidence – an expression of one’s certainty – and results from the implicit belief that you understand things better than you actually do. For example, the recent debacle in the mortgage securities market or the failure of Enron was in part due to a lack of understanding about the complicated financial derivatives in common use by the industry. Warren Buffett, no financial slouch, called such derivatives “financial weapons of mass destruction.” Despite the confidence shown by Wall Streeters in their use, practice demonstrates an illusion of knowledge where it is not present.
We often mislead ourselves by focusing on snippets of information that we do possess while ignoring what we don’t know. We equate familiarity with knowledge, sometimes with disastrous consequences. The phenomenon is present in all of us, particularly those who rank in the lower quartile of knowledge about a subject; they most often overestimate their abilities. There is some evidence that the gap between actual knowledge and the over-estimation begins to close as we gather more knowledge, but it never disappears.
5. Illusion of Cause
Our ability to recognize patterns has long been critical to our survival as a species. The ability to see intent in an expression, a gait, or a gesture enables us to distinguish between friends and enemies, and we often make conclusions in seconds that would take hours if we rationally considered alternatives and consequences.
At the same time, we have tendencies to see patterns where none exist, to correlate cause and effect inappropriately, and assume that the past is a totally accurate predictor of the future. Scientists call the tendency to perceive meaningful patterns in randomness “pareidolia,” which leads to seeing the Virgin Mary in a grilled cheese sandwich, the face of Jesus in a potato chip, and the word “Allah” written out in Arabic in the veiny material of a sliced tomato.
The consequences of this illusion can run from comical, to bizarre, to dangerous. It is a scientific principle that correlation does not imply causation. The fact that both the consumption of ice cream and the number of drownings increase during the summer is not evidence that eating ice cream will result in drowning.
6. Illusion of Narrative
We can encourage others to reach certain conclusions by arranging factual statements in a particular order and/or omitting or inserting relevant information that might lead them to a different opinion from our intent. Our brains developed not as instruments to make the optimum decisions, but to find food to eat and protect us from being eaten. As a consequence, many people – unless they have training in probability, statistics, regression, and Bayesian analysis – place undue importance on anecdotal information as opposed to hard numbers or proven facts.
Consider the following examples of exaggerations:
- Likelihood of Becoming a Victim of Violent Crime. People overestimate the likelihood of being the victim of violent crime because they see story after story in the media of such events. As a consequence, people rush to buy guns for self-protection, install expensive security alarms, and enroll in self-defense classes. Yet according to the FBI, violent crime has been cut in half in the United States since 1992. In fact, the odds of being a victim are less than one-half of 1%. You are 73 times more likely to die in the U.S. from heart disease or malignant tumors than from homicide.
- Likelihood of Illegal Immigrants Taking Over the Country. Immigration is a controversial subject in the United States. Headlines regularly appear about deportations and the Hispanic “takeover” of America. Yet according to the Department of Homeland Security, the total number of illegal immigrants in the U.S. is around 11.5 million, representing 3.7% of the total population. About 14% of the total have entered the U.S. since 2005, with about 28.3% of the 14% total arriving from Mexico since 1960. While a problem, the issue seems to have undue importance when compared to other issues facing the U.S.
The illusion of narrative can be particularly harmful to your self-esteem and self-confidence if you give too much weight to personal criticism that incorporates all-encompassing words, including “always” (such as, “you always…”) and “never (such as, “you never…”).
7. Illusion of Potential
The belief that we can acquire skills or abilities with minimal effort is the basis for the popularity of fantasy stories and comic books. Children often dream of waking up one day with mystical superpowers or discovering secret gifts and talents they never knew they possessed. Many adults retain such illusions, even though they have been rationalized to better fit adult situations. Failing to achieve a goal is not a lack of effort, but the lack of a key to using one’s “real potential” or lack of opportunity.
The myth (according to Scientific American) that we only use 10% of our brain capacity has been popular for years, and expresses the idea that we have “hidden potential” just waiting to be tapped. Unfortunately, the downside of this illusion is that some people fail to take advantage of opportunities to learn and improve themselves, and instead hope that someone will recognize their “true” ability. People passed over for raises or job promotions rarely look at themselves to identify possible weaknesses or shortcomings, and instead assume that the promoted recipient was lucky, had an upper-management sponsor, or possessed some other external advantage beyond his or her control. Rather than expend the effort to improve their capabilities, they console themselves with the belief they have potential that people will someday appreciate.
Dr. Anders Ericsson, a professor of psychology at Florida State University, has published numerous books and papers regarding the acquisition of expertise and practice, and was later popularized in Malcolm’s Gladwell’s book “Outliers.” While Dr. Ericsson’s work has been misstated and misinterpreted regarding the number of hours of practice required to gain mastery of a subject, many researchers agree that experience (i.e., deliberate practice) is essential in developing potential of any kind of skill.
There is no innate intelligence or hidden talent that can provide expertise alone. In fact, to become an “expert,” you need practice, constant feedback so that you can correct your errors, and positive reinforcement so that you don’t give up.
By understanding how our mind works and the possibility that “facts” or information we believe to be facts are not always valid, we can make better decisions with better outcomes. Occasionally, all of us are victims of our misperceptions, commonly held pseudo-facts, and reliance upon our instincts rather than our judgments. Before committing to a position that might be harmful, costly, or embarrassing, reconsider your decision and your “facts” to ensure that you are not tricking yourself.
What do you think? Have you experienced any of the illusions in your own life?