When You’re (sometimes) Wrong

The human brain is certainly an impressive organ. It serves as the epicenter of our critical thinking abilities, senses, muscle movements, and many other aspects of our bodily functions. Unfortunately, the brain also has its flaws when it comes to how we process information. We face a series of engrained cognitive biases and misconceptions that can lead us astray when making decisions or going about our daily life, even if these biases and misconceptions are present to help simplify the world around us. While there is quite the list of identified or hypothesized cognitive biases (with some having strange names such as the “IKEA effect”), I only picked a few to cover in this article. Understanding some of these biases will not prevent you from making poor decisions in your life….that’s still bound to happen…. BUT maybe you’ll be able to catch yourself once in a while.

Dunning-Kruger Effect

In a nutshell, the Dunning-Kruger effect suggests that our ability to assess our own skills or knowledge can be deficient, and those that are less knowledgeable about a subject have a tendency to overestimate their knowledge. Seems to make sense, given that you don’t know what you don’t know if you’re not very familiar with a subject. 

The original paper that proposed the effect was published in the Journal of Personality and Social Psychology in 1999. Four studies were conducted in which participants took tests on humor, grammar, and logic whilst being asked to estimate their final score. Those scoring with some of the lower scores (in the 12th percentile, well below average) had actually estimated that they would score much higher than they did (estimating around 62nd percentile, or above-average). After improving the pertinent skills of participants, they were better able to recognize their inabilities and better predict their actual scores.

Graph generated in the original 1999 study.

Graph generated in the original 1999 study.

Cool... so that arrogant kid in class that thinks they know more than the teacher is actually not that bright? Well, not exactly. Some follow-up studies have shown that randomly-generated test scores and self-estimation ratings run on computers have similar results compared to the Dunning-Kruger research study results. While the data from the original paper seem to support the idea that less knowledgeable people tend to overestimate their knowledge, the data itself could have simply been the result of randomness. So in the end, we don’t know if the “Dunning-Kruger effect” is actually a real cognitive bias or phenomenon. 

It’s sometimes difficult to measure or definitively correlate data from human psychology studies, which is important to keep in mind as we continue to explore some other cognitive biases in this article. I think one thing we can all learn from the proposed Dunning Kruger effect is to stay humble.

Truth Default Theory

Though it is only one of multiple theories dealing with honesty, the Truth Default Theory states that “when humans communicate with other humans, we tend to operate on a default presumption that what the other person says is basically honest” (Levine, 2014). The theory does go on to consider that our assumption of honesty is adaptive, and we sometimes stop presuming that others are honest under certain situations (so I guess we’re not THAT naive). Essentially, humans passively assume honesty and do not consciously assume something is dishonest or deceptive unless something triggers our suspicion. This is convenient, because truth default theory also states that a majority of human communication is indeed honest (not deceptive), which gives me hope for humanity. There’s actually one study published in 2010 that concluded only a small portion of society is responsible for most of the lies told (called “a few prolific liars”). 

The truth default theory encompasses a number of tenets, which includes the idea that humans are generally biased towards thinking things are true. Some reasons we might suspect deception, based on truth default theory, include:

  • a projected motive for deception (does this person have a reason to lie to me?)

  • behavioral displays associated with dishonest demeanor (e.g., they don’t make eye contact)

  • a lack of coherence in message content (they contradict themselves)

  • a lack of correspondence between communication content and some knowledge of reality (what you know is at odds with what they’re telling you)

  • information from a third party warning of potential deception

This theory is actually supported by a number of empirical studies, so it’s not just a random guess or ethereal idea. Being too naive is likely a bad thing, but the opposite, being overly cynical and suspicious, is likely detrimental as well. Honestly, society would probably break down if we always presumed others were lying to us as a first assumption. 

Sunk Cost Fallacy

Somewhat well-known as a cognitive bias is the “sunk cost fallacy”, which states that we might tend to continue a behavior or endeavor based on past decisions or investments of time, money, or resources. A major sticking point of this cognitive bias is that we tend to continue the behavior/endeavor (as mentioned) even though stopping it would, rationally, be more beneficial to us. There are numerous examples of this, and I’m sure each of us have fallen into the trap of the sunk cost fallacy at some point. 

A common example of the sunk cost fallacy is as follows: 

You’re at an all-you-can-eat buffet, and it costs you $50 for entry. You eat massive quantities of food and feel uncomfortable. You still return to the buffet to get one last plate (and shortly after fall into a food coma) simply because you wanted to “get your money’s worth” by eating as much as possible. 

The $50 you spent for the buffet was a “sunk cost” since it is irretrievable and shouldn’t affect your decision to go up for a fifth plate of food. You’re not technically “getting your money’s worth” by getting that extra food... because you paid the $50 either way. Maybe you could argue that the extra calories will be stored as fat reserves that will help get you through the winter with less food later, but there are also other examples of the sunk cost fallacy that have nothing to do with food. 

Sometimes, we can attempt to use the sunk cost fallacy to our own advantage. You could pay upfront for a $500 annual gym membership, for example, which would motivate you to actually go to the gym to make the membership worth it for that cost. In reality, though, the membership is a sunk cost, and it doesn’t technically matter whether you go to the gym or not… you already spent the money. 

Stakes can certainly get much higher with the sunk cost fallacy. As another example, we can look at the supersonic passenger jet, the Concorde, developed as a joint project between France and Britain. The development of the jet required significant monetary investment, but there were many signs during project development which suggested that it was likely a bad investment. Despite these signs, they still continued developing the jet since they had already invested so much money up to that point (although this source also suggested political motivations also potentially played a role).

Outside of money, time is another big motivator of the sunk cost fallacy. Suppose you’re fixing up a house that you’re planning to resell. You get to the end, and there’s likely a hesitation, however fleeting, to keep the house simply due to all of the effort you put into its repair. Unless you have a time machine, time itself is inherently irretrievable and a sunk cost. Each moment is a new moment, and you shouldn’t continue doing something simply because “you already invested so much time into it”. Cue the person that stays in a dead-end relationship. 

Spotlight Effect

Understanding the spotlight effect might ease your social anxiety, because it states that we tend to think people notice something about us more than they actually do. That stain on your shirt that you’re so concerned about? Yeah, people don’t notice it as much as you think. Unfortunately, we also have to deal with the flipside… people don’t notice the positive things about you as much as you think they do, such as your dope new haircut. This applies to other aspects that don’t just include physical appearance; an example might be that we remember something embarrassing that we did years ago even though other people probably forgot. 

This cognitive bias is attributed to egocentrism, which is the idea that we rely heavily on our own perspectives to evaluate the world around us, because our own perspective is essentially all we’re capable of directly accessing. Egocentrism also contributes to another similar cognitive bias in which we think others are able to discern what we’re thinking or feeling more easily than they actually are (illusion of transparency). Egocentrism itself does not suggest we’re selfish. Part of the spotlight effect could also be attributed to our own familiarity with ourselves. For example, you know your own behavior patterns pretty well compared to others since you live through them every day, so it’s easier for you to know when you switch something up. For close friends or family, they would also be more prone to noticing changes since they are more intimately familiar with you. 

One of the primary studies supporting the spotlight effect was published in the Journal of Personality and Social Psychology in 2000. The study involved college students wearing a brightly colored t-shirt with Barry Manilow’s face on it (considered extremely embarrassing at the time). The college students who participated in wearing the shirt thought that about 50% of other students noticed the shirt… but only 25% of other students could actually remember the shirt. Similar studies were actually performed to support the phenomenon, which included “cooler” shirts with the faces of Bob Marley, Jerry Seinfeld, Martin Luther King Jr., or Vanilla Ice (the rapper).

picture1.jpg

Imagine wearing a shirt with this guy’s face on it (Barry Manilow)… and only 25% of people noticed!

So, the moral of the story is not to worry too much about what others think of your looks or actions, because it’s likely that other people don’t even notice or care as much as you think.

Functional Fixedness

Functional fixedness is a common cognitive bias in which we only identify a particular object for its intended use (i.e., the function of the object is “fixed” in our minds). An example might be seen more clearly with a candle. Besides its use as a candle, the candle wax could be melted down and reformed into shapes or used as a sort of adhesive to stick things together or seal envelopes, and the wick of the candle could be removed and used as a simple string for a variety of purposes. Someone with functional fixedness, though, would simply see a candle and see its normal use as being lit to brighten a room. It means you run into trouble when you try thinking “outside the box”, and functional fixedness is often seen as a barrier to innovation or creativity.

One interesting example where functional fixedness played a major role is during the sinking of the Titanic. Some suggest it’s possible that alternative ideas for life rafts or flotation devices could have been used to save more passengers. For example, wooden tables or tires could have been assembled to create a makeshift life raft. Additionally, it may have been possible that passengers in life rafts could have climbed atop a nearby iceberg (essentially treating the iceberg as its own flotation device) to free up further space in the life rafts for other passengers. It’s clearly difficult to know for sure the possibilities of these alternatives during the sinking of the Titanic, and it’s certainly easier to ponder about these things when you’re not on a massive sinking ship. 

The reasons behind human functional fixedness can be attributed to age and experience. In fact, one study published in Psychological Science examined functional fixedness in the Shuar people of the Ecuadorian Amazon, a “low-technology” culture, and found that they also experienced this cognitive bias at older ages. With age, our perception of objects and their intended use becomes stronger, and it seems like studies put the age around 5 or 6 for when we establish more entrenched functional fixedness for common objects (such as a box). With experience in problem-solving, we typically tend to stick with the same old solutions or problem-solving processes as opposed to coming up with new, creative ones to solve the same problems. It’s somewhat of a conundrum, since those that are inexperienced are potentially more apt to come up with creative solutions. 

The idea that a box is functioning as a container seems to significantly affect how older kids use it in finding a solution to a predetermined problem.

The idea that a box is functioning as a container seems to significantly affect how older kids use it in finding a solution to a predetermined problem.

Two of a few possible strategies to help fight this functional fixedness have been proposed:

  • Search alternative fields of study for analogous solutions or situations. An example of this comes from PepsiCo, who was trying to reduce the amount of sodium in chips while avoiding any decrease in flavor. They found their answer in orthopedic research, which had used salt nanoparticles for studying osteoporosis. Coincidentally, changing the form or size of salt can enhance or alter the flavor without affecting the total amount going into your body. 

  • Break the problem down into simple, abstract ideas. An example of this could include simply reframing the problem in different words, such as replacing the word “fasten” in a problem statement with an array of different words, such as “adhere”, that have similar meanings. This action of removing details or particular language prevents you from falling into functional fixedness. 

While I’m not suggesting you spend the mental energy required to consciously keep tabs on your cognitive biases at all times, it certainly helps to understand and recognize how they might affect your decisions. It’s also good to keep in mind that, just as was seen with the Dunning-Kruger Effect, it’s difficult to essentially prove that any of these cognitive biases truly exist with high certainty. 


To Think About…

  1. In your own life, have you caught yourself falling into one of the cognitive biases mentioned in this article recently?

  2. What are some positive aspects of these cognitive biases, and what purpose do you think they could serve?

  3. How might you avoid some of these cognitive biases if you find them to be detrimental to your own daily life?