https://co4dm.com/wp-content/uploads/2019/01/blackboard-business-chalkboard-355988.jpg
The four problems

Cognitive biases are just tools, useful in the right contexts, harmful in others They’re even pretty good at what they’re meant to do. There are four problems that biases help us address: Information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later. Keeping these four problems and the four consequences of our brain’s strategy to solve them will help insure that we notice our own biases more often.

Problem 1
What we should remember
In order to manage the enormous amount of data being absorbed through our five senses as efficiently as possible, our brains need to remember the most important and useful bits of new information and inform the other systems so they can adapt and improve over time. However, our memory reinforces errors. Some of the stuff we remember for later just makes all of our systems more biased, and more damaging to our thought processes.
Problem 2
Too much information
In order to avoid drowning in information overload, our brains need to skim and filter insane amounts of information and quickly, almost effortlessly, decide which few things in that constant stream are actually important and call those out. However, we don’t see everything. Some of the information we filter out is actually useful and important.
Problem 3
Too Little information
In order to construct meaning out of the bits and pieces of information that come to our attention, we need to fill in the gaps, and map it all to our existing mental models. However, our search for meaning can conjure illusions. We sometimes imagine details that were filled in by our assumptions, and construct meaning and stories that aren’t really there.
Problem 4
Problem 4: Need to act fast
In order to act fast, our brains need to make split-second decisions that could impact our chances for survival, security, or success, and feel confident that we can make things happen. However, quick decisions can be seriously flawed. Some of the quick reactions and decisions we jump to are unfair, self-serving, risk averse and counter-productive.
Problem 1: What we should remember

We simply can’t remember everything. We can only afford to keep around the bits that are most likely to prove useful in the future. We need to make trade-offs around what we try to remember and what we forget. For example, we prefer generalizations over specifics because they take up less space. When there are lots of irreducible details, we pick out a few standout items to save and discard the rest.

We discard specifics to form generalities. We do this out of necessity, but the impact of implicit associations, stereotypes, and prejudice results in some of the most glaringly bad consequences from our full set of cognitive biases. 

We reduce events and lists to their key elements. It’s difficult to reduce events and lists to generalities, so instead we pick out a few items to represent the whole. Either putting emphasis on the story rather than the source or the place in the list (Primacy/ Recency). 

We store memories differently based on how they were experienced. Our brains will only encode information that it deems important at the time, but this decision can be affected by other circumstances (what else is happening, how is the information presenting itself, can we easily find the information again if we need to, etc) that have little to do with the information’s value.

We put greater emphasis on negative events than positive eventsOur capacity to weigh negative input so heavily most likely evolved for a good reason—to keep us out of harm’s way. From the dawn of human history, our very survival depended on our skill at dodging danger. The brain developed systems that would make it unavoidable for us not to notice danger and thus, hopefully, respond to it.

We notice when something has changed—and we’ll generally tend to weigh the significance of the new value by the direction the change happened (positive or negative) we have a tendency to over-value the effect of small quantitative differences when comparing options – again it is a survival trait but comes to the fore in decision making. When we are making decision in comparison we notice small difference. Once we have made the decision and no longer have the comparison we do not have the same sensitivity.

We edit and reinforce some memories after the fact. During that process, memories can become stronger, however various details can also get accidentally swapped. We sometimes accidentally inject a detail into the memory that wasn’t there before.

Close
Problem 2: Too much information

There is just too much information in the world, we have no choice but to filter almost all of it out. Our brain uses a few simple tricks to pick out the bits of information that are most likely going to be useful in some way.

We notice things that are already primed in memory or repeated often. This is the simple rule that our brains are more likely to notice things that are related to stuff that’s recently been loaded in memory.

We are drawn to details that confirm our own existing beliefs. This is a big one. As is the corollary: we tend to ignore details that contradicts our own beliefs.

We notice when something has changed. And we’ll generally tend to weigh the significance of the new value by the direction the change happened (positive or negative) more than re-evaluating the new value as if it had been presented alone. Also applies to when we compare two similar things.

Bizarre/funny/visually-striking/anthropomorphic things stick out more than non-bizarre/unfunny things. Our brains tend to boost the importance of things that are unusual or surprising. Alternatively, we tend to skip over information that we think is ordinary or expected. 

We notice flaws in others more easily than flaws in ourselves. Yes, before you see this entire article as a list of quirks that compromise how other people think, realize that you are also subject to these biases.

Close
Problem 3: Not enough meaning

The world is very confusing, and we end up only seeing a tiny sliver of it, but we need to make some sense of it in order to survive. Once the reduced stream of information comes in, we connect the dots, fill in the gaps with stuff we already think we know, and update our mental models of the world .

We find stories and patterns even in sparse data. Since we only get a tiny sliver of the world’s information, and also filter out almost everything else, we never have the luxury of having the full story. This is how our brain reconstructs the world to feel complete inside our heads.  

We fill in characteristics from stereotypes, generalities, and prior histories whenever there are new specific instances or gaps in information. When we have partial information about a specific thing that belongs to a group of things we are pretty familiar with, our brain has no problem filling in the gaps with best guesses or what other trusted sources provide. Conveniently, we then forget which parts were real and which were filled in.

We imagine things and people we’re familiar with or fond of as better than things and people we aren’t familiar with or fond of. Similar to the above but the filled-in bits generally also include built in assumptions about the quality and value of the thing we’re looking at.

We simplify probabilities and numbers to make them easier to think about. Our subconscious mind is terrible at maths and generally gets all kinds of things wrong about the likelihood of something happening if any data is missing.  

We think we know what others are thinking. In some cases this means that we assume that they know what we know, in other cases we assume they’re thinking about us as much as we are thinking about ourselves. It’s basically just a case of us modelling their own mind after our own (or in some cases after a much less complicated mind than our own).

We project our current mindset and assumptions onto the past and future. Magnified also by the fact that we’re not very good at imagining how quickly or slowly things will happen or change over time.

Close
Too much information 4

We’re constrained by time and information, and yet we can’t let that paralyze us. Without the ability to act fast in the face of uncertainty, we surely would have perished as a species long ago. With every piece of new information, we need to do our best to assess our ability to affect the situation, apply it to decisions, simulate the future to predict what might happen next, and otherwise act on our new insight.

In order to act, we need to be confident in our ability to make an impact and to feel like what we do is important. In reality, most of this confidence can be classified as overconfidence, but without it we might not act at all.

In order to stay focused, we favour the immediate, relatable thing in front of us over the delayed and distant. We value stuff more in the present than in the future, and relate more to stories of specific individuals than anonymous individuals or groups. I’m surprised there aren’t more biases found under this one, considering how much it impacts how we think about the world.

In order to get anything done, we’re motivated to complete things that we’ve already invested time and energy in. The behavioral economist’s version of Newton’s first law of motion: an object in motion stays in motion. This helps us finish things, even if we come across more and more reasons to give up.

In order to avoid mistakes, we’re motivated to preserve our autonomy and status in a group, and to avoid irreversible decisions. If we must choose, we tend to choose the option that is perceived as the least risky or that preserves the status quo. Better the devil you know than the devil you do not. We favour options that appear simple or that have more complete information over more.

We favour options that appear simple or that have more complete information over more complex.

Close