💡
Cognitive biases are systematic errors in reasoning based on context, past experiences, and knowledge. The world is complex, and so is our brain. We are constantly making decisions, both good and bad. To better this process, we need to constantly question reality and ourselves.

Cognitive Biases and Heuristics

Yes, No, and "It depends"

The human mind is both complex and fascinating and can also be very predictable. Cognitive biases are systematic errors in reasoning that happen when people judge the probability of events on the basis of what they already know.

Cognitive heuristics are mental shortcuts used to make decisions about how likely an event or outcome will occur, which often lead to mistakes.

In this blog post, we'll examine cognitive bias and heuristics from a psychological perspective - how they work and why they happen,

Action Bias

An action bias is a cognitive or perceptual process that favours decision-making based on actions rather than inaction.

Cognitive biases such as the gambler's fallacy and sunk cost bias can lead to action bias, so people favour doing something (i.e., taking an alternative) even when it might be better not to do anything at all.

This type of bias is often seen in business and investment contexts, where people are more likely to take risks in order to avoid potential losses.

An example of how action bias can lead to bad decision-making can be found in the sunk cost fallacy.

The sunk cost fallacy occurs when people continue investing time or money into something they know they shouldn't.

The bias can lead people to continue spending money on bad investments or keep throwing time into a project that isn't working because they don't want their earlier investment to be for nothing.

Affect Heuristic

The affect heuristic is a mental shortcut that relies on emotions rather than reason to make judgments about the likelihood of events.

When people are asked to estimate how likely something is, they often rely on their feelings or intuition instead of doing any actual math or reasoning necessary.

This can lead to inaccurate judgments, especially if people are feeling emotional at the time.

For example, if you're feeling angry or afraid, your judgment about how likely it is that something will happen may be skewed by those emotions. The affect heuristic can also lead people to make decisions based on their feelings for other people rather than what would be objectively best for them.

For example, you might decide to vote for a candidate because you like them, even if they're not the best candidate for the job.

Ambiguity Effect

The ambiguity effect is the tendency for people to make different judgments about something if it can be interpreted in more than one way.

If you like someone who likes you back, that's great! But what happens when the person neither likes nor dislikes you? When there are two possible outcomes (liking or not liking), then both possibilities will be considered, which can lead to indecision or anxiety.

This effect is caused by the uncertainty associated with ambiguous situations - people don't like not knowing what's going to happen.

The ambiguity effect is also relevant in business decisions. For example, imagine you're considering investing in a new company. You know that there are risks involved, but you're not sure what they are.

You might be more likely to invest if you have a good idea of the risks involved because that will reduce the ambiguity and make it easier for you to make a decision.

Conversely, if you don't know much about the company or the industry, you'll be less likely to invest, because of the increased ambiguity.

Network Effect

The network effect is a phenomenon where a product or service becomes more valuable as more people use it.

This makes it difficult for competitors to enter the market because they have to be able to attract customers away from the original network in order to compete effectively.

It's a common theme in many different aspects of business, from social media and online services to cell phones and banking. The concept of the network effect can also apply to companies' stocks and assets, as well as intellectual property like patents.

The more people use a network, the more useful it becomes. A phone gains utility as more people use phones because more people can be called with it. It's why Twitter & Facebook are so dominant; we're stuck on these platforms because everyone else is.

Anchoring Bias

Anchoring bias is a cognitive bias that occurs when people rely too heavily on the first piece of information they receive when making decisions.

For example, if you're offered two prices for a car - $15,000 and $18,000 - you're more likely to choose the $15,000 option, even if it's not the best deal.

Attentional Bias

Attentional bias is the tendency for people to focus on certain pieces of information while ignoring others.

It can be due to cognitive factors, such as memory load or how interesting something is, or emotional factors, such as anxiety or fear.

Attentional bias can lead to inaccurate judgments and decisions.

For example, someone with a strong attentional bias for words that are read in red might tend to overestimate the number of times this color is mentioned on a website.

Availability Heuristic

The availability heuristic is a cognitive bias that occurs when people estimate the probability of an event based on how easily they can think of examples. This heuristic is often used to make decisions quickly and under time pressure, but it can lead to inaccurate judgments.

For example, people might overestimate the likelihood of a terrorist attack happening in their city because there have been several high-profile attacks in recent months, even though the overall risk of being harmed by terrorism is actually very low.

Another common application of the availability heuristic is judging the attractiveness of potential romantic partners - people are more likely to be drawn to someone who seems familiar or easily available without taking other factors into account.

Bandwagon Effect

The bandwagon effect is a cognitive bias that occurs when people do something or believe something because other people do or believe it.

It's often used to describe the way people jump on the latest trend without really considering whether it's right for them, and it can lead to some pretty bad decisions.

For example, think about all of the people who bought cryptocurrencies in late 2018 without understanding what they were investing in - they were just going with the flow instead of thinking critically about their decision.

Cognitive biases like this one can be incredibly powerful forces, so it's important to be aware of them and learn how to counteract them.

Paradox of Abundance

Easy availability of food led to obesity for the masses but good health for the few who used the increased choice to avoid the mass-produced junk. Equally, you can avoid intellectual diabetes by ignoring junk info like gossip & clickbait.

Barnum Effect

The Barnum Effect is a cognitive bias that occurs when people read descriptions of their personality traits and think they are accurate.

This happens because people tend to be quite bad at accurately judging their own personalities, so they're more likely to believe any description that seems to fit them well.

The Barnum Effect is also known as the Forer Effect, after psychologist Bertram Forer who first described it in 1948.

Imagine you take a personality test online, and it tells you that you're "creative, intuitive, and insightful."

You might start to believe that this is true about yourself, even though there's no evidence for it. Almost everyone would probably say they are creative, intuitive, and insightful - these are positive traits that most people like to claim for themselves.

There's nothing wrong with you if the test told you this - it's just a tendency we all have, and one of the ways Cognitive Biases work.

Base Rate Fallacy

The base rate fallacy is a type of logical mistake where someone assumes that what they are familiar with must be true, even when there's no evidence for this.

Cognitive heuristics can lead us to make assumptions about probabilities based on our personal experience rather than the bigger picture.

Bikeshedding

Bikeshedding is a cognitive bias that occurs when people focus more on the details of a problem than on its overall purpose. This can lead to unnecessary debate and discussion over minor issues while the bigger picture is ignored.

Suppose you're planning a party and need to decide what food to serve. If you spend too much time debating whether to serve hot dogs or hamburgers, you'll never get around to deciding on anything else.

But if you focus on the overall goal - providing food for your guests - then making a decision about which type of meat to serve becomes much easier.

The Bottom-Dollar Effect

The bottom-dollar effect is a cognitive bias that causes people to dislike purchases that exhaust their remaining budget.

It is a type of endowment effect, in which the consumer places a higher value on products they already own than on similar products they don't.

It has been observed in multiple studies, with similar results across different consumer markets.

For example, consumers may be more likely to purchase an item if it is priced near their current budget threshold.

Conversely, consumers may be more likely to avoid purchases that would put them over their spending limit for the day or month.

Bounded Rationality

Bounded Rationality refers to our ability as human beings not being able to take into account every factor and instead use the simplest solution.

The bounded nature of human Rationality refers to using only a subset of the available information, thus reducing complexity and saving cognitive resources.

It is the idea that when people make decisions, they do not gather and analyze all the available information. Instead, they make decisions quickly and simply based on a subset of information.

Although this notion seems to go against our idea of Rationality, it is, in fact the main source of human Rationality.

For example, when thinking about how much money to spend on a product, we might take into account how much we earn and our total expenses in order to decide how much money we can afford to spend.

This is a rational way of making a decision, but it isn't likely to be our only consideration.

If there are many products available at varying prices, then we may also consider how nice they look, what other people are saying about them, or how they feel before choosing one over another.

These preferences will affect our choice even though they do not necessarily have any impact on its Rationality.

Bundling Bias

Bundling bias is a cognitive bias that occurs when people group items together, even if those items are not logically related.

For example, people might group all the items they need for a trip together - passport, tickets, money, etc. - and then pack them in their suitcase.

This bias can also lead to decision-making mistakes; for example, people might be more likely to buy a product if it's bundled with other products they want, rather than buying the product alone.

Bundling bias can occur in all sorts of situations, both online and offline. Grouping items together has become a common tactic for marketers on e-commerce sites; many companies will offer free shipping if you buy two or more products at once, so the buyer is likely to assume that one product isn't worth buying without the other.

Parkinson's Law

Work expands to fill the time allotted for it. No matter the size of the task, it will often take precisely the amount of time you set aside to do it because more time means more deliberation & procrastination.

Flow States

You're in flow when you're so engrossed in a task that the world vanishes, and the work seems to do itself. Flow is automatic, and it makes work much easier than you imagined. All you have to do is overcome the initial hurdle of beginning a task; flow does the rest.

The Curse of Knowledge

The more familiar you become with an idea, the worse you become at explaining it to others because you forget what it's like to not know it and therefore what needs to be explained to understand it. Makes it hard to write threads like this!

Status Quo Bias

Those who were unfazed by Covid because it had a ~1% fatality rate were suddenly concerned about vaccines when they yielded a 1 in a ~million fatality rate. People see the risks of doing something but not the risks of doing nothing.

Semmelweis Reflex

People tend to reject evidence that doesn't fit the established worldview. Named for Ignaz Semmelweis, a surgeon who, before the discovery of germs, claimed washing hands could help prevent patient infections. He was ridiculed and locked away in a mental asylum

Bias Against Null Results

Studies that find something surprising are more interesting than studies that don't, so they're more likely to be published. This creates the impression the world is more surprising than it actually is. It also applies to news,

P-hacking

If you torture the data for long enough, it'll confess to anything." Academics get around the Bias Against Null Results by performing many statistical tests on data until a significant result is found, then recording only this. p-hacking is largely why we have a.

Replication Crisis

A large proportion of scientific findings have been found to be impossible to replicate, with successive tests often yielding wildly different results. Too many studies are bunk to take any of them at face value.

Luxury Beliefs

Cultural elites often adopt views that signal status for them but hurt the less fortunate. E.g., Those who claim that concern about Islamism is Islamophobic appear open-minded but, in fact, dismiss the (usually Muslim) victims of such extremism.

Bulverism

Instead of assessing what a debate opponent has said on its own merits, we assume they're wrong and then try to retroactively justify our assumption, usually by appealing to the person's character or motives. It explains 99% of Twitter debates.

Scout Mindset

We tend to approach discourse with a "soldier mindset,"; an intention to defend our own beliefs and defeat opponents'. A more useful approach is to adopt a "scout mindset; an intention to explore and gather information. h/t:

Operation Mindfuck

A conspiracy theory that can protect you from conspiracy theories. The Operation is being conducted by persons unknown and is a plot to make you believe lies. Whenever you receive information, ask yourself, is this part of Operation Mindfuck?

Hitchens' Razor

What can be asserted without evidence can be dismissed without evidence. If you make a claim, it's up to you to prove it, not to me to disprove it.

Decision Fatigue

The more decisions you make in a day, the worse your decisions get, so rid your life of trivial choices. Steve Jobs, Barack Obama & Mark Zuckerberg have been known to wear only 1 or 2 outfits to work so they don't have to choose each day

Cumulative Culture

Humanity's success is due not to our individual IQs but to our culture, which stockpiles our best ideas for posterity so they compound across generations. The ideas we adopt from society are often far older than us, and far wiser.

Chesterton's Fence

If an old law or tradition seems so irrational that you want to scrap it, then you shouldn't scrap it. The fact it's survived the ages despite seeming irrational means it must have a purpose. Before acting, understand that purpose. An argument for conservatism

The Veil of Ignorance

Create a constitution for a country as though you could wake up tomorrow in the body of any citizen, of any race, religion, or gender, and be forced to live as for them in the society you've created. A central idea behind liberalism.

The Tragedy of the Commons

The Rapa Nui people of Easter Island felled trees for wood until there were not enough trees to provide food, causing mass starvation. Everyone acting in their own interests can create outcomes against everyone's interests. A common argument for regulation.

Purposeful Stupidity

A common argument against regulation. In 1944, the OSS (now known as the CIA) published a field manual laying out strategies to subtly sabotage a society from within. The tactics described are eerily similar to what passes for normality today.

Mediocracy

Democracy works not because it picks the best leaders, but because it picks the most average leaders. The purpose of democracy is not so much progress as preservation.

The Messiah Effect

most people don't believe in ideals, but in people who believe in ideals. Hence why successful religions tend to have human prophets or messiahs, and why when a demagogue changes his beliefs, the beliefs of his followers often change accordingly.

Mental Models

Mental models help us simplify complex issues and make better decisions. The more mental models you have, the better your judgment will be. Here are 10 of the most important ones to know:

Availability Heuristic

A mental model that causes us to judge probabilities on the basis of what's readily available in our thoughts or memories. For example, if something comes to mind quickly and easily (e.g., dinosaurs), it seems more plausible than something that is difficult to think about. Or when you are trying to predict the outcome of a situation in which you have little information, it's easy for your mind to substitute an easier question in its place, usually by focusing on the most obvious or dramatic outcome.

Base rate fallacy

A mental model that causes us to ignore general statistical data in favor of specific case information. For example, doctors are often resistant to changing their diagnoses based on standard tests—such as mammograms for breast cancer—even when they have been proved effective through extensive statistical evidence. We often refuse to believe something is true unless we can see the evidence with our own eyes and think it through for ourselves, even if this conflicts with statistics. For example, it's easier to believe that a man would take more days off work than a woman, even though research shows this is not the case.

Bias from omission

A mental model that causes us to see past events as being more predictable than they were, and future events as being less predictable than they are likely to turn out to be. For example, most people feel that they would have predicted the events of 9/11 had they been able to see it coming—when in fact they wouldn't have had a clue. This is because our minds tend to only remember the information we learn after an event has occurred, and fail to take into account all of the previously unknown

Illusion of control

A mental model that causes us to overestimate our degree of influence over external events. For example, we might take excessive care when crossing the road because we believe the law of gravity has been repealed for our benefit. Or we might look for patterns in events that are, in fact, purely coincidental—such as finding a neighbor's car parked on our driveway every morning when we get up, without realizing it is because they live next door.

Negativity bias

A mental model that causes us to pay more attention and give more weight to negative things than positive ones. For example, you are likely to be more concerned about potential job dismissal than you would be excited by the prospect of receiving a promotion, even though both events are equally improbable. This is because our brains evolved to prioritize bad news over good news in order to avoid overlooking threats in the environment.

Normalcy bias

A mental model that causes us to underestimate both the possibility of a disaster occurring and its possible effects. For example, this is what happens when people refuse to leave their homes after an earthquake because they assume "it can't be that bad." Or it's why many drivers get annoyed with people who slow down to look at a car crash—because they think it's being overdramatic.

Ostrich effect

A mental model that causes us to avoid or ignore information because we would rather not deal with it, even though this makes the problem worse in the long run. For example, while many business leaders know that they should really focus on cyber security, they often put it in the back of their minds and pretend it isn't a problem for them. Or when something goes wrong with our computer or smart device, we immediately assume there is no easy solution available—rather than taking the time to look up how to fix it online. for example

Outcome bias

A mental model that causes us to judge a decision based on its eventual outcome rather than based on the quality of the decision at the time it was made. For example, we might think someone is wise because they've been successful in life—when in reality their wisdom may have contributed nothing toward their achievements. Or we might think someone is stupid because they failed at something—when in reality, for all we know, their failure could have been due to bad luck.

Planning fallacy

A mental model that causes us to underestimate the time, costs, or risks involved in future actions and at the same time overestimate the benefits. For example, when making plans for taking on a new business venture, we might be overly optimistic about how well it will do. Or when planning a holiday, we might fail to take into account all the costs involved before leaving home.

Post-purchase rationalization

A mental model that causes us to convince ourselves that expensive products and services are actually worth their price after having made the purchase. For example, we might convince ourselves that a $200 pair of shoes is actually well-made and better than any other $20 pair—even if we don't really believe it. Or we might tell ourselves the contractor we hired to fix our roof did an excellent job even though they didn't.

Reactance

A mental model that causes us to be more motivated by the restrictions placed on our choices than by the opportunities for freedom. For example, when offered a choice of several different chocolates to take home, most people are less excited about being able to choose between all these options and so opt for one of the less appetizing options instead. Or, when given a choice between different job offers to choose from, we would be more excited about one of the poor choices than we would by one of the good ones because it's more restrictive.

Restraint bias

A mental model that causes us to underestimate our ability to show restraint in the face of temptation. For example, when on a diet, we might overestimate our ability to resist the doughnuts in the office—but underestimate our ability to resist them after a big Thanksgiving dinner. Or we might overestimate our ability to drink sensibly at a wedding—and underestimate our willpower after a few too many drinks.

Sunk cost fallacy

A mental model that causes us to irrationally justify making bad decisions by focusing on lost resources rather than the likelihood of success. For example, it's why we stay in terrible relationships—because looking back at all the time and effort that went into building them up feels like too much to give up now. Or it's why we continue terrible business investments—because looking back at all the money spent feels like too much to walk away from. Or it's why we don't quit bad habits—even if they are hurting us because looking back at all the time and effort that went into forming them feels like too much to give up now.

Zero-risk bias

A mental model that causes us to place a disproportionately high value on reducing small risks to zero, and at the same time under-value larger efforts with smaller odds of success. For example, we might insist on rewiring our basement even though there's only a one in 100 chance of it catching fire—but not be willing to pay the price of insuring it. Or we might refuse to hire a contractor to clean out our gutters because we don't want to bother trying, even though there's a one in 100 chance of a tree falling on our house—but would be willing to pay the price of rewiring our basement.

Zero-sum heuristic

A mental model that causes us to misjudge situations where one person's success and wealth creation is another person's loss and poverty. For example, in a zero-sum game, someone wins, and someone else loses—and many people mistakenly think of life as if it were a zero-sum game. They believe that the more money one person has, the less money everyone else can have—which is just not true.

Just-world belief

A mental model that causes us to blame victims for their misfortune in order to maintain our own sense of justice and satisfaction with life. For example, when someone deservedly gets fired from his job, we're willing to think.

Part 2 Coming Soon

Tagged in: