There are MANY behavioral biases, and their books, Kahneman, Ariely and Thaler (to name a few) explain what lies behind these biases. Fantastic books! However, sometimes a list of some common ones are welcome; in this article, 61 are presented.
57 Cognitive Biases That Screw Up How We Think
moriza via www.flickr.com People aren’t as rational as we would like to think. From attentional bias – where someone focuses on only one or two of several possible outcomes – to zero-risk bias – where we place too much value on reducing a small risk to zero – the sheer number of cognitive biases that affect us every day is staggering.
(At the end of this post, I have copied from the page, in case it goes offline..)
This graphic is also a useful reminder. 🙂
 http://uk.businessinsider.com/cognitive-biases-that-affect-decisions-2015-8?r=US&IR=T
61 Behavioral Biases That Screw Up How You Think
- May 27, 2012, 1:38 PM
Every decision we make is influenced by subconscious behavioral biases.
They cause us to make snap judgments based on bad information, to be unfair and to waste time. This is clearly problematic for investors, managers and people in general. Once we become aware of these biases, we can disrupt our thinking and come to terms with reality.
We’ve collected a long list of cognitive biases from Tim Richard’s Psy-Fi Blog, Wikipedia and more.
Anchoring
The tendency to rely too heavily on one piece of information. For example when buying a used car, someone might focus too much on the odometer reading, rather than considering engine condition and other factors.Read more about anchoring.
Attentional bias
When someone focuses on only one or two choices despite there being several possible outcomes.
Read more about attentional bias.
Availability heuristic
Where people overestimate the importance of information that is available to them.
One example would be a person who argues that smoking is not unhealthy on the basis that his grandfather lived to 100 and smoked three packs a day, an argument that ignores the possibility that his grandfather was an outlier.
Read more about the availability heuristic.
Availability cascade
A simple idea that gains popularity because of how simple it is, and then seems even more simple because of how popular it is. For example, claims that the earth is flat.
Read more about the availability cascade.
Backfire effect
When you reject evidence that contradicts your point of view or statement, even if you know it’s true.
Read more about the backfire effect.
Bandwagon effect
The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink.
Read more about the bandwagon effect.
Belief bias
A bias where people make faulty conclusions based on what they already believe or know. For instance, one might conclude that all tiger sharks are sharks, and all sharks are animals, and therefore all animals are tiger sharks.
Read more about belief bias.
Bias blind spots
If you fail to realize your own cognitive biases, you have a bias blind spot. Everyone thinks they’re not as biased as people may think, which is a cognitive bias itself.
Read more about bias blind spot.
Choice-supportive bias
A bias in which you think positive things about a choice once you made it, even if that choice has flaws. You may say positive things about the dog you just bought and ignore that the dog bites people.
Read more about choice-supportive bias.
Clustering illusion
To conclude that data contains a “streak” or “cluster” when that set is actually random.
For instance in basketball, the hot hand effect is the belief that a player who has hit several shots in a row is more likely to hit the next shot.
This is also called the gambler’s fallacy, where one thinks a winning number is “due.”
Read more about the clustering illusion.
Confirmation bias
A tendency people have to believe certain information that confirms what they think or believe in.
Read more about confirmation bias.
Conservatism bias
Where people believe prior evidence more than new evidence or information that emerged. People were slow to accept the fact that the earth was round because they tended to believe earlier information that it was flat.
Read more about conservatism.
Curse of knowledge
When people who are smarter or more well informed can not understand the common man. For instance, in the TV show “The Big Bang Theory” it’s difficult for scientist Sheldon Cooper to understand his waitress neighbor Penny.
Read more about the curse of knowledge.
Decoy effect
A phenomenon in marketing where consumers have a specific change in preference between two choices after being presented with a third choice.
Read more about the decoy effect.
Denomination effect
People are less likely to spend large bills than their equivalent value in small bills or coins.
Read more about the denomination effect.
Distinction bias
When you value two options differently when looking at them together rather than separately.
Read more about distinction bias.
Duration neglect
When the duration of an event doesn’t factor enough into a valuation. For instance we may remember momentary displeasure as strongly as protracted displeasure.
You can learn more about these biases on The Psy-Fi Blog and Wikipedia.
Empathy gap
Where people in one state fail to understand people in another state. If you are happy you can’t imagine why people would be unhappy. When you are not sexually aroused, you can’t understand how you act when you are sexually aroused.
Read more about the empathy gap.
Frequency illusion
Where a word, name or thing you just learned about suddenly appears everywhere. Now that you know what that SAT word means, you see it in so many places!Read more about the frequency illusion.
Galatea Effect
Halo effect
Where we take one positive attribute of someone and associate it with everything else about that person or thing.
Read more about the halo effect.
Hard-Easy bias
Where everyone is overconfident on easy problems and not confident enough for hard problems.
Read more about the hard-easy bias.
Herding
YouTube
People tend to flock together, especially in difficult or uncertain times.
Read more about herd behavior.
Hindsight bias
The tendency to see past events as predictable. “I knew all along Philip Phillips would win American Idol.” Sure you did…
Read more about hindsight bias.
Hyperbolic discounting
The tendency for people to want an immediate payoff rather than a larger gain later on. Most people would rather take $5 now than $7 in a week.
Read more about hyperbolic discounting.
Ideometer effect
Where an idea causes you to have an unconscious physical reaction, like a sad thought that makes your eyes tear up. This is also how Ouija boards seem to have minds of their own.
Read more about ideometer effect.
Illusion of control
The tendency for people to overestimate their ability to control events, like when a sports fan thinks his thoughts or actions had an effect on the game.
Read more about illusion of control.
Illusion of validity
When weak but consistent data leads to confident predictions. Like one commenter noted on the MIT admissions blog:
Why is MIT’s admissions process better than random? Say you weeded out the un-qualified (the fewer-than-half of applicants insufficiently prepared to do the work at MIT) and then threw dice to stochastically select among the remaining candidates. Would this produce a lesser class?
Read more about illusion of validity.
Information bias
The tendency to seek information when it does not affect action. More information is not always better.
Read more about information bias.
Inter-group bias
We view people in our group differently than we would someone in another group.
Read more about inter-group bias.
Irrational escalation
Investing more money or resources into something based on prior investment, even if you know it’s a bad one. “I already have 500 shares of Lehman Brothers, let’s buy more even though the stock is tanking.”
Read more about irrational escalation.
Just-world hypothesis
People wanting to believe that situations are guided by forces of justice, stability or order. People like to call this karma these days and call karma not nice names.
Read more about just-world hypothesis.
Less-is-more effect
With less knowledge, people can often make more accurate predictions.
Read more about less-is-more effect.
Negativity bias
The tendency to put more emphasis on negative experiences rather than positive ones. People with this bias feel that “bad is stronger than good” and will perceive threats more than opportunities in a given situation.
This leads toward loss aversion.
Read more about negativity bias.
Observer-expectancy effect
Our expectations unconsciously influence how we perceive an outcome. Researchers, for example, looking for a certain result in an experiment, may inadvertently manipulate or interpret the results to reveal their expectations. That’s why the “double-blind” experimental design was created for the field of scientific research.
Read more about observer-expectancy effect.
Omission bias
The tendency to judge harmful actions as worse than equally harmful inactions. For example, we consider it worse to crash a car while drunk than to let one’s friend crash his car while drunk.
Read more about omission bias.
Ostrich effect
The decision to ignore dangerous or negative information by “burying” one’s head in the sand, like an ostrich.
Read more about the ostrich effect.
Outcome bias
Judging a decision based on the outcome over the quality of the decision when it was made. This is not accounting for the role luck plays in outcomes.
Read more about outcome bias.
Overconfidence
We are too confident about our abilities, and this causes us to take greater risks in our daily lives.
Read more about overconfidence.
Overoptimism
When we believe the world is a better place than it is, we aren’t prepared for the danger and violence we may encounter. The inability to accept the full breadth of human nature leaves us vulnerable.
Read more about optimism bias.
Pessimism bias
This is the opposite of the overoptimism bias. Pessimists over-weigh negative consequences with their own and others’ actions.
Read more about pessimism bias.
Placebo effect
A self-fulfilling prophecy, where belief in something causes it to be effective. This is a basic principal of stock market cycles.
Read more about placebo effect.
Planning fallacy
The tendency to underestimate how much time it will it will take to complete a task.
Read more about planning fallacy.
Post-purchase rationalization
Making ourselves believe that a purchase was worth the value after the fact.
Read more about post-purchase rationalization.
Pro-innovation bias
Daniel Goodman / Business Insider
When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitations.
Read more about pro-innovation bias.
Procrastination
Deciding to act in favor of the present moment over investing in the future.
Read more about procrastination.
Reactance
The desire to do the opposite of what someone wants you to do, in order to prove your freedom of choice.
Read more about reactance.
Recency
SocGen
The tendency to weight the latest information more heavily than older data.
Read more about recency.
Reciprocity
The belief that fairness should trump other values, even when it’s not in our economic and/or other interests.
Read more about reciprocity.
Regression bias
People take action in response to extreme situations. Then when the situations become less extreme, they take credit for causing the change, when a more likely explanation is that the situation was reverting to the mean.
Read more about regression bias.
Restraint bias
Overestimating one’s ability to show restraint in the face of temptation.
Read more about restraint bias.
Salience
Our tendency to focus on the most easily-recognizable features of a person or concept.
Read more about salience.
Seersucker Illusion
Over-reliance on expert advice. This has to do with the avoidance or responsibility. We call in “experts” to forecast, when in fact, they have no greater chance of predicting an outcome than the rest of the population. In other words, “for every seer there’s a sucker.”
Read more about seersucker illusion.
Selective perception
Allowing our expectations to influence how we perceive the world.
Read more about selective perception.
Self-enhancing transmission bias
Everyone shares their successes more than their failures. This leads to a false perception of reality and inability to accurately assess situations.
Read more about self-enhancing transmission bias.
Status quo bias
The tendency to prefer things to stay the same. This is similar to loss-aversion bias, where people prefer to avoid losses instead of acquiring gains.
Read more about status quo bias.
Stereotyping
Expecting a group or person to have certain qualities without having real information about the individual. This explains the snap judgments Malcolm Gladwell refers to in “Blink.”
Read more about stereotyping.
Survivorship bias
An error that comes from focusing only on surviving examples, causing us to misevaluated a situation. For instance, we might think that being an entrepreneur is easy because we haven’t heard of all of the entrepreneurs that have failed.
It can also cause us to assume that survivors are inordinately better than failures, without regard for the importance of luck.
Read more about survivorship bias.
Tragedy of the commons
We overuse common resources because it’s not in any individual’s interest to conserve them. This explains the overuse of natural resources, opportunism, and any acts of self-interest over collective interest.
Read more about the tragedy of the commons.
Unit bias
We believe that there is an optimal unit size, or a universally-acknowledged amount of a given item that is perceived as appropriate. This explains why when served larger portions, we eat more.
Read more about unit bias.
Zero-risk bias
The preference to reduce a small risk to zero versus achieving a greater reduction in a greater risk.
This plays to our desire to have complete control over a single, more minor outcome, over the desire for more — but not complete — control over a greater, more unpredictable outcome.
Read more about zero-risk bias.