Cognitive biases are systematic patterns of deviation from norm or rationality in judgment, often influencing decision-making processes. They are tendencies or patterns of thought that consistently and predictably deviate from objective standards such as facts or rational choices. These biases can affect perceptions, interpretations, and decisions. There are numerous cognitive biases, and they have been extensively studied by researchers in psychology, behavioral economics, and related fields. The concept of cognitive biases gained prominence through the work of psychologists Amos Tversky and Daniel Kahneman. Their research, particularly in prospect theory, highlighted various systematic errors in human judgment and decision-making.
Beginning in the 1970s, Tversky and Kahneman conducted studies that challenged traditional economic models by revealing patterns of irrationality in how individuals assess risks, make choices, and form judgments.
Prospect theory, introduced by Tversky and Kahneman in 1979, revolutionized the understanding of decision-making under uncertainty. It demonstrated that people do not always make decisions based on rational assessments of expected value but are influenced by cognitive biases that deviate from classical economic assumptions. The theory highlighted phenomena such as loss aversion, framing effects, and the endowment effect, shedding light on how individuals deviate from rational decision-making in predictable ways.
Their research laid the foundation for the field of behavioral economics, which integrates insights from psychology into economic theories. Tversky and Kahneman’s work earned them the Nobel Prize in Economic Sciences in 2002, recognizing the transformative impact of their contributions on our understanding of human decision-making and the pervasive influence of cognitive biases in various aspects of life.
Research on cognitive biases is carried out through empirical studies, experiments, and observations. Psychologists and behavioral economists design experiments to identify and understand how cognitive biases operate in different contexts. These studies often involve presenting participants with scenarios, decision-making tasks, or i nformation to observe how biases influence their judgments and choices. Cognitive biases are not limited to academic research; they have practical implications in fields like marketing, finance, law, and various aspects of everyday life. Understanding these biases can help individuals make more informed decisions and professionals design better systems, policies, and interventions. Researchers continue to explore new biases and refine their understanding of existing ones to contribute to the broader field of behavioral science.
Inventive (Cognitive) Biases
106 Overconfidence effect
107 Social desirability bias
108 Third–person effect
109 False consensus effect
110 Hard–easy effect
111 Lake Wobegone effect
112 Dunning–Kruger effect
113 Egocentric bias
114 Optimism bias
115 Forer effect
116 Barnum effect
117 Self–serving bias
118 Actor–observer bias
119 Illusion of control
120 Illusory superiority
121 Fundamental attribution error
122 Defensive attribution hypothesis
123 Trait ascription bias
124 Effort justification
125 Risk compensation
126 Peltzman effect
Overconfidence Effect
The tendency to overestimate one’s own abilities or the accuracy of one’s beliefs and predictions.
: The inclination to respond in a way that is socially acceptable or perceived favorably by others, rather than providing honest or accurate information.
Third–Person Effect: The belief that others are more influenced by media messages than oneself, underestimating one’s susceptibility to media influence.
False Consensus Effect: The tendency to overestimate the extent to which others share one’s beliefs, attitudes, or behaviors.
Hard–Easy Effect: The phenomenon where people tend to overestimate their performance in easy tasks and underestimate their performance in difficult tasks.
Lake Wobegon Effect: The tendency to overestimate one’s abilities or characteristics in comparison to others – a belief that one is above average.
Dunning–Kruger Effect: The cognitive bias where individuals with low ability at a task overestimate their ability, while those with high ability underestimate their own competence.
Egocentric Bias: The inclination to rely too heavily on one’s own perspective and underestimate the impact of other people’s viewpoints.
Optimism Bias: The tendency to underestimate the likelihood of negative events happening to oneself and overestimate the likelihood of positive events.
Forer Effect: The tendency to accept vague and general personality descriptions as personally accurate, such as those often found in horoscopes or personality assessments.
Barnum Effect: The tendency to accept vague statements and generalizations about oneself as accurate, also known as the “personal validation fallacy.”
Self–Serving Bias: The tendency to attribute positive events to one’s own character and abilities, but attribute negative events to external factors.
Actor–Observer Bias: The tendency to attribute one’s own behavior to external factors while attributing others’ behavior to internal factors.
Illusion of Control: The belief that one has more control over events than is actually the case.
Illusory Superiority: The tendency for individuals to overestimate their own qualities and abilities in relation to others, often referred to as the “above-average effect.”
Fundamental Attribution Error: The inclination to attribute others’ actions to their character while attributing one’s own actions to external factors.
Defensive Attribution Hypothesis: The tendency to blame victims for their misfortune as a way to feel safer or more secure in one’s own world.
Trait Ascription Bias: The tendency to attribute personality traits to others based on their behavior, while ignoring situational factors.
Effort Justification: The tendency to attribute a greater value to an outcome that required significant effort or sacrifice.
Risk Compensation: The phenomenon where individuals adjust their behavior in response to perceived changes in risk, potentially leading to a nullification of safety measures.
Peltzman Effect: The idea that people may adjust their behavior in response to perceived safety measures, potentially leading to an increase in risky behavior
Availability Bias
Anchoring Bias
Egocentric or Egocentricity Bias, Overconfidence Effect
Halo Effect, Halo Error, Association Fallacy
Recency Effect Bias
Framing Effect Bias
Sunk Cost Fallacy
Hindsight, “I-Knew-It-All” Bias
Loss Aversion Bias
Gambler’s Fallacy
Self-serving or Attribution Bias, Fundamental Attribution Error
Dunning-Kruger Effect
Social Desirability Bias
Illusory Correlation or Apophenia Bias
Mere-Exposure Effect , Familiarity Principle
Conformity Bias, Groupthink or Bandwagon
Negativity Bias
Algorithmic Bias
References
“Thinking, Fast and Slow” by Daniel Kahneman: This book by Nobel laureate Daniel Kahneman explores the two systems that drive the way we think—System 1, which is fast and intuitive, and System 2, which is slow and deliberate.
“Predictably Irrational” by Dan Ariely: Dan Ariely, a behavioral economist, discusses various irrational behaviors and cognitive biases that influence decision-making in this engaging book.
“Nudge: Improving Decisions About Health, Wealth, and Happiness” by Richard H. Thaler and Cass R. Sunstein: Thaler and Sunstein explore the concept of “nudging” and how small changes in the way choices are presented can significantly impact decision-making.
“Influence: The Psychology of Persuasion” by Robert B. Cialdini: This classic book explores the psychology behind why people say “yes” and how to apply these understandings in various aspects of life.
“Judgment under Uncertainty: Heuristics and Biases” by Daniel Kahneman and Amos Tversky (Science, 1974): This seminal paper introduces many cognitive biases and heuristics, laying the foundation for behavioral economics.
“Cognitive Bias in Forensic Pathology Decisions” by Itiel E. Dror and Greg Hampikian (Science & Justice, 2011): This article discusses cognitive biases in forensic pathology and their impact on decision-making.
“The Framing of Decisions and the Psychology of Choice” by Amos Tversky and Daniel Kahneman (Science, 1981): Another important work by Tversky and Kahneman, this paper explores how the framing of decisions influences choices.
“Beyond Reason: Using Emotions as You Negotiate” by Roger Fisher and Daniel Shapiro (Harvard Business Review, 2005): This article explores the role of emotions and cognitive biases in negotiation.
“The Neural Basis of Economic Decision-Making in the Ultimatum Game” by Alan G. Sanfey et al. (Science, 2003): This paper delves into the neuroscience behind economic decision-making and fairness.