Key Takeaways
-
1
Your mind operates through two systems: System 1 is fast, intuitive, and automatic; System 2 is slow, deliberate, and analytical. Most of our daily decisions are driven by System 1, which is efficient but prone to systematic errors. Understanding this dual-process model is key to recognizing when your intuition is leading you astray.
-
2
Cognitive biases are not random mistakes — they are predictable patterns of flawed thinking hardwired into our mental machinery. From anchoring to availability bias, these shortcuts served us well in ancestral environments but regularly misfire in modern complex decisions. Awareness alone doesn't eliminate them, but it can help you design better decision-making processes.
-
3
What You See Is All There Is (WYSIATI) explains our tendency to make judgments based only on available information, without considering what we don't know. We construct coherent stories from limited data and feel confident in them. The less information we have, the easier it is to build a convincing narrative — which is dangerously misleading.
-
4
We are far more loss-averse than gain-seeking. Losing $100 feels roughly twice as painful as gaining $100 feels pleasurable. This asymmetry shapes everything from investment decisions to negotiations. Loss aversion explains why people hold losing stocks too long and sell winners too early.
-
5
The experiencing self and the remembering self evaluate happiness very differently. The experiencing self lives in the moment; the remembering self constructs the story afterward. We make future decisions based on memories, not experiences — and memories are dominated by peak moments and endings, not duration.
-
6
Overconfidence is perhaps the most significant cognitive bias. Experts consistently overestimate what they know and underestimate uncertainty. Planning fallacy — the tendency to underestimate time, cost, and risk — is a direct consequence. The best corrective is to use base rates and outside views rather than relying on insider optimism.
-
7
Anchoring profoundly shapes our numerical estimates. When exposed to any number — even a random one — before making a judgment, our estimates are pulled toward that anchor. This affects everything from salary negotiations to courtroom sentencing. Being aware of anchors helps, but even experts are susceptible.
-
8
Regression to the mean is a statistical reality that we consistently misinterpret as causal. An exceptional performance is likely followed by a more average one, not because of some cause, but because extreme outcomes are statistically rare. Coaches who punish after poor performance and see improvement are witnessing regression, not the effect of punishment.
-
9
Framing effects demonstrate that the way a question or choice is presented dramatically changes decisions. People respond differently to '90% survival rate' versus '10% mortality rate,' even though they are identical. Rational agents should not be affected by framing, but humans almost always are.
-
10
Substitution is a core mechanism of intuitive judgment: when faced with a hard question, System 1 substitutes an easier one without you noticing. Asked 'How happy are you with your life?' your mind might answer 'What is my mood right now?' instead. Recognizing substitution is one of the most powerful tools for better thinking.
Concepts
System 1 and System 2
Two modes of thinking: System 1 is fast, automatic, and intuitive (recognizing faces, reading emotions). System 2 is slow, effortful, and logical (doing complex math, parallel parking in a tight spot).
Example
When you see '2 + 2,' the answer 4 comes instantly (System 1). When you see '17 × 24,' you need to consciously calculate (System 2). Reading a billboard while driving uses System 1; navigating an unfamiliar city requires System 2.
Anchoring Effect
The tendency for an initial piece of information (the 'anchor') to disproportionately influence subsequent judgments, even when the anchor is arbitrary or irrelevant.
Example
In one study, spinning a wheel of fortune before asking people to estimate the percentage of African countries in the UN significantly shifted their answers toward the random number. In real estate, the listing price anchors buyers' perception of a home's value regardless of actual market conditions.
Availability Heuristic
Judging the frequency or probability of an event based on how easily examples come to mind, rather than on actual statistical frequency.
Example
After seeing news coverage of plane crashes, people overestimate the danger of flying relative to driving, even though driving is statistically far more dangerous. People who know someone with cancer overestimate cancer prevalence compared to those who don't.
Loss Aversion
The psychological principle that losses loom larger than equivalent gains — typically about twice as powerful. People feel the pain of losing $100 more intensely than the pleasure of gaining $100.
Example
Investors hold on to losing stocks hoping to break even rather than cutting losses. A person offered a coin flip where they win $150 or lose $100 will often refuse, even though the expected value is positive, because the potential loss feels too painful.
WYSIATI (What You See Is All There Is)
The mind's tendency to construct the most coherent story possible from whatever information is currently available, without accounting for missing information.
Example
A hiring manager reads a glowing one-page resume and feels highly confident about a candidate, without considering the many relevant qualities that a resume cannot capture. Jurors form strong opinions based on the evidence presented, rarely considering what evidence might exist but wasn't shown.
Prospect Theory
A model of decision-making under risk that describes how people evaluate potential gains and losses asymmetrically, using a reference point rather than absolute outcomes.
Example
A person who gains $1,000 and then loses $500 feels worse than a person who simply gains $500, even though the net outcome is the same. Gamblers at a horse track take bigger risks in the last race of the day, trying to break even rather than accept a loss.
The Peak-End Rule
People judge an experience largely based on how they felt at its most intense point (the peak) and at its end, rather than on the sum or average of every moment.
Example
A colonoscopy patient who experienced a less painful ending rated the overall procedure as less unpleasant than one who had a shorter but abruptly painful procedure. A vacation with one incredible day and a pleasant last day is remembered more fondly than a uniformly good but unremarkable trip.
Planning Fallacy
The systematic tendency to underestimate the time, costs, and risks of future actions while overestimating their benefits, even when past experience suggests otherwise.
Example
The Sydney Opera House was expected to be completed in 1963 for $7 million; it was finished in 1973 for $102 million. Home renovations almost universally exceed their budgets and timelines. Students consistently predict they'll finish papers earlier than they actually do.
Regression to the Mean
The statistical phenomenon where extreme measurements tend to be followed by measurements closer to the average, purely due to random variation — not any causal intervention.
Example
A Sports Illustrated 'cover jinx' — athletes who appear on the cover after an exceptional season tend to perform worse the next year, not because of the cover, but because exceptional performance is statistically unlikely to repeat. A student who scores unusually high on one exam will likely score closer to their average on the next.
Framing Effect
The way information is presented (framed) significantly influences decisions and judgments, even when the underlying facts are identical.
Example
Patients told a surgery has a '90% survival rate' are more likely to agree to it than patients told it has a '10% mortality rate.' Ground beef labeled '80% lean' sells better than the same beef labeled '20% fat.'
Substitution Heuristic
When confronted with a difficult question, the mind unconsciously replaces it with a simpler, related question and answers that one instead.
Example
Asked 'Should I invest in this company's stock?' you might actually answer 'Do I like this company's products?' Asked 'How happy are you with your life these days?' your brain may substitute 'What is my mood right now?'
Endowment Effect
People value things more highly simply because they own them. Ownership itself creates a sense of attachment that inflates perceived worth.
Example
In experiments, people given a coffee mug demanded about twice as much to sell it as others were willing to pay to buy it. Homeowners often overvalue their property compared to market prices, partly because of emotional attachment to 'their' home.