Superforecasting cover

Superforecasting

The Art and Science of Prediction

Philip Tetlock, Dan Gardner 2015
Social Science

Press Enter to add

10

Key Takeaways

  1. 1

    Accurate forecasting is not primarily a function of raw intelligence or credentials but of mindset and method. The best forecasters—whom the authors call “superforecasters”—combine open-mindedness, intellectual humility, and disciplined reasoning. They treat beliefs as hypotheses to be tested and updated rather than as identities to defend.

  2. 2

    Superforecasting is a skill that can be learned and improved with practice. Through structured training, feedback, and probabilistic thinking, ordinary individuals can dramatically outperform experts and intelligence analysts. Forecasting is less about brilliance and more about habits of thought.

  3. 3

    Thinking in probabilities rather than certainties is central to accurate prediction. Superforecasters avoid binary thinking and instead assign numerical probabilities to outcomes. This allows them to refine judgments incrementally as new evidence emerges.

  4. 4

    Updating beliefs in light of new information—Bayesian reasoning—is a core discipline of superforecasting. Rather than clinging to prior views, top forecasters adjust their estimates frequently and proportionally. Small, regular updates outperform dramatic reversals or stubborn consistency.

  5. 5

    Breaking complex problems into smaller, manageable components improves accuracy. Superforecasters decompose big, vague questions into specific sub-questions, estimate each part, and then synthesize the results. This structured analysis reduces cognitive overload and error.

  6. 6

    Superforecasters balance humility and decisiveness. They are cautious about overconfidence yet willing to make clear predictions. This blend—dubbed 'confident humility'—allows them to act while remaining ready to revise their views.

  7. 7

    Team collaboration can enhance forecasting when structured properly. Diverse perspectives, constructive disagreement, and accountability improve collective accuracy. However, unstructured group dynamics can lead to groupthink and overconfidence.

  8. 8

    Tracking performance and receiving feedback is essential for improvement. Superforecasters keep score, measure calibration, and analyze their errors. Without feedback loops, it is difficult to refine predictive skill.

  9. 9

    The distinction between ‘foxes’ and ‘hedgehogs’ explains differences in forecasting accuracy. Foxes draw from many small ideas and adapt flexibly, while hedgehogs rely on one big theory. Fox-like thinking consistently outperforms rigid, single-framework approaches.

  10. 10

    While the future is inherently uncertain, disciplined probabilistic reasoning significantly improves our ability to anticipate events. Superforecasting does not eliminate uncertainty but manages it more effectively. Better forecasts lead to better decisions in government, business, and personal life.

12

Concepts

Superforecasters

Individuals who consistently produce highly accurate probabilistic predictions by applying disciplined reasoning and updating beliefs with new evidence.

Example

Participants in the Good Judgment Project who outperformed intelligence analysts Forecasters who regularly recalibrate probabilities as new data appears

Probabilistic Thinking

Expressing judgments in terms of numerical probabilities rather than categorical yes-or-no statements to reflect uncertainty more accurately.

Example

Assigning a 65% chance to a policy passing instead of saying it ‘will pass’ Adjusting a forecast from 40% to 55% as new data emerges

Bayesian Updating

A method of revising beliefs incrementally as new evidence becomes available, adjusting prior probabilities to form more accurate estimates.

Example

Lowering the estimated chance of a recession after strong economic data Increasing conflict risk predictions after new military movements

Calibration

The alignment between predicted probabilities and actual outcomes, indicating how well a forecaster’s confidence matches reality.

Example

Events predicted with 70% confidence occurring roughly 70% of the time Tracking forecasting scores to measure accuracy over time

Fox vs. Hedgehog Thinking

A framework distinguishing flexible, eclectic thinkers (foxes) from those who rely on a single overarching theory (hedgehogs).

Example

A fox integrating economic, political, and cultural data A hedgehog explaining all global events through one ideological lens

Decomposition

Breaking complex forecasting questions into smaller, more manageable components to improve clarity and accuracy.

Example

Estimating election outcomes by analyzing turnout, demographics, and economic conditions separately Assessing war risk by evaluating alliances, resources, and leadership incentives

Fermi Estimation

A technique of making rough, structured estimates by starting with known quantities and refining step by step.

Example

Estimating the number of hospitals in a country using population data Calculating market size by multiplying customer segments and average spending

Confident Humility

The mindset of being decisive enough to make forecasts while remaining aware of one’s fallibility and open to revision.

Example

Making a clear probability estimate but welcoming contradictory evidence Admitting past errors and adjusting models accordingly

Feedback Loops

Systematic tracking of prediction outcomes to learn from mistakes and refine forecasting skill.

Example

Reviewing past quarterly forecasts to assess accuracy Using scoring systems like Brier scores to evaluate performance

Brier Score

A mathematical measure used to assess the accuracy of probabilistic forecasts by comparing predicted probabilities to actual outcomes.

Example

Receiving a lower Brier score for well-calibrated forecasts Comparing forecasters based on cumulative Brier scores

Scope Sensitivity

Adjusting forecasts appropriately based on the scale and specifics of the question rather than making overly broad generalizations.

Example

Differentiating between short-term and long-term economic predictions Recognizing that regional conflicts differ from global wars

Constructive Teaming

Structured collaboration where diverse viewpoints and respectful disagreement improve forecasting accuracy.

Example

Teams debating assumptions before finalizing probability estimates Online forecasting platforms encouraging evidence-based discussion