Embracing the Skill of Superforecasting
“Blink” versus “think” - An Open Mind Is Key to Making Better Decisions
How to make better decisions
How do we build a civilization that not only persists but truly thrives? It is a question as old as philosophy and yet newly urgent in our age of rapid technological upheaval, where information flows faster than the speed of thought and the stakes of our decisions have never been higher.
Tyler Cowen, in his book Stubborn Attachments, delivers a sober, intellectually unyielding inquiry into the ethical and economic imperatives of our age, writing:
“The difficulty lies not so much in developing new ideas as in escaping from old ones.”
The old ideas are those that hold us back in a complacent state. The new ideas are a mindset that fosters the skill of ‘shaping’ the future, not through wishful thinking or mysticism, but through a discipline grounded in rigorous analysis.
This is the kind of thinking championed by Philip Tetlock, whose seminal works, Expert Political Judgment and Superforecasting: The Art and Science of Prediction, dismantle the comforting illusions of conventional expertise. As Tetlock wryly observed, many experts perform “little better than dart-throwing chimpanzees.”
When I first heard of Tetlock’s work via Daniel Kahneman, the analogy of the dart throwing chimps jolted me awake and I embraced probabilistic thinking.
Examples abound about bad expert predictions, such as the one Steve Ballmer, then CEO of Microsoft, made in 2007: “There’s no chance that the iPhone is going to get any significant market share. No chance.”
If the renowned pundits of our time are so fallible, what hope do we have in navigating the complexities of our world?
The Good Judgment Project
To answer this question Tetlock co-led the Good Judgment Project (GJP), a large-scale, multiyear prediction tournament sponsored by the U.S. intelligence community. Its findings were nothing short of revolutionary. The best forecasters, the so-called superforecasters, consistently outperformed both intelligence analysts with access to classified information and top political pundits. They did so by honing a set of cognitive and methodological habits that allowed them to improve their forecasting ability over time.
The implications were staggering. The traditional model of expertise, reliant on credentials, institutional prestige, and the gravitas of confident proclamations, was fundamentally flawed. Instead, forecasting accuracy correlated with intellectual curiosity, probabilistic thinking, and a relentless commitment to updating one’s beliefs when new information arose. A superforecaster knows what they don’t know, and is curious to redress that.
Thinking in Probabilities
At the core of superforecasting is an ability to think in probabilities, employing techniques, that may sound Sisyphean complex but are really quite simple, such as Bayesian reasoning, and scenario planning to systematically break down complex problems into more manageable components.
It is a mental habit that is as unnatural as it is essential. Our minds crave certainty, narrative coherence, and the seductive simplicity of binary thinking, right or wrong, success or failure, hero or villain. But the real world, particularly in domains as volatile as geopolitics or economic forecasting, defies such categorization.
A good forecaster, as Tetlock and his colleagues found, does not say, This will happen or That won’t happen. Instead, they assign probabilities, break problems into smaller pieces, and remain constantly aware of their own cognitive biases.
One striking example from the GJP was a prediction question about whether Greece would exit the Eurozone within a given time frame. Superforecasters didn’t just consult economic reports, they studied historical precedents, political dynamics, and the incentives of key actors. By breaking the question down into components, such as the financial pressures on Greece, the political will of the European Union, historical patterns of currency union dissolutions, and the economic incentives for key stakeholders, they produced far more accurate estimates than conventional analysts. With experts often assigning probabilities in the range of 50-60% Greece would not exit, leading to much volatility and uncertainty, while GJP superforecasters provided more precise estimates, frequently adjusting their probabilities in the 80-90% range based on evolving information that Greece would not exit the Eurozone within the given timeframe.
Tetlock’s approach reminds us that the past is not merely a repository of data but a dynamic “base rate” that informs our expectations. To paraphrase the great Douglas Hofstadter,
“Understanding is not a one-time event; it is a continuous process of refinement.”
The iterative updating of our forecasts becomes a form of perpetual learning, a careful assessment between expectation and reality that hones our judgment over time.
Tetlock’s research identifies several key attributes that distinguish superforecasters,:
Active Open-Mindedness – They embrace complexity, seek out diverse viewpoints, and are willing to revise their judgments in light of new evidence.
Bayesian Thinking – Instead of making binary predictions, they assign probabilities and update them incrementally based on incoming data.
Falsifiability and Self-Correction – They track the accuracy of their forecasts and adjust their methods accordingly, avoiding overconfidence.
Granular Thinking and Decomposition – They break down broad questions into smaller, more measurable components, allowing for more precise analysis.
A Growth Mindset – They view mistakes as opportunities for improvement rather than failures to be ignored or rationalized.
Image from Superforecasting: How to Upgrade Your Company’s Judgment
Instead of staking our future on categorical certainties, superforecasters assign probabilities, 60%, 70%, even 10%, to competing outcomes. This is not an exercise in indecision but rather a disciplined acknowledgment of the inherent uncertainty that governs our world. True wisdom lies not in our certainties but in our willingness to question them.
I strongly believe in the methodology of thinking in probabilities, it reduces stress, improves decision accuracy and leads to more stability in life.
The Accountability Deficit
In Expert Political Judgment, Tetlock laid bare an uncomfortable truth, most political and economic predictions go untested. The pundits who dominate television and editorial pages are rarely held accountable for their past forecasts. Indeed, the incentives in media favor confident, dramatic predictions over careful, probabilistic assessments, possibly due to the halo effect bias. The more outrageous the claim, the more attention it garners, accuracy be damned.
Contrast this with the GJP’s rigorous methodology. Every forecast was time-stamped, recorded, and later scored against actual outcomes. This allowed for empirical assessment, who was consistently good at predicting events, and what traits did they share? The results upended conventional wisdom. Credentialed experts with ideological commitments performed poorly, while open-minded generalists with a habit of constantly revising their views did exceptionally well.
Overcoming Systematic Errors
The stakes of bad forecasting are not merely academic. When policymakers and business leaders rely on flawed predictions, the consequences can be catastrophic. The 2003 Iraq War, for example, was justified in part by intelligence assessments that Saddam Hussein possessed weapons of mass destruction. The failure to accurately assess the probability of this claim led to a costly and destabilizing war that continues to reverberate decades later.
The tools exist to help everyone. Studies show that even a brief cognitive-debiasing training module, such as the CHAMPS KNOW training tested in geopolitical forecasting tournaments, can improve judgmental accuracy by 6 to 11%.
Yet, despite its promise, superforecasting remains largely a niche practice. Some governments and corporations have begun experimenting with structured analytic techniques, such as red-teaming and probabilistic forecasting workshops, to integrate these skills into decision-making.
Additionally, prediction markets like PolyMarket have demonstrated how crowd-sourced forecasting can improve political and economic predictions. However, much work remains to be done in institutionalizing these practices at scale. Governments and corporations still cling to outdated models of expertise, often placing unwarranted trust in confident, untested assertions. Breaking this cycle requires more than intellectual curiosity, it requires institutional reform. Although with the PolyMarket predictions of the recent US Presedential election, that may be about to change!
Rational Forecasting
If we are serious about building a civilization that thrives, we must rewire our institutions (and our own minds) to reward accuracy over rhetoric, probability over certainty. We must demand that those who claim to know the future provide evidence of their past performance and how they arrive at their latest forecast. As Tetlock says:
“What experts think matters far less than how they think.”
Tetlock’s work offers a blueprint for this transformation. It is not enough to admire superforecasters from a distance, we must cultivate their habits in ourselves and our organizations. This means embracing probabilistic thinking, rigorously testing our beliefs, and above all, being willing to say, I might be wrong.
Superforecasting is not just a skill, it is an ethos, one that individuals and organizations can cultivate by adopting structured analytic techniques, tracking prediction accuracy, and fostering a culture of intellectual curiosity.
To build a world that truly thrives, we must prioritize evidence-based decision-making and incentivize probabilistic reasoning in governance, business, and media. The question is not whether we can afford to embrace superforecasting, but whether we can afford not to. And in a world where the cost of bad predictions grows ever higher, it may well be the most important ethos of all.
Stay curious
Colin
Further reading:
Expert Political Judgment by Phil Tetlock
Superforecasting: The Art and Science of Prediction by Phil Tetlock
Superforecasting: How to Upgrade Your Company’s Judgment Harvard Business Review by Paul J.H. Schoemaker and Philip E. Tetlock
Main image from this Knowledge at Wharton video
Has anyone tested AI on these kinds of forecasts? There must be some research going on right now.
Another excellent piece! Thank you