Image of me (blue jacket) at Margherita Hut, the highest ‘hotel’ in Europe, At 4,554 metres. One of my better decisions:-)
Part 2 of a series on making better forecasts (Part 1 is here)
What distinguishes individuals who make consistently superiorly accurate forecasts is not education or expertise but rather how they think.
Decision-making inevitably involves forecasting. Whether you work in HR, business development, IT, Logistics, policy making, accurately foreseeing possible outcomes is a critical component of intelligent decision-making. It relies on the disciplined estimation of what’s likely, what’s possible, and what’s implausible.
Most things begin with a question: “What do you think?” “What could happen?” “What do you really know?”
The best decision maker does not answer quickly. She doesn’t reach for a narrative or lean on intuition masquerading as insight. She resists outward performance and praise before its earned. Instead, she steps into the necessary work of thinking, measuring, estimating, doubting. Because she understands something most do not: our thoughts, decisions and choices about the future shape our world. And most of us are guessing blindly.
This is why, many years ago, I joined The Good Judgment Project (GJP) and participate in Good Judgment Open. I also encourage many of my students to learn the methods of superforecasting. Which is really quite simple. Nobel Prize winner Danny Kahneman recommended the BIN method, based on GJP - Bias, Information, Noise: The BIN Model of Forecasting.
A Brief History of Being Wrong
As I wrote a short-while ago, in the mid-1980s, Philip Tetlock undertook what might be the most quietly radical project in the history of political science: he asked experts to put numbers on their predictions, tracked them over years, and discovered they were barely better than random chance. Many were worse. The more famous the expert, the more confidently wrong they were.
This wasn’t just about forecasting, it was about intellectual hygiene, a rigorous practice of holding beliefs accountable to reality. Experts, Tetlock found, rarely revisited their predictions. They explained away their failures and clung to explanations. What emerged from this landscape of motivated reasoning was a counterpoint: a small number of people who got it right more often than chance, and, more impressively, who knew how likely they were to be wrong. Superforecasters, as Tetlock called them, have less subjective judgment and are less susceptible to common human cognitive biases and limitations due to their synthesis of a wide range of information and viewpoints.
Learning to Predict Like Fermi
To understand how these individuals think, you have to go back to Enrico Fermi, standing in the New Mexico desert, estimating the yield of the first nuclear test by observing how far a sheet of paper drifted in the explosion's shockwave. Fermi wasn’t seeking perfection. He was trying to be less wrong. He broke the unknowable into a sequence of smaller, more honest admissions: how many atoms, how much pressure, how fast the blast might travel.
This habit, known as Fermi estimation, is not about solving the problem in front of you, nor about mathematical equations. It’s about asking a better question. It means separating what can be approximated from what cannot. It means examining the data beneath our assumptions and dragging vague intuition into a format that can be tested, challenged, improved.
Superforecasters operate with this same ethic. When asked if China and Vietnam will clash militarily in the next year, they begin not with theory, but with precedent. How often has it happened before? What would need to change? What signals are emerging? What counterforces exist? They test these pieces against history, data, probability. And when they deliver an answer, it comes with refined probabilities expressed as percentages, not because precision is fashionable, but because it is accountable.
It Can Be Learned
This ability to dissect complexity is not innate. It can be trained. And the intellectual habit it fosters, separating the knowable from the unknowable, is foundational to becoming a better forecaster. This learnability is rooted in a specific style of thinking, as Tetlock notes:
“Broadly speaking, superforecasting demands thinking that is open-minded, careful, curious, and — above all — self-critical. It also demands focus.”
The principles behind Fermi estimation offer a particularly clear window into this learnability. This method emphasizes breaking large, amorphous questions into smaller, tractable ones. As Tetlock and Gardner illustrate, estimating how many piano tuners work in Chicago is not about knowing the answer, it's about knowing what would allow you to arrive at a plausible estimate: how many pianos there are, how often they’re tuned, how long a tuning takes, and how many hours a year a piano tuner typically works. Each of these sub-questions introduces structure, boundaries, and visibility. The fog doesn’t disappear, but you learn to see farther into it.
The uncomfortable truth is that most of us are not trained to think this way. We are trained to have views, to express them fluently, to defend them with cleverness and conviction. But the methods that distinguish superforecasters are not natural gifts. They are cultivated habits: wide reading, frequent updating, respect for base rates, attention to feedback. None require genius. All require effort.
IQ, Tetlock and Gardner found, helps. But only up to a point. Beyond that, what matters is style of thought. Open-mindedness. Numerical comfort. A willingness to quantify beliefs and revise them without ego. The real marker of potential? A growth mindset, the belief that you can get better at reasoning if you're willing to put in the work.
The forecasting tournaments made this clear. Ordinary people, with no access to classified information, outperformed professional intelligence analysts. Why? Because they were willing to do the slow, recursive work of thinking.
The Architecture of Learning
Start with calibration. Assign a number to your belief. Not just a shrug or a “probably,” but 70% chance the legislation passes. 30% the ceasefire holds. 85% the merger goes through. 20% the US pulls out of NATO. These decimals aren’t pretentious artifacts. They’re commitments. Then, check back regularly as new evidence arrives. Were you right? If not, by how much? Do this for a month. Then a year. Patterns emerge. You find where you are overconfident, where you underreact. Feedback sharpens the mind.
Decomposition is an essential trait, too few of us use. We need to learn to break the question down. Not because smaller is easier, but because smaller is more honest. Instead of asking, “Will the global economy recover next year?” ask: What is the historical recovery rate after comparable downturns? What fiscal interventions are on the table? What political risks might intervene? The point is not to predict with certainty, but to reduce the noise.
Tetlock and Gardner write in their book,
“The kind of thinking that produces superior judgment does not come effortlessly. Only the determined can deliver it reasonably consistently, which is why our analyses have consistently found commitment to self-improvement to be the strongest predictor of performance.”
To make the decomposition point concrete, take the piano tuner example: First, estimate the number of households in Chicago. Then the proportion that might own a piano. Assume how often pianos are tuned, and how long each tuning takes. Then determine how many hours a tuner works per year. Divide the total tuning hours by annual working hours, and you’ve got a working estimate. The process doesn’t make the answer correct. It makes it inspectable.
Third, you learn to update. This, perhaps, is the hardest. It means letting go. Letting go of cherished views. Letting go of predictions you wanted to believe. Letting go of being right the first time. Superforecasters do not cling. They shift. Not out of indecision, but because the world is shifting, too.
Why You Should Learn It
Because guessing blindly has consequences. Because decisions in business, in policy, in everyday life are bets on the future, and bad bets cost relationships, dollars, trust. A marketing director trying to predict consumer behavior, a fund manager weighing the risk of a default, a parent planning for a child's education, each are, in essence, forecasting. Learning how to do it better isn’t a luxury. It’s a responsibility.
Because humility without structure becomes paralysis, and confidence without calibration becomes recklessness.
Because in a world drowning in opinion, the ability to think probabilistically, incrementally, and with respect for evidence is not just rare, it is indispensable.
And yes, critics will argue forecasting is inherently unreliable, that we’re better off trusting instincts honed by experience. But instincts, when untested, are simply stories. Superforecasting doesn’t reject intuition, it disciplines it. It treats hunches as inputs, not conclusions.
Superforecasting is not a matter of personality. It is a matter of practice. And if it can be practiced, it can be learned. Patiently. With mistakes. With determination. Because learning to become less wrong over time, is the closest thing we have to everyday stability.
Stay curious,
Colin
Useful resources:
The Art and Science of Superforecasting Video
'Superforecasting': The people that predict the future – BBC REEL
A Short Course in Superforecasting - Philip Tetlock: An EDGE Master Class
“We are trained to have views, to express them fluently, to defend them with cleverness and conviction.” I think this sums up most of the world. We operate on the god of opinion or the god of society and when we come into power, in small or large ways, we are then in danger of lording it over others.
Interesting that forecasting can be much improved by a logical technique, especially in an unpredictable and volatile world of 'discreet' events. Having had deeply-held world-views shattered twice over the years - requiring a complete rethink from scratch - I remain open to new ideas, with caution (healthy doubt) as my byword, since risk is an inevitable part of life. Total surety (or security) is when you're dead.