Random the Book

Random the Book: Matt Ballantine and Nick Drage's experiment in serendipity and chance.


Where do you see yourself in five years’ time?

Questions for you:

  • How confident do you feel about the predictions you make for your own life? Does that confidence feel justified when you look back at the predictions you made five years ago?
  • The page makes a point, that gentle change creeps up on us – which is easy to appreciate abstractly, but has much more impact when you experience it. Will you actually follow the page’s advice, and make at least one specific prediction? Will you do that right now?
  • Does it feel different making predictions about yourself, opposed to making predictions about others? Why?
  • How would it change your decision making process if you genuinely accepted that the future is substantially unpredictable, even at a personal level?

Organisational applications:

The gap between strategic confidence and actual predictive accuracy: The story’s mechanism, writing a prediction and returning to it later, is discomfiting precisely because it makes the gap between felt certainty and actual accuracy visible. Organisations do the equivalent of this every time they produce a three or five-year strategy, but rarely return to it honestly enough to learn from the divergence.

The question worth asking is not whether the strategy turned out to be right, but whether the confidence with which it was articulated was ever warranted. Systematically reviewing past predictions, not to assign blame but to calibrate future confidence, is one of the more useful things a leadership team can do. It tends to produce more honest probability ranges, more explicit assumptions, and a greater willingness to update as conditions change.

Hindsight bias as an organisational learning problem: The story introduces “temporal parallax”: the way outcomes that were genuinely uncertain at the time feel obvious and inevitable in retrospect. This is hindsight bias, and it is one of the more corrosive forces in organisational learning. Post-project reviews conducted after an outcome is known will reliably underestimate the uncertainty at the point of decision.

The decisions that led to success will be reconstructed as wise; those that led to failure will be reconstructed as avoidable. Neither reconstruction is accurate. One practical response is to require decisions to be logged with their reasoning and uncertainty levels at the time they are made, so that reviews work from the contemporaneous record rather than from memory shaped by what subsequently happened.

Using pre-mortems to counteract over-confident planning: If the story’s exercise is uncomfortable because it reveals how wrong confident predictions turn out to be, the organisational corollary is to build that discomfort into the planning process rather than waiting for it to arrive retrospectively.

A pre-mortem, in which a team imagines that a plan has failed and works backwards to explain why, is one method. It temporarily suspends the optimism that attaches to a plan once it has been agreed upon, revealing the random and contingent factors that the plan assumes away. Teams that run pre-mortems consistently produce more realistic risk registers and more robust contingency thinking than those that rely on forward-looking confidence alone.

Further reading

On forecasting and the limits of prediction:

Superforecasting: The Art and Science of Prediction by Philip Tetlock and Dan Gardner. Tetlock’s research on expert prediction accuracy is the most rigorous available account of how poorly even well-informed people predict future events, and what the small minority of accurate forecasters do differently, including maintaining explicit probability estimates and updating them as evidence arrives.

The Black Swan: The Impact of the Highly Improbable by Nassim Nicholas Taleb. Taleb’s argument that the most consequential events are by definition those that were not anticipated is directly relevant to the five-year prediction exercise: the things that will most change your life in the next five years are probably not on your list.

On hindsight bias and organisational learning:

Thinking, Fast and Slow by Daniel Kahneman. Kahneman’s treatment of hindsight bias and the illusion of understanding is the clearest account of why temporal parallax occurs, and why it makes honest learning from experience so difficult without deliberate structural effort.

Noise: A Flaw in Human Judgement by Daniel Kahneman, Olivier Sibony and Cass R. Sunstein. Extends the hindsight bias argument into organisational decision-making contexts, with practical suggestions for reducing the distortion it introduces into performance reviews and post-project analysis.

On planning under uncertainty:

Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts by Annie Duke. Duke’s framework for treating decisions as probability estimates rather than binary right/wrong choices is a practical complement to the story’s exercise, providing a vocabulary for the kind of honest uncertainty-logging that makes retrospective review meaningful.

The Future and Its Enemies by Virginia Postrel. A broader argument about the relationship between prediction, control, and uncertainty, useful for contextualising why the discomfort the story’s exercise produces is not simply a personal failing but a structural feature of operating in complex systems.

About the image

St Stephen’s Tower, commonly incorrectly called Big Ben (that’s the bell inside) in Westminster, London, during its recent restoration.

Photo montage and photo by Matt Ballantine, 2026