Random the Book

Random the Book: Matt Ballantine and Nick Drage's experiment in serendipity and chance.


When does noise become the news?

Questions for you:

  • Think of the last time your organisation changed something in response to a data movement. How confident are you that the movement was a genuine signal rather than noise?
  • Where in your organisation is there the most pressure to produce explanations for outcomes, regardless of whether those outcomes are actually explainable?
  • What would it take for your organisation to respond to a piece of data by saying “this is probably just noise” rather than launching an investigation or intervention?

Organisational applications:

The explanation imperative and its costs: The story identifies a structural problem: the news machine must tell stories, so every observable movement gets a narrative attached to it. Organisations have an equivalent pressure. A sales dip, a customer satisfaction score that moved two points, a quarter where attrition was slightly higher than usual — each of these tends to trigger analysis, meetings, and interventions, regardless of whether the movement exceeds the normal variation in the underlying system.

These are not without cost, interventions based on noise consume resources, introduce new variables that make genuine signals harder to detect, and foster a culture in which every metric is treated as a direct consequence of someone’s actions. Building explicit tolerance bands around key metrics and treating movements within those bands as requiring no explanation is a structural response to this imperative.

Zooming out as a deliberate analytical practice: The story contrasts daily news, which does blips, with historians, who look for long arcs. Organisations have the same choice about what time horizon they use to evaluate their data. Weekly or monthly reporting cycles create exactly the conditions the story describes: enough granularity to see the noise clearly, but not enough context to distinguish it from the signal.

Supplementing short-cycle reporting with explicit longer-horizon views — rolling twelve-month trends, year-on-year comparisons, multi-year trajectories — does not eliminate noise but changes the ratio of signal to noise in what gets discussed. The practical question is not which frequency is correct but whether the organisation has a mechanism for zooming out when the short-cycle view is generating more heat than light.

Statistical process control as a practical framework: The underlying question the story raises, when is a change real, has a formal answer in statistical process control, developed in manufacturing contexts but applicable to any repetitive process producing measurable outputs. Control charts distinguish common cause variation, the normal random fluctuation in a stable process, from special cause variation, movements that indicate something genuinely different is happening.

The principle is straightforward: only act on special cause variation. Organisations that apply this logic to their operational metrics, even informally, make substantially fewer spurious interventions than those that treat every data point as requiring a response. The discipline required is not technical sophistication but the willingness to define what normal variation looks like and commit to ignoring movements within it.

Further reading

On signal, noise, and the limits of pattern-finding:

The Signal and the Noise: The Art and Science of Prediction by Nate Silver. Silver’s account of how experts in multiple domains consistently mistake noise for signal is the most comprehensive available treatment of the problem the story describes, with practical guidance on how to improve the ratio.

Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Nicholas Taleb. Taleb’s account of how traders and analysts confuse random fluctuation with meaningful information is the sharpest available critique of the explanation imperative the story identifies, particularly in financial contexts.

On measurement, variation, and statistical thinking in organisations:

Thinking, Fast and Slow by Daniel Kahneman. Kahneman’s treatment of regression to the mean is directly relevant: many apparent causes of change are simply regression from an unusual observation toward the average, and mistaking regression for a real effect is one of the most common organisational analytical errors.

Noise: A Flaw in Human Judgement by Daniel Kahneman, Olivier Sibony and Cass R. Sunstein. The chapters on measurement and variability in organisational data cover the structural conditions that make noise hard to distinguish from signal, with particular attention to performance management and quality control contexts.

On the history of narrative explanations for market movements:

A Random Walk Down Wall Street by Burton G. Malkiel. Malkiel’s long-running argument that stock price movements are substantially random, and that the financial commentary industry exists to explain movements that require no explanation, is the academic background to the story’s financial framing.

How to Lie with Statistics by Darrell Huff. Huff’s account of how statistical presentation manufactures apparent meaning from random variation is older than the other texts here but remains the most accessible introduction to the specific analytical traps the story describes.

About the image

A photo of the New York Stock Exchange on Wall Street, taken on a trip there in 2007.

Photo montage and photo by Matt Ballantine, 2026, 2007