Patrick Blackett was a physicist who became one of the founders of operational research during the Second World War. He disagreed with Churchill’s strategy of bombing German cities. The received view was that destroying civilian morale would shorten the war. Blackett’s analysis pointed to U-boat infrastructure as a far more damaging target. Churchill was not moved. Blackett had a very strong point.

The disagreement was not about data. Records of bombing runs, casualties, and industrial output were available to both sides. What Blackett had was a different question: what actually damages an enemy’s capacity to wage war? That is not the same question as “are we hitting targets?”, and it does not produce the same answer. The discipline that emerged from this kind of work was built on the idea that rigorous analysis applied to the right question is more useful than rigorous analysis applied to the assumed one. The scientists involved, many with no background in military strategy, were useful precisely because they were outside the institutional assumptions. They could see what the organisation could not.

John Boyd is a later example of the same disposition. Boyd was a US Air Force pilot who became convinced that the official metrics for evaluating aircraft performance were measuring the wrong things. Top speed and ceiling told you about a plane in isolation. What mattered in combat was the ability to transition between energy states faster than an opponent. To prove it, he teamed with mathematician Thomas Christie at Eglin Air Force Base in the early 1960s. They used the base’s high-speed IBM 704 computer to run the millions of calculations required, entering data by punch card at a time when a single computer filled an entire room. Boyd later admitted to stealing computer time to finish the work. The two-volume report was completed in 1964. It became the world standard for fighter aircraft design and overturned procurement decisions the Air Force had already made. Boyd faced years of institutional hostility. He kept going.

The pattern in both cases is the same. The data existed. The analytical capability existed. What was missing was someone willing to ask whether the problem was set up correctly. That is a different capability from being able to answer questions once they are posed, and it is rarer. It does not require a doctorate. Blackett had one; Boyd did not have the equivalent. What both had was a willingness to treat the problem definition as provisional, to look at what the institution was actually trying to achieve rather than what it said it was trying to achieve, and to follow the analysis where it led even when the conclusion was unwelcome.

Both men faced serious resistance from people with considerably more authority than them. The institutions they were advising preferred to take decisions with their older models of thinking. The resistance is part of the pattern. An organisation that welcomes the news that it has been asking the wrong question is unusual. Most would rather dispute the answer.

This is what Systemus finds worth admiring in the operational research tradition. Not the techniques, which have evolved and spread across many fields. The disposition: willingness to stand outside an assumed problem definition, to treat the framing as something that can be examined rather than taken as given, and to say plainly when the wrong question is being asked. That is a harder conversation to have inside an organisation than any technical disagreement, because it implies that the people who set the question were working with a frame failure, and institutions rarely welcome that implication.

Frame failure is most damaging when it is invisible. The value of someone standing outside the problem is not that they are cleverer than the people inside it. It is that they are not subject to the same constraints on what can be questioned. That is the role Blackett was playing. It connects directly to what observation and reasoning tries to make practical, and to the argument in The diagnosis is yours about who is positioned to see the problem clearly.

The operational research tradition is a lineage of people who understood this. They were effective not because they brought more rigour to agreed problems, but because they were willing to challenge what the problem was. That remains the rarest and most useful thing a person can do in an organisation facing a decision it cannot see clearly.


Further reading:

References & Influences — annotated notes on related thinkers and sources in this territory.


Garden notes

  • Frame failure — operational research is a historical case study in what frame failure looks like from the outside, and what it costs to name it
  • Observation and reasoning — the practical disposition that connects directly to the OR tradition
  • The diagnosis is yours — on who is positioned to see the problem clearly, and why that position matters
  • Separated knowledge — Blackett and Boyd both crossed domain boundaries; working outside their home field was part of what made them effective