There is a particular literary form, the list of great historical failures, that is, I think, a much more useful educational genre than the list of great successes. Success stories tell you what worked in a specific set of circumstances, most of which will not be your circumstances. Failure stories tell you about the structural mistakes that can capsize any project, and those transfer. You will not be Napoleon. You may very well be the person about to make Napoleon's mistakes.
What follows is a working list of what I consider the ten greatest failures in recorded history, ranked by magnitude of ambition multiplied by completeness of disaster. My criteria are, essentially: did they aim high? and did they fail spectacularly? I have tried to select for failures that failed not because the project was stupid, but because the project was genuinely ambitious and ran into problems that, in retrospect, might have been foreseen. These are the instructive ones.
10. The Xerxes Bridge
In 480 BCE, the Persian king Xerxes attempted to cross the Hellespont, the narrow strait between Asia and Europe, by building two pontoon bridges of lashed ships. The first bridges were destroyed by a storm. Xerxes, according to Herodotus, responded by ordering that the sea itself be punished with three hundred lashes and that a set of chains be thrown into the water to "enslave" it.
The second set of bridges was, eventually, completed. The invasion of Greece that they enabled was, eventually, a failure. What makes this a beautiful failure is the specific, imperial absurdity of whipping the ocean. The gesture is both futile and revealing. Xerxes had, at this point, resources that no earlier human had commanded. He had reached the point where his power met an indifferent natural fact, and he responded by trying to command the natural fact. This is a very specific kind of failure and it recurs. It is one of the oldest failure modes of great power.
9. The South Sea Bubble
In 1720, the South Sea Company, a British joint-stock company granted a monopoly on trade with South America, became the center of one of the most famous financial bubbles in history. The stock price went from £128 in January to over £1,000 in August, then collapsed to £124 by December. Thousands of investors were ruined. Isaac Newton, who had sold at a profit and then bought back in at the top, lost £20,000 (roughly £4 million in modern money). He reportedly said afterwards that he could "calculate the motions of the heavenly bodies, but not the madness of the people."
The South Sea Bubble is instructive because it failed not for a technical reason, the company's actual trading business was not the point, but for a structural reason. Once the stock became a speculative vehicle, the underlying business ceased to matter. The same failure mode has repeated, with slight variations, about every twenty years since. We are not good at learning this one.
For anyone who wants to dig into any of these case-by-case, most of the primary documents are more accessible than you'd expect. The British Library's collection guides hold the main record for several of the earlier disasters. The UK National Archives has the Lloyd's of London insurance records that help reconstruct the Titanic material. And the Internet Archive's book collection has free scans of most of the older histories I've drawn on for this piece.
8. The Darien Scheme
In 1698, Scotland attempted to establish a colony on the isthmus of Panama, in a location called Darien. The plan was to build a Scottish port that would control overland trade between the Atlantic and the Pacific, producing vast wealth for a small country that had been struggling economically. An estimated quarter of all the liquid capital in Scotland was invested in the venture.
The colony failed within two years. The site was malarial. The Scots had inadequate supplies. The English, who opposed the venture for geopolitical reasons, refused to trade with or assist the colonists. Fewer than 300 of the roughly 2,500 settlers survived to return home. The financial devastation in Scotland was a significant factor in the 1707 Act of Union with England, which effectively ended Scottish political independence.
What makes this so instructive is the scale: the failure of a single venture was consequential enough to lose a nation its sovereignty. Most failures do not have this kind of concentrated impact. When a single project represents a quarter of a country's capital, the project's failure becomes, effectively, a national catastrophe.
7. The Tacoma Narrows Bridge
Opened July 1, 1940. Collapsed November 7, 1940. Duration of operation: four months, six days.
The Tacoma Narrows bridge is a beautiful engineering failure because the engineers who designed it were not incompetent. They were working at the absolute edge of the structural engineering knowledge of the time, trying to push suspension bridge design to new proportions of lightness and elegance. The specific mode of failure, aeroelastic flutter, in which wind caused the deck to oscillate with increasing amplitude, was not well understood in 1940. The engineers had pushed past the limits of their own field. The bridge taught everyone, violently, that the limits were there.
The engineers had pushed past the limits of their own field. The bridge taught everyone, violently, that the limits were there.
The film of the bridge's collapse is, if you have not seen it, worth watching. It is one of the most haunting documentary records we have of a technological failure in real time.
6. The Ford Edsel
Introduced September 1957. Discontinued November 1959. Total losses, in 1950s dollars, approximately $350 million, roughly $3.7 billion today.
The Edsel failed for reasons that are almost comic in their overdetermination. The marketing was overhyped. The design was ugly, in ways that are hard to articulate but which consumers responded to immediately. The price point missed its intended market segment. The build quality was poor because the assembly plants were shared with other Ford lines and the workers didn't know the Edsel's specifications. It launched during a recession. Its biggest target demographic was, simultaneously, being wooed by smaller and more efficient European imports.
This is, I think, a very useful failure to study, because it shows how a project can fail not from a single catastrophic mistake but from the accumulation of a dozen small ones, no one of which would have killed it alone. Large organizational failures usually look like this. The story always simplifies them to a single cause. The reality was almost always compound.
5. The Maginot Line
France, between 1929 and 1938, built the most elaborate permanent fortification system in human history along its border with Germany. Hundreds of miles of concrete emplacements, artillery positions, underground railways, and living quarters for a permanent garrison of tens of thousands of soldiers. The line was, technically, a masterpiece of engineering. It did almost exactly what it was designed to do, very well.
What it was designed to do was prevent a German invasion along the 1914 axis, through northeastern France. In 1940, Germany invaded through the Ardennes forest, north of the Maginot Line, and around it. The line was captured intact from the rear. The entire project was, essentially, irrelevant.
This is the canonical example of fighting the last war. France prepared, with enormous skill and resources, for a problem that the next conflict would not pose. The failure was not of execution. It was of imagination. Every large organization has a Maginot Line somewhere in its planning, which is to say, a thing it has built beautifully for a future that is not coming.
4. Challenger
January 28, 1986. Seventy-three seconds after launch. Seven crew members killed on live television, in front of millions of schoolchildren who were watching because Christa McAuliffe was going to be the first teacher in space.
The Challenger failure is, from a failure-analysis standpoint, one of the most documented disasters ever. The physical cause. O-ring failure in the right solid rocket booster due to cold launch temperatures, was well-understood within days. The organizational cause, which is the more important one, was studied for years. Engineers at Morton Thiokol had warned, explicitly and in writing, that the O-rings might fail at the temperatures forecast for launch morning. NASA management overrode them. The warnings were there. The system did not listen.
This is the pattern that keeps showing up in catastrophic failures: the warning was available, the information was there, the system failed to act on it. The specific organizational reasons why this keeps happening are the subject of a whole field of study, normal accident theory, high reliability organizations, and nobody has yet figured out how to reliably prevent it.
3. Long-Term Capital Management
A hedge fund, founded in 1994, whose partners included two Nobel laureates in economics. In its first three years, it produced returns of over 40% a year. In 1998, during the Russian financial crisis, it lost $4.6 billion in less than four months and had to be rescued by a Federal Reserve-coordinated consortium of major banks to prevent a systemic collapse of the global financial system.
LTCM is on this list because the failure mode was not incompetence. The partners were genuinely the smartest people in finance. The models were sophisticated. The strategies were mathematically sound under a range of conditions that the partners had carefully specified. What they had not, it turned out, properly modeled was the correlation of extreme events, the fact that when markets panic, everything moves in the same direction at once, and the diversification that protects you in normal conditions evaporates. This was an intellectual failure of a very specific kind. It is also, nearly verbatim, the failure that produced the 2008 financial crisis ten years later. We did not learn.
2. Operation Barbarossa
June 22, 1941. Nazi Germany launched the largest invasion in human history, attacking the Soviet Union with approximately four million men along an eighteen-hundred-mile front. The invasion was predicated on the assumption that the Soviet Union would collapse within months, that Moscow would fall before winter, and that the war would be over before Germany had to worry about any of the strategic problems it was, in fact, about to face.
The invasion did not collapse the Soviet Union. It did not take Moscow before winter. It bogged down in the mud and cold. It killed, by the time it was over, some twenty-five million Soviet citizens and four million Germans. It was, from Germany's strategic perspective, the single largest self-inflicted wound in military history. The decision to invade the Soviet Union, more than any other single decision, lost Germany the war.
The instructive aspect of Barbarossa is how many people saw the problems in advance. German military planners raised objections. Intelligence officers warned about Soviet industrial capacity. Weather experts noted what winter would do to the supply lines. The warnings were all there. The decision was made anyway, for reasons that had more to do with Hitler's ideological commitments than with military calculation. The failure was, at bottom, a failure to weight evidence against belief. A very common pattern. Almost the defining pattern of large-scale historical catastrophe.
1. The Four Horsemen Nobody Listens To
At the top of my list, a failure that is ongoing: the repeated failure, across many human societies and many centuries, to act on warnings about predictable large-scale risks. The warnings have always been available. The failure is structural. Every large civilization has lost to this failure mode, eventually, and our current civilization is, I suspect, in the process of losing to it as well.
This is not a specific historical event. It is the meta-failure that sits behind many of the items on this list. LTCM was an instance. Challenger was an instance. The Darien Scheme had warnings. Barbarossa had warnings. The pattern is almost identical each time: information is available, the system does not act on it, the disaster that was predicted unfolds exactly as predicted, a commission is convened afterward to produce a report, the report is filed, and the pattern repeats with slightly different specifics next time.
I do not know how to break this pattern. Nobody does. It is, I think, the central failure of large organizations, and by extension of civilizations. The best I can say is that studying the failures of the past makes the pattern more visible, and that increased visibility may, at the margin, help. It is not a cure. It is, at best, a slightly better diagnosis.
Ten failures. Ranked by ambition times disaster. I would, if anyone asked, argue for different orderings on different days. The top three are more or less fixed; after that, the case for swapping any two entries is nearly always defensible. The point is not the specific order. The point is that these are, all ten of them, worth studying, and that each one has something to teach anyone who is in the middle of a project large enough that they might, someday, appear on a list like this.