The Concorde Fallacy: Why We Finish Projects That Kill Us

A strategic analysis of the Sunk Cost Fallacy, Plan Continuation Bias, and the Escalation of Commitment. A forensic examination of why organizations—from the nuclear control rooms of 1986 to the drilling floors of the 21st century—double down on dangerous courses of action simply because they have "come this far."

A Visual Anatomy of the Concorde Fallacy. The left panel illustrates the "Sunk Cost trap," showing the grueling, expensive effort to push the boulder of "Project Continuation" uphill simply because of past investment. The right panel shows the inevitable catastrophic consequence—symbolized by the exploding oil rig (Deepwater Horizon) and the sinking ship (Vasa)—when the decision to stop finally comes "too late."

Executive Summary: The Lethal Trap of "Too Late to Stop"

On the evening of April 20, 2010, the crew of the Deepwater Horizon was wrestling with a "nightmare well" in the Gulf of Mexico. The Macondo project was plagued by difficulties: it was 43 days behind schedule and $50 million over budget. The financial pressure was suffocating; every hour the massive rig sat idle cost BP roughly $1 million in leasing fees and operational overhead.

The engineers and company men on board faced a critical, binary decision that would determine the fate of the rig, the crew, and the company's future: Should they perform a comprehensive cement bond log (CBL) to verify the integrity of the well seal?

  • The Cost of Safety: The test would take approximately 12 hours to rig up and perform. It would cost roughly $128,000. Crucially, it would delay the project by another half-day, angering stakeholders who were demanding completion.

  • The Benefit of Safety: It would provide definitive proof of whether the cement barrier was sealed or if it was leaking explosive gas.

They chose to skip the test. They chose to finish the job.

Hours later, the well blew out. The resulting explosion killed 11 men, sank the massive vessel, unleashed the largest marine oil spill in human history, and eventually cost BP over $65 billion in fines, settlements, and cleanup costs.

Why did experienced professionals make such a reckless gamble? They were victims of the Concorde Fallacy (also known as the Sunk Cost Fallacy). The logic that drove them was not based on the future probability of success, but on the past investment of time and money. The unspoken rationale was: "We are already 43 days late. We have already spent $100 million. We cannot afford to stop now."

This analysis argues that past investment is the single greatest enemy of future safety. The more time, money, or effort a team invests in a risky plan, the harder it becomes to abandon it, even when the signs of impending disaster are screaming "Stop." We do not maximize safety; we maximize the justification of our past decisions.


Part 1: The Origin (The Supersonic Mistake)

The term derives from the Concorde supersonic jet project, a joint venture between the British and French governments in the 1960s.

Long before the first plane took off, it became clear to economists and engineers that the project was commercially unviable. The operational reality was bleak: rising fuel costs, strict noise regulations that limited flight paths, and limited passenger capacity meant the Concorde would never make a profit. It was a financial black hole.

However, neither the British nor the French government wanted to back out. They had already poured millions into development. They argued that stopping would mean "wasting" that money and suffering a national embarrassment. So, they continued pouring billions more into a project that was doomed from the start. They built the plane not because it was a good idea, but because they couldn't bear to admit it was a bad one.

The Economic Truth: Rational economics states that Sunk Costs are irrelevant. The money you spent yesterday is gone forever. You cannot get it back, no matter what you do today. Decisions should be based only on future costs and future benefits.

The Psychological Reality: Humans are not rational economic actors. We view "stopping" as "wasting." We view "continuing" as "justifying the investment." In safety, this logic is lethal.

  • "We've already set up the crane. It will take 4 hours to tear it down and move it to a safer spot. Let's just make the lift from here."

  • "I've already climbed 28,000 feet. The summit is right there. I can't turn back now."


Part 2: The Neuroscience of Loss Aversion (Prospect Theory)

Why is this trap so inescapable? Nobel laureates Daniel Kahneman and Amos Tversky identified the mechanism in their groundbreaking work on Prospect Theory.

They discovered that the human brain feels the pain of a loss twice as intensely as the pleasure of an equivalent gain. This is known as Loss Aversion. When a project is failing, the brain faces two choices, and the neurology dictates the outcome.

The Binary Choice:

  1. Choice A (The Sure Loss): Stop the project. Admit failure. The $10 million invested is lost instantly. The brain registers this as emotional agony, triggering the same neural pathways as physical pain.

  2. Choice B (The Gamble): Continue the project (risking a catastrophe). There is a chance you might succeed and "save" the investment, avoiding the pain of loss.

The brain chooses the risk of catastrophe (Choice B) to avoid the certainty of the immediate loss (Choice A). The most dangerous insight from Prospect Theory is this: When we are "winning," we are risk-averse. But when we are "losing" (behind schedule, over budget), we become risk-seeking. We gamble with lives to break even.


Part 3: The Endowment Effect (Why We Love Our Bad Ideas)

Closely related to Sunk Cost is the Endowment Effect. This psychological bias states that we value something more simply because we own it or created it.

In safety engineering and procedure development, this manifests when a team spends six months designing a safety system or writing a complex procedure. Even if new data arrives showing the design is flawed or the procedure is unworkable, the team fights to keep it.

  • The Objective Reality: The design is flawed, confusing, and introduces new risks.

  • The Subjective Reality: "This is my design. I spent 1,000 hours on it. It is valuable because I made it."

This leads to the "Not Invented Here" syndrome, where safety teams reject safer alternatives from the outside because accepting them would mean admitting that their own "endowed" work was wasted. The emotional attachment to the work overrides the logical assessment of the risk.


Part 4: Historical Case Study: The Vasa (1628)

The Concorde Fallacy is not a modern invention. The sinking of the Swedish warship Vasa in 1628 is a perfect pre-industrial example of Sunk Cost driving engineering failure.

King Gustavus Adolphus ordered the construction of the most powerful warship ever built to dominate the Baltic Sea.

  • The Sunk Cost: During construction, the King demanded the addition of an extra deck of heavy cannons. The shipwrights knew this would raise the center of gravity and make the ship unstable and top-heavy.

  • The Test: A stability test was performed at the dock. Thirty men ran back and forth on the deck to simulate wave action. The ship rocked so violently the test was cancelled to prevent it from capsizing right there at the dock.

  • The Decision: The cost of delay and the King's anger were too high. The investment in timber, bronze, and national reputation was immense. No one dared to say "Stop."

The ship launched with great fanfare. It sailed less than 1,300 meters, encountered a light breeze, capsized, and sank within 20 minutes, killing 30 people. They launched a ship they knew was unstable because the Sunk Cost of the King's ego was too high to write off.


Part 5: Plan Continuation Bias ("Get-There-Itis")

In aviation, the Concorde Fallacy manifests as Plan Continuation Bias, colloquially known as "Get-There-Itis."

NASA and NTSB studies show that pilots are far more likely to fly into dangerous weather (thunderstorms, fog) as they get closer to their destination. The bias creates a "Tunnel Vision" where the destination becomes the only reality, and the brain filters out threats.

The Tenerife Disaster (1977): The deadliest accident in aviation history (583 dead) was driven by this bias. Captain Van Zanten of KLM was one of the most experienced pilots in the world. However, he was under immense stress. He was nearing his legal duty time limits. If he didn't take off immediately, the flight would be cancelled, passengers would be stranded, and the airline would lose money (Sunk Cost of the schedule).

Despite heavy fog and no clear clearance from the tower, he throttled up. He felt he had "invested" too much time waiting to wait any longer. He crashed into a Pan Am jet on the runway. The investment in saving the schedule cost 583 lives.


Part 6: Case Study: Chernobyl (The Test That Couldn't Wait)

The 1986 Chernobyl nuclear disaster is often taught as an operator error. Strategically, it was a Sunk Cost disaster driven by schedule pressure.

The Context: The engineers were scheduled to perform a safety test on the steam turbine during a coastal rundown. The test had been delayed multiple times before. The plant was scheduled for a maintenance shutdown the next day.

  • The Sunk Cost: If they didn't finish the test that night, they would have to wait months for the next shutdown cycle. The pressure to "tick the box" was immense.

The Escalation: As they prepared for the test, the reactor power dropped to dangerously low levels due to xenon poisoning (a neutron absorber). The safety protocols were clear: shut down the reactor immediately.

  • The Choice: Shut down and lose the test opportunity (accepting the Sunk Cost of time/planning) OR pull the control rods out to force the power up and finish the test.

Anatoly Dyatlov, the deputy chief engineer, refused to accept the "waste" of the delay. He ordered the control rods pulled. He forced the reactor into an unstable state to save the schedule. The result was not just a failed test; it was the total destruction of Reactor 4 and the radioactive contamination of Europe.


Part 7: Escalation of Commitment (The "Entrapment" Model)

When a project starts failing, leaders rarely withdraw. Instead, they engage in Escalation of Commitment. They invest more resources to prove that the initial decision was correct. This is the "Digging a Hole" phenomenon.

The Entrapment Model: Commitment acts like a trap. The more you struggle (invest), the tighter it grips.

  1. Phase 1 (Small Deviation): The drill bit gets stuck. Cost to fix: $50k. Manager says: "Keep trying to free it."

  2. Phase 2 (Escalation): Still stuck. Cost is now $200k. Manager says: "We can't stop now, we've spent $200k. Send heavier equipment."

  3. Phase 3 (Rationalization): Still stuck. The well is unstable. Cost is $1M. Manager says: "If we walk away now, I have to explain a $1M loss to the VP. We must finish this."

The manager is no longer managing the well; he is managing his reputation. He is "doubling down" to recover his losses. This often leads to cutting safety corners to speed up the recovery, leading to a blowout.


Part 8: The Gambler's Fallacy (Probability Blindness)

Often, the Sunk Cost Fallacy operates in tandem with the Gambler's Fallacy. This is the mistaken belief that if something happens more frequently than normal during a given period, it will happen less frequently in the future (or vice versa).

In Safety:

  • "We have had 10 near-misses with this valve. Our luck is bound to turn. We are due for a success."

  • "We have run this red light 50 times and never crashed. The odds are in our favor."

When a project is failing (e.g., a drilling operation encountering constant kicks), the team believes that "the bad luck must end soon." They view the investment of time as "paying their dues" to the laws of probability. In reality, the well does not care about your luck. Independent events are independent. The 11th near-miss is just as likely to be a catastrophe as the 1st.


Part 9: Case Study: Mount Everest (1996)

The 1996 Everest disaster, documented in Jon Krakauer’s Into Thin Air, is a chilling study of Sunk Cost in its purest human form. Expedition leaders Rob Hall and Scott Fischer had a strict safety rule: "Turn around at 2:00 PM. If you haven't reached the summit, go back."

On summit day, 2:00 PM came and went. Climbers were still pushing up. Hall and Fischer, despite being safety experts, did not turn them around. Why?

  • Monetary Sunk Cost: The clients had paid $65,000 each.

  • Time Sunk Cost: They had spent 6 weeks acclimatizing in base camp.

  • Physical Sunk Cost: They had suffered immense physical pain to get there.

To turn around at 2:00 PM meant admitting that all that money and suffering was for "nothing." The Sunk Cost was too high to bear. So they pushed on. They reached the summit at 4:00 PM. Because they ignored the stop rule to "save" the investment, they were caught in a blizzard on the descent. Eight people died.


Part 10: Institutional Sunk Cost: The Boeing 737 MAX

The tragedy of the Boeing 737 MAX is a masterclass in Institutional Sunk Cost. Boeing faced a massive commercial threat from the Airbus A320neo. They needed a new, fuel-efficient plane immediately to compete.

  • The Rational Choice: Design a new airframe from scratch. (Cost: $10B+, Time: 7 years).

  • The Sunk Cost Choice: Update the 1960s-era 737 airframe. (Cost: Lower, Time: Faster).

They chose to update the 737. But the new engines were too big for the old frame, causing the plane to pitch up under thrust. To fix this, they added software (MCAS) to push the nose down automatically. Crucially, they didn't want to retrain pilots, because that would ruin the "cost benefit" promise to airlines regarding the Sunk Cost of existing pilot certification. So they hid the existence of MCAS from the pilot manuals.

They built a deadly layer of complexity on top of an old system because they refused to abandon the "Sunk Cost" of the 737 legacy design. They tried to save billions in development and lost billions in reputation and human life.


Part 11: The Ostrich Effect (Information Avoidance)

Why didn't the managers at BP or Boeing look at the data? This is the Ostrich Effect. When we have heavily invested in a Sunk Cost, we actively avoid information that might suggest we are wrong.

  • Financial Markets: Investors check their portfolios less often when the market is down.

  • Safety Management: Managers stop auditing a project that is behind schedule.

They fear the "Bad News" will force them to confront the Sunk Cost. So they bury their heads in the sand. They prefer Strategic Ignorance to the pain of admitting the investment is lost. This explains why warning signs (like the cement logs on Deepwater Horizon) were "misinterpreted" or ignored. The brain filters out the data that threatens the investment.


Part 12: Agency Theory & The Conflict of Interest

Why do individual managers take these risks? Agency Theory explains the conflict between the "Principal" (the Shareholder) and the "Agent" (the Manager).

  • The Shareholder: Wants the company to exist in 10 years. Prefers to lose $100k today to avoid a $65B explosion.

  • The Manager: Wants their annual bonus this year. If they stop the project, they miss their target and lose their bonus. If they gamble and win, they get the bonus. If they gamble and lose (blowout), they might be fired, but they don't pay the $65B.

The incentives are misaligned. The Sunk Cost Fallacy is often driven by the manager's personal "Sunk Cost" in their career trajectory, not the company's best interest.


Part 13: The Ritual of Wasted Effort

Organizations often confuse "Effort" with "Value." This is the Ritual of Wasted Effort. If a safety team spends 100 hours writing a "Safety Procedure" that is 50 pages long and impossible to read, they will defend it to the death. Why? Because they worked hard on it.

In their minds, Sunk Effort = Quality. In reality, the procedure is dangerous because no one reads it. But the organization refuses to scrap it because "we spent so much time on it." This clutters the safety management system with useless debris, hiding the real risks.


Part 14: Macro-Politics and "Face-Saving" (The Vietnam War)

In hierarchical organizations and nations, the Concorde Fallacy is amplified by Cognitive Dissonance and the need to Save Face. Admitting a mistake causes mental pain. It threatens our self-image as "competent leaders."

The Vietnam War: The US government knew by the mid-1960s that the war was unwinnable. Yet, they continued for another decade. As one advisor famously put it: "We cannot let American boys die in vain." So, they sent more boys to die, to justify the deaths of the previous ones. This is the ultimate Sunk Cost tragedy: Sacrificing the future to validate the past.

In industry, a Project Manager will push a dangerous commissioning schedule not because the company needs the product today, but because he promised it would be done today. He is risking the plant to save his face.


Part 15: Strategic Solutions (Breaking the Fallacy)

How do we defeat a bias that is hardwired into our neurology? We must remove the decision from the person who made the investment.

1. The "Tripwire" or Kill Criteria Establish non-negotiable "Stop Rules" before the project starts.

  • Example (Everest): "If we are not at the summit by 2:00 PM, we turn around. No discussion."

  • Example (Drilling): "If we are over budget by 10% or behind schedule by 5 days, a mandatory 'Safety Stand-down' is triggered by an external party."

2. Separate the Decision Maker (Red Teaming) The person who approved the budget should never be the person deciding whether to cancel the project. They are too emotionally invested.

  • The Solution: Use an independent "Red Team" or a specific "Exit Manager." They have no "Sunk Cost" in the game. They only see future risk. If the Red Team says stop, you stop.

3. Reframe the Narrative (Opportunity Cost) Leaders must change the language. Instead of asking "How much have we spent?", ask "What else could we do with the money/time required to finish this?"

  • Teach teams that Stopping is a Success. Stopping a dangerous job is not "wasting time"; it is "buying survival."

4. The Pre-Mortem Analysis Before a critical phase, ask: "If we continue this plan and it fails catastrophically, what would we wish we had done today?" The answer is almost always: "Stopped and reassessed."


Conclusion: The Courage to Burn the Plan

In a corporate culture that glorifies "Grit," "Perseverance," and "Never Give Up," we have forgotten the survival value of Quitting.

Safety requires the moral courage to look at a $100 million project, a 43-day delay, or a summit 100 meters away, and say: "It is not worth it."

The Deepwater Horizon tragedy teaches us that the laws of physics do not care about your budget, your schedule, or your quarterly earnings. The ocean does not care about your Sunk Cost. When we prioritize the Past (Investment) over the Future (Risk), we are not managing a business; we are digging a grave.

True leadership is not forcing the plan to work. True leadership is knowing when to burn the plan.

Comments

Popular posts from this blog

The Myth of the Root Cause: Why Your Accident Investigations Are Just Creative Writing for Lawyers

The Audit Illusion: Why "Perfect" Safety Scores Are Often the loudest Warning Signal of Disaster

The Silent "H" in QHSE: Why We Protect the Head, But Destroy the Mind