The "Bad Apple" Myth: Why Firing People for Safety Violations Is an Act of Corporate Self-Sabotage
When an accident happens, our primitive instinct is to find the culprit, punish them, and close the file. It feels like justice. But in the complex world of modern industry, it is an act of intellectual laziness. By firing the person who made the mistake, you are smashing the only thermometer that could tell you how sick your system truly is.
Act 1: The Event A veteran crane operator—let’s call him Andreas—is performing a routine lift at a busy construction site. It is late in the shift, the wind is picking up, and the schedule is three days behind. The Load Moment Indicator (LMI) inside the cab starts beeping, warning him he is near the limit. Andreas knows the crane. He knows the load. He thinks the sensor is being overly sensitive (as it often is). He overrides the LMI key to finish the lift. Ten seconds later, gravity wins. The crane tips over. Miraculously, nobody is killed. But the crane is destroyed. The damage is €300,000. The project is halted.
Act 2: The Inquisition The investigation begins. The pressure from the Board is immense. They want a "Root Cause." They want a name. The investigation report is short, brutal, and technically "correct":
Rule Violated: Operator intentionally overrode the safety device.
Root Cause: Human Error / Gross Misconduct / Violation.
Corrective Action: Andreas is fired immediately for cause.
Act 3: The Illusion of Safety A memo is sent to all staff: "Safety violations will not be tolerated. We have removed the 'Bad Apple' from our basket. We are now safe again." Management sleeps well that night. They feel decisive. They feel they have upheld high standards.
They are dangerously wrong.
By firing Andreas, they have not fixed the problem. They have simply ensured that the next crane tip-over will happen sooner, and likely with fatal results. They have committed the "Bad Apple Fallacy." This is the simplistic belief that systems are fundamentally safe and only fail because of erratic, unreliable, or "bad" people. It is a comforting lie that protects the system's reputation at the expense of the truth.
Part 1: The Psychology of Blame (Retributive Justice)
Why is our first instinct to blame and punish? Why do we feel satisfied when the "Bad Apple" is removed?
Because blame is chemically satisfying. When a bad event happens (an accident), our brain enters a state of cognitive dissonance. We feel vulnerable. We need to restore order. Finding a "Villain" (Andreas) allows us to draw a psychological boundary:
"Us" (The Managers): We are smart, compliant, and safe.
"Him" (The Worker): He is reckless, stupid, and unsafe.
"I wouldn't have overridden the LMI. He did. Therefore, he is bad, and I am safe."
This is Retributive Justice. It asks three primitive questions:
What rule was broken?
Who did it?
How much should they suffer?
It looks like accountability, but it is actually intellectual laziness. It is a way to stop the investigation before it reaches the uncomfortable truths about the organization. By blaming Andreas, we stop asking the hard questions that might implicate management:
Why did the crane design allow the override so easily (a simple button press vs. a complex lockout)?
Why was Andreas under so much schedule pressure that he felt he had to override it to keep his job?
How many other operators override the LMI every single day but just get lucky?
When you fire the "Bad Apple," you leave the "Bad Barrel" (the system) completely untouched. You put a new operator (a fresh apple) into the same rotten barrel, subject them to the same pressures, and then act surprised when they rot too.
Part 2: The "Hindsight Bias" (The Curse of Knowledge)
The biggest enemy of a fair investigation is Hindsight Bias (also known as the "I-knew-it-all-along" effect).
Looking Back (The Investigator's View): We know the outcome (the crane tipped). The probability of failure is 100%. Therefore, the decision to override the LMI looks insane, reckless, and suicidal. We say: "How could he be so stupid?"
Looking Forward (The Operator's View): Inside the cab, 10 seconds before the crash, Andreas didn't see a "crane tip." He saw a job that needed to be done. He saw a sensor that false-alarms 10 times a day. He saw a supervisor looking at his watch.
To Andreas, in that specific moment, his action was rational. It was a local adaptation to get the work done. Professor Sidney Dekker, a world leader in Safety Science, teaches us the Golden Rule of Restorative Just Culture:
**"People do not come to work to do a bad job. Accidents are not produced by bad people. They are produced by good people trying to do their best in a complex, flawed, and pressured system."
If you want to prevent the next accident, you don't need to know who broke the rule. You need to know why it made sense to them to break the rule at that time. And you can only find that out if they aren't afraid of being fired.
Part 3: The "Black Box" Problem (Destroying Data)
Let’s use an aviation analogy. Imagine if, every time a commercial plane crashed, the airline located the Black Box (Flight Data Recorder), saw that the pilot made a mistake, and then threw the Black Box into the ocean without analyzing the data because "the pilot was reckless."
You would call them criminals. You would say they are endangering the public. You would demand they analyze the data to prevent recurrence.
Yet, this is exactly what companies do when they fire an employee immediately after an incident. The Employee IS the Black Box. Their brain holds the critical data: the context, the distractions, the confusing labels, the unwritten pressures, the "norm" of the site.
When you fire Andreas, you are throwing away the data. You are destroying the only map that can lead you to the systemic flaw. Andreas is now the most expensive expert you have. You just paid €300,000 for his "training" (the accident). Why would you fire him and hire a novice who hasn't learned that lesson yet?
Part 4: The "Second Victim" Syndrome
There is a profound human cost to the blame culture that HR departments rarely discuss. When a worker causes a serious accident—especially one that hurts a colleague—they are usually devastated. They are traumatized. They are guilt-ridden.
They are the "Second Victim" (the first being the person who was physically hurt). Instead of offering support, psychological first aid, or a debriefing, we isolate them. We suspend them. We interrogate them under bright lights like criminals. We strip them of their dignity.
This destroys Psychological Safety for the entire workforce. When other workers see Andreas being marched off-site by security guards, they absorb a clear, terrifying lesson:
**"The company doesn't care about us. They only care about their liability. If I make a mistake, I must hide it. If I report a near-miss, I am putting a target on my back."
You drive the errors underground. You create a "Secret Factory" where problems are hidden, rigged, and covered up until they become unmanageable. You create a culture of silence, which is the breeding ground for catastrophes.
Part 5: The Solution – Just Culture (The Aviation Model)
Commercial aviation is the safest industry in human history. Why? Because 50 years ago, they realized that you cannot punish error out of existence. Humans will always make mistakes. The human brain is fallible.
Aviation operates on a "Just Culture". This means:
Pilots are encouraged to report their own errors (e.g., "I flew at the wrong altitude").
If the report is honest, they are immune from punishment (unless it was sabotage, gross negligence, or substance abuse).
The data is shared globally so every other pilot can learn.
We need to bring this model to our construction sites and factories. We need to move from Retributive Justice (Punishment) to Restorative Justice (Learning).
Part 6: The "Culpability Algorithm" (How to Judge Fairly)
So, do we never punish anyone? Is it a "Free-for-All"? No. That is the "Laissez-Faire" trap. Just Culture is about drawing a line between Honest Error and Reckless Violation.
To do this, use the "Substitution Test" (developed by Neil Johnston) before you discipline anyone.
The Test: Gather a group of peers (other crane operators) and ask:
**"If you put a different worker with the same qualifications in the exact same situation (same time pressure, same fatigue, same confusing controls), is it possible they would have made the same mistake?"
If the answer is YES: It is a System Problem. Do not punish. Fix the system. (Hint: This is the answer in 90% of cases).
If the answer is NO: It might be an individual problem. (But check for training gaps first).
The Decision Tree:
Was the act malicious? (Sabotage) -> Punish.
Was the worker impaired? (Drugs/Alcohol) -> Punish/Help.
Did they knowingly take a shortcut? -> Apply Substitution Test.
If others would do it: Systemic Issue. Don't Punish.
If nobody else would do it: Individual Issue. Coach/Train.
Part 7: The "Restorative" Protocol
So, what do you do with Andreas? You don't let him off the hook. You hold him accountable, not culpable.
Culpability: "You broke the rule, so you must suffer." (Past-focused).
Accountability: "You were part of the event, so you must be part of the solution." (Future-focused).
The Action Plan:
Keep Him: Do not fire him.
Debrief Him: Use a non-judgmental facilitator to extract every detail of the "Why."
Empower Him: Make him part of the solution. Ask him: "Andreas, you know this machine better than the engineers. Help us redesign the LMI interface so the next guy doesn't get confused."
Share the Story: Have Andreas tell his story to the new hires. "I made this mistake. Here is why. Here is how the company helped me fix it. Don't do what I did."
A worker who has been treated fairly after a mistake becomes the most loyal, safety-conscious advocate your company will ever have.
The Bottom Line
Firing people is easy. It requires zero leadership skills. It is an administrative transaction that clears the conscience of the powerful. Learning from people is hard. It requires empathy, humility, and the courage to admit that the management system failed.
You can have a culture of Blame, or you can have a culture of Learning. You cannot have both.
Next time you want to fire the "Bad Apple," pause and look at the barrel. You are likely smashing the thermometer because you don't like the temperature.

Comments
Post a Comment