The Myth of "Common Sense": Why It Is the Most Dangerous Phrase in Safety
We tell workers to "use their common sense" and assume they will stay safe. But common sense is a lie. It is a retrospective bias that managers use to blame victims for systemic failures. If your safety strategy relies on "sense," your system is nonsense.
A young worker, let’s call him Jason, is clearing a jam on a conveyor belt. He puts his hand inside the guard while the machine is running. He doesn't isolate the power. The machine cycles. Jason loses three fingers.
The investigation meeting begins. The Plant Manager turns purple with rage. He slams his fist on the table.
"I don't understand! We gave him the induction. We gave him the PPE. Why did he put his hand in there? Where was his common sense?"
Heads nod around the table. The Safety Manager sighs. "Yes," they agree. "You can't fix stupid. It was just common sense not to do that."
This is the moment the investigation dies. And this is the moment the next accident is born.
"Common Sense" is the lazy manager’s alibi. It is the ultimate "Get Out of Jail Free" card for leadership. By attributing the accident to a lack of "sense" in the worker, we absolve ourselves of the responsibility to design a system that doesn't require "sense" to be safe.
Here is the uncomfortable truth that we need to scream from the rooftops: Common Sense does not exist. In a high-risk industrial environment, it is a myth. Relying on it is not management; it is negligence disguised as wisdom.
Part 1: The Definition Problem (The "Cached Experience" Fallacy)
Voltaire famously wrote: "Common sense is not so common." But in safety, the problem is deeper: Common Sense is not innate; it is learned.
What a Manager calls "Common Sense" is actually "Cached Experience."
The Manager knows not to touch that specific pipe because 15 years ago, he touched it and burned his hand.
The Manager knows that a specific hum in the motor means the pressure is too high because he has heard it 1,000 times.
The Manager knows that "Red" means danger because he grew up in a culture where red means stop.
He calls this "Common Sense." He thinks it is universal logic. But to the 22-year-old apprentice who started last Tuesday, that pipe looks like any other pipe. That hum is just noise. And maybe he comes from a background where symbols mean different things.
When you tell a worker: "Use your common sense," what you are actually saying is:
"I expect you to have the same 20 years of painful experience that I have, without actually living through it."
It is an impossible request. You are judging a novice by the standards of an expert. If "Common Sense" was truly common, we wouldn't need training schools, manuals, or procedures. We would just hire people off the street and let them work.
Part 2: The "Curse of Knowledge" (Why Experts Fail Novices)
Psychologists call this the "Curse of Knowledge." Once you know something (e.g., how to safely operate a lathe), it becomes cognitively impossible for you to remember what it was like not to know it.
The danger becomes "obvious" to you. The solution becomes "intuitive." So, when you design a workspace or write a procedure, you design it for yourself (the expert). You leave out the "obvious" steps.
You don't put a label on the hot pipe because "everyone knows it's hot."
You don't put a guard on the back of the machine because "nobody is stupid enough to reach in there."
Then, a novice arrives. He doesn't have your "Curse of Knowledge." He reaches in. He gets hurt. And you blame him for lacking common sense, when in reality, you failed to design for the user.
Part 3: The "Hindsight Bias" (It Was Obvious... Afterwards)
Why does every accident look like a failure of common sense? Because of Hindsight Bias (also known as Creeping Determinism).
Before the accident: Jason sees a jammed machine. He sees a production target he has to hit. He sees a supervisor watching him impatiently. He thinks: "I can grab that piece quickly. I’ve done it before. If I stop to Lockout, I’ll get yelled at." To him, in that specific context, his action makes sense. It is rational behavior driven by incentives.
After the accident: We know the outcome (he lost his fingers). The probability of injury is now 100%. Looking back, the risk seems huge and the reward seems tiny. We say: "It was obvious he would get hurt."
It wasn't obvious to him. If it were obvious, he wouldn't have done it. (Unless he is suicidal, which is statistically rare). Labeling an error as a "lack of common sense" is a psychological trick we play on ourselves to feel superior to the victim. It comforts us: "I wouldn't have done that. I have common sense. Therefore, I am safe." This is the "Just-World Hypothesis"—we blame victims to convince ourselves that the world is fair and predictable. It isn't.
Part 4: System 1 vs. System 2 (The Biology of Stupidity)
Nobel Prize winner Daniel Kahneman taught us that the brain has two modes:
System 1: Fast, instinctive, emotional, automatic. (e.g., jumping when you hear a bang).
System 2: Slow, logical, calculating, deliberate. (e.g., solving a math problem).
"Common Sense" (making the right safety decision) requires System 2. It requires stopping, analyzing, and predicting. But industrial accidents happen when workers are tired, rushed, stressed, or scared. Stress kills System 2.
When a worker is under pressure to finish a job, their brain defaults to System 1. They go on autopilot. They don't see the hazard; they only see the goal. Expecting a stressed human to use "Common Sense" is like expecting a drowning man to solve a Sudoku puzzle. It is biologically impossible. Safety systems must be designed to protect people when they are in System 1 (autopilot), not just when they are in System 2 (classroom mode).
Part 5: The Legal Reality (Duty of Care)
Let’s take this argument to its logical conclusion: The Courtroom. Imagine you are the Employer. You are standing before a Judge after a fatality on your site.
Judge: "Mr. Safety Manager, you instructed the deceased to clean the chemical tank. Did you provide specific training on the toxic vapors? Did you have a gas detector? Did you have forced ventilation?"
You: "No, Your Honor. We didn't do those things. It is common sense not to breathe toxic fumes. He should have known better."
Do you think you will win that case? You will go to jail.
The Law does not recognize "Common Sense" as a control measure. The Law recognizes Training, Guarding, Supervision, and Instruction. If your safety strategy relies on the worker "figuring it out" because it’s "obvious," you have failed in your legal Duty of Care. You have sent a soldier into battle without a map and told him: "Just use your intuition to avoid the landmines."
Part 6: The Solution – The "Explicit" Protocol
How do we eradicate this dangerous phrase from our vocabulary and our culture? We move from Implicit Safety (guessing) to Explicit Safety (design).
1. Ban the Phrase
Make it a hard rule in your department. If an accident investigator uses the phrase "Common Sense" or "Carelessness" in a draft report, reject the report immediately. Force them to explain WHY the action made sense to the worker at the time.
Did they lack training?
Was the label missing?
Was the control panel confusing?
Were they tired?
2. Design for the "Fool" (Poka-Yoke)
Japanese manufacturing uses the term Poka-Yoke (Mistake-Proofing). The assumption is: "The worker will eventually try to do the wrong thing. How do we make it physically impossible?"
Common Sense Approach: "Don't put diesel in the petrol car." (Relies on the driver paying attention).
Poka-Yoke Approach: Make the diesel nozzle too big to fit into the petrol tank hole. (Relies on physics).
Stop relying on the brain. Rely on the design. If safety depends on a human being smart, alert, and logical 100% of the time, the system is broken.
3. The "New Hire" Test (The Martian Test)
When you write a procedure or design a station, grab a totally new person from the office (someone who has never done the job, or an intern). Ask them: "How would you operate this?" Don't give them hints. Just watch.
If they make a mistake, it’s not because they lack common sense. It’s because your design is not intuitive. Fix the design, don't blame the user.
4. Assume "Zero Sense"
When doing a Risk Assessment, start with the assumption that the worker has Zero Common Sense. Assume they will try to stick their hand in the blade. Assume they will try to bypass the guard. Now, design the machine so they can't. If you build a system that is safe for an idiot, it will be safe for a genius on a bad day.
The Bottom Line
"Common Sense" is a myth we tell ourselves to sleep better at night after someone gets hurt. It creates a divide between "Us" (the smart, safe managers) and "Them" (the stupid, careless workers).
But there are no stupid workers. There are only untrained workers, distracted workers, and workers placed in confusing, poorly designed environments.
Stop wishing for common sense. Start building Common Knowledge and Common Systems. If you have to ask them to "use their head," it means you didn't use yours when designing the job.

Comments
Post a Comment