Don't Turn the Lights Out by Playing the Blame Game

The oh-too-human drive to blame can turn out the lights on the real game at play.  And once the lights are out, the true causes of incidents like these cannot be illuminated.

Subscribe to our newsletter

Toward the end of the Super Bowl my son remarked, “This game is taking more than 5 hours!”  Indeed, extra commercial breaks and halftime boogie concert didn’t help.  But the reason for the exceptional delay was a power failure to the Superdome that rendered the stadium lights dark.

They stopped play for 34 minutes for safety reasons.  My son remarked…”don’t they play football in all sorts of weather, snow & rain.  Why can’t they play in the lower light?”

The answer undoubtedly had to do with safety.  In such a dangerous, fast-paced game, limited visibility could also limit players’ ability to be aware of oncoming hazards and the risks of certain actions.

The “Super Bowl Blackout” reminded me of a recent discussion I had with corporate HSE directors overseas.  They told  me about a supervisor who had a close call with a high-pressure hose that nearly took off his head.  Two years later, this same individual was teaching a new worker to do the very same procedure.  This time physics were not so forgiving and the new worker lost an eye.

The question to me was, from a behavioral safety perspective, how can you account for people like this supervisor who intentionally took short-cuts during a procedure that put him at risk? And, to make it worse, teaching others to do these short-cuts?

These HSE directors that told me this story were clearly personally upset with this supervisor.  They labeled the supervisor with such words like “stupid”, “reckless”, and “word-that-shouldn’t-be-uttered-in-mixed-company”.  I’d be upset too.  This supervisor’s intentional actions hurt others.

I could see that blaming the culprit (the supervisor) was like turning out the lights.  Once the lights are out, the real hazards and risks cannot be illuminated.

Consider the label “stupid”.  If that’s your conclusion for an incident investigation what are you left with?  You’re left with no solutions.  Worse, if we get in the habit of labeling blame we tend to use it as an easy excuse not to do the harder work of a behavioral assessment of the incident. 

We go no further… we solve nothing … the incident will likely happen again, and this time, take out an eye.


Instead, approach the incident with a clear understanding of the cause and effect relationships between the behaviors that were related to the risk and the reasons why that person, knowingly or unknowingly, was put in the position to take that risk. 

Understand the cause of behavior and you’ll begin to see a pathway for solutions.

A) Ask critical questions around training, tool availability, equipment flaws, perceived time pressure, missed reporting, supervision, and work flow.  

B) Ask yourself about the person’s experience doing the task in the past.  Was the safe way to do the task cumbersome or ineffective?  Did they adopt the risk in an effort to save time or trouble?  What can be changed in the task to alleviate the “costs” of the safe alternatives?

C) Then, take a close look at your Safety Culture.  Was this person told how to do the risky short-cut by others?  Alternatively, would a peer be likely to stop and coach the person if they saw the risky behavior occurring? 


As I started drilling down into the supervisor’s behavior, the HSE leaders became more interested in the problem. The lights had been turned on.

We discussed equipment and tool changes that could help make the risk avoidable.  We discussed how the older generation had grew up in a safety culture where supervisors taught short cuts and encouraged them.  They now saw a need for targeting supervisor refresher training and safety selection methods of promoting new supervisors.

Then we got to what I considered a root cause.  Evidently, when the supervisor had done the shortcut in the past and barely avoided personal injury himself, a Close Call report was completed on the incident.  In fact, this close call was flagged as “High Potential” for serious injury or fatality.


The Close Call report concluded “Human Error” as the cause of the high potential incident.

Then, of course, after this report, nothing changed.  The “Close Call” was filed, the supervisor was talked to, and the work went on, only to have the risk repeated ... how many times by how many people?

When the lights were on they shined brightly on the “Close Call” reporting system.  It needed to be fixed.  It was too easy to blame the worker and not uncover truly actionable changes to the work and the work system that could have stopped the risk from happening, not only with this one individual but throughout the workforce.

If you have “Human Error”, “Human Factor”, or “Stupid” as an option in your incident investigation forms it is all too easy to just stop there, turn out the lights, and keep on promoting the environment that caused the risk in the first place.


If you accept a life of labels, you can go through it moving not among things but among words.”

William Golding


Timothy Ludwig’s website is where you can read more safety culture stories and contribute your own.  Dr. Ludwig is a senior consultant with Safety Performance Solutions (SPS:, serves as a commissioner for Behavioral Safety Accreditation at the non-profit Cambridge Center for Behavioral Studies (CCBS: and teaches behavioral psychology at Appalachian State University, in Boone, NC.  If you want Tim to share his stories at your next safety event you can contact him at

 Walker Andrew • Agreed totally that root cause analysis needs to be undertaken with an open mind and should incoporate all elements of the management of safety. However, this must also be balanced with a fair and just approach which deals with the human errors. If an individual takes a course of action which he knows will place himself or others at risk, then that cannot be ignored. After all they can always stop the task, just like the organisers of your Superbowl did.

Richard Rhimes •  I have found that using a"Just Culture" system, including rational analysis and evaluation aspects for incidents/accidents, works really well with the over 98% non-culpable errors made by people. It is neither a blame culture nor a blame free culture. 

Requires a lot of buy-in by management and employees to implement. Takes years to embed but is well worth the effort. 

In a previous role I had a number of people ask me, after a Just Culture was in place, "would I like to really know what happened some years ago in incident XYZ?' What an eye opener to factual information.

Tim Ludwig • Indeed... 
I have been very impressed with the "Just Culture" system that seeks to understand the systemic factors influencing incidents first but also has a structured, respectful approach to addressing the individual and their need for training, feedback, and, if necessary, consequences.

Contact Dr. Ludwig                                 © Timothy Ludwig 2017