Thursday, June 17, 2010

The Never Ending Heinrich Controversy

Recently, I recieved an email asking me this question. Below is the email, and my response:

 
Bill wrote:
I wondered if you had any comments or thoughts on the attached study by ORC worldwide which appears to negate the correlation between injury incident rates and fatalities. I would really like to implement a near-miss reporting program, but I don't think that I will have much credibility if someone were to question me on the attached study. The study, although not exhaustive seems to make sense, but I wanted to get your input.

 
I replied:
I have heard of a number of research reports that come to similar conclusions about the kind of risk leading to a fatality is different from those which lead to a medical injury. I have seen and heard presentations on causal factors for high severity incidents not being the same as for low severity injuries. This makes some sense to me as well, considering things like random work that is infrequent and tends to be higher risk due to unfamiliarity, lack of knowledge, etc.

 
When Dan Petersen was alive we had this discussion in some of our one-on-one talks. Certainly the early, less scientific research of what Heinrich did has come under attack by those who do in depth, statistically validated research like those of ORC. Frank Bird redid the Heinrich work with better technique in the 80’s and got similar results. It seems there is a cultural injury reality predictability for many organizations. My belief is that we should stop squabbling over the probability numbers and get down to what it takes to eliminate the injuries no matter whether they are first aids or potential fatalities or any in between.

 
On occasion I have had discussions along these lines with company safety pros that went something like this:
“We reviewed our injuries, and first aid injuries are about 12-18 inches away from a lost time and lost times are about 6 inches off a fatality. If the event had moved just a little in one direction we would have lost the person.”

 
“We keep track of all medical injuries and it is our experience that, on average over the last 15 years or so, for every 100 medical injuries we have a fatality somewhere in the world at one of our many operations. Sometimes it occurs at 93 medicals other times at 112, but on average it is a fatality for every 100 medical treatment cases even though our recordable rate is currently running at about 0.32.”

 
The ORC report seemed to say to me that more research was important in three areas;
"The results of the ORC Fatality and Serious Injury Task Force Survey have convinced us that as professionals we must change how we view:
  • The relationship between incident investigations and corrective action;
  • Employee behavior and risk management;
  • Employee behavior and engineering controls.”
I would agree that:

 
  • Incident investigation and associated corrective actions are a part of a safety culture that, from my practical experience, seems often to be more of a “check in the box” than a strong process
  • Employee behaviors and risk management are important considerations that are also not typically considered or addressed in the average workface level safety culture
  • Employee behavior goes beyond the hourly ranks. Management behaviors (or lack thereof) greatly affect the workface safety culture and performance, even though the managers don’t usually realize this
  • Engineering controls by classical safety theory are often the first thing to be considered
  • And now my personal conclusion to this "chicken or the egg" type of controversy;
  • It is a good idea to keep a probability type of chart for a LARGE organization which has a significant volume of incidents. Personally I'd use more of a control chart and focus on what we are doing to ever lower all injury rates and track this change over time
Behaviors are significantly affected by a good safety accountability system and this is lacking in most companies. If we develop good safety accountabilities at all levels of the company and live them at all levels of the company, medical and more serious injuries will go down. This is classic Dr. Dan Petersen and in my experience his "Six Criteria of Safety Excellence" which have a focus on accountabilities work very well. This kind of safety accountability system also addresses engineering controls, risk assessments, near misses, etc.

 
When Dr. Deming started his assault on quality gaffs (errors) I believe he ran on to the same statistical navel gazing syndrome, which in turn lead to never ending academic hypotheses. His solutions to this dilemma included:
  • Chart the errors (control charts and the like)
  • Engage people from all levels of the organization in a kaizen culture of the endless pursuit of perfection
  • Do a risk assessment of all designs with both engineers and people from the floor who have practical experience
Analyze every error (incident) which has occurred and do meaningful corrective actions which get to root causes and deliver both engineering and accountability improvements

 
His mantra "DMAIC" (Define, Measure, Analyze, Improve, Control) had these Elements

 
We need a safety culture which does something similar with our errors (incidents, including close calls). This works in organizations when there is leadership which engages just like they do in the quality revolution. Some leading edge safety focus organizations have joined this kind of safety revolution and are doing very well at eliminating both minimal and more serious incidents.

 
Regardless of the risks, a good near miss system/process is an excellent tool to do this. Our field experience with companies who have done this is astounding! However, you must do a good job of error proofing the near miss process so it fits your culture and your objectives. Just dropping in a "program of the month" does not work well at all. You need to have a system which:
  • Generates 100’s of near miss reports/month
  • Grades each near miss by risk as Red, Yellow, or Green
  • Solves 90+% of all near misses, no matter the risk, within 3-5 days by the people who turn them in
When organizations have their employees live this kind of error proofing, the resultant near miss system is able to reduce both lesser and serious injury rates significantly. Indeed, they develop a culture that doesn’t take any kind of noticeable risk on their own, without needing a 24/7 supervisor. It is not a matter of the kind of risk; rather it is about a safety culture that eliminates all risks. And a good, practical near miss system does this very well. If you would like to discuss how to implement this kind of near miss success, please give me a call

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Connect With Us

Bookmark and Share
/////////////Google analytics tracking script//////////////// /////////////END -- Google analytics tracking script////////////////