In our rush to judgment we rarely intend to do harm. Often, we react to incomplete or even scant information, fit it into our own mental model of how things should be and then jump to conclusions that could inflict harm.
Last week, CBS Morning News showed a film clip of a man snagging a baseball from a kid who was sitting directly in front of him. The less than 10 second clip resulted in the vilification of the man as a bully who stole the ball from the little kid. One day later the same news show provided an apology to the man for jumping to judgment. Why the change? Simply put, the news media learned the context around the actions and found that the man had caught and given away several balls to those around him including the boy in front of him.
Context is something that helps us to walk in the shoes of others – but it takes time and effort to learn context. Often it is much easier to live in a land of blame and shame. Our organizational responses to incidents and accidents have followed the same path and resulted in investigation reports that name individuals as the cause of accidents without a mechanism that helps the investigator to learn or discover context.
Our tendency is to oversimplify – enter the concept of requisite variety, which implies that the complexity of our assessment of systems has to meet the complexity of the systems that we are scrutinizing. Yet, so many of our processes are not designed to embrace complex systems.
This is what we learned in the US Forest Service, as we attempted to “investigate” fatal accidents. The processes we had drove us toward judgment. People were simply admonished and told to follow the rules. When it was found that they did not follow a rule, we didn’t ask questions about the rule, we simply wrote, “The worker failed to follow rules, regulations, policy or procedures.” The relevance of the conditions seemed irrelevant. Our Serious Accident Investigation Guide actually said,
“The causes of most accidents are the result of failures to observe established policies, procedures and controls.”
The stated purpose of investigations was prevention, yet we were not learning how to prevent accidents. We were; however, perfecting our skills of blaming others.
We had to learn how to learn from our systems following an accident and, more importantly, we had to learn how to learn when the system was delivering the unexpected. These unexpected situations exist outside our ability to fully predict and, therefore, cannot be fully regulated or controlled.
The result of our investigations was often admonishment, sometimes criminal prosecution and often a demand for simple compliance. For those of us who were working the fire-line or flying planes in firefighting operations, this approach simply felt like our leaders were telling us not to have accidents, because we knew that we could not simply follow all the rules. Asking pilots not to crash may feel good and give a sense that we have met our responsibility, but it had no effect on our accident rates. Our definition of safety had to change and along with the new definition, we had to reconsider our metrics of success. We needed to marry compliance with innovation – comply when it made sense and the rules fit the situation, while simultaneously giving room for innovation when the system delivered the unexpected.
Our journey took many years. It was led by field personnel who wanted a different approach and by leaders who recognized that the system was complex. A milestone was the development of learning focused approaches to the organizational response to incidents and accidents. One big breakthrough came when we realized that our old processes robbed us of the context that held blame in check. The result of our routine response to accidents was distrust in both the system and leadership and this led directly to silence. We finally concluded that the currency of safety is information and that we had to protect the trust of employees in order to ensure we could understand the difference between work as imagined and work as performed.
We also learned that the system is dynamic, and rules needed to be reviewed to meet modern information. Adding rules made the system more cumbersome and vulnerable. In dynamic (complex) systems there is variability that defies prediction. Workers have to recognize the situation as novel, make sense of new (often conflicting) information, learn in the moment, and devise innovation(s) to fit the new conditions.
Investigations also had to understand, rather than judge actions, and learn how to capture the conditions (factors) that influenced people to do what they did. A principle emerged for us – people act in ways that make sense to them based on their training, heuristics biases and the conditions they perceive at the time, not because they are bad actors. From this perspective, an accident in not seen as a choice, after all who would choose to have an accident? Rather it is seen as a natural outgrowth of normal system and human variability.
As I mentioned in the first paragraph, “In our rush to judgment we rarely intend to do harm. Often, we react to incomplete or even scant information, fit it into our own mental model of how things should be and then jump to conclusions that could inflict harm.” Workers are just leaders. We are all people who are influenced by what we see, hear and feel. These conditions form the context – The conditions that influence our decisions and actions. It is our duty to understand those conditions. To paraphrase professor James Reason: You cannot change the human condition, but you can change the conditions under which people work. We learned – in order to meet the challenge imposed by Reason, we had to shape our processes to look for those conditions.
I have to thank Professors Reuben McDaniel, Karl Weick, Sidney Dekker, Eric Hollnagel and David Woods, along with the over 400 wildland firefighters who have died in the US since 1996, for their contributions to the concepts explored in this op-ed.
Editor’s Note: For more from Ivan, check out his TedX talk on this subject.