For the last century, the evolution of accident investigation can be tied to research and the scientific advancements in how we view our work systems. Three major lenses of scientific research emerge as we begin to examine key influences on accident investigation processes: Classic mid-century faith in engineering was termed, Scientific Management, which was followed by Systems Thinking and, ultimately, an emerging understanding of Complex Adaptive Systems.
In the early 1900s Scientific Management was the dominant system used to explain and improve work. Much of this approach emerged from the industrial revolution and was captured by Frederick Taylor. Emphasis was placed on efficiency of machine and labor, the latter of which focused on the analytical assessment of workflows. The overriding goal of this system was to increase production by making all components of the system more efficient. Accidents were generally viewed as failures of individuals and people were removed and replaced as if they were failed mechanical components.
It took time for organizations to realize that firing people was not achieving the goals of increased efficiency. Employers had to retrain skills held by people who were fired, in order to achieve the level of performance lost when good people had accidents. Major thought leaders like James Reason began to propose a different view.
System Thinking started to hit mainstream accident investigation in the 1980s. Investigators began to look at work systems, searching for active failures and the absent defenses that allowed accidents to happen. Prevention strategies centered on a deep need to develop defenses in depth. Error management and error traps became popular as people were seen as triggering agents whose actions could be mitigated by finding latent conditions that were lurking in the system, waiting to align when the event was triggered by the active failures. Investigations were designed to identify absent defenses and active failures. Recommendations often pointed to the need to create additional defenses that would block the holes in the “Swiss cheese.” Stronger regulations and tightening of procedures were the natural result, as organizations tried to plug all the holes discovered during their thorough investigations. The System Thinking model resulted in significant improvements in areas like engineering, ergonomics, manufacturing and quality control. It remains an effective tool to improve predictable and stable aspects of our work environment, but questions lingered regarding the human contribution to accidents and incidents.
To this point, our research and experience taught us to learn how to solve specific problems, which created both heroes and villains, based almost solely on the outcome of the solution. Solutions seemed to have limited application as we recognized that not all things were predictable. Greater understanding was being demanded as researchers, like Sidney Dekker, began to point out that people were being named as intentional agents in their own injury, or even demise. Some researchers and practitioners recognized that the causes stated in many investigations (e.g. human error, pilot error) were, as Professor David Woods states, “simply labels that masquerade as explanations.” The agentive language used in most reports was further impeding our ability to learn from events.
As a researcher and practitioner, I began to knit several theoretical concepts together with the help of Professor Sidney Dekker. Leading this line up were complex system research, organizational development, cognitive psychology, social psychology, sensemaking, high reliability organizing and social construction. This happened as I was challenged with the responsibility of conducting wildland fire accident investigations and I found that the traditional tools did not fit my work environment. Wildland firefighting is complex – it is substantively reliant on human interaction and largely absent of technology. Ground firefighting operations have changed little in terms of technology over the last 40+ years and I discovered that it proved to be a great proving ground to explore accidents through the lens of Complex Adaptive Systems (CAS).
Complex systems are, by definition, not fully predictable and as a result, uncertainty is recognized as a natural part of a CAS. Investigations that embrace the complex nature of work begin to look beyond traditional cause and effect relationships by recognizing every work evolution has unique attributes. This makes it difficult if not impossible to generalize and often impossible to predict outcomes by those involved.
Investigators in a CAS are asked to consider why it made sense for people to do what they did. The focus of investigation shifts from judging actions or decisions as right or wrong and people as good or bad, to developing an understanding how worker actions can be tied to a network of influences, or performance shaping factors. When the focus shifts to learning and understanding all we can about what influenced actions, then the way questions are asked and the very language used to describe incidents changes. The desired outcome is that organizations purposefully increase their capacity to learn and workers recognize how important it is to learn their way through work. We add to our prevention toolbox by increasing our ability to recognize novel situations and by learning before, during and after work operations.
 The use of the word systems refers to the set of principles or procedures or the prevailing cultural or social orders that guide how we see the world and how things are done.