There is something absolute about accidents. The loss, the destruction, and the irreversibility of it all – it’s clearly not what was intended. This unwanted character of accidents allows, affords, or even encourages a binary understanding of safety. Either things are safe, or they are not.
So when bad things happen, it makes sense to pull back, to constrain, to establish tighter control over what has failed. As if we only stay away from danger, if we use more caution, then we can rest assure that the accident will not happen again. Most efforts to establish safety remains driven by this simple principle: stay away from dangerous stuff!
This is problematic for at least two reasons:
First, dangers are everywhere. Sociologist Ruth Simpson argues that while we may sometimes conclude that dangers are present simply by observation, dangers can also develop suddenly (like bombs or earthquakes), invisibly (like gas or radiation), incrementally (like toxins in food), or lie hidden and dormant (like aneurysms). And in a world in which things are increasingly interconnected, and in which there is a fast pace of technological and social change, dangers may develop into harm in unprecedented ways. To conclude that we are safe, to think that we have got things right, thus per necessity involves some (probably often unconscious) disregard for how things may go wrong. The point is that doing things the ‘right’ way is no guarantee for safety.
Second, as people react to accidents, and as they engage in a binary approach to things being either safe or unsafe, they trade away the complexities and grey zones. As we step farther away from where accidents could occur, we simultaneously give up a space which may be ripe with innovative and better ways of doing things. Safety through (excessive) caution stifles creativity: People are turned into procedural pawns, following ideas that someone in an air-conditioned office far away has thought up to be the correct way to do a job. This way, fantastic resources go to waste – experiences, ideas, up to date expert understanding. This way, organisations become inflexible, they start living in the past, and they give up the opportunity to be the best they can be.
As written by Charles Darwin at the end of 19th century “In the struggle for survival, the fittest win out at the expense of their rivals because they succeed in adapting themselves best to their environment”. But instead of applying more intelligence and more collaboration to adapt wisely when it comes to safety, the traditional response is to use brute force, or impose old solutions more frantically.
We need to open up the way we engage with safety. We need to allow for more variation, for innovation and for creativity. But not by discarding what we have learnt so far. We need something which can be described as ‘informed variability’, or ‘disciplined plurality’.
Variability or pluralism is essential for successful adaptations – we need fresh perspectives, to try new things, to seize opportunities. But discipline and keeping others informed is important too. We need to make sure that such initiatives are thoroughly shared and communicated. We need to invite multiple viewpoints about such developments. By embracing plurality, complexity, and adaptive capacity within our organisations, we stand a better chance to meet such challenges in our environment.