Six Thinking Hats for Safety

hatsAlmost a year ago on the this blog, Sidney Dekker asked “Can safety renew itself?“. He asked whether the profession was even capable of doing this, with its goal to eliminate what goes wrong; “For a profession that is organized around the elimination, reduction and control of risk, innovation can be a tall order“. Since then, there has been progress. Significantly, a new way of looking at safety has further emerged – so-called Safety-II, and the resilience engineering movement has gathered pace. Rocking the boat can annoy people, but moreover it get people thinking and asking some fundamental questions; questions about paradigm, purpose and processes. It is unlikely that safety is unique, that there can be only one paradigm, one way. But that is sometimes how it feels in practice. For all professions, there is so much invested in paradigms that new thinking is resisted.

So I wonder how we might switch our thinking to avoid a dogmatic approach. Perhaps there is a way to try before you buy, without necessarily buying a one-way ticket to nu safety. I rediscovered some of Edward de Bono’s work, and in particular his little book on Six Thinking Hats®.  The approach employs parallel thinking, and encourages everyone to think from the same perspective at a particular moment in a workshop. It struck me that the hats provide a way to think different about safety, not just in a meeting – more generally, and not necessarily lock, stock and barrel.

Considering the hats with regard to our thinking about safety, I had a few thoughts. Here they are:

White Hat: Facts and Figures

With the White Hat, we might ask ourselves a few questions about our safety data – particularly the neutral(ish) ‘hard facts’. An obvious question is, what facts and figures about work and safety do we actually collect? Considering whatever data we have and use, we need to ask ourselves, what are we really measuring? If we collect only negative outcome data, then this may tell us little about how safe the system actually is. To understand this, we need to understand how ordinary work works, not just exceptional events. But a more fundamental question is this: why do we measure what we measure (and in the way that we measure it)? We rarely ask ourselves these sorts of questions, but purpose is a fundamental aspect of any system. There may well be several purposes, some of which may be incompatible. But the answer might reveal something about our ideas about safety. With purpose in mind, we might now ask, what do we need to measure, how often, and over what time? What we actually measure and what we need to measure according to purpose may be different things. If our purpose is to improve the system, then we need data on the system works – how the work works. The problem is, this leads to measuring things that don’t obviously (to us) relate to safety. Organisations often collect a vast array of data. Some of the data we need to understand safety is not the data collected by safety departments, but may well already exist somewhere. Perhaps a habit of gathering ‘safety data’, rather than work data, may be part of the problem. If we want to understand safety, we need to get out from behind our desks.

Red Hat: Gut Feelings

The Red Hat concerns emotion and instinct – our gut feelings. We all have them, even if they are accessible in their untwisted form only for moments. Gut feelings can be a problem for safety professionals. Safety professionals are drawn to ‘facts’, logic, analysis, process, method, technique – but not gut feeling. Other safety actors however – pilots and air traffic controllers, doctors and nurses – clearly do value gut feeling, and act on it. And according to Gerd Gigerenza, it usually works very well.

With the Red Hat, we might ask, what are our immediate and initial gut feelings, intuitions or emotions with regard to work and safety? Unless we are close to the action, we might not have the exposure needed to cultivate these gut feelings, or sense those of others. We might have reactions to secondary data, but not first-hand experience of operations. Even if gut feelings don’t show up on our internal radars, others closer to the action will have them, and we can access these. If we are aware of some gut feelings – our own or others – do we take notice of them, or disregard them? Our organisational processes and systems, or our own preferred way of thinking, may encourage us to reason-away our inner voices. That is a terrible waste of data. Even if we attend to gut feelings, do we talk about them? If we don’t value gut feelings, we are less likely to talk about them because they are not ‘evidence’ or ‘facts’. Perhaps we are happier with partial or distorted statistics. And even if we talk about our gut feelings, do we act on them as an organisation? Front-line workers act on their gut feelings. Under intense time pressure, they have to. So when we safety professionals have strong gut feelings about something, will we act?

Black Hat: What Goes Wrong

The Black Hat is the most comfortable hat for safety professionals. It concerns things that go wrong, or might go wrong and indeed is a hat that we must wear. Along with the White Hat, it might seem like the others are superfluous (even ridiculous). The key Black Hat question seems to be, how do we think about failure? Whether or not we realise it, we all have an accident model. It’s the way we think about causality and failure – in the head or written down. It may be simple, linear and direct, or a bit more complicated like a set of dominos or slices of Swiss cheese. Or it may be more of a network of influences, characterised by non-linear relationships, causal loops and emergence. Formally we may use methods that are more reductionist or more holistic. We may or may not really buy into the methods we use. A particular problem arises when the way that we understand systems no longer matches the methods we (have to) use. Another issue is how do we consider emergence, including the unwanted consequences of interventions and projects. Emergent properties are surprising and some would say unpredictable by nature, but we can learn from previous emergent phenomena, and use systemic and creative approaches to try to understand  emergence.

Yellow Hat: What Goes Right

The Yellow Hat looks at what goes right: positives, benefits, success. It is not a hat that safety professionals often wear. It’s not that we lack methods – there are hundreds of methods – from operations research, systems thinking and human factors. It’s just that we don’t think in this way. Perhaps there is a difference in thinking between front-line workers, who think of safe operations as ensuring things go right, and safety professionals, who think of safety as avoiding that things go wrong. There is good reason for this – someone has to wear the black hat, and that is to some extent our cross to bear and our hat to wear. But wearing only this hat is counterproductive. It separates safety (or unsafety) and sets us against other organisational goals. To understand what goes right, we need to look at ordinary work, as well as exceptional performance. How often do safety investigators use their skills to investigate success or performance variability? Not very often. How often to safety assessors consider safety benefits? Some are starting to do this, but it is not the norm.  Wearing this hat, we can look at how the work works, and how and why it normally works so well. This can be extended and enhanced, perhaps using  appreciative inquiry‘s cycle of discover, dream, design and deploy. However we do it, we need to direct some of our attention to understanding the adjustments, trade-offs and conditions from which safe operations emerge.

Green Hat: Creativity and Innovation

Creativity is not something that one naturally aligns with the safety profession. And neither is innovation. Perhaps, as Dekker says, the profession is conservative and risk averse by nature. I tend to think systemic factors are at play – particularly regulation. But perhaps, even with such constraints, we can use creativity to create new ideas and innovate in order to improve work, and therefore safety. This means going beyond analysis, structure, process and order, but it does not mean abandoning them. Can we use creativity to overcome obstacles, or to achieve possibilities? As a start, it might mean stopping doing things the way we routinely do them, for a while at least. On a personal level, I try to look to completely different fields, such as psychotherapy, film, design and photography. For instance, in designing the safety culture discussion cards, I worked with mental imagery and photos (abstract and concrete). In safety culture focus groups, creativity might involve free-wheeling discussions where anything goes – at least for a while – where there is no structure. Where they end up, without being forced through process, is often surprising – and useful.

Blue Hat: Thinking Process

Blue Hat Thinking is about meta-cognition, and invites us to think about how we think about safety. What is our safety paradigm? Do we have default or habitual ways of thinking, perhaps coded in tools and methods? Is it possible to switch to new ways of thinking, at least for periods, to think differently about safety issues? The Blue Hat manages the thinking process, so we might consciously switch between different hats and  different sorts of thinking. Some tools and methods are more appropriate for some situations, problems or opportunities than others. Similarly, a way of thinking may be appropriate some times and not others, or we might need to blend aspects of different modes of thinking. But personal and system factors keep us attached to one paradigm. Can we shift? Might we move towards Safety-II thinking – even just to try? If we find our existing paradigm unsatisfactory, how might we change paradigms? Might we even transcend paradigms – open-minded, willing to adapt and try the best of what is? If we can, we have our hands on the most powerful of Donella Meadowstwelve leverage points to intervene in a system.


I have since used the hats with safety specialists, investigators and operational staff in a workshop to re-look at an old but important occurence. Each hat was ‘worn’ for a short amount of time starting with the White Hat (this was worn for the longest time), next the Red Hat (this was worn for the shortest time, 10-30 seconds for each person), then the Black Hat, Yellow Hat and Green Hat, with the Blue Hat directing the use of the hats. The results gave a new perspective, especially highlighting how gut feelings might influence the direction of an analysis, how creative thinking can lead to new insights, and how looking at what went right can rebalance the nature of an investigation.

Note: This post was originally published at
Photo: Jeremy Brooks CC BY-NC 2.0

4 thoughts on “Six Thinking Hats for Safety”

  1. Hi Steven, That brings back memories. Twenty years ago I did the six hats train-the-trainer with Edward de Bono in Canada and later wrote my PhD thesis about the role of creativity in safety science using the six hats tools as an experimental intervention. John

  2. Hi John. That brought a smile to my face. Safety and creativity do not good bedfollows make, in some minds! When I told people about using the 6 hats in this way in a workshops, a got a couple of baffled responses, yet those actually present (including no-nonsense operational types!) found it a very valuable approach. In fact, the informal verbal nudge “Red hat” or “White hat” was spontaneously used and understood well after this particular workshop to prompt particular discussions or keep them focussed. It was also fun, and we all need joy-in-work – even (especially) in safety. I had no idea about your work here, and would love to see your PhD and any publications arising.

  3. Steven,

    Thanks for the insightful post; in my experience going back twenty years with the US Department of Energy’s nuclear energy facilities and large national laboratories there has been a couple of movements that I believe align with your multiple perspectives take on thinking about how success is achieved.

    In developing an activity-based planning system for conduct of a very complex mixture of multiple hazard decontamination and decommissioning tasks, I had the advantage of a good-sized, multidisciplinary team with considerable overlap in relevant experience. Some people held formal planning and hazard control positions going in, others had held those positions in other organizations or at different times.

    It soon became clear that with all of the one-off work activities we would be planning, that we really needed to “focus on the work” and “design the work to be doable – safely.” If it became clear that we faced significant uncertainty about the as is hazard conditions, then we first designed what we termed an “uncertainty reduction activity.” Focus on Working Safely to get the information needed to then plan the Hazard Reduction Activity itself.

    In the planning process we thought a lot about how to convince responsible managers and persons doing independent oversight, that we had anticipated their concerns – and so our hazard assessment tables included write-ups for each step in the action plant of three different perspectives – worker, foreman, and manager. Part of the latter perspective was to illuminate before hand what kinds of emergent circumstances might be resolved without stopping the work and which would lead back to the planning table.

    That experience led us to recognize the difference between the “Do Safety, Work when Safe Enough (maybe)” perspective (i.e. the tension-filled one), and that of Do the Work Safely. So far as I can tell those options conform with the essence of Safety I and Safety II. For the past 25 years, the importance of cultivating multiple perspectives on the path to success has been an essential aspect of all my thinking.

    Sadly, still today, many are hesitant to believe that a focus on Doing Work Safely, and especially reducing the amount of avoidable rework as a performance measure, is worth the investment. On that point I have no doubt.

    I look forward to reading Eric’s thoughts when the new book is released to confirm that conclusion.

  4. Great post – as there is increasing reliance on ‘gut feelings’ I think this shows the value of taking a balanced approach. Talking about emotions and instincts play a role but replacing another hat or only wearing the red hat is short sighted and won’t produce a comprehensive picture. Heather

Leave a Reply