A few weeks ago, I led a workshop about risk in the wonderful city of Bergen on the west coast of Norway. On the toilet mirror, I noticed these stickers issued by the city council. The stickers pictured a hand on which a text was written with ‘permanent’ marker, saying “100% Clean Hand. 37 degrees”, and below the encouragement “Wash your hands!”.
Well-intended as this sticker surely is, here is a lot of nonsense going on.
Firstly, my mother wouldn’t agree at all. If I would have shown up at supper with permanent marker all over my hands she would definitely NOT have judged them as clean, not even 50%. I would have been sent to the bathroom to give them another good scrub, and no questioning that command.
Secondly, if I recall well, there are a lot of bacteria that thrive very well at 37 degrees Celsius. If you can believe Wikipedia, this is even the optimal growth temperature for e-coli, which is a bacteria that is commonly found in the lower intestine of warm-blooded organisms. Which is, well, the area that you probably have been in contact with on the toilet.
What we see here, are phenomena at work that we regularly encounter in safety (and health, quality, etc.). In this case, they are based on at least four Safety Myths:
- Believing that words don’t matter.
- Believing that firm language, based on absolutes, is good.
- Believing that perfection is good and can be achieved.
- Making information as simple as possible.
Let’s look at them one by one.
Myth 1: In this example words get a different meaning than we attach to them in everyday life. Clean doesn’t really mean clean, but okay, we might forgive that as a white lie because the permanent marker is just a gimmick to attract attention. But we most certainly don’t mean 100%; rather “clean enough to safely eat your lunch” or something similar. More about that below.
On a practical note, how would you know that the water is 37 degrees Celsius? This is the average body temperature, but your outer extremities are probably colder than that. So the water should feel warm, but how much more than your hands? So, do they actually mean 37 degrees, or something else?
Myth 2: Firm statements based on absolutes ooze a sense of certainty. Just check these:
100%. No Doubt. Definitely. Read My Lips.
Wonderful, aren’t they? Many perceive this a better form of communication than more nuanced and somewhat ambiguous statements like “as clean as possible” or “good enough”. Often, however, this firm communication with use of absolutes lacks basis to do so. Cursory critical poking makes the arguments crumble. Washing your hands at 37 degrees Celcius will neither remove all dirt nor kill off all bacteria.
Interestingly, also the mentioned temperature is packed in a precise and firm statement, but the operational part (the temperature of the water you wash your hands with) will be approximate, at best. Unless we are required to bring thermometers to the bathroom.
Myth 3: Striving for perfection is unrealistic and actually counterproductive. Just imagine that you go for 100% clean hands. This would require either temperatures that will in fact hurt you, or a level of scrubbing that damages your skin (and leaves everything bloody, which isn’t really clean either). Additionally you will remove also the ‘good germs’ on your skin that you have need for.
Besides, you should not forget that some dirt may actually be good for you - it helps you build resistance (the process called hormensis).
Myth 4: Of course, information should be easy to understand, but simplification is often taken so far that it ends up in dumbing down a complex subject into a soundbyte. The result of the process of cleaning is a combination of temperature, the amount of water, time, chemicals (soap, alcohol) and mechanical techniques (e.g. what parts to pay special attention to and the amount of scrubbing). Feel free to experiment with those factors the next time you do the dishes, it may make the process much more fun too.
Stripping it down to just a temperature is severely misguided, as one may be led to believe that this is all that matters while it is not.
As my dad used to say: you shouldn’t believe everything you read. This sticker falls squarely into that category. Luckily these were stickers only intended for ‘common’ people and not occupations with higher risk of contamination and infection through contact, like health care (find better alternatives here and here). Still, it’s a questionable practice.
Safety Myth 101
You won’t find this exact example, but a great variety of these and other Safety Myths in the book Safety Myth 101 that has just been published through Mind The Risk. Available from a.o. Amazon - find links at:
Also published on Linkedin
Within the Safety community there is some discussion going on about the ‘new’ and ‘old’ view on certain things. Some people seem to think that things are black and white and that it’s either one or the other. It must be stressed over and over again, however, that both views can be complementary and both have value - depending upon what element you want to focus on. A nice illustration of this occurred during the Learning Lab on Critical Thinking In Safety at Lund University in January 2016.
The Learning Lab took place at the Pufendorf Institute in Lund. The Institute is housed in a wonderful historical building. During a past upgrade some genius designer apparently thought it fit to put power outlets in the floor right behind the entrances of the big room where the Learning Lab was held (and liked the idea so much that he/she did this for all three the entrances!). Of course people needed power to feed their smart phones and iPads, so most of the time one or more charging cables were lying on the floor right behind the door.
As said, this example illustrates very nicely both the old view and the new view of safety - and that they go well together.
This situation provided a perfect empirical case for the (by some) much loathed safety pyramid. People had to get in and out of the room and stepped over the cables (which provided a clear tripping hazard that interestingly none of the safety folks in the room acted upon - including yours truly, because I was observing this experiment…) continuously. It was only the course leader, JB, who tripped once over the half-opened power outlet (but did not fall or hurt himself).
In traditional ‘Heinrichian’ language we can say that we observed many unsafe acts and one near-miss. I have to admit that I didn’t count the ratio, but then, the numbers are irrelevant and context dependent; it’s the idea that counts.
What we have here is a nice little example of more or less normal (they were safety professionals, after all) people making do with what there is. We all respected the fact that also the ones who were so unfortunate to sit too far away from the wall-mounted power outlets needed to charge their gadgets (ETTOing in a way) and figured we were careful enough to deal with the variability. Once JB tripped, he was resilient enough to catch himself.
So, who’s right?
Who really cares? One could even argue that we normalised deviance, operated clearly at the margin or one could make a case for situational awareness or some kind of safety culture. It all depends upon the narrative you chose - which was one of the main themes of the Learning Lab... It’s not so much a case of being right, but what analytical choices you make - and in the end what you want to use it for (hopefully improvement, or possibly using it as inspiration for a blog).
Also published on Linkedin
The other day a friend of mine forwarded an invitation to an event on safe behaviour. This invitation advertised (apparently successfully because the event was full-booked in no time) among others with the following quote:
“Experts will give you tools that enable you to have employees working safely by themselves”
(my translation, which alas doesn’t perfectly capture all nuances)
This text looks perfectly innocent, well-intended even. After all it appears to aim at improvement of safety! But let’s look closer at what they are actually saying.
Decomposing the quote from the end, “have employees working safely by themselves” says in fact that people aren’t able to work safely without someone prescribing how to do it. Even more, not only are they incompetent or ignorant when it comes to safety, apparently they aren’t particularly motivated either. They cannot or want not to work safely without an intervention.
But despair not. In the first part of the quote lies Salvation! Luckily there will be experts at this event that teach you (the safety practitioner) how to deal with these impossible workers.
Has anyone actually thought what kind of message and attitude towards the majority of the population speaks from this tiny quote? People are dumb, lazy, unmotivated and need experts and safety practitioners to do their jobs properly.
As far as I know, nobody ever comes to work with the objective to do a bad job; getting up in the morning with a thought of “today I’m going to have me a nice little accident”. If this event isn’t going to be a prime example of doing safety TO people instead of doing it WITH them, then I wonder what will.
I’m not particularly surprised that people have a certain perception of health and safety. If this is how we approach them in a condescending way, we shouldn’t be surprised that they find us arrogant and remote from reality. So my advice:
Watch you language!
Also published on Linkedin
With Peers Like This, part 5 - Skyfall, SPECTRE and Safety. Or: The Wrong Things For The Right Reasons
It’s often heard that only our lack of imagination limits our hazard identifications or risk assessments. What we cannot imagine we cannot assess and there lies a problem because we might miss something important. Indeed, missing a hazard may cause serious problems afterwards, but I do not subscribe to the view that it’s a lack of imagination that limits our hazard identifications (if it is, I pity my colleagues who suffer from this condition).
In most cases I think it’s rather a sense of potential embarrassment that keeps us from coming up with certain scenarios. We don’t dare to mention them because people might think we’re not taking our job seriously (e.g. when we come with a scenario of UFOs landing on top of the building) or that others don’t take the scenario as a serious possibility (e.g. a plane crashing in the building). And smiling while working on safety seems to be a criminal offense - according to some anyway.
Film makers typically don’t have these restrictions and I’ve found myself on various occasions discussing scenarios from movies in a safety context with likewise open-minded colleagues. Take the opening sequence from Skyfall with the excavator ripping open a train. I have literally participated in hundreds of risk assessments that dealt with trains and excavators, often even both at the same time, but this scenario never came up! (But, close enough, excavators coming within the profile of the rail tracks tends to be a point of concern in almost any occasion and justly so).
Also the new 007 movie, SPECTRE, has provided some good material for discussion. This time it wasn’t the action or the stunts (so far - have to see that snow chase some more often, however), but the organizational/managerial/political aspect; more specifically the role of C.
I don’t want to give away too much of the plot (mild Spoiler Alert), but there have been major discussions around me if Andrew Scott’s character is really one of the bad guys or just a misguided (and somewhat incompetent MBA-type) bureaucrat who gets lured the wrong direction in his idealistic quest for certainty and making the world a safer place.
Pending repeated watching the movie and assembling evidence, consensus for now leans toward the second choice. My friend Alan Quilley commented that the C character reminded him a lot of some safety guys that are doing all the wrong things for the right reasons. Examples of that are all around us… Plenty of Safety Programs build on them. Traditions and Established Beliefs are important drivers as is a lack of Critical Thinking. Safety Slogans and Safety Absolutes like Zero Accidents and Safety First are clear proponents of the Wrong Things for the Right Reasons.
Having 007 around with his licence to kill, solutions appear to be easy. In a movie at least. Solutions of that kind are not really an option in our work as Safety Professionals, of course. But it would be a good start to ask ourselves whether we are really doing the Right Things. And then maybe reconsider because the Right Reasons alone may be not enough…
Since the end of the year is nearing we might consider this as a Professional New Year’s Resolution… And while you’re at it: if applicable (you decide, but be honest) do something about your imagination, embarrassment and sense of humour!
Also published on Linkedin
The safety profession does have a bad reputation in some cases. One of the general perceptions is that safety work takes a lot resources and time without giving proper return on investment. While safety professionals will have a much more nuanced (and sometimes naïve and idealistic) view on the matter one has to admit that if you want to do a good job this can be demanding and time consuming. For example when one has to do a risk assessment. These strive optimally to be as complete as possible as possible. After that it’s an almost just as big job to get it documented in a proper, usable and understandable way.
Of course there are quite a few risk assessments (and other ‘safety’ activities) that are purely done as an administrative exercise as window dressing or to satisfy some requirement from regulations or internal procedures. Those should cut out immediately and stop unnecessary waste of time and energy. But that’s not what we talk about here. The question is if we can increase effectiveness majorly for the risk assessments that have to be done by working smartly and creating the possibility to relocate resources to actual improvement.
There is little literature on the subject so far (although I’m sure that many companies are practicing some form of recycling and reuse), but my friend Rune Winther has done some work in this direction to work towards a systematic method. Results so far are encouraging. For a series of similar assessments there was indicated an 80% reduction in use of resources from the first to the third assessment.
Winther’s paper describes that there is often a significant overlap of hazards between similar projects and the number of hazard and failure modes that are unique to a specific project may in fact represent a minor part of the hard and failure modes. There is a serious possibility that time is ‘wasted’ on identifying hazards and failure modes that are already well known and often adequately mitigated. Many safety professionals can probably confirm this finding from their own experience. The aim should be to use as little as possible effort on the identification process, while still being complete. Systematic reuse is one way to solve this.
However, one must watch out that reuse and recycling of material not just ends up in a simple copy and paste job. I remember all too well an episode sometime in the early 2000s when we were contacted by the Health and Safety Inspectorate that had visited a contractor and checked a couple of safety plans and other required documentation. They wondered why working on the rail tracks (which the project they had visited was all about) at one place included the painting of a railway bridge about 200 kilometers away.
Luckily there was no real harm done (if I recall well the inspector actually thought this was funny), but of course it does send a wrong message, it doesn’t really give an impression of safety management being taken seriously (the safety plans had after all been approved by our project manager) and worse: there are other ‘copy and paste errors’ thinkable that might even have led to dangerous, maybe even life-threatening situations. Another danger that lurks in cases like these is of course the possibility of systematically copying errors which may spread an isolated problem to other situations.
Also, if one wants to reuse information it’s important to check if the situation or system that has been assessed previously indeed is similar to the one at hand. One must one watch out that one doesn’t overlook things that are unique in the current situation when compared to previous assessments. Important keywords are the context (which may be quite different from situation to situation, even if they seem similar) and interfaces and interactions (for example as a consequence of the integration of a sub-system into the system as a whole).
To tackle these and other problems, Winther argues that reuse should be based on hazards defined on a subsystem level, because these hazards will be more generic than hazards defined on a system level. It’s important to use clear generic subsystem definitions because these are an aid to evaluate the relevance and completeness of generic lists of hazards and failure modes. Good definitions help to identify the differences between a generic case and each specific case.
As often with research, more is needed, but as many of us will assume from their own experience there is a good potential here to improve and do things in a smarter way. It’s important to go forward systematically and with caution, to accept some trial and error and to do a good deal of tinkering. The results may be very rewarding indeed. I for one am looking forward to hear about your experiences!
A Pragmatic Approach To The Reuse Of Qualitative Risk And Reliability Analyses - Experiences From Analyses Of Railway Traction Substations (Rune Winther, 2015, presented at this year’s ESREL conference).
Get the full article through Researchgate
Also posted on Linkedin
Page 6 of 11