Print

A chapter by chapter summary of:

 

Just Culture - Sidney Dekker

2007, Ashgate Publishing, ISBN 978-0-7546-7267-8

 

Summary mainly constructed from direct quotes…

 

Comments BUCA (my comments in the text are in italics and marked with my initials):

1.     The author doesn’t define in the beginning of the book the term ’just culture’. This either means that he takes it in the sense like James Reason described it in his book on organizational accidents and he “hopes” the reader is familiar with this. Or the author wants the reader to discover how he understands ‘just culture’ by reading the book…

2.     The author does take the phenomenon of criminalizing error (and especially from the Anglo/American law system) as starting point for his discussion. But with only a little thinking one can see how conclusions and discussions can be relevant for other viewpoints, like European law systems and/or company internal processes (compliance focus and scapegoating versus leaning and improving).

 

Preface and prologue

Dekker poses some of the themes he will discuss later in the book:

·          There is a trend towards criminalization of human error.

·          There are no objective lines between ‘error’ and ‘crime’. These are constructed by people and are drawn differently every time. It doesn’t matter as much where the line gets drawn but who draws it.

·          Hindsight is an important factor in determining culpability.

·          Multiple (overlapping) interpretations of an act are possible and often necessary to capture its complexity.

·          Some interpretations (notably criminalization), however, have significant negative consequences for safety that overshadow any possible positive effects.

·          When we see errors as crimes, then accountability is backward looking and means blaming and punishing. One should rather see an error as an organizational, operational, technical, educational or political issues, accountability gets forward looking and can be used for improvement.

 

1 Why bother with a just culture?

Consequences of admitting/reporting a mistake can be bad. Some professionals have a “code of silence” (omertà), and often this is not uncommon or only applicable for a “few bad apples”. The problems often are structural trust and relationship issues between parties. Trust is critical. Hard to build, but easy to break. But there is little willingness to share information if there is a fear of being nailed for them.

 

Responding to failure is an ethical question. Has ‘just’ to do with legal criteria or is ‘just’ something that takes different perspectives, interests, duties and alternative consequences into evaluation?

 

Once the legal system gets involved there is little chance for a “just” outcome or improvement of safety. Rather than investing in safety people or organizations invest in defensive posturing so they are better protected against prosecutorial attention. Rather than increasing the flow of safety-related information, legal action has a way of cutting off that flow. Safety reporting often gets a harsh blow when tings go to court.

 

Case treatment in court tends to gloss over the most difficult professional judgments where only hindsight can tell right from wrong.

 

Judicial proceedings can rudely interfere with an organization’s priorities and policies. They can redirect resources into projects or protective measures that have little to do with the organization’s original mandate. Instead of improvements in the primary process of the organization it can lead to “improvements” of all the stuff swirling around the primary process like bureaucracy, involvement of the legal department, book-keeping and micromanagement. These things make work for the people on the sharp end often more difficult, lower in quality, more cumbersome and often less safe.

 

Accountability is a trust issue and fundamental to human relationships. Being able to and having to explain why one did what he did is a basis for a decent, open, functioning society. Just culture is about balancing safety (learning and improvement) and accountability. Just Culture wants people to bring information on what must be improved to groups or people who can do something about it and spend efforts and resources on improvements with safety dividend rather than deflecting resources into legal protection and limiting liability. This is forward looking accountability. Accountability must not only acknowledge the mistake and harm resulting from it, but also lay responsibilities and opportunities for making the changes so that the probability of the same mistake/harm in the future goes down.

 

Research shows that not having a just culture is bad for morale; commitment to the organization; job satisfaction and willingness to do that little extra outside ones role. Just Culture is necessary if you want to monitor safety of an operation, want to have an idea about the capability of the people or organization and to effectively meet the problems that are coming your way. Just culture enables to concentrate on doing a quality job and making better decisions rather that limiting (personal) liability and making defensive decisions. Just culture promotes long term investments in safety over short-term measures to limit legal or media exposure.

 

Wanting everything in the open, but not tolerating everything.

 

2: Between Culpable and Blameless

Events can have different interpretations: is a mistake just a mistake or a culpable act? Often there is no objective answer, only how you make up your mind about it.

 

Companies often impose difficult choices on their employees. On one side “never break rules, safety first” on the other “don’t cost us time or money, meet your operational targets, don’t find reasons why you can’t”.

 

A single account cannot do justice to the complexity of events. A just culture:

·          accepts nobody’s account as ‘true’ or ‘right’,

·          is not about absolutes but about compromises,

·          pays attention to the “view from below”,

·          is not about achieving power goals,

·          says that disclosure matters and protection of those who do just as much,

·          needs proportionality and decency.

 

3: The importance, risk and protection of reporting

Point of reporting is to contribute to organizational learning in order to prevent recurrence by systematic changes that aim to redress some of the basic circumstances in which work went awry.

 

What to report is a matter of judgment; often only outcome leads us to see an event as safety relevant. Experience and “blunting” (an event becoming “normal”) affects reporting. Ethical obligation: if in doubt then report.

 

In a just culture people will not be blamed for their mistakes if they honestly report them. The organization can benefit much more by learning from the mistakes than from blaming the people who made them. Many people fail to report not because they are dishonest, but because they fear the consequences or have no faith that anything meaningful will be done with what they tell. One threat is that information falls in the wrong hand (e.g. a prosecutor, or media - notably for government related agencies due to freedom of information legislation!). Some countries provide for this reason certain protection to safety data and exempt this information from use in courts.

 

Getting people to report is difficult and keeping the reporting up is equally challenging. It is about maximizing accessibility (low threshold, easy system, …) and minimizing anxiety (employees who report feeling safe). It is building trust, involvement, participation and empowerment. Let the reporter be a part in the process of shaping improvement and give feedback. It may help to have a relatively independent (safety) staff to report to instead of line reporting (which does have advantages regarding putting the problem where it should be treated and having the learning close to the process).

 

4: The importance, risk and protection of disclosure

Reporting is giving information to supervisors, regulators and other agencies. Main focus: learning and improvement.

Disclosure is about giving information to customers, clients, patients and families. Main focus: ethical obligation, trust, professionalism.

These two can often clash, as can various kinds of reporting (internal/external). If information leaves the company often there is the danger that the information will be used against the reporting employees (blaming, law suits, criminal prosecution).

 

Often people think that not providing an account of what happened (disclosure) means that one has something to hide, and often suspicion rises that a mistake was not a “honest” one. This again is a breach of trust that may lead to involvement of the legal apparatus. Truth will then be the first to suffer as the parties take defensive positions.

 

Many professions have a “hidden curriculum” where professionals learn the rhetoric to make a mistake into something that is no longer a mistake, making up stories to explain or even a code of silence.

 

Honesty should fulfill the goals of learning from a mistake to improve safety and achieving justice in the aftermath of an event. But “wringing honesty” out of people in vulnerable positions is neither just nor safe. This kind of pressure will not bring out the story that serves the dual goal of improvement and justice.

 

5: Are all mistakes equal?

Dekker discusses two kinds of ways to look at errors: technical and normative. The difference is made by people, the way they look at it, talk about it and respond to it.

Technical errors are errors in roles. The professional performs his task diligently but his present skills fall short of what the task requires. People can be very forgiving (even of serious lapses in technique) when they see these as a natural by-product of learning-by-doing. In complex and dynamic work where resource limitations and uncertainty reign, failure is going to be a lasting statistical reality. But technical errors should decrease (in frequency and seriousness) as experience goes up. It is seen as an opportunity for learning. The benefit of a technical error outweighs the disadvantages. Denial by the professional may lead to people around him to see the error as a normative one.

Normative errors say something about the professional himself relative to the profession. A normative error sees a professional not filling his role diligently. Also: if the professional is not honest in his account of what happened, his mistake will be seen as a normative error.

Worse, however, is that people are more likely to see a mistake as more culpable when the outcome of a mistake is really bad. Hindsight plays a really big role in how a mistake is handled!

 

Note by BUCA: Dekker discusses normative errors rather briefly (compared to technical). I’m surprised that he doesn’t address the “compliance” issue under the moniker of normative errors. A look at errors from a compliance oriented Point Of View is also rather accusative and little focused on improvement!

 

6: Hindsight and determining culpability

We assume that if an outcome is good, then the process leading up to it must be have been good too - that people did a good job. The inverse is true too: we often conclude that people may not have done a good job when the outcome is bad.

 

If we know that an outcome is really bad then this influences how we see the behaviour leading up to it. We will be more likely to look for mistakes, or even negligence. We will be less inclined to see the behaviour as forgivable. The worse the outcome the more likely we are to see mistakes.

 

The same actions and assessments that represent a conscientious discharge of professional responsibility can, with knowledge of outcome become seen as a culpable, normative mistake. After the fact there are always opportunities to remind professionals what they could have done better. Hindsight means that we:

·          oversimplify causality because we can start from the outcome and reason backwards to presumed or possible causes,

·          overestimate the likelihood of the outcome because we already have the outcome in our hands,

·          overrate the role of rule/procedure violations. There is always a gap between written guidance and actual practice (which almost never leads to trouble), but that gap takes on causal significance once we have a bad outcome to look at and reason back from,

·          misjudge the prominence or relevance of data presented to the people at the time,

·          match outcome with the actions that went before it. If the outcome was bad, then the actions must have been bad too (missed opportunities, bad assessments, wrong decisions, etc).

 

Lord Hidden, Clapham Junction accident: “There is almost no human action or decision that cannot be made to look flawed in the misleading light of hindsight. It is essential that the critic should keep himself constantly aware of that fact”.

 

Rasmussen: if we find ourselves asking “how could they have been so negligent, so reckless, so irresponsible?” then this is not because the people were behaving so bizarrely. It is because we have chosen the wrong frame of reference for understanding their behaviour. If we really want to know whether people anticipated risks correctly, is to see the world through their eyes, without knowledge of outcome, without knowing exactly which piece of data will turn out critical afterward.

 

7: You have nothing to fear if you’ve done nothing wrong

A no-blame culture is neither feasible nor desirable. Most people desire some level of accountability when a mishap occurs. All proposals for a just culture emphasize the establishment of and consensus around some kind of line between legitimate and illegitimate behaviour. People then expect that cases of gross negligence jump out by themselves, but such judgments are neither objective nor unarguable. All research on hindsight bias shows that it turns out to be very difficult for us not to take into account an outcome.

 

The legitimacy (or culpability) of an act is not inherent in the act. It merely depends on where we draw the line. What we see as a crime and how much retribution we believe it deserves is hardly a function of the behaviour. It is a function of our interpretation of that behaviour.

 

People in all kinds of operational worlds knowingly violate safety operating procedures all the time. Even procedures that can be shown to have been available, workable and correct (question is of course: says who?). Following all applicable procedures means not getting the job done in most cases. Hindsight is great for laying out exactly which procedures were relevant (and available and workable and correct) for a particular task, even if the person doing the task would be the last in the work to think so.

 

Psychological research shows that the criminal culpability of an act is likely to be constructed as a function of 3 things:

·          was the act freely chosen?

·          did the actor know what was going to happen?

·          the actor’s causal control (his or her unique impact on the outcome).

Here factors that establish personal control intensify blame attributions whereas constraints on personal control potentially mitigate blame.

 

Almost any act can be constructed into willful disregard or negligence, if only that construction comes in the right rhetoric, from the legitimated authority. Drawing the line does not solve this problem. It has to be considered (very carefully!) who gets to draw this line and make structural arrangements about this. Question is if the people who get this task indeed are able to have an objective, unarguable neutral view from which they can separate right from wrong.

 

Just culture thus should not give people the illusion that it is simply about drawing a line. Instead it should give people clarity about who draws the line and what rules, values, traditions, language and legitimacy this person uses.

 

8: Without prosecutors there would be no crime

We do not normally as professionals themselves whether they believe that their behaviour “crossed the line”. But they were there, perhaps they know more about their own intentions than we can ever hope to gather. But…:

·          we suspect that they are too biased

·          we reckon that they may try to put themselves in a more positive light

·          we see their account as one-sided, distorted, skewed, partial - as a skirting of accountability rather than embracing it.

 

No view is ever neutral or objective. No view can be taken from nowhere. All views somehow have values and interests and stakes wrapped into them. Even a court rendering of an act is not a clear view from an objective stance - it is the negotiated outcome of a social process. So a court may claim that it has an objective view on a professional’s actions. From the professional’s perspective (and his colleagues) that account is often incomplete, unfair, biased, partial.

 

To get to the “truth” you need multiple stories. Settling for only one version amounts to an injustice to the complexity of an adverse event. A just culture always takes multiple stories into account because:

·          telling it from one angle necessarily excludes aspects from other angles

·          no single account can ever claim that it, and it alone, depicts the world as it is,

·          if you want to explore opportunities for safety improvements you want to discover the full complexity and many stories are needed for this.

 

9: Are judicial proceedings bad for safety?

Paradoxically when the legal system gets involved, things get neither more just nor safer.

 

As long as there is fear that information provided in good faith can end up being used by a legal system, practioners are not likely to engage in open reporting. A catch 22 for professionals: either report facts and risk being persecuted for them, or not report facts and risk being persecuted for not reporting them (if they do end up coming out along a different route).

 

Practioners in many industries the world over are anxious of inappropriate involvement of judicial authorities in safety investigations that, according to them, have nothing to do with unlawful acts, misbehaving, gross negligence or violations. Many organisations and also regulators are concerned that their safety efforts, such as encouraging incident reporting, are undermined. Normal structural processes of organizational learning are thus eviscerated; frustrated by the mere possibility of judicial proceedings against individual people. Judicial involvement (or the threat of it) can create a climate of fear and silence. In such a climate it can be difficult - if not impossible - to get access to information that may be critical to finding out what went wrong and what to do to prevent reoccurrence.

 

There is no evidence that the original purposes of a judicial system (such as prevention, retribution, or rehabilitation - not to mention getting a “true” account of what happened or actually serving “justice”) are furthered by criminalizing human error:

·          The idea that the charged or convicted practioner will serve as an example to scare others into behaving more prudently is probably misguided: instead practioners will become more careful in not disclosing what they have done.

·          The rehabilitative purpose of justice is not applicable because there is little to rehabilitate in a practioner who was basically just doing his job,

·          Above that: correctional systems are not equipped to deal with rehabilitation of this kind of professional behaviour for which people were convicted.

·          The legal system often excludes the notion of an accident or human error, simply because there are typically no such legal concepts.

Not only is criminalization of human error by justice systems a possible use of tax money (that could be spent in better ways to improve safety) - it can actually end up hurting the interests of the society that the justice system is supposed to serve. Instead: if you want people in a system to account for their mistakes in ways that help the system learn and improve then charging and convicting a practioner is unlikely to do that.

 

Practioners like nurses and pilots endanger the lives of other people every day as a part of their ordinary job. How something in those activities slides from normal to culpable then is a hugely difficult assessment for which a judicial system often lacks the data and expertise. Many factors, all necessary and only jointly sufficient are needed to push a basically safe system over the edge into breakdown. Single acts by single “culprits” are neither necessary nor sufficient. If you are held accountable by somebody who really does not understand the first thing about what it means to be a professional in a particular setting then you will likely see their calls for accountability as unfair, coarse and uninformed (and thus: unjust).

 

Summing up - judicial proceedings after an incident:

·          make people stop reporting incidents,

·          create a climate of fear,

·          interfere often with regulatory work,

·          stigmatize an incident as something shameful,

·          creates stress and isolation that makes practioners perform less well in their jobs,

·          impede (safety) investigatory access to information.

 

More or less the same as for criminal legal actions applies also to civil legal actions.

 

10: Stakeholders in the legal pursuit of justice

In this chapter Dekker discusses the various stakeholders:

·          victims,

·          suspect/defendant,

·          prosecutor,

·          defense lawyer,

·          safety investigators,

·          lawmakers,

·          the employing organisation.

 

Practioners on trial have reason to be defensive, adversarial and ultimately limited in their disclosure (“everything you say can and will…”).

 

Language in investigation reports should be oriented towards explaining why it made sense for people to do what they did, rather then judging them for what they allegedly did wrong before a bad outcome. An investigation board should not do the job of a prosecutor.

 

In countries with a Napoleonic law tradition a prosecutor has a “truth-finding role”. But combining a prosecutorial and (neutral) investigative role in this way can be difficult: a magistrate or prosecutor may be inclined to highlight certain facts over others.

 

In general can be said that establishing facts is a hard thing. The border between facts and interpretation/values often blurs. What a fact means in the world from which it came can easily get lost. Expert witness is a solution to this problem, but prosecutors and lawyers often ask questions that lie outside the actual expertise.

 

Dekker argues a difference between judges and scientists in their deriving judgment from facts. Scientists are required to leave a detailed trace that show how their facts produced or supported particular. Dekker argues that scientific conclusions thus cannot be taken on faith (implying that decisions of a judge can - I tend to agree for a part, but want to state that both scientific and judge’s conclusions can contain a major deal of interpretation and thus “faith” - BUCA).

 

For employing organizations: The importance of programs for crisis intervention/peer support/stress management to help professionals with the aftermath of an incident cannot be overestimated.

 

Most professionals do not come to work to commit crimes (and this is a major difference with common criminal acts which are nearly always intended! - BUCA). Their actions make sense given their pressures and goals at the time. Professionals come to work to do a job, to do a good job. They do not have a motive to kill or cause damage. On the contrary: professionals’ work in the domains that this book talks about focuses on the creation of care, of quality, of safety.

 

11: Three questions for a just culture

Many organizations kind of settle on pragmatic solutions that allow them some balance in the wake of a difficult incident. These solutions boil down to 3 questions:

1. Who in the organization or society gets to draw the line between acceptable and unacceptable behaviour?

2. What/where should the role of domain expertise be in judging whether behaviour is acceptable or not?

3. How protected are safety data against judicial interference?

 

re 1 - The more society, industry, profession or organization has made clear and agreed arrangements about who gets to draw the line, the more predictable the managerial or judicial consequences of an occurrence are likely to be. That is, practioners will suffer less anxiety and uncertainty about what may happen in the wake of an occurrence, as arrangements have been agreed on and are in place.

 

re 2 - The greater the role of domain expertise in drawing the line, the less practioners and organizations may be likely to get exposed to unfair and inappropriate judicial proceedings. Domain experts can easier form an understanding of the situation as it looked to the person at the time, as they probably know such situations from their own experience:

·          It is easier for domain experts to understand where somebody’s attention was directed. Even though the outcome of a sequence of events will reveal (in hindsight!) what data was really important, domain experts can make better judgments about the perhaps messy or noisy context of which these (now critical) data were part and understand why it was reasonable for the person in question to be focusing on other tasks and attention demands at the time.

·          It is easier for domain experts to understand the various goals that the person in question was pursuing at the time and if the priorities in case of goal conflicts can have been reasonable.

·          It is easier for domain experts to assess whether any unwritten rules or norms may have played a role in people’s behaviour. Without conforming to these tacit rules and norms, people often could not even get their work done. The reason, of course, is that written guidance and procedures are always incomplete as a model for practice in context. Practioners need to bridge the gap between the written rule and the actual work-in-practice, which often involves a number of expert judgments and outsiders often have no idea about the existence of these norms, and would perhaps not understand their importance or relevance for getting the work done.

That said, domain experts may have other biases that work against their ability to fairly judge the quality of another expert’s performance like psychological defense (“if I admit that my colleague made a mistake my position is more vulnerable too”).

 

re 3 - The better safety data is protected from judicial interference, the more likely it is that practioners could feel free to report.

 

Dekker then goes on discussing various solutions with an “increasing level of just culture”. Key elements in these: trust, existing cultures and the legal foundation for protection of safety data.

 

12: Not individuals or systems, but individuals in systems

The old view sees human error as a cause of incidents. To do something about incidents then we need to do something about the particular human involved. The new, or systems view, sees human error as a symptom, not a cause. Human error is an effect of trouble deeper inside the system. Pellegrino says that looking at systems is not enough. We should improve systems to the best of our ability. But safety critical work is ultimately channeled through relationships between human beings or through direct contact of some people with the risky technology. At this sharp end there is almost always a discretionary space into which no system improvement can completely reach (and thus can only be filled by and individual or technology-operation human). Rather than individuals versus systems, we should begin to understand the relationships and roles of individuals in systems. Systems cannot substitute the responsibility borne by individuals within the space of ambiguity, uncertainty and moral choices. But systems can:

·          Be very clear where the discretionary space begins and ends.

·          Decide how it will motivate people to carry out their responsibilities conscientiously inside of that discretionary space. Is the source going to be fear or empowerment? In case of the former: remember that neither civil litigation nor criminal prosecution works as a deterrent against human error.

 

Rather than making people afraid, systems should make people participants in change and improvement. There is evidence that empowering people to affect their work conditions, to involve them in the outlines and content of that discretionary space, most actively promotes their willingness to shoulder their responsibilities inside of it. Holding people accountable and blaming people are two quite different things. Blaming people may in fact make them less accountable: they will tell fewer accounts.

 

Blame-free is not accountability-free. But we should create accountability not by blaming people, but by getting people actively involved in the creation of a better system to work in. Accountability should lay out the opportunities (and responsibilities!) for making changes so that the probability for harm happening again goes down. Getting rid of a few people that made mistakes (or had responsibility for them) may not be seen as an adequate response. Nor is it necessarily the most fruitful way for an organization to incorporate lessons about failure into what it knows about itself, into how it should deal with such vulnerabilities in the future.

 

13: A staggered approach to building your just culture

Dekker suggests staggered approach which allows you to match your organisation’s ambitions to the profession’s possibilities and constraints, the culture of your country and its legal traditions and imperatives.

Step 1: Start at home in your own organization. Don’t count on anybody to it for you! Make sure people know their rights and duties. See an incident as an opportunity to focus attention and learn collectively, do not see it as a failure or crisis. Start with building just culture from the beginning during basic education and training/introduction: make people aware of the importance of reporting. Implement debriefing and incident/stress management programs.

Step 2: Decide who draws the line in your organization. How to integrate practioner peer expertise in the decision to handle the aftermath. Empowering and involving the practioner is the best way for improvement.

Step 3: Protect your organisation’s data from undue outside probing.

Step 4: Decide who draws the line in your country. It’s important to integrate domain expertise in the national authority who will draw a line since a non-domain expert doing this is fraught with risks and difficulties.

 

Unjust responses to failure is often a result of bad relationship rather than bad performance. Restoring that relationship, or at least managing it wisely, is often the most important ingredient of a successful response. One way forward id to simply talk together. Building good relations can be seen as a major step toward just culture.

 

Epilogue:

If professionals consider one thing “unjust” it is often this: split second operational decisions that get evaluated, turned over, examined, picked apart and analyzed for months - by people who were not there when the decision was taken, and whose daily work does not even involve such decisions.

 

Often a single individual is made to carry the moral and explanatory load of a system failure - charges against that individual serve as a protection of “larger interests”.