zaterdag 28 januari 2012

Just Culture - Sidney Dekker (A Summary)

A chapter by chapter summary of:

Just Culture - Sidney Dekker
2007, Ashgate Publishing, ISBN 978-0-7546-7267-8

Summary mainly constructed from direct quotes…

Comments BUCA (my comments in the text are in italics and marked with my initials):
1.     The author doesn’t define in the beginning of the book the term ’just culture’. This either means that he takes it in the sense like James Reason described it in his book on organizational accidents and he “hopes” the reader is familiar with this. Or the author wants the reader to discover how he understands ‘just culture’ by reading the book…
2.     The author does take the phenomenon of criminalizing error (and especially from the Anglo/American law system) as starting point for his discussion. But with only a little thinking one can see how conclusions and discussions can be relevant for other viewpoints, like European law systems and/or company internal processes (compliance focus and scapegoating versus leaning and improving).

Preface and prologue
Dekker poses some of the themes he will discuss later in the book:
·          There is a trend towards criminalization of human error.
·          There are no objective lines between ‘error’ and ‘crime’. These are constructed by people and are drawn differently every time. It doesn’t matter as much where the line gets drawn but who draws it.
·          Hindsight is an important factor in determining culpability.
·          Multiple (overlapping) interpretations of an act are possible and often necessary to capture its complexity.
·          Some interpretations (notably criminalization), however, have significant negative consequences for safety that overshadow any possible positive effects.
·          When we see errors as crimes, then accountability is backward looking and means blaming and punishing. One should rather see an error as an organizational, operational, technical, educational or political issues, accountability gets forward looking and can be used for improvement.

1 Why bother with a just culture?
Consequences of admitting/reporting a mistake can be bad. Some professionals have a “code of silence” (omertà), and often this is not uncommon or only applicable for a “few bad apples”. The problems often are structural trust and relationship issues between parties. Trust is critical. Hard to build, but easy to break. But there is little willingness to share information if there is a fear of being nailed for them.

Responding to failure is an ethical question. Has ‘just’ to do with legal criteria or is ‘just’ something that takes different perspectives, interests, duties and alternative consequences into evaluation?

Once the legal system gets involved there is little chance for a “just” outcome or improvement of safety. Rather than investing in safety people or organizations invest in defensive posturing so they are better protected against prosecutorial attention. Rather than increasing the flow of safety-related information, legal action has a way of cutting off that flow. Safety reporting often gets a harsh blow when tings go to court.

Case treatment in court tends to gloss over the most difficult professional judgments where only hindsight can tell right from wrong.

Judicial proceedings can rudely interfere with an organization’s priorities and policies. They can redirect resources into projects or protective measures that have little to do with the organization’s original mandate. Instead of improvements in the primary process of the organization it can lead to “improvements” of all the stuff swirling around the primary process like bureaucracy, involvement of the legal department, book-keeping and micromanagement. These things make work for the people on the sharp end often more difficult, lower in quality, more cumbersome and often less safe.

Accountability is a trust issue and fundamental to human relationships. Being able to and having to explain why one did what he did is a basis for a decent, open, functioning society. Just culture is about balancing safety (learning and improvement) and accountability. Just Culture wants people to bring information on what must be improved to groups or people who can do something about it and spend efforts and resources on improvements with safety dividend rather than deflecting resources into legal protection and limiting liability. This is forward looking accountability. Accountability must not only acknowledge the mistake and harm resulting from it, but also lay responsibilities and opportunities for making the changes so that the probability of the same mistake/harm in the future goes down.

Research shows that not having a just culture is bad for morale; commitment to the organization; job satisfaction and willingness to do that little extra outside ones role. Just Culture is necessary if you want to monitor safety of an operation, want to have an idea about the capability of the people or organization and to effectively meet the problems that are coming your way. Just culture enables to concentrate on doing a quality job and making better decisions rather that limiting (personal) liability and making defensive decisions. Just culture promotes long term investments in safety over short-term measures to limit legal or media exposure.

Wanting everything in the open, but not tolerating everything.

2: Between Culpable and Blameless
Events can have different interpretations: is a mistake just a mistake or a culpable act? Often there is no objective answer, only how you make up your mind about it.

Companies often impose difficult choices on their employees. On one side “never break rules, safety first” on the other “don’t cost us time or money, meet your operational targets, don’t find reasons why you can’t”.

A single account cannot do justice to the complexity of events. A just culture:
·          accepts nobody’s account as ‘true’ or ‘right’,
·          is not about absolutes but about compromises,
·          pays attention to the “view from below”,
·          is not about achieving power goals,
·          says that disclosure matters and protection of those who do just as much,
·          needs proportionality and decency.

3: The importance, risk and protection of reporting
Point of reporting is to contribute to organizational learning in order to prevent recurrence by systematic changes that aim to redress some of the basic circumstances in which work went awry.

What to report is a matter of judgment; often only outcome leads us to see an event as safety relevant. Experience and “blunting” (an event becoming “normal”) affects reporting. Ethical obligation: if in doubt then report.

In a just culture people will not be blamed for their mistakes if they honestly report them. The organization can benefit much more by learning from the mistakes than from blaming the people who made them. Many people fail to report not because they are dishonest, but because they fear the consequences or have no faith that anything meaningful will be done with what they tell. One threat is that information falls in the wrong hand (e.g. a prosecutor, or media - notably for government related agencies due to freedom of information legislation!). Some countries provide for this reason certain protection to safety data and exempt this information from use in courts.

Getting people to report is difficult and keeping the reporting up is equally challenging. It is about maximizing accessibility (low threshold, easy system, …) and minimizing anxiety (employees who report feeling safe). It is building trust, involvement, participation and empowerment. Let the reporter be a part in the process of shaping improvement and give feedback. It may help to have a relatively independent (safety) staff to report to instead of line reporting (which does have advantages regarding putting the problem where it should be treated and having the learning close to the process).

4: The importance, risk and protection of disclosure
Reporting is giving information to supervisors, regulators and other agencies. Main focus: learning and improvement.
Disclosure is about giving information to customers, clients, patients and families. Main focus: ethical obligation, trust, professionalism.
These two can often clash, as can various kinds of reporting (internal/external). If information leaves the company often there is the danger that the information will be used against the reporting employees (blaming, law suits, criminal prosecution).

Often people think that not providing an account of what happened (disclosure) means that one has something to hide, and often suspicion rises that a mistake was not a “honest” one. This again is a breach of trust that may lead to involvement of the legal apparatus. Truth will then be the first to suffer as the parties take defensive positions.

Many professions have a “hidden curriculum” where professionals learn the rhetoric to make a mistake into something that is no longer a mistake, making up stories to explain or even a code of silence.

Honesty should fulfill the goals of learning from a mistake to improve safety and achieving justice in the aftermath of an event. But “wringing honesty” out of people in vulnerable positions is neither just nor safe. This kind of pressure will not bring out the story that serves the dual goal of improvement and justice.

5: Are all mistakes equal?
Dekker discusses two kinds of ways to look at errors: technical and normative. The difference is made by people, the way they look at it, talk about it and respond to it.
Technical errors are errors in roles. The professional performs his task diligently but his present skills fall short of what the task requires. People can be very forgiving (even of serious lapses in technique) when they see these as a natural by-product of learning-by-doing. In complex and dynamic work where resource limitations and uncertainty reign, failure is going to be a lasting statistical reality. But technical errors should decrease (in frequency and seriousness) as experience goes up. It is seen as an opportunity for learning. The benefit of a technical error outweighs the disadvantages. Denial by the professional may lead to people around him to see the error as a normative one.
Normative errors say something about the professional himself relative to the profession. A normative error sees a professional not filling his role diligently. Also: if the professional is not honest in his account of what happened, his mistake will be seen as a normative error.
Worse, however, is that people are more likely to see a mistake as more culpable when the outcome of a mistake is really bad. Hindsight plays a really big role in how a mistake is handled!

Note by BUCA: Dekker discusses normative errors rather briefly (compared to technical). I’m surprised that he doesn’t address the “compliance” issue under the moniker of normative errors. A look at errors from a compliance oriented Point Of View is also rather accusative and little focused on improvement!

6: Hindsight and determining culpability
We assume that if an outcome is good, then the process leading up to it must be have been good too - that people did a good job. The inverse is true too: we often conclude that people may not have done a good job when the outcome is bad.

If we know that an outcome is really bad then this influences how we see the behaviour leading up to it. We will be more likely to look for mistakes, or even negligence. We will be less inclined to see the behaviour as forgivable. The worse the outcome the more likely we are to see mistakes.

The same actions and assessments that represent a conscientious discharge of professional responsibility can, with knowledge of outcome become seen as a culpable, normative mistake. After the fact there are always opportunities to remind professionals what they could have done better. Hindsight means that we:
·          oversimplify causality because we can start from the outcome and reason backwards to presumed or possible causes,
·          overestimate the likelihood of the outcome because we already have the outcome in our hands,
·          overrate the role of rule/procedure violations. There is always a gap between written guidance and actual practice (which almost never leads to trouble), but that gap takes on causal significance once we have a bad outcome to look at and reason back from,
·          misjudge the prominence or relevance of data presented to the people at the time,
·          match outcome with the actions that went before it. If the outcome was bad, then the actions must have been bad too (missed opportunities, bad assessments, wrong decisions, etc).

Lord Hidden, Clapham Junction accident: “There is almost no human action or decision that cannot be made to look flawed in the misleading light of hindsight. It is essential that the critic should keep himself constantly aware of that fact”.

Rasmussen: if we find ourselves asking “how could they have been so negligent, so reckless, so irresponsible?” then this is not because the people were behaving so bizarrely. It is because we have chosen the wrong frame of reference for understanding their behaviour. If we really want to know whether people anticipated risks correctly, is to see the world through their eyes, without knowledge of outcome, without knowing exactly which piece of data will turn out critical afterward.

7: You have nothing to fear if you’ve done nothing wrong
A no-blame culture is neither feasible nor desirable. Most people desire some level of accountability when a mishap occurs. All proposals for a just culture emphasize the establishment of and consensus around some kind of line between legitimate and illegitimate behaviour. People then expect that cases of gross negligence jump out by themselves, but such judgments are neither objective nor unarguable. All research on hindsight bias shows that it turns out to be very difficult for us not to take into account an outcome.

The legitimacy (or culpability) of an act is not inherent in the act. It merely depends on where we draw the line. What we see as a crime and how much retribution we believe it deserves is hardly a function of the behaviour. It is a function of our interpretation of that behaviour.

People in all kinds of operational worlds knowingly violate safety operating procedures all the time. Even procedures that can be shown to have been available, workable and correct (question is of course: says who?). Following all applicable procedures means not getting the job done in most cases. Hindsight is great for laying out exactly which procedures were relevant (and available and workable and correct) for a particular task, even if the person doing the task would be the last in the work to think so.

Psychological research shows that the criminal culpability of an act is likely to be constructed as a function of 3 things:
·          was the act freely chosen?
·          did the actor know what was going to happen?
·          the actor’s causal control (his or her unique impact on the outcome).
Here factors that establish personal control intensify blame attributions whereas constraints on personal control potentially mitigate blame.

Almost any act can be constructed into willful disregard or negligence, if only that construction comes in the right rhetoric, from the legitimated authority. Drawing the line does not solve this problem. It has to be considered (very carefully!) who gets to draw this line and make structural arrangements about this. Question is if the people who get this task indeed are able to have an objective, unarguable neutral view from which they can separate right from wrong.

Just culture thus should not give people the illusion that it is simply about drawing a line. Instead it should give people clarity about who draws the line and what rules, values, traditions, language and legitimacy this person uses.

8: Without prosecutors there would be no crime
We do not normally as professionals themselves whether they believe that their behaviour “crossed the line”. But they were there, perhaps they know more about their own intentions than we can ever hope to gather. But…:
·          we suspect that they are too biased
·          we reckon that they may try to put themselves in a more positive light
·          we see their account as one-sided, distorted, skewed, partial - as a skirting of accountability rather than embracing it.

No view is ever neutral or objective. No view can be taken from nowhere. All views somehow have values and interests and stakes wrapped into them. Even a court rendering of an act is not a clear view from an objective stance - it is the negotiated outcome of a social process. So a court may claim that it has an objective view on a professional’s actions. From the professional’s perspective (and his colleagues) that account is often incomplete, unfair, biased, partial.

To get to the “truth” you need multiple stories. Settling for only one version amounts to an injustice to the complexity of an adverse event. A just culture always takes multiple stories into account because:
·          telling it from one angle necessarily excludes aspects from other angles
·          no single account can ever claim that it, and it alone, depicts the world as it is,
·          if you want to explore opportunities for safety improvements you want to discover the full complexity and many stories are needed for this.

9: Are judicial proceedings bad for safety?
Paradoxically when the legal system gets involved, things get neither more just nor safer.

As long as there is fear that information provided in good faith can end up being used by a legal system, practioners are not likely to engage in open reporting. A catch 22 for professionals: either report facts and risk being persecuted for them, or not report facts and risk being persecuted for not reporting them (if they do end up coming out along a different route).

Practioners in many industries the world over are anxious of inappropriate involvement of judicial authorities in safety investigations that, according to them, have nothing to do with unlawful acts, misbehaving, gross negligence or violations. Many organisations and also regulators are concerned that their safety efforts, such as encouraging incident reporting, are undermined. Normal structural processes of organizational learning are thus eviscerated; frustrated by the mere possibility of judicial proceedings against individual people. Judicial involvement (or the threat of it) can create a climate of fear and silence. In such a climate it can be difficult - if not impossible - to get access to information that may be critical to finding out what went wrong and what to do to prevent reoccurrence.

There is no evidence that the original purposes of a judicial system (such as prevention, retribution, or rehabilitation - not to mention getting a “true” account of what happened or actually serving “justice”) are furthered by criminalizing human error:
·          The idea that the charged or convicted practioner will serve as an example to scare others into behaving more prudently is probably misguided: instead practioners will become more careful in not disclosing what they have done.
·          The rehabilitative purpose of justice is not applicable because there is little to rehabilitate in a practioner who was basically just doing his job,
·          Above that: correctional systems are not equipped to deal with rehabilitation of this kind of professional behaviour for which people were convicted.
·          The legal system often excludes the notion of an accident or human error, simply because there are typically no such legal concepts.
Not only is criminalization of human error by justice systems a possible use of tax money (that could be spent in better ways to improve safety) - it can actually end up hurting the interests of the society that the justice system is supposed to serve. Instead: if you want people in a system to account for their mistakes in ways that help the system learn and improve then charging and convicting a practioner is unlikely to do that.

Practioners like nurses and pilots endanger the lives of other people every day as a part of their ordinary job. How something in those activities slides from normal to culpable then is a hugely difficult assessment for which a judicial system often lacks the data and expertise. Many factors, all necessary and only jointly sufficient are needed to push a basically safe system over the edge into breakdown. Single acts by single “culprits” are neither necessary nor sufficient. If you are held accountable by somebody who really does not understand the first thing about what it means to be a professional in a particular setting then you will likely see their calls for accountability as unfair, coarse and uninformed (and thus: unjust).

Summing up - judicial proceedings after an incident:
·          make people stop reporting incidents,
·          create a climate of fear,
·          interfere often with regulatory work,
·          stigmatize an incident as something shameful,
·          creates stress and isolation that makes practioners perform less well in their jobs,
·          impede (safety) investigatory access to information.

More or less the same as for criminal legal actions applies also to civil legal actions.

10: Stakeholders in the legal pursuit of justice
In this chapter Dekker discusses the various stakeholders:
·          victims,
·          suspect/defendant,
·          prosecutor,
·          defense lawyer,
·          safety investigators,
·          lawmakers,
·          the employing organisation.

Practioners on trial have reason to be defensive, adversarial and ultimately limited in their disclosure (“everything you say can and will…”).

Language in investigation reports should be oriented towards explaining why it made sense for people to do what they did, rather then judging them for what they allegedly did wrong before a bad outcome. An investigation board should not do the job of a prosecutor.

In countries with a Napoleonic law tradition a prosecutor has a “truth-finding role”. But combining a prosecutorial and (neutral) investigative role in this way can be difficult: a magistrate or prosecutor may be inclined to highlight certain facts over others.

In general can be said that establishing facts is a hard thing. The border between facts and interpretation/values often blurs. What a fact means in the world from which it came can easily get lost. Expert witness is a solution to this problem, but prosecutors and lawyers often ask questions that lie outside the actual expertise.

Dekker argues a difference between judges and scientists in their deriving judgment from facts. Scientists are required to leave a detailed trace that show how their facts produced or supported particular. Dekker argues that scientific conclusions thus cannot be taken on faith (implying that decisions of a judge can - I tend to agree for a part, but want to state that both scientific and judge’s conclusions can contain a major deal of interpretation and thus “faith” - BUCA).

For employing organizations: The importance of programs for crisis intervention/peer support/stress management to help professionals with the aftermath of an incident cannot be overestimated.

Most professionals do not come to work to commit crimes (and this is a major difference with common criminal acts which are nearly always intended! - BUCA). Their actions make sense given their pressures and goals at the time. Professionals come to work to do a job, to do a good job. They do not have a motive to kill or cause damage. On the contrary: professionals’ work in the domains that this book talks about focuses on the creation of care, of quality, of safety.

11: Three questions for a just culture
Many organizations kind of settle on pragmatic solutions that allow them some balance in the wake of a difficult incident. These solutions boil down to 3 questions:
1. Who in the organization or society gets to draw the line between acceptable and unacceptable behaviour?
2. What I where should the role of domain expertise be in judging whether behaviour is acceptable or not?
3. How protected are safety data against judicial interference?

re 1 - The more society, industry, profession or organization has made clear and agreed arrangements about who gets to draw the line, the more predictable the managerial or judicial consequences of an occurrence are likely to be. That is, practioners will suffer less anxiety and uncertainty about what may happen in the wake of an occurrence, as arrangements have been agreed on and are in place.

re 2 - The greater the role of domain expertise in drawing the line, the less practioners and organizations may be likely to get exposed to unfair and inappropriate judicial proceedings. Domain experts can easier form an understanding of the situation as it looked to the person at the time, as they probably know such situations from their own experience:
·          It is easier for domain experts to understand where somebody’s attention was directed. Even though the outcome of a sequence of events will reveal (in hindsight!) what data was really important, domain experts can make better judgments about the perhaps messy or noisy context of which these (now critical) data were part and understand why it was reasonable for the person in question to be focusing on other tasks and attention demands at the time.
·          It is easier for domain experts to understand the various goals that the person in question was pursuing at the time and if the priorities in case of goal conflicts can have been reasonable.
·          It is easier for domain experts to assess whether any unwritten rules or norms may have played a role in people’s behaviour. Without conforming to these tacit rules and norms, people often could not even get their work done. The reason, of course, is that written guidance and procedures are always incomplete as a model for practice in context. Practioners need to bridge the gap between the written rule and the actual work-in-practice, which often involves a number of expert judgments and outsiders often have no idea about the existence of these norms, and would perhaps not understand their importance or relevance for getting the work done.
That said, domain experts may have other biases that work against their ability to fairly judge the quality of another expert’s performance like psychological defense (“if I admit that my colleague made a mistake my position is more vulnerable too”).

re 3 - The better safety data is protected from judicial interference, the more likely it is that practioners could feel free to report.

Dekker then goes on discussing various solutions with an “increasing level of just culture”. Key elements in these: trust, existing cultures and the legal foundation for protection of safety data.

12: Not individuals or systems, but individuals in systems
The old view sees human error as a cause of incidents. To do something about incidents then we need to do something about the particular human involved. The new, or systems view, sees human error as a symptom, not a cause. Human error is an effect of trouble deeper inside the system. Pellegrino says that looking at systems is not enough. We should improve systems to the best of our ability. But safety critical work is ultimately channeled through relationships between human beings or through direct contact of some people with the risky technology. At this sharp end there is almost always a discretionary space into which no system improvement can completely reach (and thus can only be filled by and individual or technology-operation human). Rather than individuals versus systems, we should begin to understand the relationships and roles of individuals in systems. Systems cannot substitute the responsibility borne by individuals within the space of ambiguity, uncertainty and moral choices. But systems can:
·          Be very clear where the discretionary space begins and ends.
·          Decide how it will motivate people to carry out their responsibilities conscientiously inside of that discretionary space. Is the source going to be fear or empowerment? In case of the former: remember that neither civil litigation nor criminal prosecution works as a deterrent against human error.

Rather than making people afraid, systems should make people participants in change and improvement. There is evidence that empowering people to affect their work conditions, to involve them in the outlines and content of that discretionary space, most actively promotes their willingness to shoulder their responsibilities inside of it. Holding people accountable and blaming people are two quite different things. Blaming people may in fact make them less accountable: they will tell fewer accounts.

Blame-free is not accountability-free. But we should create accountability not by blaming people, but by getting people actively involved in the creation of a better system to work in. Accountability should lay out the opportunities (and responsibilities!) fro making changes so that the probability for harm happening again goes down. Getting rid of a few people that made mistakes (or had responsibility for them) may not be seen as an adequate response. Nor is it necessarily the most fruitful way for an organization to incorporate lessons about failure into what it knows about itself, into how it should deal with such vulnerabilities in the future.

13: A staggered approach to building your just culture
Dekker suggests staggered approach which allows you to match your organisation’s ambitions to the profession’s possibilities and constraints, the culture of your country and its legal traditions and imperatives.
Step 1: Start at home in your own organization. Don’t count on anybody to it for you! Make sure people know their rights and duties. See an incident as an opportunity to focus attention and learn collectively, do not see it as a failure or crisis. Start with building just culture from the beginning during basic education and training/introduction: make people aware of the importance of reporting. Implement debriefing and incident/stress management programs.
Step 2: Decide who draws the line in your organization. How to integrate practioner peer expertise in the decision to handle the aftermath. Empowering and involving the practioner is the best way for improvement.
Step 3: Protect your organisation’s data from undue outside probing.
Step 4: Decide who draws the line in your country. It’s important to integrate domain expertise in the national authority who will draw a line since a non-domain expert doing this is fraught with risks and difficulties.

Unjust responses to failure is often a result of bad relationship rather than bad performance. Restoring that relationship, or at least managing it wisely, is often the most important ingredient of a successful response. One way forward id to simply talk together. Building good relations can be seen as a major step toward just culture.

Epilogue:
If professionals consider one thing “unjust” it is often this: split second operational decisions that get evaluated, turned over, examined, picked apart and analyzed for months - by people who were not there when the decision was taken, and whose daily work does not even involve such decisions.

Often a single individual is made to carry the moral and explanatory load of a system failure - charges against that individual serve as a protection of “larger interests”.

zondag 22 januari 2012

New safety theory launched!

Since the Swiss Cheese Model has come in discredit I propose hereby the Italian Bread Model (IBM). Not to be confused with 'Big Blue'.


I've never been fond of cheese anyway.

donderdag 19 januari 2012

Webspace available for USEFUL information

Looking for something?

I'm not sure for what, of course, but it's probably not worth the effort. Spend you time more wisely, I've wasted enough, trust me!

An example and various solutions

An example
Take the following example which hopefully demonstrates why I believe that:
  • multi-causality is a valid and usable concept (this is especially true in implementing improvements to safety systems and behaviours),
  • this may include multiple direct causes,
  • that causality not necessarily goes in strictly linear paths, but often in parallel paths that meet up at a certain point,
  • that management not necessarily is the root of all evil, and
  • causation models and tests that fit tort and criminal law definitions don’t necessarily align with prevention goals.
The case
Someone in a laboratory in a university has suffered partial loss of eyesight after he had boiling acid liquid in one of his eyes. He was not wearing the mandatory safety goggles; instead he was wearing his own glasses which he needs in order to be able to see details at working distance. He was blending two chemical substances but did not do this in the described order and volume, causing the chemicals to react more ‘enthusiastically’ than intended, causing part of the acid liquid to reach boiling point and evaporate/explode sending drops of acid liquid into the immediate area - one or several of those hitting the insufficiently protected eye.

The victim was a bit in a hurry. It was Friday afternoon and he wanted to get home as soon as possible. He just needed to get this job done; clean up his part of the lab and it would have been weekend for him. So instead of following the proper procedure he was a bit careless and used greater volumes and the wrong order (started with the ingredient he randomly picked up first). He didn’t wear safety goggles because he never does. He tried the standard issue as provided by the laboratory several times, but those fit very badly in combination with his ordinary prescription glasses, irritating nose and ears immensely. This has been reported to the department head on several occasions, by several employees without resulting in a better fitting alternative. Also suggestions to provide contact lenses that would fit with the standard safety goggles have been turned down. In addition did the victim believe that his common glasses would provide sufficient protection since they cover the area in front of his eyes. The supervisor was aware of the situation and had noticed that the victim consequently doesn’t wear his safety goggles, but chooses to ignore this fact.

The victim is experienced, properly instructed, etc. The failure to provide suitable equipment is explained by the department head with reducing his expenses to the bare minimum. He is on a tight budget (government cuts in budgets to universities and lack of corporate sponsoring due to financial crisis) and has planned to spend all the money he has on a laser-ion-mega-spectrometre which will improve his department’s possibility to do analysis of the substances they work with. Besides, the university does provide standard safety equipment in accordance to CE standards and that should be good enough. The supervisor informs us that he is fairly new in his job and used to be ‘one of the boys’ before he was promoted. He was the best qualified among the applicants and the only internal applicant filling all the formal requirements. He finds it a bit embarrassing to start picking on safety rules he hasn’t followed all that closely himself before he was promoted.

Some solutions
Many thanks to Jeff Harris and Alan Quilley for supplying alternative solutions listed below! Great to see some alternating versions and views - this only contributes to learning, there is no perfect way anyway.

Starting with my own approach, I’ll draw the incident as a causal tree, a method I prefer because it illustrates both causal connections and (to a lesser degree) chronology. In contradiction to some opinions, I would note this with two separate direct causes.


Alternatively (and fully justifiable) one could choose to see the point of “loss of control” as the direct cause and then picture the accident and its direct cause as follows:


There are certainly many other and different ways one could write down things, many of them correct. All of the relevant information in the case (so far) is present, after all. One drawback of this particular notation and combining relevant elements in one ‘box’ (i.c. the accident and the missing goggles), however, is that it’s going to be harder to explain in a logical way how the lack of goggles played a role, which it does in my opinion. But see Jeff Harris’s solution below as well!

This is for me one reason to argue that many accidents will not have linear causal relationships. There is no way in this example that you get the wrong blending of chemicals and the missing goggles in some cause/effect or why/because relationship to each other, so they must exist parallel and join up at a certain point to produce the final effect. And mind you, both are not sufficient in producing the accident in themselves. Just leaving off goggles would result in what we could call an unsafe condition, while the act of wrong blending and thus boiling/exploding liquid with goggles on would lead to what we tend to call a near miss (as far as eyes are concerned - acid in your face is no great fun either).

We could stop here, by the way, if we would follow Hart & Honore’s rule of having met a “deliberate act” (in my notation even two, parallel, acts) being a “barrier” for further investigation. That, however, would leave the questions why the victim not followed the correct procedure and why he did not wear his safety gear.

So, I choose to continue my analysis and start gathering more facts, hoping to find an answer. What we find is the following and I’ll continue with my first notation in building a visual presentation of the case, adding the next layer of causes (for our convenience I add a background showing the various phases or dominos). We see that there are three underlying causes (parallel and independent) to the not wearing of goggles: 1) a conscious decision of the victim not to use the ill-fitting safety equipment because he trusts he is sufficiently covered; 2) not supplying suitable equipment by the management and 3) the supervisor choosing not to enforce the safety rule.

I am open for suggestion and arguments here if someone does not agree on some of these being actually causes, or if some of the factors should be considered more important than others. The most crucial in my eyes would be the not supplying of suitable equipment (especially from a prevention point of view), but I have a hard time excluding the other two.

We will take this investigation even one step further. With regard to the victim’s decisions/violation/errors no relevant causal information is found. He is experienced, properly instructed, etc. So I choose not to add more underlying causes here. But the underlying managerial causes can be taken one useful step further back.


One could obviously take the analysis even more steps back, but I think that the causal connection between “budgeting process” and “recruitment of supervisors” or even “government cuts” and “financial crisis” is going to be too hazy for this case. So I choose to (slightly arbitrarily) to stop here with ‘root causes’ both on personal/employee and management/organisational level.

Jeff Harris’s version

I break the incident down into two direct causes: the cause of the incident and the cause of the injury. The direct cause of the incident was improper mixing of the chemicals. The direct cause of the injury was not wearing the "mandatory" goggles. (both unsafe acts).

Then you delve into the root causes. Why did he mix the chemicals improperly - he was in a hurry and wanted to get home - a very common theme in incidents. He was properly trained - he just didn't follow the training. So why did he not wear the "mandatory" goggles? He said they hurt his face. The company would not buy him "special" goggles to make it more comfortable. Why not? Too expensive, they said. His supervisor never made him wear goggles. Why not? Apparently the supervisor did not always wear goggles and didn't feel like he could enforce that rule on other people (no lead by example). The managers over the supervisor either did not know or did not enforce the adherence to a safety requirement.

So what to do to avoid a repeat? Start by finding out how many other safety requirements are not being followed. Start enforcing mandatory safety requirements and if supervisors are not doing their job (enforcement) maybe you need supervisors who do. I personally feel it would have been a lot cheaper and easier for the company to have bought some special goggles for the employee, but that does not relieve him of the responsibility of wearing the goggles, even if uncomfortable. (If I had a dime for every time I was told a respirator was uncomfortable!) Changing the behavior of getting in a hurry is a much harder task. There you have to reach out and engage the "hearts and minds" to change the employees' attitudes about risk and what is acceptable. You won't always succeed. That is why the safety goggles are important: to minimize the injury when someone screws up.

By the way, where is the "unsafe condition" in this case. Oh yes, the dangerous chemicals. Well, if we just shut down the lab, fired everyone, and got rid of the chemicals, this incident would not have happened. :-)

Alan Quilley’s version