Risk and safety management of ammonium nitrate fertilizer
Zsuzsanna Gyenes analyses some of the major disasters in the ammonium nitrate fertilizer industry (including Oppau, 95 years ago, and Toulouse, 15 years ago) to emphasise the importance of remembering and learning from past mistakes. This paper is aimed at keeping the memory of disasters alive, assuming that risk awareness and implementation of safety measures are facilitated by case histories. There have been several accidents and a few disasters in the ammonium nitrate fertilizer industry, and it is worthwhile to review these from time to time, beyond the regulation and practice changes which they triggered.
Lessons learned from major accidents involving fertilizers
This paper presents the results of the analysis of 25 major accidents involving fertilizers in the European Commission’s major accidents reporting system (the so-called eMARS) and other publicly available sources, including also road traffic accidents. Ammonium nitrate (AN) has been involved in numerous accidents causing explosions, fires, and releasing toxic fumes, and it is known that even small scale storage of ammonium nitrate fertilizers (defined as low as 10 tonnes in some legislation) may place the population at high risk if proper safety measures and procedures are not fully in place.
A review on Toulouse accident trials: some lessons to learn despite uncertainty on direct causes
A number of separate inquiries were carried out in the aftermath of the Toulouse disaster to identify the generic lessons to learn. All provided findings, recommendations and lessons in relation to some of the direct and root causes of the disaster. Some of these findings resulted in changes to both the French law on industrial risk prevention and to European legislation, firstly in 2003 with a change of classification criteria for ammonium nitrate fertiliser and off-specification materials, and later to some extent in Seveso III in 2013. During the third trial in 2017 — more than sixteen years after the disaster — to determine criminal responsibility, the direct causes were still being challenged by the Total group lawyers and other experts. These issues became the major debating points during the three trials. As the prosecution’s main scenario remains contested and uncertainty on direct causes remains, this could raise doubts on the relevance of the learning lessons process that led to changes in French and European legislation.
The imperfections of accident analysis, Erik Hollnagel & Fiona Macleod, LPB270, December 2019
Most accident analyses assume that it is possible to reason backwards in time from effect(s) to cause(s) in order to implement specific remedial actions and learn the relevant lessons. The simplicity of this way of thinking has made it attractive, but it leads to a false sense of comfort; the real world of work is too complex for such simple methods to work. This paper presents four concepts that help us understand why simple solutions are insufficient, hence why accidents recur.
Management of Change – what does a ‘good’ system look like? Ken Patterson & Gillian Wigham, LPB267, June 2019
The process industries have a history littered with the consequences of not managing change carefully and safely. Despite these lessons and a great deal of good practice, globally we continue to struggle to get it right. As time passes, our memories of the human tragedies linked to Flixborough (a temporary change) and Hickson & Welch (a series of non-assessed changes) fade and fall out of our common experience. However, we still need to learn the lessons – properly – and ensure that they are embedded in our industries’ everyday practice. So, what is it that makes a good system for change management? At heart it is a combination of a robust assessment, especially risk assessment, and appropriate management, ensuring any change has the correct authorisation. That’s a start but not sufficient to answer to the question “What does a good Management of Change (MoC) system really look like?” The rest of this paper attempts to answer that question, especially from the viewpoint of someone auditing an MoC system. Both authors are experienced process safety and occupational health & safety auditors, having worked on sites from the Americas to the Far East. We do not claim in any way that what follows is the only or the right answer, but it does draw on our experience both of systems that are working well and those that were clearly failing.
Limitations and misuse of LOPA, Roger Casey, Cantwell Keogh & Associates, LPB265, February 2019
Layers of Protection Analysis (LOPA) is a simplified form of numerical risk assessment. It is an order of magnitude approach and hence precise figures are not used. The technique does have significant limitations compared to more advanced techniques such as Fault Tree Analysis, QRA, etc. This paper highlights some of the mistakes that are seen in its application and challenges some of the practices that are occurring within LOPA calculations – in particular with the use of conditional modifiers related to exposure times which causes an underestimation of the risk. Keywords: Layer of Protection Analysis, LOPA, risk assessment
Investigation and bias – procedures, Andy Brazier, AB Risk Ltd, LPB264, December 2018
Incident investigations often conclude that one of the causes was either that people did not follow a “good” procedure or that procedures were not fit for purpose. These findings are often based on an inflated opinion of what procedures can achieve. The reality is that procedures appear very low on the hierarchy of risk control and will only ever make a fairly modest contribution to safety. Avoiding hindsight bias when considering the role of procedures in incidents can mean that more effective recommendations can be made, leading to a set of procedures that provide effective support to competent people.
Piper Alpha - what have we learned? Fiona Macleod and Stephen Richardson, LPB261, June 2018
The investigation into the Piper Alpha disaster has much to teach us thirty years on. Most of the physical evidence sank to the bottom of the North Sea, so the testimony of survivors and witnesses had to be woven together into a coherent story. The Cullen inquiry uncovered not only what probably happened on the terrible night of 06 July 1988, but also the complex path leading up to it, the early warnings and missed opportunities that might have prevented a tragedy in which 167 people lost their lives. The lessons to be learned are applicable far beyond the offshore oil industry, across all hazardous industries, and every bit as relevant today.
Hamlet chicken processing plant fire, Tony Fishwick, LPB260, April 2018
A major fire occurred at a chicken processing plant in North Carolina, USA, resulting in the death of 25 workers. The main outcome of the investigation into the accident was that the consequences of the fire resulted from serious shortcomings in the management of safety, including that the plant had never received a safety inspection.
Buncefield: Lessons learned on emergency preparedness, Graham Atkinson, HSE, LPB254, April 2017
Early on Sunday 11 December 2005 a gasoline storage tank was being filled from a pipeline at a fuel terminal at Buncefield. Safety systems fitted to prevent the tank overfilling failed and gasoline began to spill from the vents on the tank roof. A low-lying cloud of heavy, flammable vapour accumulated and spread out for about 250m in all directions around the tank. Ignition of the cloud occurred and a powerful vapour cloud explosion devastated the fuel depot. The ensuing fire spread to other tanks and was not fully extinguished for several days. This paper considers the lessons learned related to emergency preparedness at large flammable sites as a result of this incident. These include the responsibility of the operators of fuel depots, tanker terminals, etc.
Hazard identification - can power engineers learn from the process industries? Matt Clay, Health & Safety Laboratory, LPB253, February 2017
The power engineering sector features similar challenges to the early process industries at the time when they first developed hazard identification techniques. This paper summarises a pilot study carried out by the Health & Safety Laboratory (HSL) working with an electricity Distribution Network Operator (DNO) involving engineers and managers to determine whether process industry techniques could be adopted successfully to power engineering applications. The study focussed on the application of HAZOP and bow-tie.
Chernobyl - 30 years on, Fiona Macleod, LPB251, October 2016
The 1986 Chernobyl accident has lessons that extend beyond the nuclear industry and the former Soviet Union. These lessons are directly applicable to today’s international chemical industry: artificially imposed deadlines lead to shortcuts; simplified targets in complex environments will lead to perverse incentives and unintended consequences; real experts tell leaders things they don’t want to hear; good leaders listen; you don’t get safety by rules and regulation, it starts with the design and evolves with experience; good design is iterative — it takes time, expertise and feedback; things happen differently on night shift; whatever the designers intended, sooner or later the operator will do something unimaginable — often on night shift; sharing process safety information means sharing what went right (near misses) as well as what went wrong (accidents); sharing process safety stories widely and acting on the lessons they teach us is the way we shore up our defences faster than the changes can overwhelm us; management of change, and a sense of chronic unease, stops only when the field is green again.