Mon - Sat 9:00 - 17:30

Beyond "Sab Theek Hai": Digital Safety and Operational Excellence

In my three decades investigating industrial accidents, I've learned one hard lesson: catastrophes are rarely born in a single, thunderous moment of failure. They are conceived in the quiet whispers of a shift change. This phenomenon can be described as the "Butterfly Effect of Data Loss," where minor, unrecorded anomalies in one shift metastasize into major disasters in subsequent shifts. This case study will analyze the 2017 fire at the Shell Pernis refinery in the Netherlands, presenting it as a clear, real-world example of this principle. For students of industrial safety, this incident serves as a critical lesson on how seemingly small communication gaps can cascade into catastrophe when human habits override engineered safeguards.

--------------------------------------------------------------------------------

1. The Anatomy of a Casual Catastrophe: Understanding "Sab Theek Hai"

The phrase "Sab Theek Hai" (Hindi for "All is well") is more than a simple statement; it is a universal industrial dialect spoken in control rooms worldwide. Whether expressed as "All good" in the US or "She'll be right" in Australia, its function is the same: to provide a quick, reassuring summary of a plant's status. While socially convenient, this mindset is structurally dangerous in a high-hazard environment.

The core danger of this mindset is that it shifts the incoming operator from a state of "alert investigation" to "passive acceptance." This creates a dangerous disconnect between the plant's Physical Reality (the actual pressures, temperatures, and equipment states) and the operators' Perceived Reality (the belief that everything is stable).

This casual assurance is the bedrock of a phenomenon known as the Normalization of Deviance. This is a gradual process where unsafe practices or deviations from technical standards become the accepted norm simply because they have not yet resulted in a catastrophe. The practical mechanism that fuels this normalization is often a "Chalta Hai" ("It will do") attitude. For example, an operator might use a "temporary" clamp on a minor leak instead of following a lengthy repair procedure because it's easier and keeps production running. When the clamp holds, the deviation is smoothed over by a "Sab Theek Hai" handover, and the unsafe practice begins its journey to becoming the new standard.

This culture is driven by powerful psychological factors that encourage operators to prefer comfortable narratives over uncomfortable data. The two most important drivers are:

  • Fear of Blame: In organizational cultures that punish bad news, operators are strongly incentivized to hide minor issues. Reporting a problem can invite interrogation and blame, so presenting a "clean" handover by saying "All is well" is a form of self-protection.
  • Conflict Avoidance: A shift change is a moment of social transition. Raising complex technical issues can delay the handover, require lengthy explanations, and be seen as creating work for the next team. Simply saying "Sab Theek Hai" is often the path of least resistance, preserving social harmony at the expense of technical accuracy.

This dangerous culture of verbal reassurance and normalized deviation set the stage for the specific sequence of failures that occurred at the Shell Pernis refinery.

--------------------------------------------------------------------------------

2. The Incident: A Step-by-Step Breakdown of the 2017 Pernis Fire

In August 2017, a major fire erupted at the Shell Pernis refinery in Rotterdam. The immediate cause was the rupture of a furnace tube. The process flow through the tube had stopped, but the furnace's burners continued to fire. This caused the stagnant hydrocarbons inside the tube to overheat, pressurize, and ultimately burst, releasing over 100 metric tons of flammable liquid that quickly ignited.

The chain of events leading to this disaster reveals a classic failure of both technical and human systems:

  1. The Designed Safeguard: The furnace was equipped with a critical safety interlock known as a "low-flow trip." Its specific purpose was to prevent the exact scenario that occurred by automatically cutting off the burners if the hydrocarbon flow stopped. This was the engineered barrier designed to prevent overheating.
  2. The Initial Deviation: At some point prior to the incident, operators bypassed this critical safety interlock. This action was likely taken to facilitate a difficult startup or to manage "nuisance" trips that were interrupting production. This decision to operate without a key safety feature was the first step in normalizing an unsafe condition.
  3. The Communication Breakdown: This was the heart of the failure. The bypassed state of the safety interlock was not rigorously communicated or formally tracked across shifts. The critical data point—that the plant’s primary safety net had been removed—was not formally logged. It became 'dark data,' existing only in the memory of the outgoing shift and evaporating at the moment of handover.
  4. The Inevitable Outcome: The subsequent shift operated the furnace under the false assumption that they were protected by the automatic low-flow trip. When the process flow stopped, they expected the safety system to engage and shut down the burners. Because the interlock was bypassed, it couldn't. The physical result was predictable: stagnant hydrocarbons in the tubes overheated, the tube ruptured, and a massive fire ensued. From an investigator's standpoint, this wasn't an accident; it was a scheduled event. The only unknown was which shift would be on duty when the bill came due.

This step-by-step breakdown shows how the disaster was not an unforeseeable accident, but the logical conclusion of a series of human errors rooted in poor communication.

--------------------------------------------------------------------------------

3. Analysis: How a Verbal Handover Disabled a Safety System

The Shell Pernis fire was not just a technical failure of a furnace; it was the predictable outcome of a communication and cultural failure. The casual acceptance of a bypassed safety system, combined with an informal handover, created the exact conditions necessary for the disaster.

The table below connects the theoretical concepts of a "Sab Theek Hai" culture to the practical events of the incident.

Concept

Application at Shell Pernis

Normalization of Deviance

The "temporary" bypass of the safety interlock became the new, accepted baseline for operations, as it hadn't caused an immediate accident on previous shifts.

The "Sab Theek Hai" Handover

The verbal assurance of "All is well" masked the critical reality that a primary safety barrier was disabled, leaving the incoming shift blind to the true risk.

Erosion of Barrier Integrity

Using the "Swiss Cheese Model," the bypassed interlock was the hole in the technical barrier. The undocumented handover actively blinded the human barrier, creating a second, aligned hole that allowed the hazard to become a catastrophe.

This analysis reveals that the true failure at Pernis was not mechanical, but informational, highlighting a crucial lesson for anyone studying industrial operations.

--------------------------------------------------------------------------------

4. Conclusion: The Core Lesson of the Pernis Fire

The primary lesson from the Shell Pernis fire is stark: in high-hazard operations, the absence of robust, verified data is itself a catastrophic risk. A safety system that is disabled but not documented is functionally identical to having no safety system at all.

This incident perfectly illustrates the central theme of the "Butterfly Effect of Data Loss." The disaster was not caused by a single, massive error on the day of the fire, but by the loss of a small but critical piece of data—the status of a single bypass—during a routine shift change sometime earlier. This lost fact cascaded through subsequent shifts until it met the exact physical conditions it was meant to prevent.

Ultimately, safety and operational excellence depend on a culture that values data over intuition and verification over reassurance. We must build systems that demand clarity and reject ambiguity. The contrast is clear: true reliability comes not from the operator who says, "It's all good," but from the one who reports, "The pressure is deviating by 5%, and here is the trend." The first gives us a problem we can solve; the second gives us a silence that can destroy us.

Categories

Subscribe To Our Newsletter

Join our mailing list to receive the latest news & updates from our team.