Dynamic Risk Analysis Using Alarm Databases to Improve Process Safety and Product Quality: Part I – Data Compaction

1 minute read

Ankur Pariyani, Warren D. Seider, Ulku Oktem, and Masoud Soroush

In most industrial processes, vast amounts of data are recorded through their distributed control systems (DCSs) and emergency shutdown (ESD) systems. This two-part article presents a dynamic risk analysis methodology that uses alarm databases to improve process safety and product quality. The methodology consists of three steps: (i) tracking of abnormal events over an extended period of time, (ii) event-tree and set-theoretic formulations to compact the abnormal-event data, and (iii) Bayesian analysis to calculate the likelihood of the occurrence of incidents. Steps (i) and (ii) are presented in Part I and step (iii) in Part II. The event-trees and set-theoretic formulations allow compaction of massive numbers (millions) of abnormal events. For each abnormal event, associated with a process or quality variable, its path through the safety or quality systems designed to return its variable to the normal operation range is recorded. Event trees are prepared to record the successes and failures of each safety or quality system as it acts on each abnormal event. Over several months of operation, on the order of 106 paths through event trees are stored. The new set-theoretic structure condenses the paths to a single compact data record, leading to significant improvement in the efficiency of the probabilistic calculations and permitting Bayesian analysis of large alarm databases in real time. As a case study, steps (i) and (ii) are applied to an industrial, fluidized-catalytic-cracker.

Keywords: dynamic risk analysis, alarm databases, chemical industry, bayesian theory, fluid catalytic cracking unit