Open access peer-reviewed chapter

From Nobel Prizes to Safety Risk Management: How to Identify Latent Failure Conditions in Risk Management Practices

Written By

Sanjeev Kumar Appicharla

Submitted: 13 June 2021 Reviewed: 17 June 2021 Published: 19 April 2022

DOI: 10.5772/intechopen.98960

From the Edited Volume

Railway Transport Planning and Management

Edited by Stefano de Luca, Roberta Di Pace and Chiara Fiori

Chapter metrics overview

163 Chapter Downloads

View Full Metrics

Abstract

The aim of the Chapter is to introduce readers to the Cognitive Biases found in Railway Transport Planning and Management domain. Cognitive biases in planning of railway projects lead to cost overruns, fail to achieve performance and fulfil safety objectives as well is noted in the economics, business management and risk management literature as well. Unbiased decision making is a core goal of systems engineering, encouraging careful consideration of stakeholder needs, design alternatives, and programmatic constraints and risks. However, Systems engineering practices dealing with Railway Transport Planning and Management fields do not pay attention to the human factors and organisational factors at initial stages of planning where driveability of European Railway Traffic Management System (ERTMS) Trains emerges as a concern in real time operations is noted in the Railway Transport Planning and Management domain. Therefore, there is a case for studying the Cognitive Biases in this domain. The System for Investigation of Railways (SIRI) Cybernetic Risk Model (2006), (2017) is a Systems engineering response to the internal research brief by RSSB, a GB Railways Safety Body. The SIRI Cybernetic Risk Model (2017) incorporating the “Heuristics and Biases” approach was published by the UK Transport Select Commission as a Written Evidence in 2016 on the occasion of the Inquiry theme of Railway Safety. The validity of the SIRI Risk Model (Swiss Cheese Model) is further illustrated through the 2019 historical survey of railway accidents and the two recent RAIB investigations of track worker fatal accident and signalling related near miss event in the form of Swiss Cheese Model. The data and information in the RAIB Reports (17/2019) and (11/2020) is supplemented by further research and the author’s own past studies of accident analyses. The results of the study show that the Guide to Railway Investment Process (GRIP) (2019) (now deleted by Network Rail) has no provision for incorporating measures to address to deficiencies raised by the accident reports or safety analysis reports as the RSSB (2014) Taking Safe Decisions Framework does not include all Hueristics and the biases they lead in the information used for taking decisions. Thus, the Duty Holder Investment process fails to meet the requirements of the mandatory regulatory requirements of the Common Safety Method-Risk Assessment (CSM-RA) Process. The results of the Case Studies in the Chapter remain the same despite the proposed changes in the Shapps-Williams Reform Plan (2021) as the safety related matters are not yet addressed by the plan. The author hopes when the lessons that are learnt from the Case Studies are embedded in railway organisations then we may see improvements in the railway planning and management practices by considering the risk factors at the conceptual stage of the projects and meet the requirements of ISO Standard 27500 (2016) for Human Centred Organisation. National Investigations Bodies (NIB) also may be benefitted.

Keywords

  • As low as reasonably practicable (ALARP) decision making
  • AI Internal Audit
  • Bounded rationality
  • Bow Tie Modelling and Assurance Management
  • Heuristics and Biases
  • Less Than Adequate Human
  • and Organisational Factors Analysis in Risk Assessments
  • Risk Management

1. Introduction

Thoughts without content are empty,

Intuitions without concepts are blind.

-Immanuel Kant (1724–1804) cited by [1]

Journal of Rail Transport Planning & Management aims to stimulate the quality of service for railway passengers and freight customers by improving the knowledge on effectiveness and efficiency of capacity management, timetabling, management, and safety of railway operations [2]. It is a matter of regret that the Journal of Rail Transport Planning & Management does not yet have any research paper dedicated to the role of cognitive biases in transport planning and management despite the fact that these are under investigation for the past decade and half. Role of cognitive biases in transport planning was investigated by Prof Bent Flyvberg leading to cost over-runs due to forecasting errors and a method of Reference Class Forecasting was advanced to mitigate the Optimism Bias [3]. This method was based upon the 2002 Nobel prize winning economics theory. Prof Bert De Reyck et al., after their study of “Optimism Bias “in Network Rail Projects, the Railway Infrastructure Manager, recommended that the correction of “Optimism bias” needs extension to all stages of the Guide for Railway Investment Process (GRIP) Process [4]. Despite the consideration of “Optimism bias “in Network Rail Projects, and some of the biases (like Hindsight, Outcome or loss aversion) in the RSSB (2014) Taking Decisions Framework, the similarity and other heuristics are not described therein. Further, the normative definition of “Rational” decision is not given as well [4, 5].

Apart from mitigating effects of Optimism bias through uplift of the cost budget, the infrastructure projects planning domain need to pay attention to the human and organisational aspects to consider the latent failures in risk management practices as well. For example, the failure of Crossrail to open in time and meet safety objectives despite the fact uplift of the cost budget was provided is a sign of planning failure [6, 7, 8, 9].

Apart from these concerns, the ERTMS train driveability due to changes in speed profile as noted in the case of Swedish railway, planning and delivery of safety critical projects and scheduling of track maintenance tasks during the planning and operational stages are concerns of management and safety of railway operations [10].

This chapter is intended to draw readers attention to these aspects through very brief Case Studies of a safety incident and a fatal accident. In this section, we examine the role of Cognitive biases from the study of literature in economics and cognitive science domains to support the SIRI Model to be described in the Section 2 of this paper. In doing so, we advance two experiments to provide an intuitive basis for these concepts.

Insensitivity to prior probability of outcomes, Insensitivity to sample size, mis-conceptions of chance, Insensitivity to predictability, Illusion of Validity, mis-conceptions of regression are some of the biases due to the representativeness heuristic, one of the three heuristics that are employed in making judgements under uncertainty [11].

Within the Cognitive System Engineering disciple (see [12]), Systems reliability discipline (see [13]), and Organisational research disciplines (see [14]), it is accepted that decision makers are prone to use mental short cuts in complex solving tasks. Technically speaking, these short-cuts are known as Heuristics ([15], p. 37; [16], p. 66; [17], pp. 316–324). There are five types of Nobel prize winning schools of thought in relation to rationality: first, absolute rationality, which is enjoyed by primarily by economists, risk assessors and engineers like Nobel laureate Prof G.S Becker who argued that decision takers act rationally to optimise specific goals like wealth etc. ([17], p. 315; [18]). Second, Theory of Bounded or “limited” rationality which lays emphasis on limits on the capacity of human mind to take rational decisions in real world and rationality is limited by neurological conditions, memory, attention, lack of training in statistics and probability, and lack of education etc. [15, 16, 17]. Prof H.A. Simon argued against absolute rationality observing that limitations upon human information processing gives to rise to a tendency known as “satisficing behaviour” to settle for a satisfactory alternative than explore all alternatives for taking an optimal course of action [15, 16, 17]. Third one is the “Heuristics and Biases” Approach of the Behavorial school. Nobel laureate Prof Daniel Kahneman and Prof A. Tversky argued when make judgements concerning the likelihoods of uncertain events, “people rely on a limited number of heuristic principles, which reduce complex tasks of assessing probabilities and predicting values to simpler judgemental operations. In general, these heuristics are quite useful but sometimes lead to severe and systematic errors” ([11]; [15], pp. 36-52). Apart from these American schools, there is fourth type of rationality called, German school of rationality, and is named “Ecological rationality”. Erwin Dekker and Blaž Remic argue that there are two types of ecological rationality advanced by Gerd Gigerenze and 2002 Nobel laureate Prof Vernon Smith [19]. The details of ecological rationality may be pursued with this article [19]. Fifth, type of rationality cited by Prof Daniel Kahneman is that based upon emotions of regret and disappointment. As per Prof Daniel Kahneman, these models have had less influence than Prospect theory of Risk ([11], p. 288).

The heuristics as per the classical “H&B” are discussed here below:

1.1 Representativeness (similarity) heuristic

First, Representativeness, which is usually employed when people are asked to judge the probability that an object or event A belongs to class or process B.

For example, in the GB Railways domain, I. Muttram, CEO Railtrack plc, London, stated” the dreadful accidents at Southall, Ladbroke Grove and Hatfield long term do not lie outside the bounds that are predicted by long term statistical analysis [20]. The same sentiment was echoed in another paper by John Corrie as well [21].

However, both papers from domain experts failed to note the role of risk in management systems [15, 20, 22]. In 1990, the transfer of rail safety regulation to the HSE invited, as a political fallout of the Kings Cross Underground Fire accident, opposition from the industry due to being classified as a high hazard industry and as a result of this opposition (a safety director in the HMRI opposed to the UK HSE Approach to safety regulation resigning) and the responsibility transferred to the DfT Office of Rail Regulation in April 2006 took place [23, 24]. The HMRI -Yellow Book process forms the Operational Reliability Process (A) and Systems Management Process based upon the IEC 15288 (2002) standard with human and organisation factors included constitutes Process B as noted in the systems engineering standard IEC 15288 [25, 26, 27].

Prof J. Barnett and Andrew Weyman drew attention to this heuristic in relation to the ‘string’ of large-scale UK railway accidents that happened in the late 1990s and early 2000s (Southall in 1997; Paddington in 1999; Hatfield in 2000 and Potters Bar in 2002). The series of accidents led to a speculation of a trend associated with railway privatisation. Statistically, Prof J. Barnett and Andrew Weyman argued that the small number of cases over a short time frame makes it impossible to draw firm conclusions; but the “belief in small numbers” leads to this conclusion of a statistical trend post privatisation. This belief is not limited to lay people but was seen in the case of scientists and engineers as well. Prof J. Barnett and Andrew Weyman noted, “The structural and regulatory impacts, from the Public Inquiries into the ‘string’ of large-scale UK railway accidents led to far-reaching and enduring effects” [28]. Industry observers noticed that it lacks will to move away from Blame Approach(see [29, 30]) and industry participants (2021) fail to recognise the fact that Yellow Book does not meet the requirements of the mandatory risk assessment process and hence withdrawn [23, 29, 30, 31].

Let us study the following case of standard problem of Bayesian Inference to understand the concept of Base Rate Neglect (base rate frequency):

“A taxicab was involved in a hit and run accident during the night. Two cab companies, the Black and the Green, operate in the city. 85% of the cabs in the city are Black and 15% are Green. An eye -witness account stated that the taxi -cab was Green. The court asked for the reliability of the witness under the same circumstances that existed on the night of the accident to be checked. The test results showed that witness identified each one of the two colours 80% of the time correctly and failed 20% of the time. What is the probability that the cab involved in the accident was Green rather than Black knowing that this witness identified it as Green?” [11].

The answer, when two information items of base rate (15%) and unreliable testimony (80%) of a witness are combined with the help of Bayes rule is that there is a 59% chance that the cab was Black (see TEDx talks @17.24) ([32], p. 166; [11]). The common answer to the standard Taxi-cab discussed here is that there is 80% chance that it is Blue is noted by Prof Daniel Kahneman [11].

How to combine the causal base rate information (the cab witnessed was Blue) and statistical base rate information (the number of blue cabs is 15%) through the Bayesian inference is not known commonly. Further, Prof Daniel Kahneman concluded that neglect of statistical base rate in such examples, the causal base rate will be used to feed a stereotype [11]. The colour of the cab involved in the hit and run case is not an indicator of causal factor is noted by the author. This message was reinforced by Prof Daniel Kahneman vide a personal communication to the author.

1.2 Availability heuristics

The ease with which outcomes can be brought to mind (recalled and visualised) increases their subjective salience and perceived likelihood (probability) of occurrence [28]. Infrequent and high-impact events can often be easily brought to mind, leading people to overestimate the likelihood of such an event. Biases due to retrievability of instances, Biases due to effectiveness of search set, Biases of imaginability, and Illusory correlation are due to the availability heuristic. The definition and details of the availability heuristic and its two aspects can be learnt from accessible Nobel prize document (2002) [33]. Biases of imaginability and biases due to retrievability of instances are relevant to the domain of risk management as we recall our fond memories with ease and are averse to entertain our pitfalls in our moral behaviour.

1.2.1 “Out of sight out of mind bias”

This arises from Availability heuristic in the form of omission of commonly known factors (see Kahneman et al., 1982 cited in [11]) and gains its validity from the risk analysis experiment (fault tree analysis of car starting scenario) (See Fischoff et al., 1978) is cited by ([15], p. 89). In the context of the GB Railways safety risk management, omission of Human and Organisational Factors from RSSB risk assessments is noted in the author’s review of railway risk assessment data published by the Office of Rail and Road Regulation regarding the Crossrail risk assessment [6].

The same omission can be found in the doctoral theses of Dr. Bruce Elliott who does not even mention human factors and their integration into Systems engineering process and who has been for over two decades practising Systems engineering in the domain and has been one of the authors of the Yellow Book. Yellow book authors do not even consider the role of human information processing and heuristics in their Engineering Safety Management Strategy [25, 34].

Despite this glaring evidence, the professional body for safety, reliability, and risk management practitioners professional engineering society for safety and reliability, The Safety and Reliability Society (SaRS) in the UK, does not blink an eye lid when it reflects over the decade of experience in applying Common Safety Method for Risk Assessment [31].

1.2.2 Definitions of active and latent errors

The following are the definitions for hidden (latent) and active errors given by Rasmussen and Pedersen (1984) cited in [15], pp. 173–217; [30], pp. 26–43.

Definition: Human errors whose effects are felt almost immediately are called active errors [15]. For example, a train driver may fail to break short of the stopping distance provided at the stop signal or a pilot may fail to recognise the loss of control in flight situation [13, 15, 30, 35].

Definition: Latent errors, are error, whose adverse consequences may lie dormant within the system for a long time, only becoming evident when they combine with other factors to breach system (production) defences [15]. For example, failure to provide for an engineering safe guard as in the case of the Herefordshire level crossing accident [13, 15, 36].

Sanjeev Appicharla (2015b) identified approximately eighteen biases (this is not a conclusive list) from literature and stated them in the 2015 publication [37]. However, the sources of these biases were not traced to the heuristics in the 2015 publication and were made a part of the MORT Assumed Risk Branch is to be noted [37]. Thus, it requires this chapter to provide a correction of the error by mapping them to the Rasmussen’s “Step-ladder model of” decision making [6].

1.3 Adjustment from an anchor heuristic

In classical paradigm, the third heuristics is the Adjustment from an anchor: Experimental subjects and people in real world situations are often unduly influenced by outside suggestion. People can be influenced even against their intentions when they know that the suggestion is made by someone who is not an expert [38]. In experimental situations, and real-world situation, subjects and decision takers make estimates by starting from an initial value that is adjusted to yield the final answer [11].

When subjects are asked to estimate the % of the African nations in the United Nations, and were given the starting point by spinning a wheel of fortune in the subject presence. Prof Daniel Kahneman and Prof Amos Tversky report that the median estimates were 25 ad 45 for groups that received 10 and 65, respectively starting points. Further, they state that payoffs for improving accuracy did not reduce the anchoring effect. Further in experiments involving intuitive calculating the result of 8! and result of 1x2x3x4x5x6x7x8 within five seconds, they found their predictions of higher number for descending sequence than the descending sequence. The actual answer is 40,320 in both cases [11]. Some of the examples in this category are Insufficient adjustment, Biases in evaluation of conjunctive and disjunctive events, anchoring in the assessment of subjective probability distributions are biases due to this heuristic [11].

To give a flavour of the idea of some of these biases in perceptual and cognitive context, let us look at the following two experiments. First one is about perceptual bias.

1.3.1 First experiment: perceptual bias“what you see is all there is”

Cognitive bias: WYSIATI is the acronym for “What you see is all there is”, a cognitive bias as per 2002 Nobel Laureate, Prof Daniel Kahneman ([11], p. 417). Do you see an old or young woman? Or both? Please note your intuitive response!!

The author found that there was no consensus between System safety experts in viewing of the picture below at the 2011 UK IET International Safety Conference [40] (Figure 1). Further, only 3 out of 19 engineers with experience over 30 years saw both old and young girl in an experiment conducted in a Webinar for the author’s college alumni [12].

Figure 1.

The picture shows an old crone or a 19th century young girl, depending upon the perspective of person looking at the picture: The picture is sourced from Prof Charles Handy [39] cited in [40].

1.3.2 Second experiment: investment decision

Prof Daniel Kahneman describes the following experiment. Imagine that you face the following pair of concurrent investment decisions. First examine both decisions, then make your choices ([11], p. 334).

Decision (i) choose between:

A. sure gain of £240.

B. 25% chance to gain £1,000 and 75% chance to gain nothing.

Decision (ii) choose between.

C. sure loss of £750.

D. 75% chance to lose £1,000 and 25% chance to lose nothing ([11], p. 334).

Prof Daniel Kahneman (2012) stated, “Most people, and large majorities prefer A to B and D to C. As in many other choices that involve moderate or high probabilities, people tend to be risk averse in the domain of gains and risk taking in the domain of losses” ([11], p. 334).

1.4 Importance of decision making in organisations

1.4.1 Insights from Nobel prize winners

1978 Nobel Laureate in Administrative Sciences, Prof H. A Simon asserted, on the theme of Decision making in Organisations, that there is a kind of Euclidean parallel between a decision-making tasks and manufacturing of and distributing organisation’s products. Decision making task is a kind of two stage process: in the first stage, the search for the knowledge that can provide premises that logic of decision-making process requires. Second, identifying roles which can be assigned responsibility for goals to be realised with the constraints and side conditions that a decision must satisfy. Organisations develop effective process for manufacturing and distribution of products and same importance has to be accorded for the first stage of decision-making task ([41], p. 43).

Prof Charles Handy stated, on the vocabulary of organisations, whilst equating vocabulary to the ecology, thus: “March and Simon (1965) mean things like structure of communication, rules and regulations, standard programmes (inventory control & purchasing), selection and promotion criteria. The “vocabulary” forms the premises for decision -making itself. In short, the ecology sets the conditions for behaviour” ([39], p. 138).

Prof Daniel Kahneman, in his 2012 work, emphasised the Euclidean parallel between a decision-making tasks and life cycle stages of factory in manufacturing of and distributing organisation’s products as well. Thus, organisation can be regarded as a factory that manufactures judgements and decisions. Given the analogy, the corresponding stages in the production of decisions are the framing of the problem that is to be solved, the collection of relevant information leading to a decision, reflection and review ([11], p. 426).

1.4.2 “Heuristics and Biases” (H and B) approach in system engineering discipline

1.4.2.1 Decision making process in the GB railway domain

Risk in management systems is a theme in the safety research literature dating back to the 1990s [15]. Risk in management systems was a research project carried out by RSSB as well [42]. The findings from these research programmes do not find their mention in the RSSB (2014) Taking Safe Decisions document is a matter of regret and this is not addressed till date [5]. In 2010, finding omission of systematic errors in risk judgements by Dr. George Bearfield, the then RSSB Safety Risk Assessment Manager when discussing Taking Safe Decisions Framework, Sanjeev Appicharla raised the issue of “Satisficing” behaviour “in a Letter to the Editor of the System Safety Club News Letter [14, 43]. Later, Sanjeev Appicharla published his findings of application of System engineering Methodology at RSSB and the cognitive biases that came to attention during the application. Sanjeev Appicharla found that Group -think bias was a major factor in decision over safety standards decision at RSSB, the GB Railways, Industrial level decision making body, and the concept of latent failures was not part of the organisation’s vocabulary (see Handy’s comment later in the text) at that time and still the case remains the same [29]. Section 2.2.3 of the RSSB (2014) Taking Safe Decisions document discusses the principles of good decisions, thus: “The choice of which option to pursue should be informed by evidence and analysis but ultimately the decision taker needs to make a judgement. When making judgements, people tend to be prone to a number of cognitive biases that have the potential to result in illogical or flawed decisions. Some of these are of particular relevance to safety management in the railway industry. An awareness of these can be useful for ensuring that decisions are rational and based on objective criteria. Annex 2 presents further information about psychological influences on decision taking” [5]. The author finds that Annex 2 describes some of the psychological traps that are particularly relevant to risk management in the GB rail industry such as Hindsight Bias, Loss aversion and Narrow Framing on the part of members of public but does not discuss “optimism bias” and “Out of sight out of mind” biases discussed in this chapter.

Modern control systems engineers and managers rely upon engineering judgements based upon “the transfer function method” to decide upon stability, observability and controllability of a system performance. One of the methods for signal information processing is the Kalman filter [44], (chapters 12 and 13). System safety experts like Prof Jens Rasmussen, Prof Jens Reason and Prof Nancy Leveson extended the idea of “management” concept of organisational theory as representing a control function to the analysis of risk management in socio-technical systems [15, 45, 46]. The linear process of decision making described by Nobel laureate Prof H.A. Simon in setting the Agenda, Representing the Problem, Identifying the Alternatives, and Selecting a course of Action and concepts of “Bounded Rationality” and “Satisficing” behaviour of firms and individuals had influenced System safety experts like Prof James Reason and Prof Jens Rasmussen as well [15, 16, 45] Prof Jens Rasmussen, a Cognitive Systems Engineering Expert, realised the rising influence of the “H and B” approach and the role “Management Oversight and Risk Tree “developed by W. Johnson and his team in identifying “less than adequate “or “potential inadequate conditions” that can lead to hazardous operations accidents or incidents [47]. Thus, it is inferred by the author, that linearisation heuristic in the study of system failures is no longer helpful as system failures can no longer be explained as a failure of human operator or technology failure but study of all stakeholders involved and organisational factors are to be included as well [29, 48].

Prof J. Barnett and Andrew Weyman, risk management experts, noted thus; “One of the key themes of the risk literature over the last 50 years is the widely encountered claim that compared to experts, lay people often over or underestimate risk. Or more specifically, that they are prone to deviate from the axioms of formal logic orientated around utility maximisation. …casting risk assessment and its management as a technical, objective process has led many scientists and policymakers to conclude that insights on heuristics simply reflect a component of broader knowledge deficit and lack of sophistication in lay understandings of risk; however, this is not an area where there are necessarily marked distinctions between lay and expert decision makers. In fact, both are susceptible to decision bias effects and prone to apply heuristics, particularly, when dealing with unknown and uncertain issues that lie at or beyond the boundaries of their knowledge (see Kunreuther et al., 2010; MacGillivray, 2014). Expert use of heuristics runs the risk of being problematic when making decisions about complex phenomena, particularly where these are without precedent or unknown to science. Under these circumstances, science and engineering disciplines have little option other than to resort to often quite sophisticated but, nonetheless, rules of thumb, educated guesses, intuitive judgement and relatively crude theoretical models, for example selecting a subset of variables for manipulation in models designed to predict uncertain future outcomes, or assessing the degree of fit with some wider classification.

Ultimately, all scientific theories and models are heuristics – they are all simplified, although often complex and rigorously tested, approximations to reality. In engineering, failure models for complex systems are inevitably limited to the imagination of their architects. Similarly for natural phenomena, weather forecasters focus on those variables they consider to be primary influences. Under most circumstances these models satisfice. If they did not, they would not be used. It is only when unforeseen novel interrelationships and alignments of variables occur that their limitations tend to become manifest and recognised (Reason, 1997)” cited in [28].

Prof McDermott A. Thomas et al. stated “Unbiased decision making is a core goal of systems engineering, encouraging careful consideration of stakeholder needs, design alternatives, and programmatic constraints and risks. However, as systems engineers, we must understand that while our discipline encourages rational decision making, the human decision-making process is largely irrational. Systems engineers have a role to guard against either individual or group related biases in the decision process. The first step is coming to recognize and understand common biases, why they occur, and how they affect group decision making. The formal foundations of human cognition and related cognitive bias should be part of systems engineering training – they are fundamental concepts in the development of core systems engineering competencies related to critical and systems thinking, group facilitation, and team dynamics [INCOSE, 20I8]. Challenging individual bias in team decisions should not be seen as undue criticism, but as part of the holistic process to arrive at sound decisions. Mentoring of team leads to help them learn to recognize common biases and call them out is critical” cited in [49].

1.4.2.2 AI systems

Andrew Smart et al., at Google noted on the lessons learnt from the aviation industry, thus: “Globally, there is one commercial airline accident per two million flights [Clarence Rodrigues and Stephen Cusick, 2011]. This remarkable safety record is the result of a joint and concerted effort over many years by aircraft and engine manufacturers, airlines, governments, regulatory bodies, and other industry stakeholders [Clarence Rodrigues and Stephen Cusick, 2011]. As modern avionic systems have increased in size and complexity (for example, the Boeing 787 software is estimated at 13 million lines of code [Paul A Judas, Lorraine E Prokop. 2011]), the standard 1-in-1,000,000,000 per use hour maximum failure probability for critical aerospace systems remains an underappreciated engineering marvel [Kevin Driscoll, Brendan Hall, Håkan Sivencrona, and Phil Zumsteg, 2003]. However, as the recent Boeing 737 MAX accidents indicate, safety is never finished, and the qualitative impact of failures cannot be ignored—even one accident can impact the lives of many and is rightfully acknowledged as a catastrophic tragedy. Complex systems tend to drift toward unsafe conditions unless constant vigilance is maintained [Nancy Leveson, 2011]. It is the sum of the tiny probabilities of individual events that matters in complex systems—if this grows without bound, the probability of catastrophe goes to one. The Borel-Cantelli Lemmas are formalizations of this statistical phenomenon [Kai Lai Chung and Paul Erdös. 1952], which means that we can never be satisfied with safety standards. Additionally, standards can be compromised if competing business interests take precedence. Because the non-zero risk of failure grows over time, without continuous active measures being developed to mitigate risk, disaster becomes inevitable” cited in [50].

1.4.2.3 ALARP decision making

Ale, B.J. et al. noted, thus: “For the purpose of Cost Benefit Analysis (CBA, the RSSB ‘value of preventing a fatality’ (VPF) was£ 1.826 million in June 2014. In terms of the RSSB Guidance on the use of cost–benefit analysis for ensuring safety, CBA should only provide an input to the overall decision rather than giving a definitive result. Notwithstanding the extensive guidance provided by RSSB to operators, a degree of uncertainty remains with respect to the safety decision because as in other situations in the common law system, the adequacy of a duty holder’s safety measures will ultimately be determined by the Courts after an accident. While in the case of railways in the United Kingdom, guidance on risk-decisions and CBA is provided by the industry, the guidance for dams is provided by a Government Agency. However, in both cases, the risk assessment and CBA information inform the safety decision, they do not form the sole basis for the decision [51]. For further basis of the “H and B” Approach readers may consult “Allais “paradox question discussed in Appicharla to which the Prospect Theory is an attempt to resolve it [37]. RSSB (2014) (see Figure 2 of the RSSB paper) summarises the information that informs decisions, the criteria that are applied, and the distinction between decisions that are taken to meet legal obligations and those that are taken voluntarily to meet commercial objectives [5]. However, as noted in Appicharla latent errors are not part of the decision criteria is to be noted [14, 29]. This fact comes to attention in the review of risk analysis of the RSSB Safety Risk Model where the organisational and managerial factors are omitted (a Bow-Tie model is acknowledged in the 2012 Review carried for the Regulator) [52]. Bow Tie models do not deal with the human errors in system’s lifecycle is fact learnt from Sir Charles Haddon Cave’s investigation as well [53]. Sanjeev Appicharla considered “H and B” Approach as a part of the “Assumed Risk “branch of the Management Oversight & Risk Tree it [37]. However, realising that the regulator would not publish the full human factors analysis of the ERTMS/ETCS Design and Development for the fear of legal challenges (informed privately) because of the Evidence submitted to the Regulator, Sanjeev Appicharla (2016-2017) decided to develop a Dynamic representation to include them as disturbances using the control theoretic representation (see Section 2).

Figure 2.

Evidence and information that support industry decisions and the legal and commercial criteria applied. The picture is sourced from [5].

Prof Roger Kemp asked, “Against this background (of Knightian discrimination between risk and uncertainty, emphasis added)”, it is relevant to ask whether the UK safety regulatory system relying on quantified risk assessment and a calculated value of prevented fatality is really as scientific as we like to think – or can we argue that the maths is a convenient smokescreen behind which the regulator exercises qualitative engineering judgement, as in most other European countries? [23]. Prof Roger Kemp concluded, “Until the middle of the twentieth century, risks were largely related to deterministic failures of mechanical components; they were easy to identify and were managed by ‘engineering common sense’ and learning from previous experience. The second half of the twentieth century saw the introduction of quantified risk assessment that gave a (sometimes spurious) assurance that risks had been calculated, assessed and managed against criteria of acceptable risk. In the twenty-first century we are seeing more risks in complex systems, which are less amenable to traditional analysis, and where the boundaries of the ‘system’ being analysed are fluid. This will require a different type of risk management system that will be a far cry from the number-crunching of the twentieth century” [23]. Doubts about current risk assessments were raised by Prof Anson Jack and his doctoral student as well [54]. Less than adequate mental models of the risk scenarios may lead to knowledge-based mistakes and this may be unrepresented in the current risk assessments [55].

Advertisement

2. The SIRI cybernetics model: validity and desirability of the SIRI safety risk model

Profs J. Reason and E. Hollangel and J Paires noted, thus: “The understanding of how accidents occur has during the last eighty years a rather dramatic development. The initial view of accidents as the natural culmination of a series of events or circumstances, which invariably occur in a fixed and logical order (Heinrich, 1931), has in stages been replaced by a systemic view according to which accidents result from an alignment of conditions and occurrences each of which is necessary, but none alone sufficient (e.g., Bogner, 2002)” cited in [56]. Further, they noted, “If we relax the requirement that every accident must involve the failure of one or more barriers, the inescapable conclusion is that we need accident analysis methods that look equally to individual as to organisational influences. In other words, models of “human error” and organisational failures must be complemented by something that could be called socio-technical or systemic accident models… It is now broadly recognised that accidents in complex systems occur through the concatenation of multiple factors, where each may be necessary but where they are only jointly sufficient to produce the accident. All complex systems contain such potentially multi-causal conditions, but only rarely do they arise thereby creating a possible trajectory for an accident. Often these vulnerabilities are “latent”, i.e. present in the organisation long before a specific incident is triggered. Furthermore, most of them are a product of the organisation itself, as a result of its design (e.g., staffing, training policy, communication patterns, hierarchical relationship) or as a result of managerial decisions ([15], p. 2). On the use of the Swiss Cheese Model has been used for three different purposes. The first is as a heuristic explanatory device (communication); the second is as a framework for accident investigation (analysis); and the third is as a basis for measurements [56].

Prof John Adams and Prof Michael Thompson in their summary to the UK HSE Research Report 035, noted the subjective nature of risk. Further, they characterised HSE as a hierarchical risk manager, and faces a challenging task of managing societal concerns of risk. Given that risk is subjective in nature, in the report, they concluded that attempts to manage risk that a) ignore the rewards of risk taking, and/or b) exclude significant stakeholders, and/or c) fail to appreciate the type of risk it is sought to manage, are unlikely to succeed [57]. Readers may consult Reason et al. for details on the development of the Swiss cheese model and how its criticisms are clarified [56].

The standard UK HSE risk assessment and human error models in the form of feedback strategy are presented below (See Figures 3 and 4).

Figure 3.

The risk assessment process (Figure 1.3 in UK HSE 035, 2002 [57]) (see MB3 in the MORT M branch Figure 5: MORT tree) [58].

Figure 4.

Swiss cheese model (1990) version [56].

2.1 Swiss cheese model as a heuristic explanatory device

“The SCM is a heuristic explanatory device for communicating the interactions and concatenations that occur when a complex well-defended system suffers a catastrophic breakdown. In particular, it conveys the fact that no one failure, human or technical, is sufficient to cause an accident. Rather, it involves the unlikely and often unforeseeable conjunction of several contributing factors arising from different levels of the system. It also indicates what defines an organizational accident, namely the concurrent failure of several defences, facilitated, and in some way prepared, by sub-optimal features of the organisation design. In this regard it has proved very successful. It is a simple metaphor—easily remembered and passed on—that encompasses what is often a very complex story. A Google search on ‘Swiss cheese model of accidents’ yielded around 18,400 hits covering a wide range of hazardous domains. Many of these involve passing on the model to various professional communities. A high proportion of these messages are aimed at health carers” [56].

2.2 Management oversight and risk tree for identification of potentially less than adequate management conditions

Prof Jens Rasmussen et al. discussed role of less than adequate’ management decisions play in risk management of loosely coupled systems and the usefulness of “large number and complexity of causal trees included the Management Oversight and Risk Tree developed by William Johnson (1980) (see [29] to help identify resident pathogens in management practices derived from analysis of past incidents ([16], p. 155; [47]). So, the author believes there is a justification to combine the concepts of ‘resident pathogens’ and “less than adequate’ management decisions to yield a model of actual behaviour trace in terms of System safety engineering and management. This is supported from the comment in the British Standard on Root Cause Analysis, BS EN 62740:2015, on the MORT Technique, thus” “unless the organisation to which it (MORT, emphasis added) is applied is a high reliability organisation (a sort of learning organisation, emphasis added) very large number of weaknesses are found which make it difficult to implement changes. Thus, it is inferred by the author that SIRI Cybernetic Model (2017) is a model of actual behaviour trace in terms of System safety engineering and management as described by Prof Jens Rasmussen and help provide clarity where changes are feasible [47].

The “Heuristics and Biases”(H&B) approach made its appearance in the Section 2.2 of Prof Nancy Leveson’s 2015 paper on how biases are inherent in the risk assessments [46]. Prof Nancy G. Leveson after discussing “confirmation bias”, “the availability heuristic”, “the likelihood of risk is underestimated”, and “defensive avoidance” concluded, “Successful creation and use of leading indicators will require ways to control the psychological biases involved in assessing risk” [46]. MORT User Manual provides the accident analyst with the complete list of questions that need to be answered in analysis of the accident. The MORT User Manual (2009) is freely available for accessible online and can be downloaded [26, 58, 59, 60]. Overconfident operators can be seen in the domain despite lacking awareness of systems engineering concepts and making false claims about axle counters [61, 62].

However, if we take the comment of the British Standard on Root Cause Analysis, BS EN 62740:2015, “unless the organisation to which it (MORT, emphasis added) is applied is a high reliability organisation (a sort of learning organisation, emphasis added) very large number of weaknesses are found which make it difficult to implement changes. Thus, it is inferred by the author that SIRI Cybernetic Model (2016) is a model of actual behaviour trace in terms described by Prof Jens Rasmussen [47, 63]. Management oversight and risk tree (MORT) is extended into a dynamic risk model in the following manner (Figure 5).

Figure 5.

MORT tree top [58].

The application of the MORT Fault Tree Analysis and other methods provided in the MORT User Manual such as Energy Trace Barrier (ETBA), Event and Causal Factors Analysis (ECFA) will enable identification of the operational reality, latent failure conditions and identify heuristics used and the resulting biases which may feed into the accident model. The process of applying the MORT analysis was described by Sanjeev Appicharla [36, 37, 40].

The focus of attention in the railway domain is absent from the classification noted by Prof Jens Rasmussen can be noticed from the inspection of the works published in the domain: Paul Hollywell, Mike Castles, Prof Anson Jack and Neil Barnatt, where the Cognitive Systems Engineering” discipline play their role by means of Swiss Cheese model (1990), Standard Risk Management Framework (1997) or Yellow Book type of Engineering Safety Management are applied but “Heuristics and Biases “approach are omitted [54, 64, 65].

Paul Hollywell attempted to create an accident model incorporating Human and Organisational Factors (HOF) factors to enable a systems approach to enhancing railway safety assurance [64]. But this model did not seek to integrate Hueristics and biases approach and how to apply learning lessons activity into a single model is to be noted. Prof Anson Jack and Neil Barnatt (2018) refer to criticism of Swiss Cheese Model by defending it and accept that Standard Risk Management Framework (1997) does apply to the railway domain but do not refer to the “Heuristics and Biases” Approach. RSSB Ten Incident Classification System for Accident Investigation is not used for underlying management issues as the RAIB Report 03/2020 assumes from the presentation of the developer of the RSSB Scheme (2019) whose failure to understand the System Approach to Human Error can be seen in comparison to Paul Hollywell’s model [64, 66, 67]. The requirement for Graphical model of accidents is required to depict structure, function and behaviour of systems was stated by Prof Jens Rasmussen (Figure 6) [68].

Figure 6.

A detailed model of organisational failures [64].

To claim internal and external validity of the SIRI Cybernetic model as a standard model, the argument is advanced based upon the UK HSE risk assessment, MORT and human error models shown here and metaphysics expressed by Nobel laureate in physics, Prof Erwin Schrodinger (Figure 7) [69].

Figure 7.

SIRI cybernetic representation of societal accident risk management process [63].

Advertisement

3. Application of the SIRI cybernetic Model (2017)

Jean-Christophe Le Coze defined the context of risk management in the socio-technical society, thus: “A major dynamic of the contemporary world is globalization and the rise of a network society, a term coined to describe the changes over the past 20–30 years. The accidents of the 1980s occurred at a time when the notion of post-industrial society was a central description of Western societies that saw major transformations in cultural, political, economic and technological areas following the Second World War (Touraine, 1969; Bell, 1973). In the first decade of the twenty-first century, the concepts of network society or informational society have been suggested, most notably by Manuel Castells (Castells, 2001), to replace this previous scheme and to embrace current transformations. Information technology, privatisation, deregulation and financial and trade liberalisation have indeed shaped a new world for industries, leading to new opportunities as much as new challenges (Berger, 2005). Incorporating new technological developments into operations, adapting strategies to uncertain global markets, structuring organisations to obtain flexibility through subcontracting and matrix organisations, complying with new demands for accountability through international and intensified standardisation and indicators (for example, key performance indicators) or negotiating with a risk-averse civil society with stronger ecological concerns are some of the new trends of the past two or three decades that have been shaping high-risk systems environments. Our world now appears more interconnected, networked and complex than it has ever been and should as a result trigger a certain degree of reflexivity. It is something James Reason expressed following some of the critics of his contribution: ‘Is Swiss Cheese past its sell-by dates?’ (Reason et al., 2006). As the world evolves and as science evolves, so should the graphical models that serve as rally points for practitioners and researchers of socio-technological risks – and their analytical backgrounds” (Le Coze, 2013, 2015)” cited in [70].

The application of the SIR Model to the RAIB Reports 17/2019 and 11/2020 elicited following results. The results are presented in a very summarily manner due to space constraints.

3.1 RAIB summary of loss of safety critical signalling data on the cambrian coast line incident

On the morning of 20 October 2017, four trains travelled over the Cambrian Coast line, Gwynedd, without the “temporary speed restriction”-TSR data transmit by the signalling system to the onboard system. Trains approached a level crossing at 80 km/h (50mph), significantly exceeding the temporary speed restriction of 30 km/h (19mph). Since 2014, the temporary speed restriction was applied to give adequate warning time for level crossing users [71, 72]. At around 10:02 hrs, driver of the fourth train 2 J03 after passing through the TSR location approximately 80 km/h (50mph) while travelling between Barmouth and Llanaber reported a fault with the speed information displayed on the incab display on the train borne system [71, 72]. In the evening, on the October 2017, after the automated signalling computer restart, the temporary speed restriction data was not transmitted by data base system but a display screen to the signallers incorrectly showed the TSRs are being loaded for transmission to trains [71, 72]. Fortunately, despite the foregoing unsafe acts as per the Swiss Cheese Terminology, no harm occurred.

Since 2011 the Cambrian Coast line was commissioned with a pilot installation of the European Rail Traffic Management System (ERTMS) and it has been operating since then [71, 72]. This system replaced traditional lineside signals and signs with movement authorities transmitted to trains. These movement authorities include maximum speed of train, speed profile of the line, speed restrictions imposed by the interlocking and temporary speed restrictions due to working sites etc. These are displayed to the train driver and used for automatic monitoring and enforcement of train speed [73]. The RAIB (2019) Summary following causal factors were learnt, thus:

The causal factors identified by the RAIB were:

  • likelihood of a corrupted database (paragraph 46, Recommendation 5);

  • no indication of data base system failure provided to signallers (paragraph51, Recommendation 2) [71].

  • the temporary speed restriction data was stored in the volatile memory leading to loss of data during roll over (paragraph 62, no recommendation);

  • the required level of safety integrity for validation of temporary speed restriction data uploaded to the RBC following a rollover was not achieved by the design (paragraph 67, Recommendations 1 and 2);

  • the database system (GEST server software) was unable to detect and manage the corruption of its database (paragraph 75) [71]; and

  • the vulnerability of the signalling system to a single point of failure had neither been detected nor corrected during the design, approval and testing phases of the Cambrian ERTMS project due to a combination of the following [71]:

    1. i. Insufficient definition of the safety related software requirements for the GEST software (paragraph 81, Recommendations 1 and 2);

    2. ii. the hazard analysis process did not identify, and mitigate the risk of data base system failure (paragraph 88, Recommendation 2);

    3. iii. the validation process did not assure that the safety requirement for the correct display of temporary speed restrictions was implemented (paragraph 94, Recommendations 1 and 2); and

    4. iv. The database software (GEST server) was accepted into service without a generic product safety case (or equivalent);(paragraph 99, actions taken paragraph 149, Recommendations 1 and 2, Learning points 2 and 3).

The underlying factors in the Paragraph 144 identified by the RAIB were:

  1. signalling supplier, Ansaldo STS, did not appreciate the latent failure of single point of failure within the GEST sub-system software (paragraph 113, Recommendation 2); and.

  2. Client, Network Rail, input did not include effective systems engineering (emphasis added) role checks to identify the design process shortcomings (paragraph 116, Recommendation 1, Learning point 4) [71].

3.2 The application of the SIRI cybernetic model (2017) to the loss of safety critical signalling data

3.2.1 The SIRI cybernetic model (2017) stakeholder analysis

The description of the stakeholder organisations involved is given below. Marius Wold Albert (2019) of NNTU studied the same accident from the STAMP/CAST basis based upon the interim RAIB Report is to be noted [74].

The SIRI Cybernetic Model (2017) Analysis beings with identification of stakeholder organisations involved: As per the RAIB Report (17/2019), Network Rail owns and maintains the Cambrian lines infrastructure, and employs the Machynlleth signalling control centre staff, including signallers and signalling technicians responsible for operation and maintenance of the Cambrian ERTMS system. Arriva Trains Wales Ltd. operated the trains and employed the drivers affected by the loss of speed restrictions. Transport for Wales took over operation of these trains in October 2018. Ansaldo STS (now part of Hitachi STS) supplied the equipment for the Cambrian ERTMS installation and provides maintenance assistance to the local Network Rail signalling maintenance staff when requested. It employed the support engineer involved in restoring the train services after the incident. The Cambrian ERTMS project team designed, installed, commissioned and brought the Cambrian ERTMS system into operational use. It included representatives from both Network Rail and Ansaldo STS. Lloyd’s Register Rail, now Ricardo Rail/Ricardo Certification, acted as the Independent Safety Assessor (ISA) of safety case documents issued by the Cambrian ERTMS project team. Network Rail chaired and employed the discipline experts which formed the System Review Panel (SRP). The SRP determined the acceptability of the safety case documents submitted to it by the Cambrian ERTMS project team, taking account of the issues that had been identified by the ISA (Clauses 9 to 15 [71]).

Simon Paye provided a large picture of the history that shaped the design and development of the ERTMS Standard and a map of stakeholders initially involved (SNCF, IRRI and UIC, and the European Commission) and their perception of the problems they intend to solve since 1986 [75]. Libor Lochman relied upon Simon Paye M. Sc Thesis to establish the background for the ERTMS [76].

Marius Wold in his M.Sc. Thesis in the Section 9.4 provided a control structure and operational structure of the GB Railways for the CAST Analysis [74]. From the website of the Network Rail Consulting, we learn that the European Rail Traffic Management System (ERTMS), Cambrian Early Deployment Scheme (EDS) was carried by them at the cost of £113 Million over a period of 42 months for their client UK Department for Transport [77]. Sanjeev Appicharla learnt from the research Incident Case Study on the ABCL Incident (2011) on the Cambrian ERTMS Railway that RSSB was involved as in granting a deviation to safety critical requirement, the UK HSE and the National ERTMS Programme of which RSSB was a part were involved in selecting the baseline D without identifying hazards involved [37, 78]. Sanjeev Appicharla provided an architecture context diagram which did not include a map of stakeholders initially involved (SNCF, IRRI and UIC, and the European Commission) despite having access to a copy of the original Commission Plan (1996). This latent error on the part of the author is to be noted (see Section 3.1) [37, 73].

3.2.2 The SIRI cybernetic model (2017) of safety standards

The RAIB Report (17/2019) accepts that the following standards are helpful in achieving safety and system assurance:

  1. GEGN8650, ‘Guidance on high integrity software-based systems for railway applications.

  2. Network Rail standards?? These are unspecified by the RAIB.

  3. CENELEC EN 501xx standards;

  4. European technical standards for interoperability

  5. ISO/TS 22163:2017(en) Railway applications — Quality management system — Business management system requirements for rail organisations: ISO 9001:2015 and particular requirements for application in the rail sector

The act of omission on the part of RAIB, RSSB, ORR, and the respective duty holder organisations forming the part of the safety regulation community or their European counter parts such as the UIC, the Rail Research Institutes or the UNISIG, or the Agency or the ISO Working Groups is that they fail to realise that the human and organisational factors(HOF) were omitted in the 2004 EU Railway Safety Directive and ROGS Regulations as well and retrospective analysis cannot be carried out if the Reference Architecture is followed dogmatically [71, 73, 79, 80, 81]. The RAIB has failed to raise the issue of lack of application systems engineering standard IEC 15288 after the initial report [82]. The RSSB standard GEGN8650 is recommended by the RAIB when examined thoroughly will reveal the many errors contained in it. One of the latent failure conditions is that the lack of awareness of the fact measurement of software integrity cannot be carried out (see Clause 2.3 Measuring Integrity when considering the NASA Langley Research (1993) Report [83, 84]. Inspection of the Network Rail (Infrastructure) Ltd. (NRIL) Health & Safety Management System does not contain any System safety method to manage the complexity of safety critical systems [85]. Inspection of NRIL System Operator (2018) Plan and RSSB Taking Safe Decisions (2014) reveal no information and decision criteria by which latent failure conditions identified by system safety research and RAIB accident investigation [67, 86]. On scrutiny of available documentation on the research by R. W Butler and G.S Finelli, empirical research on biases such as the UK HSE societal risk concerns, and regulatory lack of awareness of to enforce integration of human and organisational factors in activity of the system definition and perform risk analysis, risk assessment and evaluation accordingly and applying systems thinking based approaches is less than adequate by The Network Rail (Infrastructure) Ltd. (NRIL) Health & Safety Management System is inferred [6, 45, 74, 83, 87].

3.2.3 The SIRI cybernetic model (2017) of Swiss cheese model barrier analysis

The sequence of activity in Swiss Cheese Model analysis (ETBA in the MORT terms) starting with SA1 Incident/Accident and going downwards from the Barrier Analysis and below is shown in MORT User Manual [58]. Sanjeev Appicharla may be consulted to learn about the application of ETBA/MORT technique [37, 40]. The relationship between the latent failure conditions and the accident/incident is explored through the ETBA, and the MORT branches using the Cybernetic model and the Swiss Cheese Model to include social and organisational factors and the graphical representations requirements for the accident model. In the Swiss Cheese Model literature, we find that HFCAS, ACCIMAP, STAMP-CAST and PDCA-SHELL/PDCA cycle-based process models are used to identify the latent failure conditions [45, 88, 89]. The belated effort by the EU Agency for Railways to include HOF can be seen from the 2020 webinar [90, 91]. EU SAMRAIL and SAMNET Projects failed to improve safety culture and the less than adequate safety culture was affirmed again by a doctoral thesis (Table 1) [103, 104, 105].

1: harmful energy flow or adverse agent or environmental conditionSB2: target: vulnerable person or thingSB3: barriers & ccontrols to separate energy and target. For IM/RU SMS categories of factors, please refer to ROGS guidance [92] and MORT user manual (2009) [58]
(System hazard: Kinetic hazard of ERTMS train moving into crossing space in excess of the permitted speed [59]. For component level hazards – please refer to Marius Wold Albert [74].Worst case risk scenario of 30 or odd school bus of children [93].Regulatory Decision-making layer: (Status quo bias)
  • LTA ALARP Decision Making [43, 51, 81]

  • LTA ORR Risk (SFAIRP) Policy [94]

  • LTA 2004 EU Railway Safety Directive [73]

  • LTA ORR ROGS Review LTA [92]

Industry body layer: (Availability heuristic – out of sight out of mind bias) & Anchoring heuristic (single point failure):
  • LTA Railway Group Standards Planning: RSSB guidance note GEGN8650 LTA [84]. Infeasbility of SIL4 Assurance by testing [83]

  • LTA CENELEC safety standards (Duty holder Standards) & UNISIG Safety Analysis LTA [95]

  • LTA System definition (RSSB, 2011) [22]

  • LTA Risk Assessment (Application of IEC 61508 standard and related UK HSE Guidance LTA), (UK HSE RR 035), (National ERTMS Board, 2003), (RSSB SRM Review LTA), Yellow Book, RSSB Taking Safe Decisions [5, 25, 27, 52, 71, 78, 80, 95]

Duty holder management and co-operation SMS layer
  • LTA Risk Management [43, 71, 77]

  • LTA RU/IM Safety Management System [42, 96, 97]

  • IM H& SE Document LTA [85, 86]

Risk Assurance Management layer
  • LTA ERTMS Risk Assessment Review (Ricardo Rail/Ricardo Certification Review LTA) [71].

  • LTA System Assurance Management (Ricardo Rail/Ricardo Certification Review LTA) [71].

Operator layer
  • LTA Competence Management operator level [52, 71, 98, 99, 100]

Learning lessons from past failures
  • LTA Accident Modelling and analysis of human and organisational factors, risk in management systems and LTA risk management framework [28, 37, 42, 47, 67, 96, 101, 102]

Table 1.

SCM/MORT ETBA analysis [56, 58, 71].

3.3 Double fatality track worker accident

3.3.1 The RAIB summary of track worker accident

Approximately at 09:52 hrs, on 3 July 2019, a passenger train, which was travelling from Swansea to London Paddington, struck and fatally injured two track workers at Margam East Junction on the South Wales main line. The driver of the train 1 L48 09:29 Swansea to London Paddington sounded the train horn and applied the emergency brake when he sighted three track workers ahead on the Up Main line his line. A third track worker came very close to being struck. Prior to this sighting, his saw three track workers walking in the same direction as his train on the adjacent line. At the time of accident, the train was travelling at about 50 mph (80 km/h). The track workers walking on the adjacent Down Main line became aware of the train approaching and tried to warn their colleagues as the train passed them [106].

The three track workers on the Up Main line were working on a set of points, using a petrol-engine driven tool for loosening and tightening large nuts (commonly called a nut runner). CCTV images taken from a camera at the front of the train showed that two workers were stood in the four-foot, the one using the nut runner was crouching in the six-foot for tightening and loosening bolts. The three track workers did not become aware of the train until it was very close to them. By this time the train was travelling at around 50 mph The three workers, who were part of a group of six staff, who were carrying out a maintenance task on the track side. The group of track workers were not aware that the train was approaching until it was too late for them to move to a position of safety. The RAIB 11/2020 Report identified the causal factors, and underlying factors and or want of space there are not reported here but these can be accessed online [107].

3.3.2 The SIRI cybernetic model (2017) of stakeholder analysis

For want of space, the author requests readers to perform the other steps of the model analysis based upon the example given earlier see the RAIB 11/2020 Report [107].

3.3.3 The SIRI cybernetic model (2017) of safety standards

For want of space, the author requests readers to perform the other steps of the model analysis based upon the example given earlier.

3.3.4 The SIRI cybernetic model (2017) Swiss cheese model barrier analysis

See Table 2.

SB1: harmful energy flow or adverse agent or environmental conditionSB2: target: vulnerable person or thingSB3: barriers & controls to separate energy and target with evidences. For IM/RU SMSM categories of factors, please refer to ROGS guidance [92] and MORT user manual (2009) [58]
Kinetic hazard: train running into track section under maintenance [107].Two track workers were struck and fatally injured [107].Regulatory Decision-making layer: (Status quo bias)
  • LTA ALARP decision making UK HSE −035 [57, 81]

  • LTA RSSB Taking Safe Decisions [5, 43]; LTA ORR Policy [94]

  • LTA RSSB Safety Standards decision making [29]

  • LTA ORR/RU/ IM Change Management [67, 107, 108].

Industry body layer: (Availability heuristic – out of sight out of mind bias) & Anchoring heuristic (single point failure):
  • LTA RSSB System definition [22, 107, 108]. LTA RSSB Safety Standards decision making [29]

  • LTA IM Application of Standards [6, 45, 110]

Duty holder management and co-operation SMS layer
  • LTA IM Risk Assessment Review [67, 107, 108].

  • LTA IM Risk Management System [67, 107, 108].

  • LTA IM Competence Management [67, 107, 108].

  • LTA IM System Assurance Management [67, 107, 108]

Learning lessons from past failures
  • LTA Accident Modelling and analysis of human and organisational factors, risk in management systems [28, 37, 42, 47, 67, 96, 101, 102, 106]

Table 2.

SCM/MORT ETBA analysis [56, 58, 106, 107].

Advertisement

4. Conclusions

The paper showed failures in safety management led to safety risk problems in both tightly and loosely coupled systems. The rush to implement the AI algorithms must be checked in the domain for problems in auditing process in both non-AI and AI contexts as safety upsets in low accident rate industries such as aviation do occur revealing culture problems in such industries [35, 50].

The vocabulary of the organisational decision making as suggested by Prof H.A Simon did not cover risk assessments that were grossly underestimated due to imaginability bias. And the idea that bounded rational organisations do not seek to minimise harm are illustrated in both highly automated aviation work system and non-automated track work system(s) as well. Organisations need to strike a balance between loss aversion and excessive optimism attitudes to benefit the society where the goals involving emergent properties like driveability of ERTMS train are concerned [11, 41, 111].

Advertisement

Acknowledgments

The author expresses gratitude to the Reviewers for corrections suggested. And thanks are due to the publisher to accept the very lengthy chapter. The author extends gratitude to the organisations who made the information available freely. To name a few, the NRI Foundation, the GB Railway organisations, and decision scientists and risk scholars who made their papers available freely.

Advertisement

Conflict of interest

There is no external funding and hence, no competing interests are involved.

References

  1. 1. Prof Alfred North Whitehead. Process and Reality, an essay in cosmology, Gifford Lectures delivered in the University of Edinburgh during the session 1927-28 (1985). (D. W. David Ray Griffin, Ed.) Edinbugh: Free Press, New York. Available from: https://en.wikipedia.org/wiki/Process_and_Reality [Retrieved: 24 March 2019]
  2. 2. The Editorial Board. Journal of Rail Transport Planning & Management Aims and Scope. 2011. Available from: https://www.sciencedirect.com/journal/journal-of-rail-transport-planning-and-management/about/aims-and-scope [Retrieved: 2 Septermber 2021]
  3. 3. Flyvbjerg B. From Nobel prize to project management: Getting risks right. Project Management Journal. 2006;37(3):5-15. Available from: https://www.researchgate.net/profile/Bent_Flyvbjerg/publication/263747196_From_Nobel_Prize_to_Project_Management_Getting_R [Retrieved: 28 March 2020]
  4. 4. Bert De Reyck, Daniel Read, Jeremy Harrison, Ioannis Fragkos, Yael Grushka-Cockayne. (2017). Optimism Bias Study. Available from: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/576976/dft-optimism-bias-study.pdf [Retrieved: 9 April 2021]
  5. 5. RSSB. Guidance Taking Safe Decisions—How Britain’s Railways Take Decisions that Affect Safety. 2014. Available from: https://www.rssb.co.uk/Library/risk-analysis-and-safety-reporting/2014-guidance-taking-safe-decisions.pdf [Retrieved: 02 October 2018]
  6. 6. Appicharla SK. Cross rail train protection (Plan B)—railway safety regulations. ORR; 2015. Available from: https://orr.gov.uk/__data/assets/pdf_file/0004/19894/crossrail-exemption-application-consultation-sanjeev-kumar-appicharla.pdf [Retrieved: 18 May 2019]
  7. 7. Network Rail Investment Projects. Crossrail Train Protection (Plan B)—Railway Safety Regulations 1999 Exemption Application Report. ORR Railway safety consultations; 2015. Available from: https://orr.gov.uk/__data/assets/pdf_file/0010/18856/paddington-0-12-exemption-application-report.pdf [Retrieved: 1 August 2020]
  8. 8. Simon Wright OBE. Crossrail programme organisation and management for delivering London’s Elizabeth line. Crossrail; 2017. Available from: https://learninglegacy.crossrail.co.uk/wp-content/uploads/2017/09/1R-001-Programme-organisation-and-management.pdf [Retrieved: 20 May 2019]
  9. 9. Amyas Morse KCB. HC 2106: Completing Crossrail. NAO; 2019. Available from: https://www.nao.org.uk/wp-content/uploads/2019/05/Completing-Crossrail.pdf [Retrieved: 19 May 2019]
  10. 10. Lidén MJ. Dimensioning windows for railway infrastructure maintenance: Cost efficiency versus traffic impact. 2016. Available from: https://www.sciencedirect.com/science/article/abs/pii/S221097061630004X [Retrieved: 23 October 2021]
  11. 11. Kahneman D. Thinking Fast and Slow. London: Penguin Group; 2012
  12. 12. Appicharla S. Governance of Risk Management. Google Meet KREC 82 Alumni bi-montly Conference. Webinar. 2021
  13. 13. Appicharla S. Technical review of common safety method using system for investigating railway interfaces (SIRI) methodology. In: 8th IET International System Safety Conference incorporating the Cyber Security Conference 2013. Cardiff, UK: IET; 2013. pp. 1-9. DOI: 10.1049/cp.2013.1701
  14. 14. Appicharla SK. Letter to editor (A): Tolerability of risk. The Safety Critical Systems Club Newsletter. 2010;19(3):8-10. Available from: https://scsc.uk/scsc-112
  15. 15. Reason J. Human Error. 17th ed. New York, USA: Cambridge University Press; 1990
  16. 16. Rasmussen J. In: Sage AP, editor. Cognitive Systems Engineering. New York: John Wiley and Sons, Inc; 1994. Available from: https://www.wiley.com/en-us/Cognitive+Systems+Engineering-p-9780471011989 [Retrieved: 4 September 2019]
  17. 17. Perrow PC. Normal Accidents; Living with High Risk Technologies. 1999 ed. New Jersey: Princeton University Press; 1984. Available from: https://en.wikipedia.org/wiki/Normal_Accidents [Retrieved: 11 December 2020]
  18. 18. Becker GS. Human Capital: A Theoretical and Empirical Analysis with Special Reference to Education. London: The University of Chicago Press; 1964. Available from: https://www.nber.org/books-and-chapters/human-capital-theoretical-and-empirical-analysis-special-reference-education-third-edition [Retrieved: 15 October 2015]
  19. 19. Dekker E, Remi B. Two types of ecological rationality: Or how to best combine psychology and economics. Journal of Economic Methodology. 2019;26(4):291-306. Available from: https://www.tandfonline.com/doi/full/10.1080/1350178X.2018.1560486 [Retrieved: 28 July 2021]
  20. 20. Muttram RI. Safety. Railway Safety: Railtex—International Railway Engineering Conference—IMeche/ICE. London: IMeche Conference Transactions; 2001. pp. 3-8
  21. 21. Corrie J, Gilamartin BP. Managing safety and reliability—theory and reality. Safety and Reliability. 2001. p. 15. Available from: https://www.tandfonline.com/doi/abs/10.1080/09617353.2001.11690721 [Retrieved: 25 September 2019]
  22. 22. Bearfield GJ, Short R. Standardising safety engineering approaches in the UK Railway. In: The Sixth International System Safety Conference. Birmingham: The Institution of Engineering and Technology; 2011. p. 5. Available from: https://ieeexplore.ieee.org/document/6136922 [Retrieved: 17 March 2020]
  23. 23. Kemp R. Quantitative risk management and its limits: A UK engineering perspective. In: Routledge Handbook of Risk Studies. Abingdon, Oxon OX14 4RN: Routledge; 2016. pp. 286-307. Available from: https://play.google.com/books/reader?id=HS7eCwAAQBAJ&pg=GBS.PT285
  24. 24. The Economist. Railways: What price safety? What growing fears of safety cost the railways. 2003. Available from: https://www.economist.com/britain/2003/11/27/what-price-safety [Retrieved: 25 January 2021]
  25. 25. Hessami A, Corrie JD, Muttram R, Aylward R, Clemenston B, Davis RA, et al. Yellow Book, Engineering Safety Management. London: Railtrack on behalf of the UK Rail Industry; 2000. Available from: http://www.arbutus-tc.co.uk/docs/irse2008.pdf [Retrieved: 24 August 2019]
  26. 26. Appicharla S. System for investigation of railway interfaces. In: The First IET International Conference on System Safety. London: Institution of Engineering and Technology; 2006. pp. 7-16. DOI: 10.1049/cp:20060197
  27. 27. BS ISO/IEC 15288:2002. Annex D. System Concepts. London: International Electro-technical Commission/BSI London; 2002
  28. 28. Weyman A, Barnett J. Heuristics and biases in decision making about risk. In: Adam Burgess AA, editor. Routledge Handbook of Risk Studies. Oxon OX14 4RN: Routledge; 2016. pp. 235-249. Available from: https://www.google.co.uk/books/edition/Routledge_Handbook_of_Risk_Studies/Gi3eCwAAQBAJ?hl=en [Retrieved: 25 June 2021]
  29. 29. Appicharla S. System for investigation of railway interfaces. In: The Fifth IET International System Safety Conference. Manchester: Institution of Engineering and Technology; 2010. p. 6. Available from: https://ieeexplore.ieee.org/document/5712351
  30. 30. Whittingham R. The Blame Machine. Oxford: Elsevier; 2004
  31. 31. Safety and Reliability Society (SaRS). 10 years of using the CSM for Risk Evaluation and Assessment (CSMRA)—extended Q&A/Panel session. Safety and Reliability Society (SaRS); 2021. Available from: https://www.youtube.com/watch?v=TjzD7OFi9uE&ab_channel=SaRSociety [Retrieved: 7 June 2021]
  32. 32. TED Talks. You are a Simulation & Physics Can Prove It: George Smoot at TEDxSalford (T. Talks, Producer). You Tube; 2014. Available from: https://www.youtube.com/watch?v=Chfoo9NBEow&ab_channel=TEDxTalks [Retrieved: 15 December 2015]
  33. 33. The Royal Swedish Academy of Sciences. Advanced information on the Prize in Economic Sciences 2002. 2002. Available from: https://www.nobelprize.org/uploads/2018/06/advanced-economicsciences2002.pdf [Retrieved: 21 May 2021]
  34. 34. Elliott DB. Benefits of Adopting Systems Engineering Approaches in Rail Projects. 2014. Available from: https://etheses.bham.ac.uk/id/eprint/5322/1/Bruce14PhD.pdf [Retrieved: 27 March 2020]
  35. 35. Appicharla SK. From Nobel prize(s) to safety risk management: How to identify latent failure conditions in the safety risk management practices. 39th US International System Safety Conferencce. Online Event: US International System Safety Society; 2021
  36. 36. Appicharla S. Modelling and Analysis of Herefordshire Level Crossing Accident using Management Oversight and Risk Tree (MORT). IEE Explore; 2011. Available from: https://ieeexplore.ieee.org/abstract/document/6136924 [Retrieved: 23 January 2012]
  37. 37. Appicharla SK. Chapter three: Application of cognitive systems engineering approach to railway systems (system for investigation of railway interfaces). In: Zboinski K, editor. Railway Research—Selected Topics on Development, Safety and Technology. Rijeka, Croatia: Intech; 2015. pp. 81-113. DOI: 10.5772/61527
  38. 38. The Economist. Rethinking thinking. 1999. Available from: https://www.economist.com/christmas-specials/1999/12/16/rethinking-thinking [Retrieved: 27 July 2021]
  39. 39. Handy C. Understanding Organisations. 2011 ed. London: Penguin; 1976. Available from: https://books.google.co.uk/books/about/Understanding_Organizations.html?id=NitYAC2aH2sC [Retrieved: 22 October 2019]
  40. 40. Appicharla S. Chapter six: System for investigation of railway interfaces. In: Perpina PX, editor. Reliabiity and Safety in Railways. Crotia: Intechopen; 2012. pp. 144-192. Available from: http://cdn.intechopen.com/pdfs/34435/InTech-Railway_system_safety.pdf
  41. 41. Simon HA. Administrative Behavior: A Study of Decision-Making Processes in Administrative Organizations. 4th ed. New York: Free Press; 1997
  42. 42. Oldfield A. RSSB Report: T169—Risk in Management Systems Rev 2. London: RSSB; 2004
  43. 43. Bearfield G. Taking safe decision—railway industry ALARP guidance. Safety Systems—The Safety Critical Systems Club Newsletter. 2009;19(1):1-5
  44. 44. Nagrath IJ, Gopal M. Control Systems Engineering. 2nd ed. New Delhi: Wiley Eastern Limited; 1982
  45. 45. Svedung I, Rasmussen J. Graphic representation of accident scenarios: Mapping system structure and the causation of accidents. Safety Science. 2002;40(5):397-417. Available from: https://www.sciencedirect.com/science/article/abs/pii/S0925753500000369#! [Retrieved: 11 July 2021]
  46. 46. Leveson NG. A systems approach to risk management through leading safety indicators. Reliability Engineering & System Safety. 2015:17-34. Available from: https://dspace.mit.edu/bitstream/handle/1721.1/108601/Leveson_A%20systems%20approach.pdf?sequence=1&isAllowed=y [Retrieved: 25 March 2021]
  47. 47. Rasmussen J. Risk management in a dynamic society. Safety Science. 1997;27(2):183-213. Available from: http://sunnyday.mit.edu/16.863/rasmussen-safetyscience.pdf [Retrieved: 4 July 2020]
  48. 48. Dien Y, Llory M, Montmayeu R. Organisational accidents investigation methodology and lessons learned. Journal of Hazardous Materials. 2004;111(1-3):147-153. Available from: https://www.sciencedirect.com/science/article/abs/pii/S0304389404001037 [Retrieved: 8 January 2021]
  49. 49. McDermott TA, Folds DJ, Hallo L. Addressing cognitive bias in systems engineering teams. INCOSE International Symposium. 30(1):257-271. DOI: 10.1002/j.2334-5837.2020.00721.x
  50. 50. Smart A et al. Closing the AI accountability gap: Defining an end-to-end framework for internal algorithmic auditing. In: FAccT: Fairness, Accountability, and Transparency. Barcelona, Spain: Association for Computing Machinery, New York; 2020. pp. 33-44. Available from: https://dl.acm.org/doi/10.1145/3351095.3372873 [Retrieved: 31 March 2021]
  51. 51. Ale BJM, Hartford DND, Slater D. ALARP and CBA all in the same game. Safety Science. 2015;76:90-100. Available from: https://www.sciencedirect.com/science/article/abs/pii/S0925753515000405 [Retrieved: 21 April 2021]
  52. 52. Hunt M, Taig T. Review of LU and RSSB Safety Risk Models. London: ORR; 2012. Available from: https://orr.gov.uk/__data/assets/pdf_file/0019/5059/ttac-safety-risk-models-review.pdf [Retrieved: 6 May 2019]
  53. 53. Haddon Cave C. The Nimrod Review. 2009. Available from: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/229037/1025.pdf [Retrieved: 25 December 2019]
  54. 54. Jack PA, Barnatt N. Safety analysis in a modern railway setting. Safety Science. 2018:177-182. Available from: https://www.sciencedirect.com/science/article/pii/S092575351731130X [Retrieved: 25 August 2019]
  55. 55. Johnson-Laird PN. Mental models and human reasoning. Proceedings of the National Academy of Sciences. 2010;107(43):18243-18250. Available from: https://www.pnas.org/content/107/43/18243 [Retrieved: 26 October 2010]
  56. 56. Reason J, Hollangel E, Paires J. Revisting the “Swiss Cheese” Model of Accidents. Bruxelles: Eurocontrol Agency; 2006. Available from: https://www.eurocontrol.int/publication/revisiting-swiss-cheese-model-accidents [Retrieved: September 2011]
  57. 57. Adams J, Thompson M. UK HSE 035: Taking into Account Societal Concerns about Risk: Framing the Problem. Norwich: The UK HSE. Available from: http://www.hse.gov.uk/research/rrpdf/rr035.pdf; 2002 [Retrieved: December 2015]
  58. 58. The Noordwijk Risk Initiative Foundation, Royal Dutch Navy. NRI MORT User’s Manual. NRI; 2009. Available from: http://www.nri.eu.com/NRI1.pdf [Retrieved: 16 March 2017]
  59. 59. Clifton EI. Hazard Analysis Techniques for System Safety. New Jersey: Wiley& Sons; 2005
  60. 60. The Noordwijk Risk Initiative Foundation. NRI MORT User’s Manual for Use with the Management Oversight & Risk Tree Analytical Logic Diagram. NRI; 2009. Available from: https://www.nri.eu.com/NRI1.pdf [Retrieved: 4 April 2011]
  61. 61. Murphy N, Roberts S. Systems Engineering—Railway Operators get it too! 2012. Available from: https://incoseuk.org/Documents/Groups/Railway/RIG_17_07_2012_Presentation.pdf [Retrieved: 11 June 2021]
  62. 62. Bourn J. The Modernisation of the West Coast Main Line. London: The HM Stationery Office; 2006. Available from: https://www.nao.org.uk/wp-content/uploads/2006/11/060722.pdf [Retrieved: 22 August 2019]
  63. 63. Appicharla SK. RSL 013 and RSL 024: Written Evidences for the UK Transport Select Committee’s Railway Safety Inquiry. 2017. Available from: https://www.parliament.uk/business/committees/committees-a-z/commons-select/transport-committee/inquiries/parliament-2015/rail-safety-16-17/publications/ [Retrieved: 31 May 2020]
  64. 64. Hollywell P. A systems approach to enhancing railway safety assurance. AusRAIL PLUS 2014, doing it smarter. People, Power, Performance. Melbourne, Vic, Australia: The National Academy of Sciences; 2014. Available from: https://trid.trb.org/view/1341033 [Retrieved: 2 August 2019]
  65. 65. Castles M. Engineering Safety Management. PWI Half Day Seminar, Practical Control of Risk S & C, London. 2014. Available from: https://www.thepwi.org/technical_hub/presentations_for_tech_hub/141127_lul2_risk_within_s_c/27th_november_2014_london_underground_practical_control_risk_within_s_c [Retrieved: 7 June 2021]
  66. 66. Gibson H. Human Factors Performance in the Overall System and Use of the Ten Incident Factors. 2019. Available from: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/847197/Human_Factor_Performance_in_the_Overall_System_and_Use_of_the_Ten_Incident_Factors.pdf [Retrieved: 12 July 2021]
  67. 67. RAIB. Report 03/2020: Class investigation into factors affecting safety-critical human performance in signalling operations on the national network. 2020. Available from: https://assets.publishing.service.gov.uk/media/5eb935c6e90e07083b7d75fe/R032020_200512_HPSO.pdf [Retrieved: 15 June 2021]
  68. 68. Jens R, Svedung I. Graphical Representation of accident scenarios: Mapping the system structure and causation of accidents. Safety Science. 2002;40(5):397-417. Available from: https://orbit.dtu.dk/en/publications/graphic-representation-of-accident-scenarios-mapping-system-struc;;https://www.sciencedirect.com/science/article/abs/pii/S0925753500000369?via%3Dihub
  69. 69. Schrödinger E. What Is Life? The Physical Aspect of the Living Cell. Dublin: Cambridge University Press; 1944. Available from: https://en.wikipedia.org/wiki/What_Is_Life%3F [Retrieved: 3 September 2019]
  70. 70. Le Coze JC. Risk management: Sociotechnological risks and disasters. In: Routledge Handbook of Risk Studies. Abingdon, Oxon OX14 4RN: Routledge; 2016. pp. 270-284
  71. 71. RAIB. Report 17/2019: Loss of safety critical signalling data on the Cambrian Coast line, 20 October 2017. 2019. Available from: https://assets.publishing.service.gov.uk/media/5df8fa1be5274a08de86827d/R172019_191219_Cambrian_Coast_line.pdf [Retrieved: 5 January 2020]
  72. 72. The Rail Accident Investigation Baord. The Rail Accident Investigation Baord: Interim Report Loss of speed restrictions on the Cambrian line. Derby: The Rail Accident Investigation Branch, Department for Transport; 2018. Available from: https://assets.publishing.service.gov.uk/media/5bc871d5e5274a0956564a41/IR012018_181018_Cambrian_TSRs.pdf [Retrieved: 25 September 2019]
  73. 73. Winter DP. Compendium on ERTMS. Hamburg: DVV Media Group GmbH; 2019
  74. 74. Albert MW. A case study to investigate accidents involvingthe European Rail Traffic Management System (ERTMS): Investigation of complex accidents in the digitalised railway sector. 2019. Available from: https://ntnuopen.ntnu.no/ntnu-xmlui/bitstream/handle/11250/2634920/Albert%20Marius%20Wold.pdf?sequence=1&isAllowed=y [Retrieved: 11 June 2021]
  75. 75. Simon P. Standardizing European railways: A supranational struggle against persistent national languages and emergent local dialects. Flux. 2010;79-80(1):124-136. Available from: https://www.cairn-int.info/journal-flux1-2010-1-page-124.htm [Retrieved: 19 May 2021]
  76. 76. Lochman L. Background on ERTMS. In: Winter UP, editor. Compendium on ERTMS—European Rail Traffic System. 2009th ed. Hamburg: EU Rail Press, DVV Media House; 2009. pp. 31-50
  77. 77. Network Rail Consulting. ERTMS, Early Deployment Scheme. 2018. Available from: https://www.networkrailconsulting.com/our-capabilities/network-rail-projects/ertms-early-deployment-scheme/ [Retrieved: 13 July 2021]
  78. 78. The NEL Consortium. The UK HSE Research Report 067: Train Protection—Technical review of the ERTMS Programme Team report. Norwich: The UK HSE, HMSO; 2003. Available from: http://www.hse.gov.uk/research/rrpdf/rr067.pdf [Retrieved: 6 August 2013]
  79. 79. The UK HSE. HSG 65: A Guide to Measuring Health & Safety Performance. UK: UK HSE GOV; 2001. Available from: http://www.hse.gov.uk/opsunit/perfmeas.pdf [Retrieved: August 2019]
  80. 80. The UK HSE. HSG 238: Out of Control: Why Control Systems Go Wrong and How to Prevent Failure. 2003 ed. Norwich: HMSO; 1995. Available from: http://www.hse.gov.uk/pubns/priced/hsg238.pdf [Retrieved: 15 October 2019]
  81. 81. The UK HSE. Reducing Risk Protecting People. Norwich: Her Majesty’s Stationery Office; 2001. Available from: https://www.hse.gov.uk/managing/theory/r2p2.pdf [Retrieved: 12 June 2021]
  82. 82. RAIB. Report 27/2009: Investigation into runaways of road-rail vehicles. 2009. Available from: https://assets.publishing.service.gov.uk/media/547c901ee5274a428d000173/R272009_091029_RRV.pdf [Retrieved: 3 October 2019]
  83. 83. Butler RW, Finelli GB. The infeasibility of quantifying the reliability of life-critical real-time software. IEEE Transactions on Software Engineering. 1993;19(1):3-12. Available from: https://shemesh.larc.nasa.gov/fm/papers/Butler-nonq-paper.pdf [Retrieved: 5 December 2020]
  84. 84. RSSB Control Command and Signalling Standards Committe. GE GN 8650 (2017) Guidance on High Integrity Software Based Systems for Railway Applications. 2017. Available from: https://catalogues.rssb.co.uk/rgs/standards/GEGN8650%20Iss%201.pdf [Retrieved: 31 January 2020]
  85. 85. Group Safety & Engineering Director. Network Rail (Infrastructure) Ltd (NRIL) Health & Safety Management System. 2020. Available from: https://safety.networkrail.co.uk/wp-content/uploads/2021/01/NR-HSMS-Version-5-November-2020.pdf [Retrieved: 12 June 2021]
  86. 86. Kaye J. Strategic Business Plan. London: Network Rail; 2018. Available from: https://cdn.networkrail.co.uk/wp-content/uploads/2018/02/System-Operator-Strategic-Plan.pdf [Retrieved: 12 June 2021]
  87. 87. Derek Hitchins FF. Systems Engineering. A 21st Century Systems Methodology. 2007 ed. West Sussex: John Wiley & Sons Limited; 2007. Available from: https://play.google.com/books/reader?id=tdZod1zaIeQC&pg=GBS.PA84 [Retrieved: 22 April 2020]
  88. 88. Fukuoka K, Furusho M. Relationship between latent conditionsand the characteristics of holes in marine accidentsbased on the Swiss cheese model. WMU Journal of Maritime Affairs. 2016;15(2):267-292. Available from: https://link.springer.com/article/10.1007/s13437-015-0099-8 [Retrieved: 13 July 2021]
  89. 89. Wiegmann A, Shappell SA. DOT/FAA/AM-01/3: A Human Error Analysis of Commercial Aviation Accidents Using the Human Factors Analysis and Classification System (HFACS). 2001. Available from: https://www.faa.gov/data_research/research/med_humanfacs/oamtechreports/2000s/media/0103.pdf [Retrieved: 14 September 2019]
  90. 90. Dror IE. Cognitive and human factors in expert decision making: Six fallacies and the eight sources of bias. Analytical Chemistry. 2020;92(12):7998-8004. DOI: 10.1021/acs.analchem.0c00704
  91. 91. European Union Agency for Railways. Human and Organisational Factors (HOF) in Railway Automation. 2020. Available from: https://www.era.europa.eu/content/human-and-organisational-factors-hof-railway-automation_en#relatedDocuments [Retrieved: 11 April 2021]
  92. 92. The Office of Rail and Road (ORR). Guide to ROGS: The Railways and Other Guided Transport Systems (Safety) Regulations 2006 (as amended). London: ORR; 2020. Available from: https://www.orr.gov.uk/sites/default/files/2020-11/rogs-guidance-october-2020.pdf [Retrieved: 23 December 2020]
  93. 93. Appicharla S. SIRI Analysis of Risk Associated with Level Crossing Operations of ABCL Type. Unpublished RSSB Report. London; 2009
  94. 94. The Office of Rail and Road. RIG 2009-5: Assessing whether risks on Britain’s railways have been reduced SFAIRP. 2017. Available from: https://www.orr.gov.uk/media/10878/download [Retrieved: 14 June 2021]
  95. 95. UNISIG. SUBSET-091: Safety Requirements for the Technical Interoperability of ETCS in Levels 1 & 2, version 3.4.0. Brussels: European Railway Agency; 2015. Available from: https://www.era.europa.eu/sites/default/files/filesystem/ertms/ccs_tsi_annex_a_-_mandatory_specifications/set_of_specifications_2_etcs_b3_mr1_gsm-r_b1/index027_-_subset-091_v340.pdf [Retrieved: 30 March 2020]
  96. 96. French S. Discussion paper 20: The Investigation of Safety Management Systems and Safety Culture: The Roundtable on Safety Management Systems. Paris Cedex 16: International Transport Forum; 2017. p. 44. Available from: https://www.itf-oecd.org/sites/default/files/docs/investigation-sms-safety-culture.pdf [Retrieved: 23 August 2020]
  97. 97. Suokas J. On the reliability and validity of safety analysis. Espoo; 1985. Available from: https://cris.vtt.fi/en/publications/on-the-reliability-and-validity-of-safety-analysis-dissertation [Retrieved: 17 September 2019]
  98. 98. Hulme A, Stanton NA, Walker GH, Waterson P, Salmo PM. What do applications of systems thinking accident analysis methods tell us about accident causation? A systematic review of applications between 1990 and 2018. Safety Science. 2019;117:164-183. Available from: https://www.sciencedirect.com/science/article/pii/S0925753518319672?via%3Dihub [Retrieved: 24 November 2020]
  99. 99. Mearns KJ. Safety leadership and human and organisational factors (HOF)—Where do we go from here? In: Benoît Journé HL, editor. Human and Organisational Factors: Practices and Strategies for a Changing World. Toulouse, France: Springer Open; 2020. p. 138. Available from: https://link.springer.com/content/pdf/10.1007%2F978-3-030-25639-5.pdf [Retrieved: 23 August 2020]
  100. 100. Brendan R. Accounting for differing perspectives and values: The rail industry. In: Benoît Journé HL, editor. Human and Organisational Factors: Practices and Strategies for a Changing World. Springer Briefs in Safety Management. Toulouse, France: Springer; 2020. pp. 5-13. Available from: https://link.springer.com/book/10.1007%2F978-3-030-25639-5 [Retrieved: 23 August 2020]
  101. 101. Technical Committee ISO/TC 262. ISO 31000(en) Risk Management—Guidelines. ISO; 2018. Available from: https://www.iso.org/obp/ui/#iso:std:iso:31000:ed-2:v1:en [Retrieved: 15 April 2021]
  102. 102. Underwood P, Waterson P. Systemic accident analysis: Examining the gap between research and practice. 2013. Available from: https://www.sciencedirect.com/science/article/abs/pii/S0001457513000985 [Retrieved: 11 March 2016]
  103. 103. European Commission Fifth Framework Programme. D2.9.1: Synthesis of SAMRAIL Findings. Brussels: The European Commission; 2006. Available from: https://trimis.ec.europa.eu/sites/default/files/project/documents/20060727_155616_03705_SAMRAIL_Final_Report.pdf [Retrieved: 24 April 2019]
  104. 104. Smith DP. Safety Case for the Introduction of New Technology into an Existing Railway System. London: Imperial College, London; 2016. Available from: https://spiral.imperial.ac.uk/bitstream/10044/1/45313/1/Smith-P-2017-PhD-Thesis.pdf [Retrieved: 15 May 2019]
  105. 105. WS ATKINS RAIL LIMITED, UK. European Commission Fifth Framework Programme: SAMRAIL/SM/D2, D2.9.1: Synthesis of SAMRAIL findings. 2004. Available from: https://trimis.ec.europa.eu/sites/default/files/project/documents/20060727_155616_03705_SAMRAIL_Final_Report.pdf [Retrieved: 11 February 2020]
  106. 106. Network Rail Wales and Western Region. SMIS2317549: Report of a Level 3 (formal) Investigation. 2019. Available from: https://www.networkrail.co.uk/wp-content/uploads/2020/01/Margam-Level-3-Investigation-into-a-double-fatality-23-1-20.pdf [Retrieved: 8 February 2020]
  107. 107. RAIB. Report 11/2020: Track workers struck by a train at Margam, Neath Port Talbot, 3 July 2019. 2020. Available from: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/934741/R112020_201112_Margam.pdf [Retrieved: 12 January 2021]
  108. 108. RAIB. Report 07/2017: Class investigation into accidents and near misses involving trains and track workers outside possessions. Derby: The Rail Accident Investigation Branch, Department for Transport; 2017. Available from: https://assets.publishing.service.gov.uk/media/58edf5aced915d06b0000149/R072017_170413_Track_workers.pdf [Retrieved: 6 June 2021]
  109. 109. Fox K. How Has the Implementation of Safety Management Systems (SMS) in the Transportation Industry Impacted on Risk Management and Decision-Making? Lund, Scania: Lund University, Sweden; 2009. Available from: https://www.humanfactors.lth.se/fileadmin/_migrated/content_uploads/thesis-2009-Fox-Impact_of_SMS_on_Risk_Management_and_Decision_Making.pdf [Retrieved: 29 May 2021]
  110. 110. Johansson J. Risk and vulnerability analysis of interdependent technical infrastructures [Doctoral thesis]. Lund: Lund University; 2010. Available from: https://www.iea.lth.se/publications/Theses/LTH-IEA-1061.pdf
  111. 111. Tomas Rosberg TC. Driveability analysis of the european rail transport management system (ERTMS)—A systematic literature review. 2021. Available from: https://www.sciencedirect.com/journal/journal-of-rail-transport-planning-and-management; https://www.sciencedirect.com/science/article/pii/S221097062100007X [Retrieved: September 2021]

Written By

Sanjeev Kumar Appicharla

Submitted: 13 June 2021 Reviewed: 17 June 2021 Published: 19 April 2022