Written evidence submitted by Mr Sanjeev Appicharla (RLS0024)
I submitted Written Evidence to the current Transport Select Committee Inquiry on the theme of Railway Safety and to the previous 2013 Level Crossings Inquiry as well. Both Submissions were published and I thank the Committee for the same.
The Supplementary Evidence provides answers to the question raised on the theme of Rail Safety Inquiry.
The previous Transport Committee recommendations on competence and regulatory awareness that were identified as deficient were not acted upon.
The accidents show that risk policy does not cover the major issues and problems likely to be encountered in the operations.
The problems with leadership, setting example, definition of stakeholder and customer requirements, human factors and task analysis, the adequacy of the hazard analysis, the knowledge deployed to assure that the duty of care responsibility is being discharged were revealed through the analysis activity.
The conclusion is inevitable through the results of two case studies of latent errors presented in this Submission. The action to take is to reform the process of systems definition, hazard identification and analysis, identifying causal factors and standardise the approach and process to risk management.
Relying upon both “individual” and “system” approaches to human error, the case study results of latent (unmanifest) errors that led to (manifest) active errors on 11th May 2014 at the Frampton User Worked Level crossing site and on 14th May 2014 at the Oakwood User Worked Level crossing sites on the farm lands are presented.
The data for investigation is taken from the RAIB Reports, Railway Group Standards, research on the reliability of system safety, human reliability and systems engineering analysis.
“Individual” approach to human error was applied by the RAIB in their investigation of fatal accident at the 2014 Frampton User Worked Level crossing site and “system approach“ to human error in the investigation of accidents at the 2015 Oakwood User Worked Level crossing site. The use of mathematical models to predict risk do not take into account the human failings in the human factors and risk management disciplines is illustrated.
The quality of safety and risk analysis carried out within the railway domain, is rated as immature, as indicated by the fact that the public concern still persists and it fails to meet the standards expected of them to raise awareness of the accident potential inherent in the proposed plans for implementation.
It is hoped that the case study analysis will provoke and instill new thinking towards further reform in the way safety hazard, risk analysis, systems engineering and decision making process is carried out in the domain as suggested in the paper.
The RAIB Investigations did not use any of the eight formal approaches to Human Failure like:
a) Swiss Cheese Model/HFCAS/GEMS (used by Prof Emeritus James Reason since 1990);
b) ECFA/HAZOP/MORT/Heuristics& Biases (used by Sanjeev Appicharla since 2006 and validation of SCM/HFCAS/GEMS approach);
c) CAST/ STAMP (used by Prof Nancy Leveson since 2004[1]);
d) SCM-ATSB/STAMP/ACCIMAP -GAP (used by Dr.Peter Underwood since 2013[2]);
e) FTA/ETA/THEART (used by RSSB);
f) ACCA/ESCA/SCA (used by Dr. Ivan Lucic since 2010 in London Underground[3]);
g) Human factors/SMS (used by the European Railway Agency since 2013);
h) Human Error Probabilites (HEPs) for generic tasks and Performance Shaping Factors (PSFs) selected for railway operations[4]( 2012)
RSSB promotes the use of FTA/ETA/THEART methodology for pre-mortem analysis and Swiss Cheese Model/HFCAS/GEMS methodology for post –mortem analysis of fatal events/incidents[5],[6].
The differences between approaches to the pre-mortem and post-mortem analysis leaves the gap wide open for the analysts (individual or a team) to impose their own worldwide view on the flow of events as the integration of pre-mortem and post-mortem analysis (Bow –tie model) reveals gaps in the perception and conception of the accident risk analysis team. The problem of decision taking in organisations over the identified hazards that are not eliminated or controlled was raised by Prof Charles Perrow[7] (1984/99).
The gap in understanding of RSSB HF/Risk/ Operations/Signalling teams was revealed to the author during the application of ECFA/HAZOP/MORT/Heuristics& Biases when working for them between 2005 and 2010. Since then, the author his published case studies of latent errors in risk management and made Submissions to the Transport Select Committee that the issue of latent errors in design and risk management do contribute to the fatal accidents in several domains.
Since 2013, the European Railway Agency, as an agency responsible for integrating capacity and safety concerns, is finding it difficult to integrate human factors into safety management system. The difficulty arises as attention is not paid to the lack of pro-active risk management in the railway domain.
The lack of awareness implications of available knowledge base on human factors since mid 1960s and overconfident expression that railways is the safe transport medium do not reveal competence in safety management. The Danish RisØ National laboratory with its research man- machine research beginning in 1969, US AEC with its 1974 Management Oversight and Risk Tree, the US Air-force with its military standards for system safety, Boeing/NASA with FTA/ETA analysis are some of the pioneering organisations in the activity of potential and accident investigation methods. However, accidents have occurred in the nuclear, chemical, railway domains have led sociologists, and engineering academics to research the processes and theories available to manage the risk of accidents.
The idea of human factors/error originated with astronomical science in the late eighteenth century is discussed later in the text. However, the challenge of risk based decision making, solving insurance problems and balancing risk decisions and economic decisions is a challenge is noted by expert mathematical physicists like Prof Henri Poincare (1905/1952).
The ACCA/ESCA/SCA (used by Dr. Ivan Lucic since 2010 does not merit any attention for its application to duty –holder organisations as there is no recognition of the fact that construction of risk profile resulting from hazards created by the organisation providing a service has to use the theory of probability is not recognised in the research. If the organisation uses statistical analysis and then attendant biases that result from the use of co-relations and co-variations have to be accounted for. Further, the use of conditional probability theorem (Bayes theorem) requires a stereotyping thinking approach and there is no way to transcend the stereotype or recursive thinking as Prof Henri Poincare pointed in 1905. The way to escape the problem of circularity was pointed out in the antiquity by ancient Greek great mathematician, Archimedes[8]. The author has noted the developments in probability and statistical theories as described by Prof David Hand (2015).
From the perspective of classification of theories in the safety sciences, the author admits, albeit reluctantly, acknowledge to be placed alongside the resilience engineering domain going beyond the label attributed to academic safety scientists like Prof Nancy Leveson, and others who share similar ideas to go beyond both“individual” and “system” approach to human error. Reasons for the reluctance to be slotted in the same academic group will become clear later in the text.
The MORT management causal factors are listed below:
1) Less than adequate risk policy (2001 till date).
2) Less than adequate implementation of risk policy (2009 till date).
3) Less than adequate risk assessment and control system (1974 till date).
4) Less than adequate concept of hazard analysis process (2008 till date).
5) Less than adequate understanding of human error at the safety standard, regulators, duty holder as well as accident investigation organisations (2009 till date).
6) Less than adequate systems engineering capability and failure of INCOSE Rail Working Group to improve its knowledge base (2006 till date).
7) Less the adequate understanding of human error at the IOSH (2016).
8) Less the adequate action by the IET Rail Group after becoming aware of the errors in task specification of the user worked crossing requirements (2008 till date).
9) Less than adequate understanding of systems engineering, safety engineering, signalling and telegraph engineering, human factors integration/engineering at the standards setting decision making committees (2010 till date).
10) Less than adequate awareness of errors contained in the risk models currently under use (1990 till date)
11) Less than adequate awareness of errors contained in the Railway Safety Guidance Principles (2009 till date).
12) Less than adequate awareness of pro-active risk management process and competence needed (2013 till date)
13) Less than adequate awareness on how to apply risk-benefit analysis as per the UK HSE Law (2006 till date).
14) Less than adequate awareness on the part of the UK HSE on how to deal with complex socio-technical systems like railways.
15) Less than adequate awareness on the part of the ICE-Rail, IRSE, IET-Rail and Ergonomics-Rail institutes of the concept of human error.
The causal factors for the above tragic events were identified through the application of models, methods and standards relating to human errors, accident investigation, systems and safety engineering, human and empirical science through the System for Identification of Railway Interfaces, SIRI Methodology whose development for the safety body, RSSB, began in 2005/2006 and the development finished in 2013:
a) Swiss Cheese Model and its progenitor, Skills –rules –knowledge framework (James Reason -1990), Jens Rasmussen (1986), Mikael Krogerus, Roman Tschappeler, Jenny Pening(2008);
b) Barrier –Analysis and Management Oversight and Risk Tree Analysis: (Jens Rasmussen, A.M. Pejtersen, L.P Goodstein-1994), Dr. Robert J.Nertney (1971/2002), F. Koorneef(2002), P. Schallier(2002), John Kingston (1985), Rudolf Frei (1975), William Johnson(1971), Jack Clark (1971), Jack Ford (1971), Trish Sentence (2002), Gordon stevenson(2002), Graham Spencer (2002);
c) Publications of standard setting organisations like IEC 15288 for systems engineering; IEC 61508 for system safety engineering ; IEC 61882 for hazaop studies.
d) Scientific method and hypothesis: Aristotle (384-322 BCE), Francis Bacon (1605), Sir Issac Newton (1704), David Hume (1748), G.W. Kitchin (1861), Prof Henri Poincare (1905), Bertrand Russell (1914), Alfred North Whitehead (1926), Donald W. Sherburne (1977), David Ray Griffin (1977), Arthur Johnston (1973), Arthur Schopenhauer (1811), (1819), Valerie Roebuck (2003). Prof Nancy Leveson (2004). Prof David Hand (2015), 1950 Noble laureate Prof Bertrand Russel, 1978 Noble laureate Prof H.A Simon (1978), (1986), 2002 Noble laureate, Prof Daniel Kahneman (2002), (2011), 2006 Noble laureate Prof George Smoot III (2006, 2014), 2002 Noble laureate, and Prof Prof G.S Baker (1992).
To reduce or eliminate harm at level crossings, works carried out on the trackside, and changes to the way harm at level crossing is measured (process of system hazard and risk analysis) and managed of safety (potential or actual accident risk management) is necessary. In other words, the current understanding of latent and active failures classified as skill-rules-knowledge based human performance errors in all railway organisations is needed.
To compound the problem of lack awareness on the part of the organisations named both statisticians like Prof Andrew Evans and practicing level crossing experts, former with expert statistical knowledge and the latter with meagre engineering and economics knowledge respectively hold a false belief that the user of the farm level crossing is at a fault. This false belief is entertained without admitting the fact that four duty –holder organisations involved : farm land owner, Network Rail, train operating /freight operating firms and Highways Agency as duty-holders.
In 2003, Prof Andrew Evans, paying attention to the decision science perspective stated that the use of FN Curves in the decisions under risk are seriously flawed (see section 2.3 of 2003 UK HSE Research Report 073)[9]. Despite this evidence, RSSB/ORR seek to rely upon the risk model of FN Curves is a cause for concern.
In the 2013 Report, notwithstanding the concern just stated on the use of FN curves Prof Andrew Evans compared the risk faced by a pedestrian at the railway level crossing and risk faced at a road junction and concluded that risk faced by a pedestrian at the railway level crossing is 114% of the baseline road risk, implying that the LC roughly doubles the risk of the journey (page 39 of the 2014 HOC Report[10]). This risk calculation, in general terms, seeks to establish the baseline comparison of risk at level crossings to the road transport domain.
Another risk calculation is presented to counter this general argument. The argument advanced in the paragraph 55 of the RAIB Report 09/2007[11] relies upon the RSSB Safety Performance Report for 2005 indicates that the overall risk per traffic moment at an UWC is forty two times the risk at an AHB (ie the number of accidents is high compared to the low usage at UWCs). Since the risk to train occupants is very low it can be concluded that the risk to a member of public using a UWC is much higher than the risk to the user at an AHB.
Both calculations of risk indices rely upon the same statistical principles used by maintenance engineers to establish various indices such as the mean times to failures (MTTF), mean between failures (MTBF) and effectiveness of maintenance regimes etc in the reliability domain[12].
Despite the availability of risk indices, the problem remains the root cause of the human errors at the level crossings are still not resolved. The phenomena of “looked but failed to see accidents” relied upon to explain the failure on the part of rider1 to look for the train (paragraph 43 of the RAIB Report 05/2015) or the explanation that the victim may have misjudged speed and/or train’s distance from the crossing (paragraph 47 of the RAIB Report 05/2015[13]) do not explain the problem. These explanations suggest that the user task specification is not understood by the accident investigator as well as the safety and standards body.
However, the vehicle driver faces more risk at the user level crossing is a known fact and recorded in the proceedings of the 2008 IET User Worked Conference by Prof F.Schmid[14]. The presentation was to outline: • User Worked Crossings are based on laws that date back to the mid 1800s which were written for a society that travelled largely on foot and are not necessarily applicable to today's highly mechanised society • Users of crossings are required to cross the tracks five times in order to use the crossing in the prescribed manner. This increases both the real and perceived risk of being struck by a train • User Worked Crossings are not necessarily signposted in a way that communicates the method of operation effectively to the occasional / new user • User Worked Crossings should be completely re-thought using 21st Century psychological and human factors thinking. Prof F.Schmid and Charles Watson assumed that assumed a double track railway line. The particular observations of Prof F.Schmid and Charles Watson contradicts that of Prof Andrew Evans observations and calculations. . Prof F.Schmid and Charles Watson wished to stimulate discussions as to why it is easier to blame the user?
When we consider the human factors problems of “looming motion or inability to judge the speed of the approaching object (looming threat)” and “looked but failed to see accidents” we have to consider if the current understanding of human factors is sufficient given the fact that expert mathematical physicists over the last hundred years have raised questions about our theories of the Universe and the question is still remain open to debate. Here, authors is concerned with natural scientists like Prof Henri Poincare (1905) Sir Prof Roger Penrose (2004), and Prof Stephen Hawking (2005) who have raised the question of moral philosophy in their studies of mathematical theory of Universe.
On 31st July 1795, Mr David Kinnerbook[15] employed at Greenwich Observatory, attained immortality in astronomy, due to losing the job as observer due to his personal equation. In the history of astronomy, the personal equation discovered by the astronomer, Freidrich Bessel. He was the first scientist who realized the effect later called personal equation, that several simultaneously observing persons determine slightly different values, especially recording the transition time of stars.
Given the viewpoints by Prof FSchmid and Charles Watson, and Prof Andrew Evans and the observations of personal equation, the chosen method needs to eliminate biases due to personal equation. Further, R.B. Whittingham (2004[16]), observes that the individual who did not survive the accident is unable to put up any defence against the attribution of blame.
The connection of physics, optics and looming motion leads to the question of dimensions of space involved and whether they‘re compatible with the axioms of probability? The history of this question dates back to early eighteenth century. The question of using inductive arguments to improve the state of moral philosophy in a similar way to natural philosophy is the argument advanced by Sir Issac Newton in his treatise on optics (1704).
The idea of causation and perception of objects was examined by philosophers like Bishop Berkeley (1709), David Hume[17] (1748), Immanuel Kant (1783), Prof Henri Poincare (1905), Sir Prof Roger Penrose (2004), Prof Stephen Hawking (2005), Noble laureate, Prof Daniel Kahneman (2011) over the nature of the geometrical object, its attributes, and our perception of it to name a few treatises that came to my attention. Is retina a two dimensional space-time framework for viewing objects as claimed by Immanuel Kant (1783), or the idea that the judgements of the perceiving self different from knowing self as argued by Prof Daniel Kahneman and is a standard hypothesis accepted in the scientific field of psycho-physics[18].
Any risk assessment is required to be compliant with the Common Safety Method Regulation –Risk Assessment -CSM-RA.
The application of Common Safety Method Regulation –Risk Assessment is mandatory since 2013.
The guidance states the following on the first step of CSM-RA viz System definition:
3.2 The CSM process starts with the system definition (which can use information from the preliminary system definition). This provides the key details of the system that is being changed - its purpose, functions, interfaces and the existing safety measures that apply to it. In most cases, the hazards which need to be analysed will exist at the boundary of the system with its environment.
3.3 The definition is not static and during iterations of the risk management process, it should be reviewed and updated with the additional safety requirements that are identified by the risk analysis. It therefore describes the condition (or expected condition) of the system before the change, during the change and after the change.
3.4 The Regulation states that: The system definition should address at least the following issues: (a) system objective, e.g. intended purpose; (b) system functions and elements, where relevant (including e.g. human, technical and operational elements); (c) system boundary including other interacting systems; (d) physical (i.e. interacting systems) and functional (i.e. functional input and output) interfaces; (e) system environment (e.g. energy and thermal flow, shocks, vibrations, electromagnetic interference, operational use); (f) existing safety measures and, after iterations, definition of the safety requirements identified by the risk assessment process; (g) assumptions which shall determine the limits for the risk assessment.
3.5 The system definition needs to cover not only normal mode operations but also degraded or emergency mode.
3.6 Consideration of interfaces should not be restricted to physical parameters, such as interfaces between wheel and rail. It should include human interfaces, for example the user-machine interface between the driver and driver displays in the cabs of rail vehicles. It should also include interfaces with non-railway installations and organisations, for example, the interface with road users at level crossings.
3.7 Operational procedures and rules, and staff competence should be considered as part of the system environment in addition to the more usual issues such as weather, electromagnetic interference, local conditions such as lighting levels etc.
3.8 A good test of whether the system definition is complete and sufficient is if the proposer can describe the system elements, boundaries and interfaces, as well as what the system does.
3.9 The description can effectively serve as a model of the system and should cover structural issues (how the system is constructed or made up) and operational issues (what it does, and how it behaves normally and in failure modes). The existing safety measures, which may change as the risk assessment process progresses, can be added after the structural and operational parts of the model are complete.
3.10 For some projects, the proposer may not know all the environmental or operational conditions in which the altered or new system will operate. In these circumstances, they should make assumptions on the basis of the intended or most likely environment. These assumptions will determine the initial limits of use of the system and should be recorded. When the system is put into use, the proposer (who may be different to the original proposer) should review the assumptions and analyse any differences with the intended environmental and operational conditions.
From the perspective of hazards and their control, the properties of preparedness in terms of proactive safety management to anticipate problems, changes and hazards; awareness of of both the hazards that it faces and the adequacy (or lack thereof) of its defences to control them. Another property of interest is the ability to learn. The organisation learns from experience by systematically gathering and analysing near misses and incidents and encouraging the reporting of incidents; in addition, Costella et al. (2009) argue that organisations can also learn from normal working practices and disseminating and sharing best practice. For instance, focusing on how procedures are implemented during normal working practices can help identify any gaps between how managers think that procedures should be used and how they are actually applied by front line staff (Costella et al., 2009).
The three properties of awareness of hazards and their control, the adequacy (or lack thereof) of its defences, and organisational learning are applicable to the adequacy of controls at the several of the interfaces that are of interest to the present inquiry. Prof Jens Rasmussen (1969, 1986,1994, 1997, 2004) was of the view that a cross functional approach is necessary to explore the question of human error in the organisational context. Prof Jens Rasmussen’s viewpoint that socio-technical systems migrate towards unsafe boundary and the operations may resemble Brownian movements of molecules in gas where human operatives can navigate freely. However, systems may migrate towards unsafe boundary due to three facts: (1) to work load on the local operatives being exceeded that that is acceptable for normal working. (2) cost boundary may limit the investment. (3) boundary to safe performance as defined by safety campaigns. If the effects of human error are reversible then human operatives can recover if /when effects are noticed.
However, If the effects of human error, then accidents (small or large) are inevitable. The contribution of various actors towards this unsafe migration of the system can be represented by means of the Accimap technique. However,from pro-active safety management perspective, the Accimap technique does not afford methods to identify and analyse the hazards.
The CSM-RA affords a framework to explore the three properties stated. The CSM –RA requirements are satisfied by the following pictorial diagrams to represent the physical, human and social interfaces. But the CSM-RA does not recommend any methodologies and therefore, recourse to standard methods is necessary to assure the quality and reliablitiy of safety and risk analysis. In the activity of performing the CSM-RA, the analyst or the proposer has to consider the risk acceptance criteria of Reference System, or the requirements of the Codes of Practice or Explicit Risk Assessments.
For loosely coupled systems, risk management by looking for resident pathogens in management practice as derived from the analysis of past accidents may be the strategy forward. However, errors, violations and latent factors may interact in an unique way and may create a future accident in unknown ways and awareness to be mindful avoid all resident pathogens always cannot be maintained. Further, defences based upon admistrative measures aimed at controlling the division of work, supervisory monitoring, and safe work procedures may be overruled by adaptation to other, active criteria governing co-operative work. Therefore, causal trees included in the Management Oversight and Risk Tree augmented with the advances with the cognitive sciences is the selected way forward. This requires in accordance with the CSM-RA to include all components of the system.
Overall Socio-technical layers schema for railway operations is indicated in the Figure 1.
The stakeholders from human factors perspective range from various departments of the UK Government dealing with Rail, Treasury, Justice, Economic and safety regulation, Audit office, Accident investigation, the Railway operator (infrastructure duty holder), the private firms operating in the railway passenger, and freight trading businesses (duty holder), the NHS, labour unions, supply chain firms, engineering and operating institutions, educational institutions, and members of public who use the level crossings.
Figure 4: Overall Socio-technical layers schema for railway operations
The general safety schema to guide the hazard identification activity in the railway design and operations is indicated in the Figure 5.
Figure 5: General safety schema for hazards and barriers in the railway operations
From the works of epistemology, psychology, philosophy of mind perspectives by Prof Robert Audi (1998-2002), Prof Nicky Hayes (1994-2010), and Prof Daniel Kahneman (2002-2011) support the above mental representation of the socio-technical world.
The first step in the SIRI Methodology is to create a system definition bringing together the designer and operator knowledge expertise in the form of system diagrams of context and activity /functions. The system definition so created is a form of socio-technical operational system. The system definition used here maps to the Abstract Hierarchy Framework ( HF Framework developed by Prof Jens Rasmussen(1994).
The Figure 1shows how three duty-holder organisations, the IM/RU/Farm owner operate together and the facilities provided.
As per the Health and Safety at Work etc Act 19742, it places general duties on employers to ensure that employees and others who may be affected by the work of their undertaking, are not, so far as is reasonably practicable, exposed to risks to their health and safety.
The equipment /people as components managed by respective duty-holders identified as socio-technical system are marked in different colours. If the colour discrimination is not possible, then the identification can be done by means of labels attached to the diagrams.
From the diagram, based upon the information encoded into the Railway Group Standard, it is clear that the perceptions of the farm users, train driver, and signallers can vary due to their differing work experiences.
From the initial scrutiny, the object posing danger is the moving train, and its visibility may be impaired due to various reasons beyond the subjects involved in the situation. How the users interact with the level crossing system is next described to elicit the normal working scenario in the next section.
Figure 1: SIRI Architecture Context Diagram
The specification of the task any user is expected to form is described in the form of activities/functions is shown in the functional diagram below. The flow of events diagram explains the reason “Why” described activities are needed to be performed by the user. This is to attain the function to minimise the collision risk.
The requirements placed upon various actors involved in the flow of events shaping the performance /outcomes are described. The text within the oval shapes represent the conditions required for the successful human performance as encoded in the safety standards. The activity to be performed is recorded into the diagram by means of rectangular shape. The spatio-temporal sequence of events is read fro left to right direction.
Figure 2: Flow of activities at the UWC –T expected by the ORR/RSSB/Duty-holder organisations as per the RAIB Report.
The above User Requirements Specification is made functionally active by means of physical signs that are installed at the site together with the gates/barriers, phones etc to answer the user’s query on the train location. .
For a vehicle mounted user, the flow chart needs to be modified and the number of traverses needed is increased to three in the case of a single line railway in a single crossing.
The next lowest sub-system is the physical objects of signs, gates, phones etc installed at the site.
From the perspective of social interpretation as per Max Weber’s legal and formal rationality[19] the provision of the following signs and facilities at the site fulfill the functions stated in the physical activities section. Instead of the images, the SIRI ACD diagram captures the static diagram representing the human users and other animals and equipment involved to inquire into the associative relationships.
Figure 3: physical signs
The function of transporting people, goods, and the need for the people to go across the railway line dates back to 1840s. Some form of restraint to prevent two sorts of movements to collide with each other was needed. This took the form of 1845 Legislation to provide for gates.
Apart from the prevention, the modern industry and society needs assurance that risks are managed in the best possible manner. This safety assurance and accident investigation functions situated at the general function level.
The measuring function sets the objectives for the next lower layer in the network viewed as Ends-Means network.
The measurement of the performance to manage the road-rail interface in the Regulatory domain, ORR, by means of the Fatalities weighted injuries( FWI) or the efficacy of safety investment by the ALARP Figures are the basis of the economic incentives to minimise the collision risk at this level. Further, various initiatives to reduce pre-cursor events like SPAD, harm to track workers, level crossing users, taking legal actions to enforce safe behaviour etc may be situated at this level.
The function of transporting people, and goods needs investment and planning to keep the infrastructure in good shape and fit-for –purpose. The function of goal system is to investigate and identify options to move the railway transport system to a target state that is better than the current state.
The goal system is the highest decision function level of various organisational systems involved in the such as the Department for Transport-Rail, ORR-Rail, RSSB, Rail Delivery Group, Duty-holders, Local councils, Passenger forums, Labour unions, Engineering and Operational Institutions, and members of public.
At this level of the system, the main goals of the whole system are located. The goals may be related to public welfare, health and safety etc. Decision making at this level is governed by the legal regulations.
The insight provided by 1978 Noble laureate Prof H.A. Simon into different kinds of substantive (economics cost –benefit analysis ) and psychological (procedure of efficient computation) rationalities[20] are different highlights the dispositions expected at this level. The socio-legal, computational and economic rationalities may not be coherent with the ecological rationality is discussed in the next section. The problem of cognitive ambiguity and uncertainty motivated in part by advancements in the decision theory that allows for distinctions between three alternative sources of uncertainty: i) risk conditioned on a model, ii) ambiguity about which is the correct model among a family of alternatives, and iii) potential misspecification of a model or a family of possible models as discussed by 2013 Noble laureate, Prof Peter Lars Hansen [21]. The idea of using a sophisticated statistical model to specify risk may not be compatible with the needs to improve risk at level crossing because the problem is not economical in nature but is technical and cognitive in its origin.
RAIB has issued a report reporting their findings and here is the summary of their report stated in the next three paragraphs[22].
At around 18:45 hrs on Sunday 11 May 2014, train 2G981, the 18:18 hrs Swindon to Gloucester passenger service, struck a trail bike (a type of motorcycle) on a level crossing. The rider of the trail bike was fatally injured. The rider was the last of a group of three that were crossing the railway at Frampton level crossing, near the village of Frampton Mansell, Gloucestershire.
The rider was crossing the railway on a trail bike, a type of motorcycle designed for use on public roads and for off-road use. He was the last of a group of three riders who had reached the level crossing along an unsurfaced track leading from a minor road near the village of Sapperton.
Signage on the approach to the crossing instructed vehicle users (which would include trail bike riders) to use a telephone located close to the crossing. This allowed the railway signaller to tell users whether it was safe to cross the railway. The riders did not use the telephone because they believed that they could cross safely by looking for trains before crossing, and because the signs did not grab their attention sufficiently for them to read the information on them. The riders did not know that a curve in the railway meant that they could not rely on seeing an approaching train early enough to decide whether it was safe to cross. The train’s warning horn was sounded as it approached, but the trail bike riders could not hear this because they were wearing full-face crash helmets and their trail bike engines were noisy. Network Rail had received some information that trail bikers were using the crossing, but had not taken effective action to manage the associated risk of unsafe use.
Although permitted to use vehicles on both approaches to the level crossing, the trail bike riders were unaware they were not among the people permitted to use vehicles on the crossing. The signs giving instructions to vehicle users did not explain this, and there was no other indication at the crossing, or on the approaches. There was no requirement for signs or other indications to be provided by Network Rail, or any other organisation, to indicate that the general public were not permitted to take vehicles onto the level crossing.
The investigation identified three observations, unrelated to the accident, relating to level crossing signage, correct sounding of train warning horns and provision of reliable images from CCTV cameras fitted to trains. The RAIB has made six recommendations. Four addressed to Network Rail, one addressed to the ORR and one addressed to the Department for Transport relate to improved content and positioning of information provided to level crossing users. Two recommendations addressed to Network Rail require it to seek a better understanding of actual (not only permitted) use of level crossings, and, in conjunction with highway authorities, to raise public awareness of locations where the general public are not permitted to take vehicles onto level crossings.
The question of human error for its role in the industrial accidents rose to prominence in the 1980s and l990s due to a series of accidents like 1979 Three Mile Island, 1984 Bhopal gas leak, 1986 Challenger accident, 1986 Chernobyl, 1987 Herald of Free Enterprise, 1987 Kingcross accident, 1988 Clapham accident, 1988 Piper Alpha accident, 1990 Newton Junction accident, 1999 Ladbroke Grove accident, to name a few fatal accidents that gave rise to public concern in the UK and elsewhere.
Prof B. A Turner (1976), Prof Karl Weick (1999), Prof Charles Perrow (1984-99), Prof James Reason (1990), Prof Jens Rasmussen (1994), Prof Nancy Leveson (2004), to name a few academics and researchers devoted their attention to explore the nature of human error and developed frameworks to explain the accidents occurring in technological systems. The research gave rise to several methodologies of human reliability analysis (see Section 4 of Human Error) for more details.
The system definition, thus, described in the section 2.2.1 provides a common indication of how well the operating design can work when subjected to the Hazop study.
The terminology used by the human factors specialists is illustrated by means of example drawn from everyday life on the railways invisible to passengers such that different object worlds may be comprehended through a common terminology.
It is important to note that a common indicator or measurement of 750 volts on a voltmeter/digital meter can evoke different stereotype responses.
For example, measured inside the railway control panel room, it is sign of correct operations of railway supply to the rails if the breaker is ON. If the breaker is ON. The operator is in the region of skills based performance, which were part of training or learnt on the job.
If the breaker is OFF then it is necessary to check the voltmeter circuit. If the voltmeter circuit is OK, then it is symbol to the operator of dangerous situation. The operator is in the region of knowledge based performance, is trying to ascertain the real situation.
If the breaker is ON and the meter reading is varying around the 750 volts the indication is a signal to be monitored continuously.
The words, sign, symbol, and signal in the previous paragraphs used by human factors experts represent the what they classify the mental activity of an electrical operator into skill, knowledge, and rule based behaviour.
-Risk management policy is outdated and the policy is not sufficient scope to address the major issues and problems likely to be encountered.
The following legislations relating to the particular level crossing were identified by the accident investigation report and some of the omitted legislations by the RAIB Report 05/2015 are stated.
-The 1836 Cheltenham & Great Western Union Railway Act gave the railway owners to create the railway (para 34 of the RAIB Report 05/2015).
-The RAIB Report 05/2015 did not cite the provisions of the Section 61 and 68 of the Railways Clauses Consolidation Act 1845. (omission on the part of the RAIB Report 05/2015).
“ Section 61 requires the Company to make sufficient approaches and fences to bridleways and footways crossing on the level .
“ if the railway shall cross any highway other than a public carriageway on the level, the company shall at their own expence, make and at all times maintain convenient ascents and descents and other convenient approaches, with handrails or other fences, and shall, if such highway be a bridleway, erect and at all times maintain good and sufficient gates, and if the same shall be a footway, good and sufficient gates or stiles, on each side of the railway, where the highway shall communicate therewith”.
“Section 68 requires the Company to make sufficient approaches and fences to bridleways and footways crossing on the level.
The company shall make and at all times thereafter maintain the following works for the accommodation of the owners and occupiers of lands adjoining the railway; (that is to say,)
Such and so many convenient gates, bridges, arches, culverts, and passages, over, under, or by the sides of or leading to or from the railway, as shall be necessary for the purpose of making good any interruptions caused by the railway to the use of the lands through which the railway shall be made; and such works shall be made forthwith after the part of the railway passing over such lands shall have been laid out or formed, or during the formation thereof;
Also sufficient posts, rails, hedges, ditches, mounds, or other fences, for separating the land taken for the use of the railway from the adjoining lands not taken, and protecting such lands from trespass, or the cattle of the owners or occupiers thereof from straying thereout, by reason of the railway, together with all necessary gates, made to open towards such adjoining lands, and not towards the railway, and all necessary stiles; and such posts, rails, and other fences shall be made forthwith after the taking of any such lands, if the owners thereof shall so require, and the said other works as soon as conveniently may be:
Also all necessary arches, tunnels, culverts, drains, or other passages, either over or under or by the sides of the railway, of such dimensions as will be sufficient at all times to convey the water as clearly from the lands lying near or affected by the railway as before the making of the railway, or as nearly so as may be; and such works shall be made from time to time as the railway works proceed:
Also proper watering places for cattle where by reason of the railway the cattle of any person occupying any lands lying near thereto shall be deprived of access to their former watering places; and such watering places shall be so made as to be at all times as sufficiently supplied with water as theretofore, and as if the railway had not been made, or as nearly so as may be; and the company shall make all necessary watercourses and drains for the purpose of conveying water to the said watering places:
Provided always, that the company shall not be required to make such accommodation works in such a manner as would prevent or obstruct the working or using of the railway, nor to make any accommodation works with respect to which the owners and occupiers of the lands shall have agreed to receive and shall have been paid compensation instead of the making them”.
Ex.BR Head of Signalling and Safety, Operating Department, BR Board, Mr Stanley Hall MBE, and Mr. Peter Van Der Mark , train driver noticed and observed the omission of any statement in the 1845 Legislative Act to make any provisions for the safety of those who use the birdle or footpath crossings.
For those who use the public highway at level, the 1845 Legislative Act did make provision in the Section 47 by requiring the company to provide for gates and a gatekeeper to attend to the operation of the gates.
Authorised users of occupation crossings owe a duty of care to their visitors under the Occupiers’ Liability Acts 1957 and 1984. Authorised users who are also employers also owe a duty to their employees under the Health and Safety at Work etc Act 1974. This means that authorised users are under an obligation to ensure that crossing users are instructed on the need to use the crossing correctly. It is important that UWC gates are closed when the crossing is not being used because they protect the railway by requiring the driver to stop and leave their vehicle to open the gates. Across the rail network it is frequently observed that gates are left open. However, it is an offence under the Railways Clauses Consolidation Act 1845 (and the Transport and Works Act 1992 modified the penalty), for users of private crossings to leave the gates open. UWCs are also covered in paragraphs 269 and 270 of the Highway Code, where they are referred to as ‘user-operated gates or barriers’. The relevant advice to the road user when there are no lights on the crossing is to: l open the gates or barriers on both sides of the crossing; l stop, look both ways and listen before you cross; and l close the gates or barriers when you are clear of the crossing 57 Network Rail, as Infrastructure Manager, is responsible for the provision of UWCs, their maintenance and for the recording and monitoring of incidents and accidents. In addition, the Health & Safety at Work Act 1974 imposes an obligation on Network Rail to control risk as far as is reasonably practicable. Guidance issued by the Health & Safety Executive, (ref: Reducing Risks, Protecting People, 2001, ISBN 07176 2151 0), suggests that this obligation can be met by ensuring the risks to individuals are tolerable and by implementing risk control measures where it is reasonably practicable to do so.( para 54,55,56, 57) of the RAIB Report 09/2007 .
-The British Railways Act 1970.
The right for the general public to take vehicles over the crossing was withdrawn on 22 March 1971 using legal powers granted in the British Railways Act 1970. This Act required that the railway maintained the crossing for use by persons on foot, leading horses or on horseback. The right to take vehicles over the crossing was restricted to persons who owned or occupied land that was served by the road that crossed the railway at the crossing. In practice, Network Rail managed crossings of this type on the basis that authorised users also included people, such as employees, who needed access to adjoining land when, in similar circumstances, the owners or occupiers would have a right to use the crossing(para 36 of the RAIB Report 05/2015).
The reading of the above ACT suggests the idea that the tragic fatality was a result of active failure on the part of the trail bike rider1 but this judgement is to treated with caution due to the following acts of omission and commission.
Omission of the safety of the users of the use the bridle or footpath crossings constitutes a “latent failure” and a “fallible decision” on the part of the 1845 Legislative Act.
Failure to cite the 1845 Legislative Act with relevant sections 61 and 68 by the RAIB Report 05/2015 constitute an omission error.
a1. Methods, Criteria, Analysis LTA
At the 2006 IET First International System Safety Conference, Sanjeev Appicharla stated the main thesis of the paper that complex issues of cost, performance, and system safety can be successfully resolved using a process based on a soft system thinking approach. To use a soft system thinking approach means to focus efforts on static and dynamic interconnections (interfaces) between the elements of a complex system using a framework of tools/techniques of systems engineering, safety engineering and human factors engineering in a bottom –up manner. Traditional systems engineering approaches exclude human aspects and treat humans as actors’ external to the system.
Further, Sanjeev Appicharla noted the observations made by Prof James Reason(1990)/(1998), Prof Peter Ladkin (2005), Prof Suokas, J (1985), Professors Rasmussen, J., Pejtersen A.M., Goodstein, L.P(1994), Prof Leveson, N.G.(1995), (2004) on the questions of human error, objective hazard assessment, quality of safety and risk analysis, defence –in-depth fallacy and analytical risk management, biases in accident investigation and the need for a simple but effective mechanism to
investigate and arrive at the parameters or variables that sit on the boundary of the components and may cause the system to drift towards the unsafe boundary.
Whenever an important element of a complex system is changed, the question of impact the change has on the system properties arises? For example, does a change in the technical or operating rule have any impact on system safety or its performance?
To manage the hazards, relying upon the simple mechanism of system and safety analysis, the different steps followed in the proposed safety analysis, Sanjeev Appicharla introduced at the Conference, are as follows:
a) Developing description of an operational railway (system modelling process)
b) Identifying hazards at the boundaries (hazards identification process)
c) Modelling accident scenarios (causal analysis process)
d) Performing risk assessment and developing Countermeasures (risk assessment process)
e) Preparation of impact assessment and documentation of results (impact assessment process)
Alert readers will notice that the above process is quite similar but exceeds the requirements of the Common Safety Method-Risk Assessment as there is a need to develop countermeasures where less than adequate hazard situations are found during the analysis.
In the 2008 IET Seminar on User Worked Crossings, Prof F.Schmid and Mr Charles Weston outlined that User Worked Crossings are based on laws that date back to the mid 1800s which were written for a society that travelled largely on foot and are not necessarily applicable to today's highly mechanised society • Users of crossings are required to cross the tracks five times in order to use the crossing in the prescribed manner. This increases both the real and perceived risk of being struck by a train • User Worked Crossings are not necessarily signposted in a way that communicates the method of operation effectively to the occasional / new user • User Worked Crossings should be completely re-thought using 21st Century psychology and human factors thinking. Prof F.Schmid and Mr Charles Weston hoped to the stimulate discussion by suggesting the idea that ‘It’s Easy to Blame the User’.
2013 Network Rail’s All Level Crossing Risk Model (ALCRM) did not consider any human factors assessment of the user task specification assumed for making risk assessments(para 89 and 90).
The focus of attention in the research and practitioner domains related to harm arising at the level crossings, thus far, has been on the basis of the idea that unsafe acts committed at the user worked level crossing have their origin in the actions of the person, immediate user(s) of the user worked level crossing in the accident situation as admitted in the RAIB Report 05/2015.
From the gestalt psychological perspective, the foregoing conclusion is inevitable in that the phenomenal ego is responsible for the unsafe acts and therefore, research to investigate how the physical objects of active signals and passive signs can be relied upon to afford cues of danger to the phenomenal ego to become aware of the danger posed to itself and others by stepping or driving into the path of trains. The inference, just made, is not an illusion, or delusion or figment of imagination on the part of the author as input data is taken from the three reports RAIB Report 05/2015, the RAIB Bulletin 07/2010, the RAIB Report 13/2009 attests to the conclusion drawn
However, on the basis of layered self worldwide vie , the case study of latent errors negates the RAIB conclusions and establishes the fact that fatality at the Frampton level crossing site was preceded by the two successful attempts by the rider(s) at the same time. Accident proneness on the part of the rider 1 is discounted as per the system approach to accident causation adopted in the paper. Therefore, by the way of this tragic accident, Nature as it were, was offering accident researchers, investigators, unions and the duty-holder organisations a chance to contemplate and learn all causal factors for the failure of the third attempt to go across the user worked level crossing on the fateful day.
The level crossing risk assessment takes the following form of activities to determine the individual and collective risk:
1) Pre-planning.
2) On –site data collection.
3) Risk assessment by calculating the frequencies and likelihood of fatalities.
4) Optioneering.
5) Risk assessment report and implementing the selected option.
The duty-holder organisation(s) may perform the activity themselves or delegate the activity in the form of sub-contract to other firms either it is cost effective or lack capability to evaluate and validate the risk assessments.
The occasion to perform risk assessment may arise when an accident occurs and the investigation report is issued raising recommendations or the time is due to perform the risk assessment (once in three years or so). The duty-holder organisations like the Network Rail-Train operating firms/freight operating companies FOC) and the safety standards body, RSSB, are required by the EU Common Safety Method Regulation -402/2013 to follow a prescribed process when making significant changes. ORR has published its Guidance on the Common Safety Method Regulation -402/2013 in 2013 and 2015[23] as well.
The ALCRM has 128 risk influence factors listed in its database in the form of physical environment, social environment, user actions and factors related to the infrastructure operations and other miscellenous factors[24]. The ALCRM does not take into account the true system approach to accident causation and thus, does not explain the some of the accidents at user worked crossings and therefore, it is a foregone conclusion that the capability of processes like safety assurance, risk assessment, human factors in risk management, hazard management relating to more complex systems like ERTMS/ETCS Signalling System and associated human factors analysis are prone to more fallible decision making process.
The currently practiced methods of measuring harm and taking corrective actions at level crossing sites prior to accidents is by the design method of all level crossing risk assessment process by duty-holder organisations (Network Rail/train operating and freight operating firms) and by accident investigation process carried out by the statutory independent body-RAIB. Both of these methods take an individual approach to human error, a research paradigm noted by by Prof James Reason(1990), (2000), (2006),(2016).
However, it should be noted that the GB Railway industry, at least in writing, focuses on the errors of individuals, namely vigilance failures, inattention, and violations but it does not place blame the individuals concerned. But the GB Railway industry approach fails to acknowledge the fact of optics as per classical and modern physics are different and the lack of acknowledgement of this fact can be seen in the RAIB Report 05/2015 (paragraph 47).
The “system approach” concentrates on the conditions under which individuals work and tries to build defences to avert errors or mitigate their effects as stated by Prof James Reason (2000). One of the forms the system approach to accident causation has developed into The Human Factors Analysis and Classification System (HFACS) was developed by behavioral scientists in the Unites States Navy . Prof James Reason did not contemplate the cost of targeting institutions to the accident investigators and researchers in learning lessons from the past accident and incidents. Otherwise, he may have issued a health warning.
Methods of GEMS and Human Factors Analysis and Classification System can be found in Human Error. The “system approach” may be seen, as it was assumed by Prof James Reason, to be close to the High Reliability Theory. The “system approach” assumes that there are multiple barriers, safeguards and defences based upon engineering, administrative and organisational processes. One of the forms of the “system approach” advanced by Prof James Reason follows the typical statistician’s approach to multiple failures and alignment of “active” and “latent” failures.
The second case study related in the RAIB Report 07/2016 takes a partial system approach and raises questions about safety assurance, human factors considerations, risk assessment process, hazard records and approvals for production introduction.
Both case studies of latent errors together with the 2011 case study of latent errors confirms the author’s 2006 hypothesis of lack of integration of practices of probability risk assessment (safety engineering), human reliability analysis (human factors), and economic analysis (systems engineering and economic decision making).
The auxiliary hypothesis, based upon Prof Jens Rasmussen’s 1994 insight) is that the senior manager(s) are not aware of their contribution to the accident risk either due to lack of background (may be CEO’s are drawn from legal, and finance backgrounds) or they may have delegated the safety risk decisions to the committee/body modelling and analysing the health and safety risk faced by the organisation and the risk it shares with other organisations in the form of safety assurance guidance or due to the regulator lack of awareness of the latent errors in the application of Bow –tie models .
The Hazop study of any design requires a baseline operational or design or concept description to learn about the deviations that can occur. Initially, this qualitative technique was used to study deviations in the chemical industry.
It cannot be used as a standalone technique as described in the IEC 61822 (2001) Application Guide for the following two reasons.
First, the system diagrams used in the chemical plants are denoted by acronyms PIDs and PFDs. The system diagrams used on the railways introduced by Sanjeev Appicharla in 2006 do not rely upon the standard UML diagrams used in the information processing domain, but require the emergent property residing at the interface between the human user and the equipment to enable the analyst reflect upon the visual space and motor space available to the user’s eye.
The problem stated was discussed initially by Prof Henri Poincare and translated into English language in 1905. Subsequently, the whole problem developed into science leading to ecological as well as constructive rationalities of human information processing activity and led to the ideas of CMU School’s computational rational activity is cited for more background[25].
Second, the HAZOP study identified the structure of causes and consequences, but it does not link them into a map of flow of events resulting from the life cycle stages of the systems; design and development phase, dysfunctional interactions between parts of the systems; operations and maintenance phases.
See attached partial HAZOP study for raising awareness of less than adequate barriers.
The formulae for the physical failure case scenario (accident) in terms of Aristotle’s entelechy of final cause of the fatality is graphically and mathematically represented as under:
Final Cause (Fatality at the Frampton level crossing)
Material Cause (No_Train_ in the crossing space) ∩ efficient cause (Rider_ in_ the crossing space) ∩ Formal Cause (Less than adequate_ risk management_ locally as well as globally) -----------------------------------------------------------------------------------------------------------------------(1)
In the above equation, the symbol ∩ denotes intersection. However, the above equation is taken to be the result of physical operation sub-system rather than represent the whole system as it is represented by Management Oversight and Risk Tree(MORT). The MORT captures and represents the physical, environmental, department level, director level, standards body level and regulatory decisions failures.
The formulae for the success case scenario #1 of crossing in terms of Aristotle’s entelechy ( final cause) of the fatality is graphically represented as under:
Material Cause (No_Train_ in the crossing space) ∩ efficient cause (Rider_ in_ the crossing space) ∩ Formal Cause (Less than adequate_ risk management_ locally as well as globally------------------------------------------------------------------------------------------------------------------------(2)
The formulae for the success case scenario #2 of crossing in terms of Aristotle’s entelechy ( final cause) of the fatality is graphically represented as under:
Material Cause (Train in the crossing space) ∩ efficient cause (No_Rider_ in_ the crossing space) ∩ Formal Cause (Less than adequate _risk management _locally as well as globally) -----------------------------------------------------------------------------------------------------------------------(3)
The formulae for the success case scenario #3 of crossing in terms of Aristotle’s entelechy ( final cause) of No_ Fatality is graphically represented as under:
Material Cause (No_Train_ in the crossing space) ∩ efficient cause (Rider_ in_ the crossing space) ∩ Formal Cause (Adequate _risk management _locally as well as globally)---------(3)
In 2009, RAIB in its paragraph 40 of the UWC Report[26] observed that for all types of crossing, the greater part (69.4%) of level crossing risk is attributed by RSSB to misuse (errors) by users. Violations by users account for 23.6% of the risk and the remaining 7.0% of risk is from railway staff errors or equipment failures. Both ‘error’ and ‘violation’ have a specific meaning in the context of accident causation. ‘Error’ implies an action which was unintentionally incorrect, caused by a lapse, a slip or a lack of knowledge.
‘Violations’ are actions which are deliberate, contravening rules or instructions to gain a perceived advantage such as increased speed or reduced effort. In the use (or misuse) of level crossings, underestimating the time available, and consequently crossing closely in front of a train, would be an error. Leaving the gates open after driving over, having seen and read the instructions on the signs at the crossing, would be a violation.
Prof James Reason (1990), and Prof Jens Rasmussen (1994) explored the question of human error from the cognitive science perspective, and they categorised the performance into cognitive levels of skills, rules, and knowledge based behaviour and the kind of error forms they lead to.
The black box Concept of Operations exploring the interactions between the various stakeholders and their strategies for normal operations go beyond the traditional hierarchical task analysis is needed. This type of analysis reveals the expected ‘emergent properties’ at the interactions between the user and the user worked crossing.
For the purpose of identifying the hazards that can be generated within the above socio-technical systems at the road-rail interface, a hypothetical Hazop[27], a standard process adapted to the railway application is applied.
A hypothetical HAZOP study is included to show how the facts that are observed can be captured. However, simply imitating the process without underlying psychological basis can mislead the HAZOP participants.
Further, re-construction of the accident pre-cursors, and root causes of the real accident, the analyst has to acutely aware of the hindsight bias and rely upon the available knowledge prior to the accident.
Using the taxonomies of the Swiss Cheese Model, and the Management Oversight and Risk Tree[28] following cognitive errors were discovered from the RAIB Report.
RAIB did not identify the cognitive error type accordance with the performance level of skill based, rule based, and knowledge (SRK control levels) based human performance levels and therefore, the RAIB investigation cannot be regarded as human reliability technique. (Reason, 1990).
The emergent properties to be measured by the hypothetical Hazop study is to understand the efficacy and the effectiveness of the physical function and activities to move passengers, freight and goods and minise the collision risk to level crossing users, train crew and others.
The emergent property at the level of the physical function level of the Abstraction Hierarchy from the Work Domain Analysis is the availability of barriers to safegaurd life. From the systems engineering perspective, the measure of harm in terms of fatalities weighted injuries are positioned at the
The trail biker crossed into the path of the approaching train. From the SRK control human performance levels, the bike rider did not pay attention to the threat of the train suggests the idea of slip of attention before the skill based error in the familiar circumstances as per the theory of human error analysis (para 38 of the RAIB Report).
The desired emergent property at the user level crossings for the user, as per normative definition, is the detection of the signs as a proxy for the perception of danger of the approaching train in the hypothetical HAZOP study.
The HAZOP table can be constructed to explore all of the failure scenarios in a real Hazop as done by the author for other case studies[29].
At this stage, the alarm bells should be ringing in the heads of subject matter experts (SME). However, the inputs from the Risk experts, and HF experts with their numerical estimates and operational research expertise may disturb the epistemic awareness of SME. The social dynamics of past organisational ex-BR routines can take over the proceedings if the cultural practice of not eliminating the risk persists within the team[30]. 2010/2011 observations by the author that decision teams co-ordinated by RSSB suffer from Group think-bias.
The difference between novice and trained driver behaviour is clearly established is discussed in the RAIB Report drawing upon the research from the road domain( paragraph 43, footnote 10). The idea that line of sight for the rider to view the train is not available till rider reaches some part of the railway line is to be understood.
In the 1988 Clapham accident in the railway domain reveals in a very tragic way the differences between novice and trained driver behaviour[31]. The novice train driver of the Basingstoke train detected the problem of error in the ( signalling system) information displayed by the signals and acted along the rule based control performance type and stopped the train at the next signal to report the irregularity to the signaller as per the rule book instruction.
From the human error and decision making perspective, and from the domain knowledge perspective, had the novice train driver proceeded like the other experienced train drivers and informed the signaller later, the fatalities may have been avoided. The fault could have been rectified without the societal risk being incurred after the signaller and controller in charge had spoken to the expert available at hand. Decision support at the control room either in the form of subject matter expert is thus, crucial. Therefore, the difference between novice and trained driver behaviour cannot be established as conclusive as transferability of lessons from the road to railway sector cannot be assumed as it was done in the case of RAIB/RSSB learning lessons from failed human performance in the road domain.
Considering the idea of human being modelled as a data information processor as conceived by human factor experts, Prof Jens Rassmussen(1994) and Prof James Reason(1990), it is clear that the dynamic mental model of trail bike rider, and the train driver behavior in the domain derive from cognitive decision making models where the novice train driver acted at the rules based information level( signal information processing), whereas the trained train drivers, from memory, recalled the rule employed and discounted the competing hypothesis of failed signal sequence( symbolic information processing).
Likewise, the presence of danger may have been discounted by the trail bike rider due to the cue that the fellow rider held the gate open for him in the environment indicated to the trail bike rider 1 that it is safe to go across (para 45 of the RAIB Report). An eye witness account of the situation where the trail rider 2/3 held the gate for the rider1( paragraph 45) seems more plausible cue for rider1 to assume that it is safe to go across.
The idea of behaviour of the road and biker rider being same is contradicted by the example of the 1988 Clapham accident in the railway domain.
Visual perception of lightning in the sky and hearing of the sound of thunder taking place thereafter as an objective physical phenomena has been documented by the great physicist, Noble laureate in physics, Prof Albert Einstein[32].
Further, from line of sight perspective, trail bike rider 1 may not have turned his head to see the approaching train as the focus of attention activated by the cue of the other trail biker holding the gate open affords the signal that it is safe to plan to go across.
From a heuristic decision making perspective, it is accepted in the HF literature that experts have to depend upon intuitive judgements and cues activate the action to be pursued. In the absence of the other bike riders, the behaviour of the rider1 would have been different. The idea is not that the other bike riders were responsible for the action of accident victim is the suggestion of any kind for it would not only compound their grief and shock and indicate a dispositional tendency on the part of the author to pin responsibility on the persons closer to the accident site.
The laws of physics determining the speed of light and sound in the air medium cannot have changed for the biker rider 1 and therefore, the research conducted by RSSB cannot be true as speed of trains cannot be estimated by anyone in an intuitive manner unless it is measured and human factors research taking into relative movement of leading and following vehicles cannot apply to the situation as the reference frames of motion of observers are not identical and therefore, cannot be assumed to apply to the level crossing situation(paragraph 48 of the RAIB Report). Vigilance failure cannot also be accepted as the arrival of the train was noted by other biker riders and one of them held the gate open despite failing to use the phone installed at the crossing. Lack of awareness of the utility of the phone under emergency conditions and rights to go across the crossing with trail bikes suggest blind spots in the riders knowledge base and therefore, it is clear that mitigating measures of phone and whistle board did not influence the behaviour (paragraphs 40, 58 of the RAIB Report). Therefore, categorising and criminalising worked crossing user into a species of humanity with a distinct behaviour is a clear sign of overconfidence and lack of awareness of laws of physics and optics on the part of researchers and those who accepted the RSSB reseach[33].
The more detailed understanding of how light stimulation works through human eye cells and nervous system through cellular mechanisms can be explored through system biology texts such as Prof Enrico Coen[34].
Audit questions from the barrier analysis branch of the logical tree are used to derive the answers.
To understand why the barriers were less than adequate and why this appreciation is not reflected in the RAIB Report, a further barrier analysis as per Management and Oversight Risk Tree, a safety assurance technique, is applied to the situation analysis.
The available options, in accordance with the traditional guidelines reduce to the following Table 3.
Table 3: Energy Barrier Trace Analysis
Potential hazard | Victims | Mitigation or Eliminative barriers | Comments |
Train striking against cars, riders (horse etc. and bicycle, motorised vehicles ) , heavy vehicles | Users | - Gates/ Lifting barriers -Visual signs - Signal telephones - Fencing - Crossing surface
| -less than adequate
-less than adequate -less than adequate -less than adequate -less than adequate
|
From the inspection of the above table, and analysis presented earlier indicate that the understanding of RAIB/ORR/RSSB/Network Rail/RDG organisations is less than adequate of the hazards they generate and adequacy of the risk controls proposed.
The 2011 recommendation by Sir Roy McNulty to establish a Rail Systems Agency (RSA) to lead the industry in achieving technical excellence in standards management, technical integration, and driving innovation is read but not acted upon.
Author’s 2010 solution to the problems of system safety faced called for a National Hazards Committee and 2011 recommendation by Sir Roy McNulty for a Rail Systems Agency (RSA) are logically equivalent.
From a safety science perspective, Prof J. Suokas (1985) argued that the limitations of hazard identification methods should be known when planning a safety analysis and evaluating its results. Reliability of safety and risk analysis is a question faced by authorities when presented with the results of safety and risk analysis. This Submission will show that reliability of safety and risk analysis either in the form of pre-mortem or post-mortem analysis suffers from the effects of human error.
Prof J.Suokas taught to use multiple safety methods complementary tools and systematic questioning of the design of process equipment and instrumented systems to overcome the limitations of safety and risk analysis in the case of chemical plants. Prof J. Suokas (1985) taught the decision criteria how a system hazard is generated from the union or intersection of system’s determining factors (related to the stable factors of system design and development activity), system operation and maintenance factors (related to the factors of deviations from the design and development norms in the system operations and maintenance activity). To the foregoing factors, I have added System dysfunctional interactions, drawing upon the teachings of another safety scientist, Prof Nancy Leveson (2004). By 2005, these concepts together with other systems engineering concepts motivated me to develop a context diagram of safety concepts in the System Safety Poster I use to navigate the problem and solution space related to railway safety at RSSB. The safety and risk strategies to manage hazards and risks were stated in the poster and included in the System Safety Poster. The System Safety Poster was published by the 2015 Transport Select Committee in 2016[35].
The activities of system analysis in the form of system definition (as opposed to scope definition), safety analysis in the form of identification of and analysis of hazards, and risk analysis to establish the adequacy of risk controls via qualitative and quantitative manner applied in tandem are mandatory for implementing the Common Safety Method for Risk Assessment or any analyst performing system and safety analysis[36]. However, ORR does not comprehend the fact the level crossing user is outside the operational railway system as per the 2014 Guidance acting as an external barrier as per the Common Safety Method for Risk Assessment.
If the ORR proposition be accepted, for the sake of argument, then it cannot be said to be true that both properties of a safety barrier and barrier failing in the case of assessing credible worst case scenario at the same time for establishing the “significance of the change” in the case of level crossings or any other technology cannot be accepted. This is an error in reasoning is the same as that occurs in the common sense reasoning that an object cannot be at rest and in motion at the same time.
If ORR advances counter argument that they think in terms of modern quantum physics, then they will have to explain how as conscious observer(s), they ‘re able to superpose states of their W l perception of level crossing user as a living barrier l live level crossing user> plus Z l perception of level crossing user as a failed barrier l dead level crossing user > co-exist in “reality” in the entangled superposition. The fact that one of the entities is non- existent in the entangled superposition is not recognised by those who advance arguments on behalf of the ORR.
For more details of Schrodinger thought experiment, readers are kindly requested to refer to Chapter XXIX of 2004 publication[37]. Further, senior managers violate the recommendation 12.9.19 made by the Ladbroke Grove Joint Inquiry Report[38] (2001) in relation to the ERTMS-ETCS Technology that Regulations on the ETCS Fitment should be in absolute terms and not dependent on reasonable practicability.
In 1994, the UK HSE –HMRI decision to abandon the installation of ATP on cost-benefit analysis basis and shift focus of attention and resources to the SPAD management is not recognised, not questioned and is not listed as a contributory causal factor to the subsequent accidents on the HMRI inspected and Railtrack controlled infrastructure is overlooked by 2013 paper by Prof Andrew Evans[39] as well. The 1995 Ministerial decision not to go ahead with the BR-ATP (Prof Andrew Evans, 2013[40]) based upon utility considerations indicated that the concepts of hazards, barriers, victims were not formally constructed by the then safety regulator to inform the Ministerial decisions on public policy is a safe inference to make.
Likewise, the July 2007 HLOS[41] document in the paragraphs 2.7 to 2.9 endorsed the work of the RSSB Safety Risk Model. The Secretary of Transport was not informed by RSSB, a safety standards body, that the fault and event method of quantitative risk assessments was negatively critiqued by Prof James Reason in 1990 and this fact was made available to the then RSSB Professional Head of Signalling, Telegraph, and Electrification and other senior managers who approved author's 2006 paper prior to its publication. Further, author’s 2006 paper noted that individual, technical and organisational factors need to be catered to in the system representation for failure analysis and subsequent activity of safety requirements specification.
2009 NIMROD Investigation[42] again critiqued the fault and event method of quantitative risk assessment but RSSB did not take notice of this critique and continued to make risk judgments with its own practice till date. From the Human Error perspective, this neglect of lessons learnt from past actions in the risk management domain constitutes a violation.
The July 2015 National Audit Office Report did not take into account of RSSB, and Rail Delivery Group in its stakeholder diagramming of the position of Network Rail in the Railway industry. How does a railway firm make allocative decisions of funds to satisfy the competing demands of safety, security, environment, maintenance and other resource demanding functions like track, electrification, station, rolling stock, signalling operations, track operatives, design, timetable planning and digital railway on economic grounds? Answering this question has been the quest of all inquiries for the last six years and has been a lifelong quest of Prof H.A Simon[43].
The ALARP Policy, read together with the lessons on occupational safety, organisational theory, and decision making as taught by Prof Jens Rasmussen, James Reason, Nancy Leveson, and other safety and human error experts lead to several kinds of biases in risk judgments as graphically illustrated in the previous Written Subimission to the current Inquiry.
Further ,the majority of these biases apart from few like hindsight biases do not find their place in the research on the Common Safety Method for Risk Assessment and Taking decisions published by the Industry[44]. For more details on the classification of decision sciences in terms of normative, descriptive and above themes can be read in the Chapters IX and Xth of 1997 Paper[45]. Moral hazard in teams is an idea discussed in the 1992 Becker’s work and has fetched its author a Noble prize in 2016.
From the above discussions, it is clear that all agencies like RAIB, RSSB, and ORR failed to observe the errors in the research commissioned by RSSB.
The idea that driving in the road conditions and at the user worked level crossing situation is not the same and the risk is not the same is not acted upon.
Recalling the fact of previous Written Submission[46],3 in which the author presented the Methodology to Network Rail Proessional head of signalling in 2007, and 2012, it can be safely concluded that there is no awareness, or know how to plan and implement the System Approach to Safety.
The problems of railway safety and security management are not intractable problems or wicked problems[47] as it is assumed in the accident risk literature. The hypothesis assumed in this paper is that as soon as safety bodies, seeking to analyse and explain the accidents from the human error perspective,drift into the failure zone as they need to be mindfully aware of the political implications of their sponsorship by either Government agencies or senior managers of the non government agencies when learning lessons from the organisational accident case studies like 1986 NASA Space Challenger accident, 1988 British Rail Clapham accident,1990 Newton Junction accident, 1999 Ladbroke Grove accident, 2006 NIMROD accident, the 2007 Grayrigg accident, and the 2010 Herefordshire accident.
Therefore, intervention is necessary to improve the process of safety decision making and enhance the effectiveness and efficiency of safety management systems by establishing an agency to implement the risk management policy, and oversee its implementation by adhering to a common standard of hazards management and assess risk controls by pre-mortem and post -mortem analysis of hazards and confront the organisational biases.
RAIB has issued a report reporting their findings and here is the summary of their report stated in the next three paragraphs.
On 14 May 2015, a passenger train collided with a tractor at Oakwood Farm user worked crossing near Knaresborough, North Yorkshire. The train was carrying 66 people and travelling at 65 mph (105 km/h), but did not derail. The collision caused the front of the tractor to become detached from its cab. The tractor driver suffered minor injuries, and the train driver was treated for shock. However, in different circumstances the consequences could have been much worse.
The tractor driver began crossing the railway after the illuminated warning at the crossing started to display a red light. This was probably because he was unfamiliar with the crossing’s operation; it is one of a small number in the country that had been fitted with remotely operated, powered gates. It is likely the tractor driver did not recheck the warning lights after first stopping on the approach to the crossing to press a button to open the gates. This button had not originally been intended to open the gates (it should only have been capable of being used to close them). It was situated at such a distance from the crossing that the time it took for the tractor driver to stop, open the gates and then drive onto the crossing, was greater than the time between the warning light turning red and the arrival of the train. There was no sign at the button to warn the driver to recheck the warning light before going over the crossing. The investigation also found that the warning light was not conspicuous among the many signs present at the crossing.
The underlying causes of the accident were that Network Rail did not ensure that the risks at the crossing were adequately mitigated, and that the process for the introduction of the gate operating equipment was adequately managed.
The RAIB[48] has made three recommendations to Network Rail. The first is to improve the safety at Oakwood Farm user worked crossing and the second is to review the safety of other user worked crossings fitted, or planned to be fitted, with the remotely operated gate opening equipment. The third recommendation is for Network Rail to review the robustness of its processes for introducing new equipment on to its railway infrastructure
The management and engineering processes for introducing changes to the design and operation of signalling systems operating at the level crossing site was less than adequate (paragraph 128 of the RAIB Report 07/2016).
In 2009, there were no level crossing managers and there was no understanding of changes to the level crossing proposed by the Ministry of Justice in 2013. However, HMRI subscribed to the viewpoint of user’s abuse of level crossings as expressed by the HMRI Inspector in 2008[49]. HMRI failed to inspect the design of the proposed changes into the operations. This is Knowledge based behavioural performance where the lack of understanding on the part of the HMRI to check the hazards being seeded through the product/design/operational changes was less than adequate.
The reactive process of engineering management, such as Yellow book was in place.
The changes initiated through the 2004 EU Railway Safety Legislation were not appreciated by the CEO’s of all organisations as noted in 2006 and 2010 papers by Sanjeev Appichar la.
The understanding of shared risk between the Farm owner, the RU, the IM and the HA is less than adequate.
The defective management and engineering processes for introducing changes to the design and operation of signalling systems operating at the level crossing site cited earlier was a latent failure condition to leading to the less than adequate implementation.
a1. Methods, Criteria, Analysis LTA
The ideas of difference between operator task and procedure used in the HF domain is not understood by the railway level crossing staff and managers, product acceptance staff, and there is no proper definition on how to manage staff engaged in introducing change to the operating infrastructure.
Network Rail’s product acceptance process is defined in its company standard NR/L2/RSE/100/05, ‘Product introduction and change’, and is managed by the Network Rail Acceptance Panel (NRAP). Its purpose is to ensure that Network Rail complies with its legal responsibilities and its Safety Management System when it introduces new equipment or systems onto its infrastructure. This can be done in a series of discrete stages, including monitored trials, which allows a controlled assessment to be made to identify any operational risks that may emerge. In this way, risk can be minimised and mitigations can be introduced before the equipment or system is given full approval, (known as full acceptance), and thereafter used more widely across the rail network. (paragraph 97 of the RAIB Report 07/2016).
Certificates of acceptance are issued both before monitored trials and to authorise equipment when fully accepted. The certificate records the details of the equipment or system, the conditions under which it may be used, and a list of documents reviewed in support of its acceptance. If a certificate is issued for trial use, a monitoring period can be specified and the criteria by which the outcome of the trial will be assessed can be defined. Certificates of acceptance are generally signed by both a member of NRAP and the professional head of the engineering discipline to which the product or system best applies, eg track, signalling. (paragraph 98 of the RAIB Report 07/2016).
The definition of socio-technical system as required for proper safety analysis as defined in the section 2.2 was not considered(paragraph 99 of the RAIB Report 07/2016).
The NR Assurance Proces and the Professional Head of Signalling did not raise objection to less than adequate system definition( paragraph 99/100/101 of the RAIB Report 07/2016).
The HF consultant’s advise to perform hazard assessment was omitted (paragraph104 of the RAIB Report 07/2016).
Lack of awareness on how to integrate human factors in safety management system is a problem for those who did not manage programme/projects is noted by Prof Jens Rasmussen 1994).
Appointing level crossing managers or asking RSSB to conduct research without being aware of pro-active risk management approaches does not solve the problem is learnt from the Report( paragraph 111 of the RAIB Report 07/2016 (paragraph 121 of the RAIB Report 07/2016).
The accident investigation by RAIB undertook a system approach to human error.
The errors in the decision making from the professional head of signalling discipline and at the project level were identified. However, from the risk management perspective, how the budget, resources and processes for change management and investment management at higher levels of management were sanctioned was not investigated by RAIB.
Further, the lessons learnt from the past accidents were not acted upon.
The concept of user task and procedure specification were not identified. The difference between hazard and its causal factors was not understood by the project team.
The idea of engineering and management supervision of the project work to assure that the hazards are not seeded into the operations is not entertained.
The management ability to learn further lessons appears to have been saturated is clear from reading the RAIB Report.
There was no Hazop study conducted on the proposed changes nor there was a logical fault tree analysis performed.
Therefore, intervention is necessary to improve the process of safety decision making and enhance the effectiveness and efficiency of safety management systems by establishing an agency to implement the risk management policy, and oversee its implementation by adhering to a common standard of hazards management and assess risk controls by pre-mortem and post -mortem analysis of hazards and confront the organisational biases.
The appendices presents brief results demonstrating lack of awareness of systems safety, human factors concerons in the domain.
The results of the study conducted in this research paper accord with the research experience of Prof N.A. Stanton (2016), an academic human factors expert as well. Prof N.A. Stanton research request to apply a System Approach to Safety to investigate accidents at level crossings was declined by RSSB and Network Rail.
I drew attention of Prof N.A. Stanton to my peer reviewed published research which he omitted in his research survey and he agreed that he was not aware of the peer reviewed accident research that I had carried out on accidents at level crossing. The Appendix contains the email correspondence I had with Prof N.A. Stanton (2016) during the first week of November 2016 and sought his permission to share the information freely. Thus, the idea of lack of awareness of Human Capital in risk management in the GB Railway domain as per Noble laureate in the field of labour economics, Prof Gary S. Becker’s definition of the concept[50] (1964-1992), to define complex systems, identify and analyse hazards, and their causal factors and assess adequacy of risk controls is established.
----- Forwarded Message -----
From: Stanton N.
To: sanjeev kumar Appicharla
Sent: Monday, 7 November 2016, 18:42
Subject: Re: Accident Investigation at Level Crossings
Dear Sanjeev,
Please feel free to cite our correspondence.
I was involved as an expert witness in the civil litigation following the Ladbroke Grove accident, which led to the following three journal papers:
Moray, N., Groeger, J. and Stanton, N. A. (2016) Quantitative Modelling in Cognitive Ergonomics: Predicting Signals Passed At Danger. Ergonomics, (Article in Press).
Stanton, N. A. and Walker, G. H. (2011) Exploring the psychological factors involved in the Ladbroke Grove rail accident. Accident Analysis & Prevention, 43 (3), 1117-1127.
Stanton, N. A. and Baber, C. (2008) Modelling of alarm handling responses times: A case of the Ladbroke Grove rail accident in the UK. Ergonomics, 51 (4) 423-440.
Kind regards,
Neville
Prof Neville A Stanton, PhD, DSc, C.Psychol., C.IEHF, C.Eng.,
Chair in Human Factors Engineering,
University of Southampton,
https://www.researchgate.net/profile/Neville_Stanton
https://scholar.google.com/citations?user=fq9SzJUAAAAJ
http://orcid.org/0000-0002-8562-3279
http://www.bbc.com/news/business-35169168
http://www.stitcher.com/podcast/bbc-inside-science/e/46049605
The University of Southampton Faculty of Engineering was ranked top in the UK in terms Research Power for General Engineering across all single-institution engineering submissions for REF 2015
From: sanjeev kumar Appicharla < >
Date: Monday, 7 November 2016 13:22
To: Neville Stanton
Subject: Re: Accident Investigation at Level Crossings
Dear Prof Neville
I thank you for your quick response and warm wishes.
I did cite your paper in my response to the Transport Select Committee but was not aware of the fact that you had approached RSSB and Network Rail for research funding and were refused.
I kindly seek your permission to quote the following emails exchanged as Supplementary Evidence to the ongoing Transport Select Committee Inquiry on Railway Safety and the INCOSE HF Working Group as well.
Thanking you in advance,
Best Regards
Sanjeev Kumar Appicharla
________________________________
From: Stanton N
To: sanjeev kumar Appicharla
Sent: Sunday, 6 November 2016, 8:21
Subject: Re: Accident Investigation at Level Crossings
Dear Sanjeev,
Thank you for indicating your research in the field. I was unaware of it.
Our research was deliberately taking the systems approach as we felt the reductionistic approaches have limitations.
We are focused on designing new road-rail interfaces.
I did ask RSSB and Network Rail if they would fund my research but they were not interested.
All the best,
Neville
Prof Neville A Stanton, PhD, DSc, C.Psychol., C.IEHF, C.Eng.,
Chair in Human Factors Engineering,
University of Southampton,
From: sanjeev kumar Appicharla
Reply-To: sanjeev kumar Appicharla
Date: Saturday, 5 November 2016 22:06
To: Neville Stanton
Subject: Accident Investigation at Level Crossings
Prof Staton
I found that your recent research is in applying Prof Jens Rassmussen concepts.
I wish to draw your attention to my peer-reviewed literature on wider human factors errors and failures observed within the GB Railways domain on applying System Thinking Approach(Prof Jens Rassmussen concepts) to investigate the accidents at level crossings as well as complying with the EU risk regulations .
I think this research should have been a part of the research you have supervised.
The failure to follow and resist the concepts advanced by Prof Jens Rassmussen's concepts in the GB Railways domain was my main finding for the studies carried out at RSSB in the Hazop studies I chaired between 2006-2010.
Failure to take cognisance of Hazop studies using HF concepts and neglect of these findings by the organisations in the GB Railways domain suggest to me A serious culture problem that needs to be addressed.
System for Investigation of Railway Interfaces<http://ieeexplore.ieee.org/search/searchresult.jsp?newsearch=true&queryText=appicharla>
Technical review of common safety method using system for investigation of railway interfaces (SIRI) methodology <https://www.scopus.com/authid/detail.uri?authorId=36959670200>
Regards
Sanjeev Kumar Appicharla
From: sanjeev kumar Appicharla
To: Rail Delivery Group
Sent: Tuesday, 10 September 2013, 16:02
Subject: Re: Fw: System Approach to Safety
Hi Graham
Thank you very much for your email and response.
I await response from the Technology and Operations working group.
Yours sincerely.
Regards
Sanjeev Kumar Appicharla
>________________________________
> From: Rail Delivery Group
>To: sanjeev kumar Appicharla
>Sent: Tuesday, 10 September 2013, 14:11
>Subject: Re: Fw: System Approach to Safety
>
>
>
>Sanjeev
>
>Thank you for your e-mails. I won't pretend to have any personal expertise in your areas of study. I have passed your proposals to RDG's recently formed Technology and Operations working group and asked them to make contact if they want to develop your ideas further.
>
>Graham
>
>
>Graham Smith
>Director-General, Rail Delivery Group
>
>On 10 September 2013 13:17, sanjeev kumar Appicharla wrote:
>> To
>>Mr Graham Smith
>>The Director General of Rail Delivery Group Limited
>>
>>
>>I do hope you have had chance to review the contents of my emails sent earlier.
>>
>>Further to my email sent to you on theme of System Approach to Safety, I have gathered further information on the state of capability of the GB Rail Industry to implement the National ERTMS Plan. This plan was examined by the ORR using the Independent Reporter Halcrow issued in March 2013. This report is hosted on their website. Here is the link.
>>http://www.rail-reg.gov.uk/upload/pdf/halcrow-ertms-review-mar13.pdf
>>However, in the Report the difficulty identified with the National Implementation of the ERTMS by the author in 2003, nor the UK HSE Review in 2003, or the observation by the Imperial College London(2012) that ERTMS Technology Plan suffers from errors in human factors engineering, safety engineering and systems engineering work capabilities does not make any appearance at all.
>>
>>The above information supports my anxiety and concern that railway projects are planned without any adequate thought given to the signalling systems engineering and information systems engineering activity thereby endangering the idea of designing and delivering safe and efficient railway. Thus, intervention may be necessary to make changes to the existing method of programme and project delivery.
>>Rather than just point to the problem space, I wish to offer the SIRI Methodology as part of the solution space to identify and clarify the errors that reside in railways standards and regulations or RAMS activities or the silo culture that distract or distort the attention away from the real hazards.
>>Regards
>>Sanjeev Kumar Appicharla
>>
>>
>>
>>----- Forwarded Message -----
>>> From: sanjeev kumar Appicharla >
>>> To:
>>> Cc:
>>> Sent: Wednesday, 4 September 2013, 17:14
>>> Subject: System Approach to Safety
>>>
>>>
>>> To
>>>
>>> Mr Graham Smith
>>> The Director General of Rail Delivery Group Limited
>>>
>>> I am writing to inform you of the System Approach to Safety that has been
>>> developed by myself over the last 10 years whilst employed by Alstom Transport
>>> and RSSB. This is the only Systems Engineering Methodology that can implement
>>> the requirements of ROGS and the Common Safety Method by identifying unsafe
>>> situations created on the operational railway due to latent errors in
>>> managerial, engineering disciplines and operator decision making processes in
>>> conjunction with the active errors.
>>>
>>> I am interested in associating with the Rail Delivery Group, but I leave the
>>> nature of association to your best judgement at this stage. I am attaching the
>>> related papers on the SIRI Methodology, the Letter of Gratitude issued by the
>>> IET and my CV for your kind perusal.
>>>
>>> Further to the research on Human Error by James Reason, this methodology can
>>> identify technological errors as well. The System Approach to Safety was
>>> developed at RSSB prior to the Government Study in 2010. The SIRI Methodology
>>> highlighted the errors in decision making connected with various types of assets
>>> and rules and regulations that can lead to unsafe operating practices.
>>>
>>> The results of application of the SIRI Methodology were published at the IET
>>> International System Safety Conferences in 2006, 2010, 2011, and 2012. Further,
>>> two papers are due to be presented at the IET System Safety Conference in 2013.
>>> One of them deals with the Technical Review of the Common Safety Method and
>>> another deals with the Japanese Nuclear Accident 2011.
>>>
>>> I look forward to an early reply from you, on a positive note.
>>>
>>> Thanking you for your time and attention
>>>
>>> Your Sincerely
>>>
>>> Regards
>>> Sanjeev Kumar Appicharla
>>>
>>> Attachments: 8
>>>
>>> 1: SIRI Appicharla 2006: Paper presented at the IET System Safety Conference
>>> 2006
>>> 2. IET Gratitude Letter in 2009
>>> 3. SIRI Appicharla 2010: Paper presented at the IET System Safety Conference
>>> 2010
>>> 4. SIRI Mort Analysis of the Herefordshire Accident IET Safety Conference 2011
>>> 5.MORT Analysis of the Challenger Accident IET Safety Conference 2012
>>> 6. SIRI Analysis of the Fukushima Accident Advance Copy of IET Safety Conference
>>> 2013
>>> 7.SIRI Analysis of the Common Safety Method Advance Copy of IET Safety
>>> Conference 2013
>>> 8. CV of Sanjeev Kumar Appicharla dated 29 August 2013
Prof Bill Hannaman and his co-authors who described eight desirable features for the Human Reliability Analysis models( James Reason-1990) . The desirable features are:
(1) be compatible and complements current quantative risk assessment (PRA).
(2) Be scrutable, verifiable and repeatable.
(3) should result in quantification of crew success probability as a function of time.
(4) They take account of cognitive processing at the skill-rules-based and knowledge based levels of performance.
(5) able to model various performing shaping factors(e.g. design features affecting human machine interface, operator training, and experience levels, stress factors, the time available for effective action, etc).
(6) The results shall be comparable to the highest degree possible with existing data from plant experience, simulator, or expert judgement.
(7) They should be simple to implement and use.
(8) Should help to generate insights and understanding about the potential for operators to cope with the situations identified in PRA studies.
In 2009, the Health and Safety Laboratory published for the UK HSE a review of human reliability assessment methods numbering more than 72. Out of A total of 72 potential human reliability related tools and acronyms were identified within the search timeframe. Of these, 37 were excluded from any further investigation and 35 were identified as potentially relevant to HSE major hazard directorates and were investigated fully. Of the 35 potentially relevant HRA tools, 17 are considered to be of potential use to major hazards directorates.
Accidents such as the Piper Alpha disaster illustrate that the performance of a highly complex socio-technical system, is dependent upon the interaction of technical, human, social, organisational, managerial and environmental factors and that these factors can be important co-contributors that could potentially lead to a catastrophic event. The purpose of this article is to give readers an overview of how human factors contribute to accidents in the offshore oil industry. An introduction to human errors and how they relate to human factors in general terms is given. From here the article discusses some of the human factors which were found to influence safety in other industries and describes the human factors codes used in accident reporting forms in the aviation, nuclear and marine industries. Analysis of 25 accident reporting forms from offshore oil companies in the UK sector of the North Sea was undertaken in relation to the human factors. Suggestions on how these accident reporting forms could be improved are given. Finally. this article describes the methods by which accidents can be reduced by focusing on the human factors, such as feedback from accident reporting in the oil industry, auditing of unsafe acts and auditing of latent failures.
To enable navigate the decision taking situation on the theme of Human Error in complex or simple organisations, the author had created a Conceptual Architecture Diagram drawing from the ideas contained in the standard ANSI/ IEEE 1471: 2000. Thanks to the time stamping techniques, the Conceptual Architecture Diagram bears the following date and time. 22/08/2005.
However, the author cannot attribute the same lack of awareness of my published research to the current RSSB Safety Director, ex-Chief Inspector of the RAIB, the ex- Director-General, Rail Delivery Group, and the ORR Chief Inspector who became aware of the research in 2011, 2010, 2013 and 2015 respectively. A request similar in nature to Prof N.A. Stanton ‘s request was made by myself to the Rail Delivery Group (RDG) in September 2013. This is probably languishing in an email box of a member of the Technology and Operations Group (TOG) after the then Director-General, Rail Delivery Group, Graham Smith directed my email to the (TOG). The Appendix contains the email correspondence I had with Graham Smith. The ORR Chief Inspector, Ian Prosser, did not take into account any objections I advanced against the SPAD Risk Assessment undertaken by Network Rail did not pay sufficient attention to human factors, safety and systems engineering disciplines in October 2015 .
Ex-Chief Inspector of the RAIB, Carolyn Griffiths, did get a chance to read my 2010 paper, and discussed the contents of the poster presentation at the 2010 IET System Safety International Conference as she presented the Keynote speech at the Conference and she asked for the 2006 paper to be emailed to her and her request was complied via a private email. The 2010 paper stated the fact that decision making on hazard(s) situations at RSSB was biased and she was surprised by the research finding that train drivers are unnecessarily blamed for accident(s).
In 2011, the current job holder of the RSSB Safety Director role, Dr. George Bearfield partially presented RSSB Research at the 2011 IET System Safety International Conference and left the presentation midway as he felt uneasy in making the presentation. I was present at the IET System Safety International Conference to give a presentation on the post-mortem analysis of the 2010 Herefordshire Level Crossing fatal accident after his presentation.
Unlike Dr. George Bearfield who faced difficulty in generating system diagrams and concluding a system definition as a part of system analysis for further safety analysis, my RSSB 2006 paper did not face any such difficulty. Both mathematical and non –mathematical definitions for the operating system can be used in the SIRI Methodology relying upon multiple methods of system safety and operational risk management .
Duty holder category of station operators, train /freight operating companies, infrastructure controllers were known prior to 2006 via the 1994 Railway Safety Case Regulations and it became part of railway professional knowledge that they will be only two railway duty-holder categories following the introduction of 2004 European Railway Safety Regulation. Given the fact, Dr. George Bearfield, was then the project manager leading the one of the workpackages of the SAMNET Project and aware of the SAMRAIL project leading to the 2004 European Railway Safety Regulation, and later, was a safety knowledge manager at RSSB. Observer from the GB railways participated in the SAMNET and SAMRAIL Programmes can be seen from the evidences cited. Therefore, the lack of understanding of the role of infrastructure manager in the SPAD situation as it is demonstrated in the Ladbroke Grove hazard situation in the model of Bayesian network by Dr. George Bearfield is a cause for wonder. Prof Fenton remark of bad statisticians may apply here in this case. In 2009, author had dismissed Dr. George Bearfield’s comment on the ABCL Hazard Report, as he observed that all human error events are attributable to the level crossing users as per SIMIS data.
The main content of the Dr. George Bearfield and Roger Short (ex-HMRI deputy Chief Inspector) 2011 presentation was the graphical representation of system hazard and barriers model in the context of Common Safety Method to apply to the railway system and develop a new safety risk model. The graphical representation of system hazard and barriers model was first proposed in my RSSB 2006 paper. The graphical representation can be seen in the earlier publication in the part V of the UK HSE Safety Standard IEC 61508 (2001). In the literature on safety analysis and decision making produced by Prof Jens Rasmussen(1997) , the model can be seen as part of the Management Oversight and Risk Tree (1973) developed by the Aerojet Corporation for the US Department of Energy -Atomic Energy Commission to perform safety analysis of civil nuclear plants. The UK Interational Crisis Management Association observed that the Management Oversight and Risk Tree (1973) is the ultimate technique in the operational risk management .
From reading the 2011 RSSB paper, reader can safefly infer that it does not qualify as a scientific safety paper as it did not define how the system which generates the hazards will be identified nor proposed any new methodology to support the railway projects by generating knowledge to support the safety case by means of hazard identification and analysis. Further, the aim of the research, contrary to the demands of the safety science, was to uncover the economic criteria to minimise the cost of safety and risk analysis.
D.J. Edge (2001) talked of departing from hard human science approach on individuals to define correct performance to form a foundation for the concept of maintenance error and question the specification of the maintenance procedure. He wanted to develop human science on the collective traits like a social scientist.
The hard human science approach takes a psychometric approach–what can person see, hear, or feel, what is his reaction time, and how that it is related to time on duty, temperature or humidity? How much force can he exert and how is this affected by lying on back? He was motivated by Dr. Debbie Lucas (2000), a former student of Prof James Reason, and ex HMRI HF Inspector. Dr. Debbie Lucas statement urging railway organisations to consider human factors as a distinct element which must be recognised, assessed, and managed effectively in order to control risk and human error has been seen as sufficient explanation for the accident or incident beyond the control of managers. Dr. Debbie Lucas (2006), it seems unable to confront senior managers latent errors shifted her attention to fatigue management of train drivers can be seen from her presentation .
Senior managers were happy to respond by creating HF department at the Rail Safety and Standards Directorate and at Railtrack as long as accident reporters shied away from critiquing the senior manager’s decisions.
December 2016
[1] Nancy Leveson. 2004/2012. Engineering a Safer World. https://mitpress.mit.edu/books/engineering-safer-world
[2] Peter J. Underwood. 2013. Examining the Systemic Accident Analysis Research-Practice Gap. PhD.Thesis.
<https://dspace.lboro.ac.uk/dspace-jspui/bitstream/2134/13845/3/Thesis-2013-Underwood.pdf>
[3] Lucic, I. (2010). Risk and Safety in Engineering Processes. (Unpublished Doctoral thesis, City University London). http://openaccess.city.ac.uk/8719/6/Lucic,%20Ivan.pdf
[4] Thommesen, J., & Andersen, H. B. (2012). Human Error Probabilites (HEPs) for generic tasks and Performance Shaping Factors (PSFs) selected for railway operations. Department of Management Engineering, Technical University of Denmark. (DTU Management Engineering Report; No. 3.2012).
[5] RSSB. the Incident Factor Classification System. http://www.rssb.co.uk/improving-industry-performance/human-factors/human-factors-case-studies/developing-the-incident-factor-classification-system.
[6] RSSB. 2014. Accident investigation guidance. http://www.rssb.co.uk/risk-analysis-and-safety-reporting/accident-investigation-and-learning/accident-investigation-resources.
[7] Perrow, C., 1984/1999. Normal Accidents. 1999 ed. New Jersey: Princeton University Press.
[8] Courant, R. and Robbins, H. "The Geometric Progression." §1.2.3 in What Is Mathematics?: An Elementary Approach to Ideas and Methods, 2nd ed. Oxford, England: Oxford University Press, pp. 13–14, 1996. https://en.wikipedia.org/wiki/Geometric_series
[9] Andrew Evans. 2003.HMSO NorwichTransport fatal accidents and FN-curves: 1967-2001.
[10] The House of Commons Trasnport Select Committee. 2014. HMSO Norwich. Safety at level crossings. http://www.publications.parliament.uk/pa/cm201314/cmselect/cmtran/680/680.pdf
[11] RAIB 09/2007. Train collision with a road vehicle at Bratts Blackhouse No 1 User Worked Crossing, near Sizewell, Suffolk 22 May 2006
https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/411983/070426_R092009/2007. 07_Sizewell.pdf
[12] Lee’s Loss Prevention in the Process Industries.2005. Burlignton: Elsevier Butterworth-Heinmann
[13] RAIB Report 05/2015, Fatal accident at Frampton level crossing 11 May 2014. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/431138/R052015_150528_Frampton_LC.pdf
[15] Duncombe, R. L., 1945. Personal equation in astronomy: Popular Astronomy, Vol. 53, p.2.
http://adsabs.harvard.edu/full/1945PA.....53....2D
[16] Whittingham, R., 2004. The Blame Machine. Burlignton: Elsevier Butterworth-Heinmann.
[17] Hume, D., 1739/1984. A treatise of Human Nature. 1984 ed. London: Penguin.
[18] Kahneman, D., 2011. Thinking Fast and Slow. London: Penguin Group .
[20] H.A Simon . From substantive to procedural rationality. http://digitalcollections.library.cmu.edu/awweb/awarchive?type=file&item=33828.
[22] RAIB. Investigation Into Fatal accident at Frampton level crossing. 05/2015. Derby: HMSO, May 2015.
< https://assets.publishing.service.gov.uk/media/5565cda540f0b63d11000004/R052015_150528_Frampton_LC.pdf> . Accessed 09th December 2016.
[23] Office of Rail Regulation. Guidance on the application of Commission Regulation (EU) 402/2013. <http://orr.gov.uk/__data/assets/pdf_file/0006/3867/common_safety_method_guidance.pdf>.
[24] http://www.lxrmtk.com/Search/Index. Accessed 10th December 2016.
[26] http://www.railwaysarchive.co.uk/documents/RAIB_UserWorkedCrossings2009.pdf
[27] International Electro-technical Commission, 2001. IEC 61822 “Hazard and Operability Studies", s.l.: s.n.
[28] Rasmussen J, Pejtersen AM, Goodstein LP. Cognitive Systems Engineering. First Edition. New York: John Wiley& Sons, Inc, 1994.
Johnson WG. Management Oversight and Risk Tree SAN 821-2. Washington D.C: US Atomic Energy Agency, 1974.
Reason J. Human Error. 17th edn. New York: Cambridge University Press, 1990
[29] Appicharla, S., 2006. System for Investigation of Railway Interfaces. London, Institution of Engineering and Technology, pp. pp.7-16.
Appicharla, S., 2009. SIRI Analysis of risk Associated with Level crossing operations of ABCL Type, London: Unpublished RSSB Report.
[30] Hutter MB. Regulation and Risk. Oxford: Oxford University Press, 2001
[31] https://en.wikipedia.org/wiki/Clapham_Junction_rail_crash
[32] Einstein A. Relativity. London: Routledge, 1916–1952/2000.
[33] BBC Science. GCSE Bytesize. <http://www.bbc.co.uk/schools/gcsebitesize/science/triple_aqa/medical_applications_physics/the_eye/revision/6/ >
[34] Coen, E., 1999. The Art of Genes, How organisms make themselves. Oxford: Oxford University Press.
[35] http://data.parliament.uk/WrittenEvidence/CommitteeEvidence.svc/EvidenceDocument/Transport/Rail%20technology%20signalling%20and%20traffic%20management/written/31374.html
[36] http://orr.gov.uk/__data/assets/pdf_file/0006/3867/common_safety_method_guidance.pdf
[37] Penrose, S. R., 2004. The Road to Reality. London: Jonathan Cape.
[38] http://orr.gov.uk/__data/assets/pdf_file/0019/5662/incident-ladbrokegrove-jointinquiry.pdf
[39] http://www.sciencedirect.com/science/article/pii/S0739885912002077
[40] http://faculty.wcas.northwestern.edu/~ipsavage/104-12.pdf
[41] https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/243207/7176.pdf
[42] https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/229037/1025.pdf
[43] http://www.nobelprize.org/nobel_prizes/economic-sciences/laureates/1978/simon-lecture.pdf
[44] RSSB.2014 TAKING SAFE DECISIONS. http://www.rssb.co.uk/Library/risk-analysis-and-safety-reporting/2014-guidance-taking-safe-decisions.pdf
[45] http://www.ocw.nur.ac.rw/NR/rdonlyres/Aeronautics-and-Astronautics/16-358JSystem-SafetySpring2003/0E565A87-DFC1-4A54-B751-DF36BA2D6147/0/rasmussensafetyscience.pdf
[46] http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/transport-committee/rail-safety/written/41564.pdf
[47] Amy C. Edmondson. Learning from Mistakes is Easier Said Than Done: Group and Organizational Influences on the Detection and Correction of Human Error. < http://jab.sagepub.com/content/32/1/5.abstract >. Accessed 9th December 2015.
[48] RAIB Report 07/2016. Collision between a train and tractor at Oakwood Farm User Worked Crossing, Knaresborough, 14 May 2015. < https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/519117/R072016_160428_Oakwood_Farm.pdf>
[49] Adam Meredith. 2008. UWC (user worked crossings) misuse. http://ieeexplore.ieee.org/document/4470187/. Accessed 19th December 2016.
[50] Gary, Becker. S. 1964/1992. Human Capital. London: The University of Chicago Press.