=Paper= {{Paper |id=Vol-2010/paper3 |storemode=property |title=Functional Human Reliability Analysis: A Systems Engineering Perspective |pdfUrl=https://ceur-ws.org/Vol-2010/paper3.pdf |volume=Vol-2010 |authors=Fabio De Felice,Federico Zomparelli,Antonella Petrillo |dblpUrl=https://dblp.org/rec/conf/ciise/FeliceZP17 }} ==Functional Human Reliability Analysis: A Systems Engineering Perspective== https://ceur-ws.org/Vol-2010/paper3.pdf
  Functional Human Reliability Analysis: A Systems
              Engineering Perspective

         Fabio De Felice, Federico Zomparelli                                              Antonella Petrillo
     Department of Civil and Mechanical Engineering                                     Department of Engineering
        University of Cassino and Southern Lazio                                    University of Naples “Parthenope”
                   Cassino (FR), Italy                                                      Napoli (NA), Italy
       defelice@unicas.it, f.zomparelli@unicas.it                                   antonella.petrillo@uniparthenope.it

                                                  Copyright © held by the authors

    Abstract— The human unreliability is the main cause of          analysis, it is also necessary to analyze all components, their
industrial accidents. In the petrochemical field, about 90% of      dependencies, energy, information, and so on. All these
accidents are due to human errors. Over the years, several          elements create a high degree of dependence and a high level
models of Human Reliability Analysis have been developed. The       of possible combinations in which the system can be found.
major limitation of these models is due to their static nature.     When analyzing accidents, it is rare to have all the necessary
Thus, the present research aims to propose a new innovative         information and those few information obtained are influenced
approach to evaluate the variability of the human error             by secondary aspects such as prejudices and practical
probability between related activities in complex systems using a   constraints [5]. An important development in safety
resilience engineering approach.. Research integrates an HRA
                                                                    management practices has been with the emergence of the
evaluation model with a resilience engineering model called
Functional Resonance Analysis Model to assess the human error
                                                                    human reliability analysis (HRA). HRA techniques allow to
variability. The methodology is applied in a real case study for    calculate the human error probability in relation to a specific
the emergency management in a petrochemical company.                task handled by the operator. HRA analyzes linearly events
                                                                    and does not identify a cause-effect relationship [6]. To work
   Keywords — Resilience Engineering,           FRAM,     System    around this problem, it needs to use resilience engineering
Engineering, HRA, Emergency Management.                             methods. The Functional Resonance Analysis Model (FRAM)
                                                                    [7] in particular allows to manage systems considering the
                      I. INTRODUCTION                               order between the various activities that make them and
                                                                    considering how a single upstream activity can affect
    In recent years, many organizations have been studied for       downstream activities. The most important limitation of the
their high level of safety (nuclear, aeronautical, chemical and     FRAM methodology is its purely qualitative character [8]. The
petrochemicals, etc.); many of the results obtained are             aim of this paper is to present an integrated framework in
shocking. The success in terms of safety of these organizations     order to develop an HRA analysis using a FRAM model to
is due to risks limitation, errors reduction, but, especially to    identify the error variability considering the cause-effect
the capacity to anticipate and plan “the unexpected” [1]. The       connections between the activities. The rest of the paper is
heart of the culture of these organizations is the                  organized as follows. Section 2 presents a literature review on
comprehension of the human factor. If human error comes             HRA models and FRAM; Section 3 explains the proposed
from unsafe, it is equally true that most of the incidents are      methodological approach; in Section 4 a real case study is
avoided due to the ability of operators to handle the               analyzed. Finally, Section 5 presents the conclusions and
unexpected and adapt to the dangers of life by identifying          future developments of this research.
alternative solutions. We speak of “Resilience Engineering” to
indicate “a non-event dynamic” [2]. Resilience is the ability of
                                                                                       II. LITERATURE REVIEW
an organization to develop robust and flexible processes, to
monitor and revise the risk models adopted and to proactively           HRA analyzes human reliability and measures the human
use available resources, in the face of a break in production or    error probability, considering the physical conditions of the
at greater economic pressure [3]. System analysis is the            operator and the environmental conditions in which it works
fundamental element of continuous improvement. It must              [9]. There are three different generations of the HRA
understand the functioning of a system to prevent any failures.     methodologies:
Of course, it is necessary to understand the functioning of the     The First generation (1970 - 1990) study the human error
systems, also considering external factors that could affect the    probability and it is not very sensitive to the causes of
system, such as pressure, temperature, weathering and               behaviors [10]. The most important first-generation HRA
behavior over time (aging and degradation). The analysis of         techniques are: Systematic Human Action Reliability
the environmental factors influencing the system allows to          (SHARP) which considers the integration between man and
develop a dynamic analysis [4]. To carry out a dynamic
machine, The Empirical technique to estimate the operator’s        railways [17] sectors. The fundamental problem of the FRAM
error (TESEO) which calculates human error probability             model is its qualitative approach. To overcome this limit,
considering five influential factors, Technique for human error    several authors have integrated FRAM with other quantitative
rate prediction (THERP), which builds a tree of events and it      methods to develop a semi-qualitative model. Bjerga [18]
quantifies the related scenarios, Success likelihood index         analyzes the FRAM in terms of modeling uncertainty,
method (SLIM) which assess the error probability considering       showing the need to integrate its context with other reliable
the indicators defined by the experts, Human error assessment      decision-making approaches. Rosa et al., [19] use the built-in
and reduction technique (HEART) which considers all factors        FRAM model with AHP to reduce susceptibility to
that adversely affect the activity performance and finally
                                                                   performance variability. Zheng et al., [20] combine the FRAM
Probabilistic Cognitive Simulator (PROCOS) which returns a
                                                                   model with the SPIN model to test different variability paths.
quantitative results of human error probability;
                                                                   Praetorius et al., [21] combine FRAM with “Formal Safety
The second generation (1990 – 2005) integrate internal and         Assessment” (FSA), a structured methodology in maritime
external factors affecting human performance and cognitive         safety decision-making. Patriarca et al. [8] define a semi
processes [11]. The most important second-generation HRA           quantitative FRAM model for evaluating function variability
techniques are: Cognitive reliability and error analysis method    by integrating the traditional FRAM model with Monte Carlo
(CREAM) which evaluates the effect of the context of risk of       simulation. Albery [22] uses finite element theory (FEM) as
error, Standardized plant analysis risk-human (SPAR-H)             an integration of the FRAM model to make it a dynamic
which divides causes of error in diagnosis and action and it       system. Furfaro et al., [23] propose a methodology, called
underlines the external influencing factors and finally            GOReM, for specifying the requirements applied in the
Simulator for human error probability analysis (SHERPA)
                                                                   analysis of a corporate cloud service. Garro et al., [24] develop
which calculates human error probability considering internal
                                                                   a new modeling language based on time logic called FORM-L
and external factors which influence human error and it
calculates the quantitative value of error probability.            to allow visual modeling of system properties with verification
                                                                   through simulation. The last two works mentioned are a clear
The third generation (Since 2005) consider the dependence of       example of complex system requirements analysis, which
various factors of human performance. The third-generation         could be analyzed by applying the FRAM model. In particular,
models are now only applied in nuclear plants and try to           the models can be used to define the requirements of complex
incorporate aspects of variability in analytical models [12].      systems before making the combined analysis FRAM-HRA
HRA models are still very much used today. In fact, the            The presented research integrates an HRA model with the
evolution and growing complexity of industrial plants makes it     FRAM analysis to evaluate the human error probability of a
necessary to review the HRA analysis practices, for the            conditional action from a previous action.
management of socio-technical systems engineering
management. From the development of these new
                                                                                  III. METHODOLOGY APPROACH
requirements was born a new analysis concept called
“Resilience engineering”. Resilience engineering has become            As shown in Figure 1, we have developed an integrated
a recognized alternative to traditional approaches to safety       approach to assess the risk of operations. Quantitative risk
management. Whereas these have focused on the risks and            assessments are made with SHERPA [25] “in red”, while the
failures as the result of a degradation of normal performance,     qualitative assessment is presented using FRAM “in blue”.
resilience engineering sees failures and successes as two sides    SHERPA evaluates the human error probability of each action,
                                                                   while FRAM is applied to the human error analysis to identify
of the same coin – as different outcomes of how people and
                                                                   the resonance on the network and the variability of human
organizations cope with a complex, underspecified and
                                                                   error. In the end, the performance variability of operator is
therefore partly unpredictable environment [3]. All                analyzed. The methodological approach is divided into
performances require people, technologies, and organizations.      different steps:
Since resources (information, time, etc.) are always limited,
the performance can vary. This variation is not necessarily        Step #1: Scope of analysis: Detailed description of the purpose
negative, in some cases it can generate benefits, in other cases   of the analysis, input data and expected output data.
it can lead to unexpected effects [13]. For this reason,           Step #2: Activity Description: Description of the case study
resilience engineering not only investigates incident events,      and the analyzed model. It is necessary to describe all the
but studies all events, considering different hypotheses where     activities needed to manage the emergency.
they can vary. One of the most popular resilience engineering
models is the functional resonance analysis method (FRAM)          Step#3.1: GTTs definition: For each action it is necessary to
                                                                   identify the Generic task that best represents it. Generic tasks
developed by Hollnagel [7]. This model identifies the main
                                                                   (GTTs) are defined in the literature by Williams, [26]. Table I
macro functions of a system and combines them to evaluate
                                                                   shows the GTTs with the relative reliability values. GTTs
performance variability, considering a relationship causing        identify the internal factors that influence the human error
effect between downstream (influenced) function and                probability.
upstream (influencing) function. Although the FRAM model
has been developed recently, it has already been applied in the    Step #3.2: PSFs choice: The calculation of the human error
aeronautical [14], nuclear [15], petrochemical [16], and           probability also depends on external factors called
“Performance Shaping Factors” affecting the operator.                                TABLE II. PERFORMANCE SHAPING FACTORS
Gertman et al., [27] identify the major environmental factors
affecting human reliability (Table II). The value of multipliers                      PSFs          PSFs values         Values
increases with the deterioration of environmental conditions.                                      Low                        1
                                                                                Available time     Medium                     0.1
                                                                                                   High                   0.01
                                                                                                   High                       5
                                                                                Stress             Medium                     2
                                                                                                   Nominal                    1
                                                                                                   High                       5
                                                                                Complexity         Medium                     2
                                                                                                   Nominal                    1
                                                                                                   Low                        3
                                                                                Training           Nominal                    1
                                                                                                   High                       0.5
                                                                                                   Not available              50
                                                                                Procedures         Incomplete                 20
                                                                                                   Poor                       5
                                                                                                   Missing                    50
                                                                                Ergonomics         Poor                       10
                                                                                                   Nominal                    1

Fig. 1. Methodological approach                                         Step #3.3: HEP calculation: SHERPA estimates the human
                                                                        error probability firstly considering the error probability
                                                                        influenced by internal factors and then also adds to the
                    TABLE I. GTTS RELIABILITY VALUE
                                                                        influence of the external environment. The nominal human
 Generic Task                                         Reliability (k)   error probability (HEPnom) represents the human error
                                                                        probability considering only internal factors. The following
 1. Total unfamiliar, performed at speed                   0.65         equation shows the calculation model:
 with no real idea of likely consequences
                                                                                                                – α (1-t) β
 2. Shift or restore system to a new or                                                      HEPnom = 1 – k e                       (1)
 original state on single attempt without                  0.86
 supervision or procedures                                                  Where α and β are parameters, of Weibull function which
                                                                        represents human error [28]. The contextualized human error
 3. Complex task requiring high level of
                                                           0.88         probability (HEPcont) with the external environment is
 comprehension and skill                                                calculated as:
 4. Fairly simple task performed rapidly or                0.94         HEPcont = (HEPnom * PSFcomp)/(HEPnom* (PSFcomp-1)+1) (2)
 given scant attention
                                                                        Where PSFcomp is the product of all PSFs value above
 5. Routine highly practised, rapid task                  0.993         described. This model calculates the human error probability
 involving relatively low level of skill                                for each action, but it is not possible to establish a cause and
 6. Restore or shift a system to original or                            effect relationship.
 new state following procedures, with some                0.992
 checking                                                               Step #4.1: Build a FRAM: The FRAM model must include all
 7. Completely familiar, highly practised,                              actions (functions) of the analyzed model. The analysis can
 routine task occurring several times per                               start from any essential function for the system, by adding
                                                         0.99992        iteratively any other function that may be needed to provide a
 hour, performed to highest possible
                                                                        complete description of the system. FRAM functions represent
 standards by highly motivated
                                                                        a hexagon with 6 different characteristics (Figure 2): Input,
 8. Respond correctly to system command                                 Time, Control, Output, Resource and Precondition. All
 even when there is an augmented or                     0.999994        functions are connected to each other through the 6
 automated supervisory system                                           characteristics.
                                                                        Step #5: HEP Variability: With the SHERPA model (steps
                                                                    # 3) we have calculated the human error probability of each
                                                                    function (HEPcont). With, FRAM we have built a qualitative
                                                                    connection model between the different functions, identifying
                                                                    the variability of accuracy (VARA (u,d)) and the variability of
                                                                    time (VART (u,d)), generated by a upstream function on a
                                                                    downstream function for a particular scenario. The product
                                                                    between variability of accuracy and variability of time is Total
                                                                    variability VARTOT. In conclusion, considering a particular
                                                                    scenario and a certain action of an operator on a downlink
Fig. 2. FRAM hexagon                                                function, it is possible to calculate the human error probability
                                                                    conditioned (HEPcond) by the upstream function such as:
   Step #4.2: Functions Variability: This step analyzes the
functions variability, that make up the FRAM model. If the          HEPcond = (HEPcont * VARTOT)/ [HEPcont *(VARTOT -1)+1] (3)
output function does not vary, then the function variability is
of no interest, while it is crucial if it causes a change in the                          IV. CASE STUDY
output of the function. Function output, can vary in terms of
time and accuracy.                                                      The proposed model has been integrated into a real case
                                                                    study for the analysis of an emergency in a petrochemical
    Step #4.3: Variable Aggregation: The FRAM analysis              plant. The company recycles used oil, so it works with
considers two functions: downstream function and upstream           extremely hazardous materials: diesel, methane, hydrogen, etc.
function, connected to each other. So if an upstream function       These substances create a highly explosive environment, so, it
is performed precisely and in a precise time it does not            is necessary to thoroughly study the safety management
generate variability in the downstream function. However, if a      system.
function is performed imprecisely, or in an excessively high or
excessively limited time, a variability in the downstream              Step #1: Scope of analysis: To analyze emergency
function is generated. The variable aggregation tables by           management activities by assessing the human error
characteristics are reported by Hollnagel [7]. Table III shows      probability, related to each activity and by using FRAM to
an example of coupling upstream and downstream functions            detect the performance variability generated by a upstream
for input and output.                                               function on the downstream function by detecting a
                                                                    conditional error probability value.
         TABLE III. QUALITATIVE VARIABLE AGGREGATION (O-I)
                                                                        Step #2: Activity Description: The case study analyzes the
       Upstream function            Possible effects on             standard actions to be taken after the explosion of a liquid
          variability              downstream function              methane tank. The analysis predicts actions of the desk
               Too early          Amplification / Damping           operator (in bold) who works in the control room and the
                                                                    subsequent actions of the operator who work in the production
     Time         In time         No effect / Damping
                                                                    site. The model analyzes the variability of the operator's error
                  Too late        Amplification                     probability if the desk operator makes a mistake earlier.
                  Inaccurate      Amplification                         1. Alarm signal
     Accuracy     Acceptable      No effect / Damping                   2. Evacuation
                  Accurate        Damping                               3. Closing steam systems
                                                                        4. Power shutdown 03T102A / F
                                                                        5. Closing distillation systems
    The limit of this model is the qualitative approach.                6. Cross pump stop 01P102B / C
Patriarca et al., [8] overcome this limit by introducing                7. Power pump stop 01P104A / D
quantitative values that have been used to develop this model           8. Suction valve closure 04 04 BN192
(Table IV). If the upstream function amplifies effects on the           9. Closing the heating system
downstream function, it associates a value of 2, if it dampens          10. Switch off oven 0H03
the effects, it associates a value of 0.5, otherwise a value of 1       11. Extraction pump stop 02P104G / H
is associated.                                                          12. Air cooler stop 09KL198I / N
            TABLE IV. NUMERICAL VARIABLE AGGREGATION
                                                                    The analysis focuses on operation #3(developed from desk
                       Effect        VAR(u,d)                       operator) and operation #4 (developed from operator).
                  Amplification           2                             Step#3.1: GTTs definition: Action 3 is associated with the
                  No effect               1                         GTT5 "Routine, highly-practised, rapid task involving
                                                                    relatively low level of skill" while action 4 is associated with
                  Damping                0.5
                                                                    the GTT3 "complex task requiring high level of
                                                                    comprehension and skill." Each operation is associated with
                                                                    the GTT that best represents it.
   Step #3.2: PSFs choice: Table V shows the external
working conditions for the two operators, considering the level
of stress, complexity and ergonomics. The operator in the
production plant has worse stress and ergonomics values than
the operator in the control room that perform very complex
operations. All other PSFs not included in this table are
nominal hypotheses and assume value 1.

                     TABLE V. PSFS CASE STUDY

            PSFs         Desk operator          Operator
         Stress                 2                  5
         Complexity             5                  2
         Ergonomics             1                 10               Fig. 3. Human Error Probability graph


Step #3.3: HEP calculation: Analyzing the internal factors         Step #4.1: Build a FRAM: The functions described in step # 2
obtained from GTTs and the external factors obtained by            are represented with a graph FRAM. The model identifies the
PSFs, it is possible to calculate the human error probability of   connections between the various functions. The two analyzed
the two activities during the 8 hours of work (Figure 3).          functions # 3 and # 4 are highlighted in red. In particular, the
                                                                   output of function # 3 is the precondition for function # 4.




Fig. 4. FRAM model

Step #4.2: Functions Variability: In the case study, only          Both causes are very frequent and have serious consequences
human functions are analyzed. In particular, the scenario          on variability.
simulated shows that the operator performs the actions in the
right way, but does too late. The causes of this delay may be      Step #4.3: Variable Aggregation: The case study has
internal to the operator, psychologically and physiologically,     considered two functions, linked as output and precondition.
but also external to the operator, social and organizational.
Table VI shows the functions variability. In particular, VARA        analysis in a petrochemical company. The case study identifies
(u,d)) = 1 and the VART (u,d) = 2.                                   an emergency situation created by the explosion of a methane
                                                                     tank. Two activities (Closing steam systems and Power
                TABLE VI. VARIABLE AGGREGATION (O-P)                 shutdown 03T102A / F) are identified and independent error
                                Possible effects on                  probabilities are calculated. Then the FRAM of the incident is
  Upstream function                                                  analyzed and the error probability of action #4 is calculated
                                  downstream           Value
     variability                                                     considering the errors made in activity #3. The output of
                                     function
                                                                     action #3 is a precondition of action #4. The results show a
   Time          Too late         Amplification           2
                                                                     growing trend of error probability with the passage of time.
Accuracy       Acceptable             No effect           1          The analysis of errors identified HEPcont more for function
                                                                     #4. After considering the variability of the performance
Step #5: HEP Variability: The last step of the study calculates      HEPcond value is greater than the previous. Future model
the conditioned human error probability for activity #4,             development involves the development of a simulation model
influenced by the variability generated by activity # 3. In this     for integrated HRA-FRAM analysis.
case study, the function #3 has a variability due to a delay of
action, so the error probability in the action is higher. Figure 5
compares the contextual error probability with the conditional                                     REFERENCES
error probability for activity #4.                                   [1]  A. Bahoo Toroody, F. Bahoo Toroody, and F. De Carlo, “Development
                                                                          of a risk based methodology to consider influence of human failure in
                                                                          industrial plants operation,” Summer School “Francesco Turco”,
                                                                          September 2017.
                                                                     [2] K.E. Weick, and K.M: Sutcliffe, “Managing the unexpected: Resilient
                                                                          performance in an age of uncertainty”, vol. 8, 2011 John Wiley & Sons.
                                                                     [3] E. Hollnagel, “Resilience engineering in practice: A guidebook”,
                                                                          Ashgate Publishing, Ltd, 2013.
                                                                     [4] A. Petrillo, D. Falcone, F. De Felice, and F. Zomparelli, “Development
                                                                          of a risk analysis model to evaluate human error in industrial plants and
                                                                          in critical infrastructures”, International Journal of Disaster Risk
                                                                          Reduction, 23, 15-24, 2017.
                                                                     [5] G. Haddow, J. Bullock, and D.P. Coppola, “Introduction to emergency
                                                                          management”, Butterworth-Heinemann, 2017.
                                                                     [6] N. Norazahar, F. Khan, B. Veitch, and S. MacKinnon, “Prioritizing
                                                                          safety critical human and organizational factors of EER systems of
                                                                          offshore installations in a harsh environment”, Safety science, 95, 171-
                                                                          181, 2017.
Fig. 5. Contextual and Conditioned human error                       [7] E. Hollnagel, “FRAM, the functional resonance analysis method:
                                                                          modelling complex socio-technical systems”, Ashgate Publishing, Ltd,
                                                                          2012.
                         V. CONCLUSIONS                              [8] R. Patriarca, G. Di Gravio, F. Costantino, and M. Tronci, “The
                                                                          Functional Resonance Analysis Method for a systemic risk based
The complexity of the most recent industrial plants drives                environmental auditing in a sinter plant: A semi-quantitative approach”,
managers to continually analyze processes, especially in terms            Environmental Impact Assessment Review, 63, 72-86, 2017.
of safety management to limit the number of workplace                [9] I.S. Kim, “Human reliability analysis in the man machine interface
accidents and occupational disease complaints. Technology                 design review”, Annals of nuclear energy, 28, 1069-1081,2001.
and machine reliability studies have considerably reduced the        [10] E. Hollnagel, “Reliability analysis and operator modeling. Reliability
                                                                          Engineering & System Safety”, 52, 327-337, 1996.
percentage of accidents due to mechanical failures. Today the
                                                                     [11] E. Hollnagel, “Cognitive reliability and error analysis method
major cause of accidents is due to human error. Historically,             (CREAM)”, Elsevier, 1998.
several HRA models have been developed to assess human               [12] W. Jung, J. Park, and J. Ha, “ Analysis of an operators’ performance
error. The major limitation of these models is due to their               time and its application to a human reliability analysis in nuclear power
static nature. In recent years, to address the complexity of              plants”, Nuclear Science, IEEE Transactions on, 54, 1801-1811, 2007.
industrial plants, a new type of analysis called "Resilience         [13] A. Azadeh, M. Partovi, and M. Saberi, E. Chang, and O. Hussain, “A
Engineering", has developed, which evaluates performance                  Bayesian Network for Improving Organizational Regulations
                                                                          Effectiveness: Concurrent Modeling of Organizational Resilience
variability of dynamical functions, considering the cause-                Engineering and Macro-Ergonomics Indicators”, In International
effect link. An engineering resilience model is the FRAM that             Conference on Intelligent Networking and Collaborative Systems, pp.
allows to evaluate the performance variability of different               285-295, Springer, Cham, 2017.
functions. The most important limit of FRAM is its qualitative       [14] I.A. Herrera, E. Hollnagel, and S. Håbrekke, S., “Proposing safety
                                                                          performance indicators for helicopter offshore on the Norwegian
approach. This research integrates a quantitative model of                Continental Shelf. PSAM10” - Tenth Conf. Probabilistic Saf. Assess.
HRA with the qualitative FRAM. It numerically calculates the              Manag, 2010.
human error probability of human of functions, considering           [15] K. Lundblad, J. Speziali, “FRAM as a risk assessment method for
the influence of upstream function on downstream function.                nuclear fuel transportation”, Int. Conference Work. Saf, 2008.
The research model is applied in an emergency management             [16] G. Shirali, V. Ebrahipour, and L. Mohammad Salahi, “Proactive risk
                                                                          assessment to identify emergent risks using Functional Resonance
     Analysis Method (FRAM): a case study in an oil process unit”, Iran            Applications (CloudTech), 2016 2nd International Conference on (pp.
     Occup. Health 10, 33–46, 2013.                                                205-212). IEEE.
[17] R. Steen, and T. Aven, A risk perspective suitable for resilience        [24] A. Garro, A. Tundis, D. Bouskela, A. Jardin, N. Thuy, M. Otter, and H.
     engineering. Saf. Sci. 49:292–297, 2011.                                      Olsson. (2016). On formal cyber physical system properties modeling: a
[18] T. Bjerga, T. Aven, E. Zio, “Uncertainty treatment in risk analysis of        new temporal logic language and a Modelica-based solution. In Systems
     complex systems: the cases of STAMP and FRAM”, Reliab. Eng. Syst.             Engineering (ISSE), 2016 IEEE International Symposium on (pp. 1-8).
     Saf. 156, 2016.                                                               IEEE.
[19] L.V. Rosa, A.N. Haddad, and P.V. de Carvalho, “Assessing risk in         [25] V. Di Pasquale, S. Miranda, R. Iannone, and S. Riemma, “Simulative
     sustainable construction using the Functional Resonance Analysis              analysis of performance shaping factors impact on human reliability”, In
     Method(FRAM)”, Cogn. Technol. Work 17, 559–573, 2015.                         manufacturing activities. 27TH European modeling and simulation
                                                                                   symposium, pp. 93-102, 2015
[20] Z. Zheng, J. Tian, and T. Zhao, “Refining operationguidelines with
     model-checkingaided FRAM to improve manufacturing processes: a           [26] J.C. Williams, “HEART–a proposed method for assessing and reducing
     case study for aeroenginebladeforging”, Cogn. Tech. Work 18, 2016.            human error”, In 9th Advances in Reliability Technology Symposium,
                                                                                   University of Bradford, 1986.
[21] G. Praetorius, A. Graziano, J.U. Schröder-Hinrichs, and M. Baldauf,
     “Fram in FSA—Introducing a function-based approach to the formal         [27] D.I. Gertman, H.S. Blackman, J.L. Marble, J. Byers, and C. Smith, “ The
     safety assessment framework”, Adv. Intell. Syst. Comput., 2016.               SPAR-H Human Reliability Analysis Method. U.S. Nuclear Regulatory
                                                                                   Commission, NUREG/CR-6883, INL/EXT-05-00509”, Washington DC,
[22] S. Albery, “ Dynamic Numerical Simulation Using the Finite Element            USA, 2005.
     Method (LS-Dyna, Altair Hyperworks)”, 2013.
                                                                              [28] B. Kirwan, “ The validation of three human reliability quantification
[23] A. Furfaro, T. Gallo, A. Garro, D. Saccà, and A. Tundis. (2016).              techniques – THERP, HEART and JHEDI. Part 1: technique
     Requirements specification of a cloud service for cyber security              descriptions and validation issues” Applied Ergonomics, 27, 359–373,
     compliance analysis. In Cloud Computing Technologies and                      199