<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Cassandra's Calling Card: Socio-technical Risk Analysis and Management in Cyber Security Systems</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Richard McEvoy</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Stewart Kowalski</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>DXC Technology</institution>
          ,
          <addr-line>Royal Pavilion Wellesley Road Aldershot Hampshire GU11 1PZ</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>NTNU i Gjøvik</institution>
          ,
          <addr-line>Teknologivegen 22, 2815 Gjøvik</addr-line>
          ,
          <country country="NO">Norway</country>
        </aff>
      </contrib-group>
      <fpage>65</fpage>
      <lpage>80</lpage>
      <abstract>
        <p>Current methodologies for cyber security risk analysis are largely focused on process and technology. They do not systematically incorporate sociotechnical thinking. We argue this reduces their predictive power in determining the risks of cyber threats to organizations and hence limits the range of responses. A remedy is to augment such systems using suitable socio-technical models. As an example, we propose a re-working of Rasmussen's model for safety in systems, applying it to cyber security. The updated model gives rise to a set of predictors and boundary conditions which can be used to determine an organization's resilience in the face of external and internal cyber threats, enabling analysts to propose an extended range of countermeasures. We propose using this approach as a basis to include socio-technical analysis in risk assessment. As an example, we provide a critique of the risk methodology used in SABSA against this model. We discuss practical applications of the approach and some associated issues. Future work will focus on incorporating this approach into a variety of risk methodologies and the creation of novel techniques that can be tested in the simulated cyber security environment of a cyber range or in the field.</p>
      </abstract>
      <kwd-group>
        <kwd>socio-technical systems analysis</kwd>
        <kwd>cyber security</kwd>
        <kwd>risk analysis</kwd>
        <kwd>risk management</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>Current methodologies used for cyber security risk analysis and management largely
focus on technical and process requirements. Evidence from incidents such as the Sony
hacks and failures at SingHealth (see section 3) point to social failures as much as
technical or procedural weaknesses as reasons for the occurrence of security incidents.
Threats and vulnerabilities in organizations are also not simply technical or procedural
in nature but result from complex systemic factors arising in modern organizations and
societies. We argue that the lack of socio-technical systems analysis in commonly used
risk analysis methodologies leaves organizations more vulnerable to cyber security risk
than they should be. As one possible means of addressing this, we show how
Rasmussen’s model of a complex socio-technical systems for safety engineering can be adapted
for cyber security purposes. This adaptation, in turn, provides a basis for enhancing
current approaches to risk analysis and management. We give an example of applying
our approach to the risk methodology SABSA, outlining the current weaknesses in the
approach and making recommendations for its improvement. Future work will focus
both on enhancing current approaches in other methodologies and developing new
ones. Verification and validation of our techniques will be aided not only by case
studies in real life, but also the use of simulated studies in the Norwegian Cyber Range.</p>
      <p>
        Section 2, “Literature Review”, puts our approach in the context of current research.
In section 3, “A Socio-technical Risk Vacuum?”, we provide evidence of the current
techno-centric nature of cyber security risk analysis methodologies. Following on in
Section 4, “4 Cyber Systems as Complex Socio-Technical Systems”, we model the
socio-technical nature of cyber security risk to organizations in terms of Rasmussen’s
model of complex socio-technical systems
        <xref ref-type="bibr" rid="ref8">(Rasmussen 1997)</xref>
        ; and we augment the
boundary conditions and predictors provided in the model(Cassano-Piché, Vicente et
al. 2006), incorporating barriers to organizational learning(Kleiner and Corrigan 1989),
such as failure to pay reputational costs(Cassano-Piché, Vicente et al. 2006). Section
5, “Developing Socio-Technical Techniques for Cyber Security Risk Analysis”, states
the predictive properties cyber security risk analysis techniques should have in order to
be able to address risks on a socio-technical basis. These factors can be used as criteria
to develop and assess proposed techniques. In section 6 ,“Example Assessment –
SABSA”, we provide an example assessment of a commonly used risk methodology –
SABSA
        <xref ref-type="bibr" rid="ref13">(Sherwood, Clark et al. 2004)</xref>
        – against our criteria. We make some practical
recommendations for its enhancement using our approach. We conclude in section 8,
“Conclusions and Future Work”, where we set out our proposals for developing both
current and novel techniques for risk analysis on the basis of our approach, their
verification and validation.
2
      </p>
    </sec>
    <sec id="sec-2">
      <title>Literature Review</title>
      <p>
        Socio-technical systems analysis is an established part of safety engineering practice
for over fifty years
        <xref ref-type="bibr" rid="ref10 ref15 ref2">(Leveson 2011, Salmon, Stanton et al. 2017)</xref>
        . But despite strident
efforts by – for example – Kowalski
        <xref ref-type="bibr" rid="ref2 ref4">(Al Sabbagh and Kowalski 2012, Al Sabbagh and
Kowalski 2015)</xref>
        , Sasse
        <xref ref-type="bibr" rid="ref1 ref11">(Adams and Sasse 1999, Sasse, Brostoff et al. 2001)</xref>
        and
Anderson (Anderson and Moore 2007), cyber security risk analysis and management
techniques commonly in use do not address socio-technical, economic or human factors
based analysis in any depth (see Section 3).
      </p>
      <p>
        Where socio-technical or human factors are addressed, generally, the treatment is
superficial. Users are treated as hostile, lazy, or ignorant – and the highest standard is
conformance to cyber security policy, which may be out-of-step with actual security
requirements, or place an unacceptable burden on costs or workloads -leading to their
“shadow” abandonment
        <xref ref-type="bibr" rid="ref12 ref17 ref2 ref4">(McEvoy and Kowalski , Schlienger and Teufel 2003,
Schlienger and Teufel 2003, Teufel 2003, Okere, Van Niekerk et al. 2012)</xref>
        or even be
directly mistaken(BBC 2017). A faulty epistemological standpoint can also undermine
the effectiveness of approaches – for example, the assumption that managers can stand
objectively outside the system and manipulate other agents within it
        <xref ref-type="bibr" rid="ref3">(Mowles 2016)</xref>
        .
      </p>
      <p>
        This shortcoming is starting to be addressed in academia where there is a recognition
that cyber security requires approaches including human and organizational factors and
that technical solutions no longer provide satisfactory answers – see, for example,
        <xref ref-type="bibr" rid="ref14">(Soomro, Shah et al. 2016)</xref>
        for a review of the relevant literature.
      </p>
      <p>
        This paper attempts to bridge this gap by showing how socio-technical models can
be adapted to create practical set of criteria for judging risk analysis and management
techniques from a socio-technical viewpoint as a basis for the assessment and
development of current techniques to make them more complete. As an example of this
approach, it draws on and adapts Rasmussen’s model of a complex socio-technical
system
        <xref ref-type="bibr" rid="ref8">(Rasmussen 1997)</xref>
        and brings it up to date by including some additional predictors
drawn from a real-world scenario(Cassano-Piché, Vicente et al. 2006) and by
considering the increasing complex intermediation of the supply of cyber services (Section
4). We use the risk methodology SABSA
        <xref ref-type="bibr" rid="ref13">(Sherwood, Clark et al. 2004)</xref>
        as a
wellknown and accessible example of a commercial risk analysis and management
methodology to demonstrate our approach.
      </p>
      <p>Ultimately, we propose to validate the plausibility of these models through action
research on real-world systems and realistic simulations (Checkland and Holwell
1998).
3</p>
    </sec>
    <sec id="sec-3">
      <title>A Socio-Technical Risk Vacuum?</title>
      <p>
        We argue that most current approaches to cyber security risk analysis center on
technology and procedures, reflecting their origins in engineering practice
        <xref ref-type="bibr" rid="ref9">(Saleh and
Alfantookh 2011)</xref>
        . Others reflect a more business-oriented approach considering
different parts of the organization. But almost none systematically incorporate
socio-technical aspects which leads to risks being underestimated (see below and section 4).
      </p>
      <p>We carried out a brief review, based on a survey of major cyber security risk analysis
methodologies in the marketplace (Ionita 2013). We define socio-technical thinking as
the consideration of a combination of factors - Culture (for example, human factors,
behavior, ethnicity, organizational culture), Structure (for example, economics, social
structures, politics, regulation, policy, procedure management and governance),
Methods (policies and procedures, working practices) and Machines (technology) and their
Interaction – see Figure 1.</p>
      <p>
        The criterion used was that, during the risk discovery phase (according to
ISO27005
        <xref ref-type="bibr" rid="ref18">(Wahlgren, Bencherifa et al. 2013)</xref>
        ), the methods used mentioned the factors
directly and dealt with them systematically. For example, if culture was mentioned in
relation to users’ knowledge and awareness of cyber security, this scored a 1. If it was
examined at different levels, such as worker, manager, senior manager and other
aspects such as group (as well as individual) learning, behavior and ethics were examined,
it would score a 2. Finally, Interaction was considered, at the basic level, to be building
risks from links between different factors and components, rather than considering them
in isolation and, more fundamentally, the identification of underlying patterns in the
social, economic and political life of the organisation which contributed to its overall
risk.
      </p>
      <p>
        The results of the review are shown in Table 1. The total score (out of 10) shows the
completeness of the methodology in socio-technical terms in our interpretation.
Although not exhaustive, the list shows that most common methods in the marketplace
have a shortfall in terms of socio-technical systems analysis. We have also been
informed by experienced colleagues in industry that even where an approach is
theoretically more complete (e.g., NIST
        <xref ref-type="bibr" rid="ref9">(Saleh and Alfantookh 2011)</xref>
        ), in practice, these aspects
are seldom addressed1.
      </p>
      <p>
        To demonstrate how scores were assigned, we use SABSA
        <xref ref-type="bibr" rid="ref13">(Sherwood, Clark et al.
2004)</xref>
        and OCTAVE Allegro as example methodologies. SABSA scores quite highly
because it takes a layered approach to risk analysis, based on soft systems
methodologies, , considering threats in terms of four domains (people, processes, systems and
external). The assets of the organization are considered its business goals and threats
are categorized in relation to these goals – for example, threats to health and safety,
threats to information security, business failures in terms of mis-selling systems
capabilities and so forth. Threats themselves are analyzed using scenarios and the capability
and motivation of threat actors considered. The scenarios also allow some consideration
of the interaction of different factors (under the labels of catalysts and inhibitors).
      </p>
      <p>
        Having said this, the SABSA approach is business-oriented, rather than
socio-technically oriented. Underlying factors which lead to the kinds of vulnerabilities identified
in the approach are not addressed – for example, aspects such as power relations
between groups(Bălan 2010), social and economic pressures which militate against
process maintenance, the results of changes over time
        <xref ref-type="bibr" rid="ref8">(Rasmussen 1997)</xref>
        , or
group/individual cognitive perceptions of security measures
        <xref ref-type="bibr" rid="ref5">(Oshlyansky, Cairns et al. 2007)</xref>
        are
not considered. Hence, we perceive this approach as only partially meeting with the
socio-technical criteria proposed. Similar remarks can be made about NIST and IRAM
(see also remarks on how these factors are ignored - above).
      </p>
      <p>OCTAVE Allegro, in contrast, takes an information centric approach to risk analysis
where the organization’s information assets represent the organizing principle.
Information moves between various “containers” in a system which can include people as
well as technical objects which store and transmit information assets. Each type of
container is associated with a set of questions about potential vulnerabilities and various
counter-measures are proposed at container level. From a socio-technical point of
view, this model focuses on technology and procedures. For example, it does not
address social or human reasons why vulnerabilities might occur and does not consider
interaction between system components or aspects such as change over time. Similar
remarks can be made about CRAMM, other OCTAVE methodologies, FAIR, Infosec
Standard 1, attack path analysis and MEHARI. In essence, all these approaches focus
on technical, physical and procedural security control measures and do not consider
social factors leading to technical vulnerabilities or poor behavior.</p>
      <p>We believe this socio-technical deficit represents a serious gap in thinking about
cyber security risk which affects all aspects of risk analysis from gathering threat
intelligence to contingency planning, since almost all threats have a strong socio-technical
element to them, both in terms of how and where the threat originates, who implements
1 In conversation.
it, what organizations are affected, how they are impacted, their resilience and their
ability to recover from the attack.</p>
      <p>Examples of threats with marked socio-technical elements include2
1. Hostilities between nation-states
2. Political protestors engaging with social media to launch attacks
3. Insider attacks
4. Malware attacks making use of (ultimately disposable) human agents
5. Hardware and software backdoors
6. Insider trading
7. State-sponsored espionage</p>
      <p>Where organizations do not take account of socio-technical factors, they fail to
defend themselves effectively or respond appropriately to threats because they do not
understand the full extent of the risks they face, the vulnerabilities they have or the
potential impacts of such attacks on them, not just at the technological level, but at the
organizational level.</p>
      <p>
        For example, the “Wannacry” attack(Hillier April 2018) on the NHS (National
Health Service) in the UK proved more devastating than it should have been due to a
failure to appreciate the weaknesses of the organization from a socio-technical point of
view. Central governing bodies were not aware of shortcomings in technical defenses
– demonstrating weaknesses in control and feedback within the organization and a
resultant weakness in the vertical integration of its working practices. The close-coupled
and complex nature of NHS systems (where, for example, medical equipment, patient
management systems, IoT and ICS devices and office systems reside on the same
unsegmented network) ensured a non-linear impact of any successful attacks
        <xref ref-type="bibr" rid="ref10 ref15 ref2 ref6 ref9">(Perrow
2011, Ehrenfeld 2017, Mattei 2017, Mohurle and Patil 2017, Sütterlin, Dyrkolbotn et
al. 2018)</xref>
        . Furthermore, the organization’s priorities during the attack were skewed in
terms of dealing with the attack and its costs rather than addressing the more vital issue
of patient safety – a serious cultural shortcoming where central government cost
concerns took priority over patients’ lives.
      </p>
      <p>Other attacks show similar patterns. The Sony attacks represented a set of internal
management failings as much as they did the work of sophisticated attackers(Berghel
2015) including the willingness of the director of IT to cover up audit failings. Similar
statements can be made about the failures at SingHealth3.</p>
      <p>
        These real-world examples show that socio-technical considerations are vital during
risk analysis. This is further emphasized by the complex nature of modern cyber
systems which we consider in Section 4.
2
https://www.forbes.com/sites/firewall/2010/04/29/seven-cyber-scenarios-to-keep-you-awakeat-night/#2e701f576f7d
3
https://www.straitstimes.com/singapore/probe-report-on-singhealth-data-breach-points-tobasic-failings?utm_medium=Social&amp;utm_campaign=STFB&amp;utm_source=Facebook&amp;fbclid=IwAR0HHFtADeIC5jLquA3bZuMGUgBAx
nnPK96NzmHRAXvkf6rwc7Ml-K9yhs8#Echobox=1547076196
Rasmussen’s model
        <xref ref-type="bibr" rid="ref8">(Rasmussen 1997)</xref>
        demonstrates how organizations need to think
socially and systemically in addressing risk – considering the vertical integration of
systems, both managerial, operational and technical and how pressures on systems – on
the one hand, to economize on cost and, on the other, to reduce work load – can push
systems to a point where they are vulnerable to a triggering event, whether exogenous
or endogenous, which creates a catastrophic incident.
      </p>
      <p>These models are useful because they allow us not just to analyze incidents but also
to derive incident predictors and determine boundary conditions (Cassano-Piché,
Vicente et al. 2006). But these predictors are currently written with a view to
preventing accidents which threaten health and safety, rather than specifically cyber security
incidents – that is, triggering events are normally endogenous rather than exogenous.
They also do not take account of reputational costs and their influence on security
decisions, nor of failures in organizational learning due to organizational inertia(Kleiner
and Corrigan 1989) – which specifically includes organizational response to threat
intelligence and to the introduction of new technologies in the cyber security context.</p>
      <p>Furthermore, although possibly unintentionally, the current version of the model
might lead analysts to the belief that primary focus should be on a single organization
and its integration. However, most critical systems are now provided on a multi-party
basis, which induces a requirement for lateral as well as vertical integration. This leads
us to propose two extensions to the model, shown in Figures 1 and 2.</p>
      <p>Risk Analysis &amp;
Management
Methodology</p>
      <p>CRAMM
OCTAVE Allegro
NIST
Infosec Standard 1
FAIR
MEHARI
STRIDE
SABSA (Risk)
Attack Path Analysis</p>
      <p>IRAM
0 – Not present
1 – Partially present
2 – Systematically addressed
N/A – Not applicable</p>
      <p>Figure 1 – Rasmussen’s Model - Adapted for Cyber Organizations</p>
      <p>Original Predictors
Safety is an emergent property of a complex socio-technical system. It
is impacted by the decisions of all the actors.</p>
      <p>Threats to safety are usually caused by multiple contributing factors,
not just a single catastrophic decision or action.</p>
      <p>Threats to safety or accidents usually result from a lack of vertical
integration across all levels of a complex socio-technical system, not just
from deficiencies at any one level alone.</p>
      <p>The lack of vertical integration is caused, in part, by a lack of feedback
across levels of a complex socio-technical system. Actors at each level
cannot see how their decisions interact with those made by actors at
other levels, so the threats to safety are far from obvious before an
accident.</p>
      <p>Work practices in a complex socio-technical system are not static. They
will migrate over time under the influence of a cost gradient driven by
financial pressures in an aggressive competitive environment and
under the influence of an effort gradient driven by the psychological
pressure to follow the path of least resistance.</p>
      <p>The migration of work practices can occur at multiple levels of a
complex socio-technical system, not just one level alone.</p>
      <p>Migration of work practices causes the system’s defenses to degrade
and erode gradually over time, not all at once. Accidents are released
by a combination of this systematically-induced migration in work
practices and a triggering event, not by an unusual action or an entirely new,
one-time threat to safety.</p>
      <p>Updated Cyber Security Predictors
Security is an emergent property of a complex cyber system. It is impacted by the
decisions of all the actors.</p>
      <p>Threats to security are usually caused by multiple malicious actors, vulnerabilities
in organisations by multiple contributing factors: not just a single catastrophic
threat, decision or action.</p>
      <p>Security incidents or vulnerabilities usually result from a lack of vertical and
lateral integration across all levels of a complex cyber system, not just from
deficiencies at any one level alone.</p>
      <p>The lack of integration is caused, in part, by a lack of feedback across and between
levels of a complex cyber system. Actors at each level cannot see how their
decisions interact with those of others so the threats to security are far from
obvious before an incident.</p>
      <p>Work practices in a complex cyber system are not static. They will migrate over
time under the influence of a cost gradient driven by financial pressures in an
aggressive competitive environment and under the influence of an effort
gradient driven by the psychological pressure to follow the path of least resistance.
Organizations may resist change for a number of other cultural reasons.</p>
      <p>The migration of work practices can occur at and across multiple levels of a
complex cyber system, not just one level or in one organisation alone.</p>
      <p>Migration of work practices causes the system’s defences to degrade and erode
gradually over time, not all at once. Incidents occur due to a combination of
changes in malicious action and this systemically-induced migration in work
practices – not by a single unusual action or an entirely new, one-time threat to
security.</p>
      <p>In Figure 1, we extend the view of cyber systems to encompass other organizations
in the service and supply chain.</p>
      <p>In Figure 2, we include two additional features to the conventional Rasmussen
model. We add the label “Barriers to Learning” to show that social and cultural
factors can impede the pace of organizational change in working practices in response to
threats (i.e., social cognition). We also display the barrier posed by working practices
as a jagged edge to show that such practices can degrade to the point they cross the
security boundary which – due to the constant nature of attacks – can lead to
catastrophic failure.</p>
      <p>In Table 3, we re-write and extend the predictors which can be derived from
Rasmussen’s model(Cassano-Piché, Vicente et al. 2006), both to make the wording match
the vocabulary of cyber security, and also to take into account the extensions we have
proposed. The major changes are the dual factor cause of cyber security incidents
(both malicious acts and a breakdown in work practices; the incorporation of the need
for lateral as well as horizontal integration; and the consideration of cultural barriers
to organizational learning.
5</p>
      <p>Developing and Assessing Techniques for Cyber Security
Risk Analysis
The predictive model proposed in Section 4 thus provides us with one possible basis
for assessing and improving cyber security risk analysis and management methods
using socio-technical analysis. We summarize the assessment criteria in Table 4.</p>
      <p>Factor
Decision-making</p>
      <p>
        Note, the requirement for dynamic modelling leads to the use of techniques such as
risk narratives as a sense-making account of cyber security behavioral patterns (the
term ‘figuration’ (configuration)
        <xref ref-type="bibr" rid="ref7">(Quintaneiro 2006)</xref>
        or ‘construct’
        <xref ref-type="bibr" rid="ref16">(Taylor and Lerner
1996)</xref>
        may be preferred). This is arguably a natural mode for expressing
socio-technical risks, exampled in
        <xref ref-type="bibr" rid="ref9">(McEvoy and Kowalski , Botta, Muldner et al. 2011)</xref>
        - which
can also be captured using techniques such as systems dynamics diagrams
(Wolstenholme 2003).
6
      </p>
    </sec>
    <sec id="sec-4">
      <title>Example Assessment - SABSA</title>
      <p>
        SABSA
        <xref ref-type="bibr" rid="ref13">(Sherwood, Clark et al. 2004)</xref>
        is an open-source methodology for creating an
enterprise security architecture.
      </p>
      <sec id="sec-4-1">
        <title>SABSA Layer</title>
        <p>Contextual
Conceptual
Logical
Physical
Component</p>
      </sec>
      <sec id="sec-4-2">
        <title>Description</title>
        <p>Business planning and decision-making, e.g., business risk
assessment, business requirements, organizational and
cultural development
Business operations, e.g., process development, audit and
reviews, standards and procedures
Security governance, e.g., security policymaking,
information classification, service management, audit trails
Security administration, e.g., development and execution of
security rules, access list maintenance, event log file
management
Technical capability, e.g., products, tools, project
management, operation of individual systems</p>
        <p>Table 5 – SABSA Layers
This not the same as a technical architecture. Rather it is a layered organizational
blueprint for ensuring cyber security is maintained throughout all parts of the
organisation. The layers are described in Table 5.</p>
        <p>
          SABSA makes use of soft systems methodology
          <xref ref-type="bibr" rid="ref13">(Sherwood, Clark et al. 2004)</xref>
          to
analyze and represent requirements and solutions at each level, leading to an
integrated approach to creating secure organizations.
        </p>
        <p>Based on our review above, SABSA does refer to socio-technical factors, but the
coverage is not complete as the methodology is business-oriented – making it
ultimately policy and procedurally driven, rather than sociologically motivated.</p>
        <p>In Table 6, we provide our assessment the strengths and weaknesses of SABSA
risk methodology in relation to our predictive criteria and its dynamic modelling
capability.
Predictive Capability
Decision-making
Threat Actors
Integration
Control &amp; Feedback
Work Practices
Security Boundaries
Strengths
SABSA is quite strong in terms of gathering
decision-making criteria (business requirements) from
all parts of the business.</p>
        <p>SABSA identifies multiple threat actors both
internal and external at different layers.</p>
        <p>SABSA does not use methods to promote
intra-organizational integration.</p>
        <p>SABSA’s approach is likely to promote feedback on
poor decision-making.</p>
        <p>SABSA does consider business requirements
changing over time
SABSA is likely to uncover where work practices
have violated security policy.</p>
        <p>Identify New Security
Working Practices</p>
        <p>SABSA is likely to identify the need for new or
updated security working practices.</p>
        <p>Dynamic Modelling</p>
        <p>The use of threat scenarios goes part way to
capturing risk narratives. But accounts of
countermeasures are not connective.</p>
        <p>Recommendations
SABSA focuses too narrowly on a single organisation and
should consider vertical or horizontal integration of
requirements.</p>
        <p>The analysis would need to also consider issues raised by
changes in external actors and external pressures over time as
well as the potential degradation in working practices.</p>
        <p>Lateral integration should be considered.</p>
        <p>There should be a focus on the completeness of feedback
mechanisms, either vertically or laterally.</p>
        <p>SABSA should consider historical or current changes to work
practices, particularly in other organizations.</p>
        <p>SABSA should seek to uncover underlying reasons for the
violation such as pressure on costs or workloads or other cultural
or human factors as well as breaches in the security boundary
Barriers to the adoption of new practice may not be identified
in SABSA. It also needs to incorporate responses to threat
intelligence and new technology into consideration.</p>
        <p>SABSA should seek to capture dynamic aspects of the narrative
and avoid a static ontology of threat capabilities/motivations.</p>
        <p>Countermeasures should be related to one another to form
both defense-in-depth and defense-in-breadth.
©Copyright held by the author(s)</p>
        <p>In practical terms, many of these issues are easy to address in the SABSA
methodology by incorporating some additional layers of analysis in the risk approach. For
example, by considering the disintegrating effect of organizational issues at different
layers in the model and providing additional means to carry out analysis of change
over time.</p>
        <p>It would be much harder to address the same issues in a methodology such as
OCTAVE Allegro where the solution would appear to be to carry out a supplementary
risk analysis from a socio-technical viewpoint in order to capture additional risks.
This kind of multi-standpoint risk assessment approach is starting to become common
in industry, suggesting that the deficiencies in purely technical and procedural
approaches are being recognized and organizations are seeking to deal with them – see
(McEvoy and Kowalski) for an example in the defense sector in the UK.
7</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Conclusions and Future Work</title>
      <p>We believe that current approaches to cybersecurity risk analysis and management, are,
for the most part, inadequate due to the omission of socio-technical systems analysis.</p>
      <p>We propose using socio-technical models to enhance these methodologies, using a
variant on Rasmussen’s model in relation to SABSA and OCTAVE Allegro as an
example.</p>
      <p>In future, these factors also lead us onto to a deeper discussion of how and why
organizations identify risks, what we are trying to protect and what is regarded as a
threat and why. These point to issues of identity and survival which would be difficult
to address in the scope of this paper but lead us to believe that a model which unifies
social learning and socio-technical factors might provide deeper insights into the nature
of risk. We are aware that the very act of augmenting risk models in this manner could
change how we think about risk in such systems, pointing us to underlying factors that
are not addressed in systems design methodologies or even perceived as
relevant(Checkland and Holwell 1998).</p>
      <p>Future work will focus on developing and assessing novel approaches to both risk
identification and risk evaluation and management in line with our findings. We also
intend to develop and improve techniques for the dynamic modelling of socio-technical
risks. In this, we will make use of the simulation of socio-technical systems (Hettinger,
Kirlik et al. 2015) and their management on the Norwegian cyber range4.
Computers and Applications (MIC-CCA), 2012 Mosharaka International Conference
on, IEEE.</p>
      <p>Al Sabbagh, B. and S. Kowalski (2015). "A socio-technical framework for threat
modeling a software supply chain." IEEE Security &amp; Privacy 13(4): 30-39.
Anderson, R. and T. Moore (2007). Information security economics–and beyond.
Annual International Cryptology Conference, Springer.</p>
      <p>Bălan, S. (2010). "M. Foucault's view on power relations." Cogito-Multidisciplinary
Research Journal 2: 55-61.</p>
      <p>BBC (2017). "http://www.bbc.co.uk/news/technology-40875534."
Berghel, H. (2015). "Cyber chutzpah: The sony hack and the celebration of hyperbole."
Computer 48(2): 77-80.</p>
      <p>Botta, D., et al. (2011). "Toward understanding distributed cognition in IT security
management: the role of cues and norms." Cognition, Technology &amp; Work 13(2):
121134.</p>
      <p>Cassano-Piché, A., et al. (2006). A sociotechnical systems analysis of the BSE
epidemic in the UK through case study. Proceedings of the Human Factors and
Ergonomics Society Annual Meeting, SAGE Publications Sage CA: Los Angeles, CA.
Checkland, P. and S. Holwell (1998). "Action research: its nature and validity."
Systemic practice and action research 11(1): 9-21.</p>
      <p>Ehrenfeld, J. M. (2017). "WannaCry, Cybersecurity and Health Information
Technology: A Time to Act." Journal of Medical Systems 41(7): 104.
Hettinger, L. J., et al. (2015). "Modelling and simulation of complex sociotechnical
systems: Envisioning and analysing work environments." Ergonomics 58(4): 600-614.
Hillier (April 2018). Cyber Attack on the NHS. HC 787. H. o. C. C. o. P. Accounts.
House of Commons, Public Accounts Committee, House of Commons.
Ionita, D. (2013). Current established risk assessment methodologies and tools,
University of Twente.</p>
      <p>Kleiner, B. H. and W. A. Corrigan (1989). "Understanding organisational change."
Leadership &amp; Organization Development Journal 10(3): 25-31.
Kowalski, S. (1994). "IT insecurity: a multi-discipline inquiry." Department of
Computer and System Sciences, University of Stockholm and Royal Institute of
Technology, Sweden.</p>
      <p>Leveson, N. (2011). Engineering a safer world: Systems thinking applied to safety, MIT
press.</p>
      <p>Mattei, T. A. (2017). "Privacy, Confidentiality, and Security of Health Care
Information: Lessons from the Recent WannaCry Cyberattack." World neurosurgery
104: 972-974.</p>
      <p>McEvoy, R. and S. Kowalski "Beyond Training and Awareness: From Security Culture
to Security Risk Management."</p>
      <p>
        Schlienger, T. and S.
        <xref ref-type="bibr" rid="ref17">Teufel (2003)</xref>
        . Analyzing information security culture: increased
trust by an appropriate information security culture. Database and Expert Systems
Applications, 2003. Proceedings. 14th International Workshop on, IEEE.
      </p>
      <p>Wolstenholme, E. F. (2003). "Towards the definition and use of a core set of archetypal
structures in system dynamics." System Dynamics Review 19(1): 7-26.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>Adams</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          and
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Sasse</surname>
          </string-name>
          (
          <year>1999</year>
          ).
          <article-title>"Users are not the enemy</article-title>
          .
          <source>" Communications of the ACM</source>
          <volume>42</volume>
          (
          <issue>12</issue>
          ):
          <fpage>40</fpage>
          -
          <lpage>46</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <given-names>Al</given-names>
            <surname>Sabbagh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. and S.</given-names>
            <surname>Kowalski</surname>
          </string-name>
          (
          <year>2012</year>
          ).
          <article-title>Developing social metrics for security modeling the security culture of it workers individuals (case study)</article-title>
          .
          <source>Communications</source>
          , 4 Norwegian Cyber Range https://www.ntnu.no/ncr Mohurle, S. and
          <string-name>
            <given-names>M.</given-names>
            <surname>Patil</surname>
          </string-name>
          (
          <year>2017</year>
          ).
          <article-title>"A brief study of wannacry threat: Ransomware attack</article-title>
          <year>2017</year>
          .
          <article-title>"</article-title>
          <source>International Journal</source>
          <volume>8</volume>
          (
          <issue>5</issue>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Mowles</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          (
          <year>2016</year>
          ).
          <article-title>Rethinking management: Radical insights from the complexity sciences</article-title>
          ,
          <source>Routledge.</source>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <surname>Okere</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          , et al. (
          <year>2012</year>
          ).
          <article-title>Assessing information security culture: A critical analysis of current approaches</article-title>
          .
          <source>Information Security for South Africa (ISSA)</source>
          ,
          <year>2012</year>
          , IEEE.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>Oshlyansky</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          , et al. (
          <year>2007</year>
          ).
          <article-title>Validating the Unified Theory of Acceptance and Use of Technology (UTAUT) tool cross-culturally</article-title>
          .
          <source>Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI</source>
          ...
          <article-title>but not as we know it-Volume 2</article-title>
          ,
          <string-name>
            <given-names>BCS</given-names>
            <surname>Learning</surname>
          </string-name>
          &amp; Development Ltd.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>Perrow</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          (
          <year>2011</year>
          ).
          <article-title>Normal Accidents: Living with High Risk Technologies-Updated Edition</article-title>
          , Princeton university press.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <surname>Quintaneiro</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          (
          <year>2006</year>
          ).
          <article-title>"The concept of figuration or configuration in Norbert Elias' sociological theory." Teoria &amp; Sociedade 2(SE): 0-0</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <surname>Rasmussen</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>1997</year>
          ).
          <article-title>"Risk management in a dynamic society: a modelling problem." Safety science 27(2-3</article-title>
          ):
          <fpage>183</fpage>
          -
          <lpage>213</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <surname>Saleh</surname>
            ,
            <given-names>M. S.</given-names>
          </string-name>
          and
          <string-name>
            <given-names>A</given-names>
            .
            <surname>Alfantookh</surname>
          </string-name>
          (
          <year>2011</year>
          ).
          <article-title>"A new comprehensive framework for enterprise information security risk management." Applied computing and informatics 9(2</article-title>
          ):
          <fpage>107</fpage>
          -
          <lpage>118</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          <string-name>
            <surname>Salmon</surname>
            ,
            <given-names>P. M.</given-names>
          </string-name>
          , et al. (
          <year>2017</year>
          ).
          <article-title>Human Factors Methods for Accident Analysis</article-title>
          .
          <source>Human Factors Methods and Accident Analysis</source>
          , CRC Press:
          <fpage>29</fpage>
          -
          <lpage>104</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <string-name>
            <surname>Sasse</surname>
            ,
            <given-names>M. A.</given-names>
          </string-name>
          , et al. (
          <year>2001</year>
          ).
          <article-title>"Transforming the 'weakest link'-a human/computer interaction approach to usable and effective security</article-title>
          .
          <source>" BT technology journal 19</source>
          <volume>(3)</volume>
          :
          <fpage>122</fpage>
          -
          <lpage>131</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          <string-name>
            <surname>Schlienger</surname>
            , T. and
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Teufel</surname>
          </string-name>
          (
          <year>2003</year>
          ).
          <article-title>"Information security culture-from analysis to change."</article-title>
          <source>South African Computer Journal</source>
          <year>2003</year>
          (
          <volume>31</volume>
          ):
          <fpage>46</fpage>
          -
          <lpage>52</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          <string-name>
            <surname>Sherwood</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , et al. (
          <year>2004</year>
          ).
          <article-title>"Enterprise Security Architecture-</article-title>
          <source>SABSA." Information Systems Security</source>
          <volume>6</volume>
          (
          <issue>4</issue>
          ):
          <fpage>1</fpage>
          -
          <lpage>27</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          <string-name>
            <surname>Soomro</surname>
            ,
            <given-names>Z. A.</given-names>
          </string-name>
          , et al. (
          <year>2016</year>
          ).
          <article-title>"Information security management needs more holistic approach: A literature review."</article-title>
          <source>International Journal of Information Management</source>
          <volume>36</volume>
          (
          <issue>2</issue>
          ):
          <fpage>215</fpage>
          -
          <lpage>225</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          <string-name>
            <surname>Sütterlin</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , et al. (
          <year>2018</year>
          ).
          <article-title>Supporting the Human in Cyber Defence</article-title>
          . Computer Security:
          <article-title>ESORICS 2017 International Workshops</article-title>
          ,
          <source>CyberICPS 2017 and SECPRE</source>
          <year>2017</year>
          , Oslo, Norway,
          <source>September 14-15</source>
          ,
          <year>2017</year>
          , Revised Selected Papers, Springer.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          <string-name>
            <surname>Taylor</surname>
            ,
            <given-names>J. R.</given-names>
          </string-name>
          and L.
          <string-name>
            <surname>Lerner</surname>
          </string-name>
          (
          <year>1996</year>
          ).
          <article-title>"Making sense of sensemaking: How managers construct their organisation through their talk." Studies in Cultures</article-title>
          ,
          <source>Organizations and Societies</source>
          <volume>2</volume>
          (
          <issue>2</issue>
          ):
          <fpage>257</fpage>
          -
          <lpage>286</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          <string-name>
            <surname>Teufel</surname>
            ,
            <given-names>T. S. a. S.</given-names>
          </string-name>
          (
          <year>2003</year>
          ).
          <article-title>Analyzing information security culture: increased trust by an appropriate information security culture</article-title>
          .
          <source>14th International Workshop on Database and Expert Systems Applications.</source>
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          <string-name>
            <surname>Wahlgren</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          , et al. (
          <year>2013</year>
          ).
          <article-title>A framework for selecting IT security risk management methods based on ISO27005</article-title>
          .
          <source>6th International Conference on Communications, Propagation and Electronics</source>
          , Kenitra, Morocco. Academy Publisher.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>