=Paper= {{Paper |id=Vol-2400/paper-11 |storemode=property |title=On the Proposal of a Unified Safety Framework for Industry 4.0 Multi-Robot Scenario |pdfUrl=https://ceur-ws.org/Vol-2400/paper-11.pdf |volume=Vol-2400 |authors=Loris Roveda,Blerina Spahiu,Walter Terkaj |dblpUrl=https://dblp.org/rec/conf/sebd/RovedaST19 }} ==On the Proposal of a Unified Safety Framework for Industry 4.0 Multi-Robot Scenario== https://ceur-ws.org/Vol-2400/paper-11.pdf
    On the Proposal of a Unified Safety Framework
        for Industry 4.0 Multi-Robot Scenario
               (DISCUSSION PAPER)

                  Loris Roveda1 , Blerina Spahiu2 , Walter Terkaj3
1
Istituto Dalle Molle di studi sull’Intelligenza Artificiale (IDSIA), Scuola universitaria
professionale della Svizzera italiana (SUPSI), Universit della Svizzera italiana (USI),
                     Via Cantonale 2c - 6928 Manno, Switzerland
                                loris.roveda@idsia.ch
         2
           University of Milano-Bicocca, Viale Sarca, 336 - 20126 Milan, Italy
                              blerina.spahiu@unimib.it
  3
    Institute of Industrial Technologies and Automation (ITIA) of Italian National
              Research Council (CNR), via Corti, 12 - 20133 Milan, Italy
                             walter.terkaj@itia.cnr.it

        Abstract. Within the context of Industry 4.0, the presence of robotic
        systems inside the production plant is increasing, thus leading to the
        need for appropriate safety rules enhancing human-robot interaction.
        However, without a systematic approach, mapping all the possible sce-
        narios becomes critical. This paper aims at laying the foundations for
        a unified safety-based semantic approach in cooperative multi-robot in-
        dustrial environments. The objective of this study is to investigate an
        approach to organize, store and re-use construction safety knowledge by
        means of defining a new ontology and the construction of rules for safety
        application.




1     Introduction
The Industry 4.0 paradigm is driving the innovation of manufacturing and pro-
duction processes [6] and robotics is a key element enabling such epochal change
[11]. The increasing use of robotic systems in industries is enhancing human-
robot interaction (i.e., HRI) [13]. Safety is becoming a critical issue since humans
and robots interact while sharing working space and tasks. Indeed, standards
have already been developed to regulate safety (ISO 20218 [5]). On the basis of
such standards, many works investigate the definition of safety algorithms and
rules for very specific use-cases, such as for the design of industrial work-cells,
physical human-robot interaction for the execution of cooperative applications,
etc. However, even if safety rules can be found in the standards, their application
is still not straightforward in real industrial environments and no contribution
    Copyright c 2019 for the individual papers by the papers authors. Copying permit-
    ted for private and academic purposes. This volume is published and copyrighted by
    its editors. SEBD 2019, June 16-19, 2019, Castiglione della Pescaia, Italy.
to a unified safety framework has been proposed to address the complexity of
a production plant (i.e., the set of robots, sensors, human workers, tasks, tools
and obstacles).
The lack of knowledge related to the tool in use by the robot while cooper-
ating with a human operator may result in unpredictable accidents (e.g., in
the case the robot is manipulating a bulky and heavy part with a mechanical
gripper while cooperating with a human operator such information is impor-
tant to define workspace limitations, velocity etc.). Since in common industrial
applications such knowledge is not yet available, robots are used under their
achievable performance (in order to always ensure safety), resulting in decreas-
ing production capabilities. The combination of the Semantic Web technologies
and ontologies (that define concepts of a certain domain and relations between
them) can enable such knowledge.
    The aim of this paper is to fill this gap and present an initial work toward
a unified knowledge framework for the definition of safety rules in production
environments by using semantic technologies. Our contributions can be summa-
rized as follows: (i) definition of a framework for the specific case of safety rules;
(ii) development of an ontology to describe concepts of the working space; (iii)
case study related to a cooperative installation task.
This paper is organized as follows: the application scenarios are introduced in
Section 2 while the state-of-the-art is presented in Section 3. The proposed frame-
work is described in Section 4 while conclusions end the paper in Section 5.


2    Application Scenarios

More and more industries are embracing innovation on the process of assembly of
their products by means of advanced human-robot collaborative solutions, i.e.,
i) a Lightweight Mobile Arm (LMA) to perform autonomous transportation of
the parts and installation tasks, ii) an empowering robot to perform installation
tasks, iii) a sensing solution for cluttered environments to identify human workers
inside the working scene.
    Considering such multi-robot environment, the definition of a unified safety
framework is mandatory. We define three scenarios related to the multi-robot
collaborative environment.
    Scenario 1 - Exoskeleton Empowering Human Worker: exoskeletons
are used to relieve humans from heavy tasks, physically connected to the worker.
    Scenario 2 - Empowering Robot for Cooperative Installation Task:
the empowering robot is used to install bulky/heavy components (hatracks).
Such manipulator, equipped with a force sensor, a mechanical gripper and a vi-
sion system (to track humans) is physically interacting with the human operator.
    Scenario 3 - LMA for Autonomous Installation Task: the LMA is
used to autonomously install medium-size components (e.g., side-wall panels).
Such manipulator, equipped with a force sensor, a mechanical gripper and with
a vision system (to track humans) should not interact with humans for this task.
3   Related Work



A plethora of EU-funded project have investigated and are currently investigat-
ing safety issues at different levels (physical interaction, workspace design, etc.).
In the Saphari project1 , collision avoidance and prevention algorithms have been
developed to activate collaboration by using human gestures and voice com-
mands [3]. This safety control framework was tested on the lightweight robot
LWR-IV, from KUKA [3]. The H2020 COVR project 2 investigates the develop-
ment of an intuitive toolkit and a range of testing protocols for the validation of
safety for cobots.
Standards (e.g. ISO 20218:2011 [5]) have been defined to regulate human-robot
collaborative modalities in industrial plants. EU-funded projects345 , together
with dedicated research activities [2, 7], have enhanced such developments. Al-
though many works can be found in the state of the art covering safety related
topics, only few contributions are devoted to define a unified safety framework
[8, 10]. To overcome such limitation in the industrial context, some works are ex-
ploiting safety representations applying semantic technologies, i.e., representing
safety as an ontology that includes the main concepts of the safety standard and
the relationships between the concepts. Applying these concepts, some works
have used this representation format in order to exploit semantic technology
capabilities for safety assurance and certification. The application of semantic
technologies to improve the safety in working environments, e.g., by developing
ontologies that represent safety concepts and the relation between them, have
been studied in [4]. Specifically for Industry 4.0, the authors in [1] present a
method to design new instances of collaborative cells, by taking into account the
ISO 15066 and extending the CORA (Core Ontologies for Robotics and Automa-
tion) ontology. However, this work is limited to specific cooperative work-cells
and it does not take into account the complete and complex production plant
environment. The work [12] proposes to integrate task-level planning with se-
mantically represented workplace safety rules, but only a specific application is
considered. The authors of [15] address how an Artificial Intelligence technique
like Answer Set Programming (ASP) can be applied to support the planning of
mobile robot, while explicitly modeling rigid knowledge time-dependent internal
knowledge, time-dependent external knowledge, and action knowledge. However,
from the analysis of the state-of-the-art, at the best of our knowledge, no contri-
bution has been proposed to define a unified safety framework considering the
whole production process involving human-robot cooperation.


1
  http://www.saphari.eu/
2
  http://safearoundrobots.com/getcovr
3
  http://safearoundrobots.com/getcovr
4
  http://www.saphari.eu/
5
  http://www.xact-project.eu/
4     Proposed Approach

The integration of heterogeneous knowledge is required to ensure safety. For in-
stance, the combination of data coming from sensors and from the environment
where the robot is operating enables to understand the context where the robot
is working, consequently defining appropriate behaviors and safety rules. How-
ever, safety depends also on the specific task assigned to the robot and to the
tools that the robot is exploiting. This paper proposes a unified robot knowl-
edge approach that considers knowledge coming from sensory data together with
context information. Taking inspiration from [8], we introduce an ontology (to
conceptualize the environment where the robot is working and formalize the
knowledge accessible to the robot in terms of logical rules) with a novel focus on
the preservation of safety in the production plant. Robot’s knowledge for safety
is enhanced by using data coming from sensors with knowledge from reasoning.
The framework on top of such approach is shown in Figure 1. Such operation is
bidirectional thus enabling to learn from previous situations and taking advan-
tage for further decisions.




                    Fig. 1. Unifying framework for safety rules.



4.1   Knowledge Description

A taxonomy provides an ontological structure for human understanding, defin-
ing the arrangement of things of interest in a hierarchical structure. Figure 2
shows the taxonomy proposed within a scenario where human and robot should
collaborate. This hierarchy takes inspiration from the ifcOWL ontology [9], i.e.,
the Web Ontology Language (OWL) version of the Industry Foundation Classes
(IFC) standard, a widespread open model for the information exchange of Build-
ing Information Modeling (BIM) data. Such description allows therefore to de-
fine axioms, aiming at identifying the specific cases and the related safety rules
for a safe human-robot cooperation inside the production environment. In such
taxonomy we unify knowledge from context, obstacles, robots, space, tools, peo-
ple, sensors and tasks to enhance safety for physical human-robot interaction,
workspace sharing, autonomous navigation, tools and parts manipulation. The
considered working scenario can be modeled using the following key classes:

 – Object: a generic tangible (e.g., physical product) or intangible item (e.g.,
   process) that is used to define the global spatial/temporal scenario of the
   human-robot collaboration (cf. class IfcObject in ifcOWL);
 – Person: defining the human operators that can present in the robot working
   space (cf. class IfcActor in ifcOWL);
 – Task: defining the actions that the robot/human can perform (e.g., manip-
   ulation tasks, assembly tasks, etc.) (cf. class IfcTask in ifcOWL);
 – Product: any object that relates to a geometric or spatial context (cf. class
   IfcProduct in ifcOWL);
 – SpatialElement: any spatial element that might be used to define a spatial
   structure or a spatial zones (cf. class IfcSpatialElement in ifcOWL);
 – Space: defining the specific location where the robot is working, e.g. assembly
   lines, storage, corridor, etc. (cf. class IfcSpace in ifcOWL);
 – Element: physically existent object that can be characterized by a placement
   and 3D representation (cf. class IfcElement in ifcOWL).
 – BuildingElement: any type of static element that may be an obstacle for
   the robot, e.g., walls, doors, etc. (cf. class IfcBuildingElement in ifcOWL);
 – Robot: defining the robotic systems working inside the production plant, e.g.
   mobile platforms, lightweight manipulators, exoskeletons, etc.;
 – Tool: defining the specific tool in use to the robot to perform the specific
   task, e.g. mechanical gripper, screwer, etc.;
 – Sensor: defining different external sensors that the robot can use to perceive
   the environment, e.g. vision systems, force sensors, etc. (cf. class IfcSensor
   in ifcOWL, class sosa:Sensor in SSN/SOSA ontology [14]).

The workspace is modeled by tools, the space where they are located and the
context where they are used. In fact, a tool can be recognized not only by its
characteristics such as shape, size, material, but also by the spatial context
where it is located. Moreover, the knowledge of the robots, the sensors, the
obstacles and the human operators inside the working scenario and the allocated
tasks to the robots are fundamental to model the working scene. Such ontology
definition can therefore be applied to the human-robot cooperation in industrial
environments, where physical and non-physical cooperation is required.


4.2   Knowledge Association

Knowledge association creates and describes the relationship between ontology
classes and properties by means of axioms. Logical inference can be exploited to
automatically generate new knowledge starting from generic axioms and specific
instances. Such framework enables robot to perceive the environment and the
                   Fig. 2. Taxonomy of the working environment.



context where it is performing in such a way that is easy to avoid obstacles,
collisions and to process knowledge coming from other sensors and humans. The
axioms are defined using Description Logic (DL) and can be applied also to verify
that data are compliant with the ontology schema. Logic representations are
defined to identify the specific working scene in which the robot is operating and
to consequently define the related safety rules from the standards. Since in the
scope of OWL reasoning is monotonic because of the Open World Assumption,
the exploitation of other non-monotonic logic languages (e.g., ASP) will be taken
in consideration to support the application scenarios.


4.3   Application to Safety

As an example, we describe the three scenarios as in Section 2 and describe the
logic representations for defining safety rules for Scenario 2 in Section 2.


Scenario 1 - Exoskeleton Empowering Human Worker: On the basis
of the exoskeleton internal sensors information (e.g., encoders, torque sensors,
etc.) it is possible to identify safety-critical situations (such as critical human
postures). Moreover, on the basis of the knowledge of the executed task and
involved tools, it is possible to on-line check for safety rules to be applied during
the task execution.


Scenario 2 - Empowering Robot for Cooperative Installation Task:
the complete working scene can be defined having the information related to
the tool of the robot, the knowledge about the human operator motion and
the safety features implemented by the robot. The related safety rules can be
then identified and applied. The examples described by Algorithms 1 and 2 can
be exploited, considering the physical HRI installation scenario. Algorithm 1 is
considering the use of a safe tool, while Algorithm 2 is considering the use of
a non-safe tool. The two proposed scenarios have to apply different safety rules
since in the second case the non-safe tool introduces higher safety risks.
Algorithm 1 Scenario 2 - Physical Human-Robot Cooperation with Safe Tool
1: if Robot in Installation Area &&
2:    Physical Human-Robot Cooperation Task for Installation &&
3:    Force Sensor for Human-Robot Cooperation &&
4:    Safe Tool &&
5:    Human Operator Tracked by Camera Sensor &&
6:    Low-Level Safety Emergency Stop then
7: Apply Safety Rule for Physical HRI Considering Safe Tool and Low-Level Safe
   Stop Features

Algorithm 2 Scenario 2 - Physical Human-Robot Cooperation with Safe Tool
1: if Robot in Installation Area &&
2:    Physical Human-Robot Cooperation Task for Installation &&
3:    Force Sensor for Human-Robot Cooperation &&
4:    Non-Safe Tool &&
5:    Human Operator Tracked by Camera Sensor &&
6:    Low-Level Safety Emergency Stop then
7: Apply Safety Rule for Physical HRI Considering Non-Safe Tool and Low-Level
   Safety Stop Features



Scenario 3 - Autonomous Installation Task Performed by Lightweight
Manipulator Since the target application is supposed to be autonomous, no
physical interaction between the robot and human operators should be estab-
lished during the task execution. However, the human operator may enter the
working area of the manipulator. Therefore, one of the key topic defining the
safety rules to be applied in such a case is related to the possibility to track the
human motion. The complete working scene can be defined having the informa-
tion related to the tool in use by the robot, the knowledge about the human
operator position and the safety features implemented by the robot. In such a
way, the related safety rules can be identified and applied.


5   Conclusions
In this paper a unified safety framework for the Industry 4.0 environment is
proposed, including (i) the definition of the framework, (ii) the development of
an ontology to describe concepts of the working space, and (iii) the definition of
safety rules based on the use cases are proposed. A safety application is detailed
to described the adopted approach, for assembly of heavy products in industry.
The proposed ontology will be further developed, by integrating existing ontology
modules and making extension in the human-robot collaboration context.


Acknowledgments
The work has been partially developed within the H2020 EUROBENCH STEP-
bySTEP project.
References
 1. D. Antonelli and G. Bruno. Ontology-based framework to design a collaborative
    human-robotic workcell. In Working Conference on Virtual Enterprises, pages
    167–174. Springer, 2017.
 2. S. Brending, M. Lawo, J. Pannek, T. Sprodowski, P. Zeising, and D. Zimmermann.
    Certifiable software architecture for human robot collaboration in industrial pro-
    duction environments. IFAC-PapersOnLine, 50(1):1983–1990, 2017.
 3. A. De Luca and F. Flacco. Integrated control for phri: Collision avoidance, de-
    tection, reaction and collaboration. In Biomedical Robotics and Biomechatronics
    (BioRob), 2012 4th IEEE RAS & EMBS International Conference on, pages 288–
    295. IEEE, 2012.
 4. B. Gallina and Z. Szatmári. Ontology-based identification of commonalities and
    variabilities among safety processes. In International Conference on Product-
    Focused Software Process Improvement, pages 182–189. Springer, 2015.
 5. ISO 10218-1:2011: Robots and robotic devices–safety requirements for industrial
    robots–part 1: Robots. Standard, International Organization for Standardization,
    Geneva, CH, 2011.
 6. H. Lasi, P. Fettke, H.-G. Kemper, T. Feld, and M. Hoffmann. Industry 4.0. Business
    & Information Systems Engineering, 6(4):239–242, 2014.
 7. P. A. Lasota, T. Fong, J. A. Shah, et al. A survey of methods for safe human-robot
    interaction. Foundations and Trends R in Robotics, 5(4):261–349, 2017.
 8. G. H. Lim, I. H. Suh, and H. Suh. Ontology-based unified robot knowledge for
    service robots in indoor environments. IEEE Transactions on Systems, Man, and
    Cybernetics-Part A: Systems and Humans, 41(3):492–509, 2011.
 9. P. Pauwels, T. Krijnen, W. Terkaj, and J. Beetz. Enhancing the ifcowl ontology
    with an alternative representation for geometric data. Automation in Construction,
    80:77 – 94, 2017.
10. S. Ramanathan, A. Kamoun, and C. Chassot. Ontology-based collaborative frame-
    work for disaster recovery scenarios. In Enabling Technologies: Infrastructure for
    Collaborative Enterprises (WETICE), 2012 IEEE 21st International Workshop on,
    pages 104–106. IEEE, 2012.
11. M. Rüßmann, M. Lorenz, P. Gerbert, M. Waldner, J. Justus, P. Engel, and M. Har-
    nisch. Industry 4.0: The future of productivity and growth in manufacturing in-
    dustries. Boston Consulting Group, 9, 2015.
12. A. Shafei, J. Hodges, and S. Mayer. Ensuring workplace safety in goal-based
    industrial manufacturing systems.
13. P. Tavares, J. A. Silva, P. Costa, G. Veiga, and A. P. Moreira. Flexible work cell
    simulator using digital twin methodology for highly complex systems in industry
    4.0. In Iberian Robotics conference, pages 541–552. Springer, 2017.
14. W3C.        Semantic Sensor Network Ontology, 2017.              Available online:
    https://www.w3.org/TR/vocab-ssn/ (Last accessed on 10 June 2018).
15. F. Yang, P. Khandelwal, M. Leonetti, and P. Stone. Planning in answer set pro-
    gramming while learning action costs for mobile robots. In AAAI Spring 2014
    Symposium on Knowledge Representation and Reasoning in Robotics (AAAI-SSS),
    March 2014.