Supporting Safety Assessment in Human-Robot Collaboration using Process Models Philipp Kranz, Shaza Elbishbishy, Jeshwitha Jesus Raja and Marian Daun Technical University of Applied Sciences Würzburg-Schweinfurt, Schweinfurt, Germany Abstract Future manufacturing scenarios increasingly rely on human-robot collaboration, where safety is a critical concern. To ensure safe collaboration in industrial automation, the underlying assembly process must be adequately considered. Therefore, we present a framework that uses graphically annotated process models to visualize safety hazards in collaborative assembly processes, providing experts with an easy-to-understand tool for their analysis. Keywords Safety Assessment, Process Models, BPMN, Human-Robot Collaboration, Collaborative Robot 1. Introduction Robotic systems play a vital role in flexibilizing manufacturing processes, with collaborative robots (cobots) increasingly being used to semi-automate production tasks. In these settings, humans and cobots share the same workspace, operate in vicinity, spatially overlap in their actions, and collaboratively work on the same task. Therefore, a significant challenge in human-robot collaboration (HRC) is ensuring safety. Research has found model-based approaches to be a promising paradigm to foster early analyses and safety assessment of robotic systems (e.g., [1, 2]). However, for robotic systems used in production, particularly the production process needs to be considered for safety analysis, as this hugely influences overlaps and potential mishaps in the interaction between human and cobot. Therefore, as a first step, we propose using annotated process models to assess the safety of HRC. In this paper, Business Process Modeling Notation (BPMN) is used to (a) specify the production process executed by cobots and humans, and (b) assess the safety of the HRC using graphical annotations embedded in the process model. BPMN models thereby help in capturing the timely dependencies between different process steps, which are crucial for safety assessment. ER2024: Companion Proceedings of the 43rd International Conference on Conceptual Modeling: ER Forum, Special Topics, Posters and Demos, October 28-31, 2024, Pittsburgh, Pennsylvania, USA Envelope-Open philipp.kranz@thws.de (P. Kranz); shaza.elbishbishy@study.thws.de (S. Elbishbishy); jeshwitha.jesusraja@study.thws.de (J. Jesus Raja); marian.daun@thws.de (M. Daun) Orcid 0000-0002-1057-4273 (P. Kranz); 0009-0002-0975-272X (S. Elbishbishy); 0009-0008-7886-7081 (J. Jesus Raja); 0000-0002-9156-9731 (M. Daun) © 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings http://ceur-ws.org ISSN 1613-0073 CEUR Workshop Proceedings (CEUR-WS.org) CEUR ceur-ws.org Workshop ISSN 1613-0073 Proceedings 2. Related Work In recent years, the rise of cobots interacting with humans in shared environments has raised safety concerns about HRC systems. In software and systems engineering, model-based ap- proaches aid in managing complex development. Daun et al. [2] demonstrated the use of goal models for early safety analyses of HRC systems. Awad et al. [3] focused on model-driven risk assessment to identify workplace hazards and estimate the impact of safety measures. Safety modeling in manufacturing, especially under the Industry 4.0 paradigm, is crucial as failures can lead to significant harm [4]. Process models, particularly those based on the BPMN 2.0 standard, have been proposed to model collaborative behaviors and ensure safety in such interactions. Corradini et al. introduce an approach, which integrates formal verification techniques into BPMN collaboration models to ensure software quality and safety, demonstrating BPMN’s applicability in safety-critical contexts [5]. Additionally, Corradini et al. propose collaboration diagrams based on BPMN to model multi-robot collaboration, and highlight the potential for extending these models to HRC [6]. The BPMN standard has also been integrated with other modeling techniques to enhance safety management. For example, Mohammedi et al. developed a framework combining iStar with BPMN to analyze trustworthiness and safety requirements in collaborative operations, emphasizing how BPMN can be used to model responses to safety constraint violations [7]. 3. Using Process Models to support Safety Assessment in Human-Robot Collaboration Safety analysis is a critical and complex step in planning HRC assembly sequences. This is particularly important as most hazards occur during process operation and maintenance [8]. To ensure human safety, the process must be systematically analyzed in detail to prevent overlooking any safety risks. The identification and severity of these risks significantly influence the selection of appropriate mitigation strategies and determine whether the process is suitable for close human-robot collaboration. Early identification of potential exclusion criteria in the planning process is therefore highly beneficial. The use of BPMN can address this gap by providing a detailed process analysis that captures these critical dimensions, thereby reducing the likelihood of overlooking relevant safety risks. To improve safety analysis in HRC, the use of graphically annotated BPMN process models is proposed. We adapt the security-oriented extension of BPMNs by Salnitri et al. [9] for safety risks in the area of HRC. Specific safety risks are mapped to corresponding assembly steps within the BPMN model, providing a detailed and systematic approach to analyzing safety. This use of BPMN ensures that safety considerations are thoroughly integrated into the process design, allowing for better identification and mitigation of potential risks. Table 1 shows graphical warning signs for eight common HRC safety risks that are used to annotate the BPMNs. In addition, a multiple risks sign is introduced to indicate safety risks in sub-processes of the BPMNs. Figure 1 shows the use of the BPMN process model to visually highlight safety risks directly related to the process steps they might be triggered in. The example, shown in the figure, is taken from an assembly process for toy pickup trucks. Table 1 Graphical warnings of common safety risks in HRC assembly with their respective descriptions (Symbols were generated using AI tool OpenAI-DALL-E 2). The BPMN model provides a detailed representation of the collaborative assembly process. The cobot initiates the process by picking and placing the load carrier, cabin, chassis and front axle upside down in an assembly bracket. Meanwhile, the human operator prepares the axle holders by inserting two screws in each holder. The operator then fixes the front axle with the prepared axle holders with an electric screwdriver. This process is repeated for the back axle. The robot’s five pick and place operations can be broken down further to assign risks to more specific actions; they are therefore shown as sub-processes. ”Picks and places load carrier” shows an example of the subdivision of the processes into the actions of reaching, grasping, bringing and releasing, with the specific safety risks that can occur in each case. The warning signs embedded in the BPMN allow safety risks to be directly associated with specific assembly steps and are easier for the user to understand than textual annotations [10]. Due to the depicted process flow, the presented BPMNs can also be used to highlight successive risks, which is particularly interesting when they influence each other. For example, in our toy truck use case, a possible communication breakdown is immediately followed by the risk of a collision between the operator and the robot. The lack of communication can significantly increase the likelihood of such a collision. 4. Conclusion This paper examines a systematic safety analysis for industrial HRC by integrating safety risks into BPMN process models. Unlike other approaches that may only address specific aspects of HRC, our approach integrates safety analysis from the early stages of development. Our annotated BPMNs provide a structured notation that meticulously captures safety risks for each task and the entire sequence, providing an easy-to-use tool for safety professionals. Our approach emphasizes using BPMN as a proactive tool for safety assessment. By visualizing the process flow in BPMN, not only potential safety risks for specific steps, but also successive risks become more apparent. This enables early identification and mitigation of risks in HRC, improving the overall safety and reliability of the production process from the outset. In future work, we want to enhance the informative value of our annotation by emphasizing the severity of a risk or a series of risks by employing a color-coding system. Furthermore, we aim to determine if automated annotation is feasible for specific BPMN components, such as a communication error for the message symbol. References [1] A. Wortmann, O. Barais, B. Combemale, M. Wimmer, Modeling languages in industry 4.0: an extended systematic mapping study, Software and Systems Modeling 19 (2020) 67–94. [2] M. Daun, M. Manjunath, J. Jesus Raja, Safety analysis of human robot collaborations with grl goal models, in: International Conference on Conceptual Modeling, Springer, 2023, pp. 317–333. [3] R. Awad, M. Fechter, J. van Heerden, Integrated risk assessment and safety consideration during design of hrc workplaces, in: 2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), IEEE, 2017, pp. 1–10. [4] Z. Liu, K. Xie, L. Li, Y. Chen, A paradigm of safety management in industry 4.0, Systems Research and Behavioral Science 37 (2020) 632–645. [5] F. Corradini, F. Fornari, A. Polini, B. Re, F. Tiezzi, A. Vandin, A formal approach for the analysis of bpmn collaboration models, Journal of Systems and Software 180 (2021) 111007. [6] F. Corradini, S. Pettinari, B. Re, L. Rossi, F. Tiezzi, A bpmn-driven framework for multi-robot system development, Robotics and Autonomous Systems 160 (2023). [7] N. G. Mohammadi, M. Heisel, A framework for systematic analysis and modeling of trustworthiness requirements using i* and bpmn, in: Trust, Privacy and Security in Digital Business: 13th International Conference, TrustBus 2016, Porto, Portugal, September 7-8, 2016, Proceedings 13, Springer, 2016, pp. 3–18. [8] K. Lee, J. Shin, J.-Y. Lim, Critical hazard factors in the risk assessments of industrial robots: causal analysis and case studies, Safety and health at work 12 (2021) 496–504. [9] M. Salnitri, F. Dalpiaz, P. Giorgini, Designing secure business processes with secbpmn, Software & Systems Modeling 16 (2017) 737–757. [10] D. Moody, The “physics” of notations: toward a scientific basis for constructing visual notations in software engineering, IEEE Transactions on software engineering 35 (2009) 756–779. Picks and Picks and Picks and Picks and places load Picks and places places front Holds front places back carrier places cabin chassis axle axle axle Message received to continue the process Goes back to Holds back initial axle Cobot position Message received to Picking and placing continue the process all parts completed Picks and places load carrier Reaches for Grasps load Brings load Releases load load carrier carrier carrier carrier Start pick and Load carrier place process picked and placed Prepare parts No Puts a screw Is there a Picks axle Holds axle in each axle screw in Yes holder holder holder slot each slot? All parts are prepared for the next process Human Operator No Screw parts through collaboration Picks and places axle Both axle Screws the 2 holder on the right Grabs holder Yes axle holders and left side of the screwdriver placed? in front front axle Sends message to the cobot to continue the process Picks and places axle Both axle Screws the 2 holder on the right Grabs holder Yes axle holders and left side of the screwdriver placed ? in the back back axle Sends message to the cobot to continue the process No All 4 axle No holders are Yes screwed in? All 4 axle holders are screwed in Figure 1: BPMN process model for the collaborative assembly of a toy pickup truck. The safety risks of the individual assembly steps are annotated with graphical warnings to provide the expert with a quick overview and to support the safety assessment (Symbols were generated using AI tool OpenAI-DALL-E 2).