The Future of Proxemic Interaction in Smart Factories DONOVAN TOURE, Daimler AG, Germany ROBIN WELSCH, LMU Munich, Germany SVEN MAYER, LMU Munich, Germany Digitalization in Smart Factories is allowing virtual, physical asset data, as well as processes data to be connected throughout their lifecycles. Here, digital twins mirror the behaviors of physical assets and can simulate their spatiotemporal statuses. The work systems that employ digital twins have yet to address in-situ information representation to workers and ways to mitigate task information overload. Thus, the key is to only present relevant information when and where it is needed. We propose proxemic interaction patterns, i.e. the distance from the user to the device or between devices, for visualization of this data. Here, we outline how scaling the amount and type of augmented reality visualization could be realized using distance, angle, and orientation of users. We first showcase possible scenarios of how proxemic interaction can support workers in smart factories. We then highlight challenges and opportunities when using proxemic interaction in industrial settings such as manufacturing and warehousing. Finally, we present possible future investigations concerning proxemic interactions in the context of a smart factory. CCS Concepts: • Human-centered computing → Mixed / augmented reality; • Applied computing → Industry and manu- facturing. Additional Key Words and Phrases: Smart Factory, Cyber-Physical Systems, Proxemics, Egocentric Interaction, Information Manage- ment, Augmented Reality, Industry 4.0, Big Data Visualization ACM Reference Format: Donovan Toure, Robin Welsch, and Sven Mayer. 2021. The Future of Proxemic Interaction in Smart Factories. In Proceedings of AutomationXP'21: Workshop on Automation Experience at the Workplace. In conjunction with CHI'21, May 07, 2021. 1 INTRODUCTION Data-driven demands of smart factories are creating new opportunities to develop systems and interaction paradigms based on real-time data access. With this, the concept of a digital twin, i.e., a digital replica of a physical system or asset, has gained increased emphasis due to the capacity to integrate virtual and physical data of machine processes and lifecycles [16]. Currently, this data is used for purposes of simulation [6] and the close connection of virtual and physical processes [16]. The sheer amount of data generated in real-time from smart factory assets presents a high potentiality of information overload [15] if human workers want to access this information in-situ to perform tasks, particularly when using augmented reality (AR). Consequently, future system design should address how users can meaningfully interact with complex information. This challenge can be addressed through innovative means of human-computer interaction (HCI) that carefully consider human capability and functionality in socio-technical spaces. Building on how humans make use of social Workshop proceedings Automation Experience at the Workplace In conjunction with CHI'21, May 7th, 2021, Yokohama, Japan Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). Website: http://everyday-automation.tech-experience.at 1 Zone 0 Zone 1 Zone 2 Zone 3 Press #081 Welding Robot #421 Welding Robot #124 Press #079 Efficiency: 3896 spots / h Efficiency: 256 spots / h Downtime 2min Downtime 58min Show Schematics Start Maintenance Welding Robot #387 Welding Robot #588 Efficiency: 2441 spots / h Downtime 38min Welding Robot #583 Fig. 1. Inspired by earlier work such as Hall [7], we envision proxemic interactions in smart factories where stack lights can be shown to workers using AR, allowing them to filter the important machines to avoid additional workload. Furthermore, we envision stack lights serving as an anchor for further information management, allowing workers to perform their tasks more effectively. With greater distance from a machine, we scale the level of information presented to the worker. boundaries for interaction [7], contemporary proxemic interaction research has moved to consider digital spaces as ways to scale information and interaction potential with devices around the user [12]. This extension into devices and information spaces seems logical because proxemic relationships are largely intuitive and the ubiquitous nature of modern devices allows for more dynamic interaction. Recently, proxemic relationships have found use in smart home scenarios [1], multi-user interactive exhibits [18], device location sensing for interaction enhancement [11], negotiating implicit and explicit interactions with notifications [1, 10], capturing the attention of and mitigating activity exposure to passersby [2, 19], 3D spatial orientation and navigation [13], and displaying events as spatiotemporal activities [4]. Proxemic interactions are beginning to find utility in a wide range of areas in HCI by allowing users to get relevant information in action spaces when it is needed. In this paper, we propose proxemic interactions in AR for smart factories. In contrast to smart homes or general population settings, smart factories have ever-increasing and complex data flows and operate on a vastly different scale reaching in excess of 200, 000𝑚 2 1 . Thus, the scale on which workers need to interact is also different. On a macro-scale, workers need to maintain overall production processes; on a micro-scale, each machine and its sub processes need to maintain optimal efficiency. Given this context, we argue that proxemic interactions can help workers maintain a real-time overviews of factory operations by overlaying relevant information as different factory spaces are engaged. In a more industry related context, we will explore how proxemic interactions can support workers in their tasks by providing meaningful real-time information using AR. First, we will map out a possible deployment scenario for proxemic interactions. Then, we will outline the challenges and opportunities in operationalizing such a scenario. 2 THE PROXEMICS FACTORY We propose using proxemic interactions to manage the information load presented to production and maintenance workers. We envision scaling information by displaying more detailed information as the worker maintains close 1 https://www.daimler.com/innovation/production/factory-56.html 2 Fig. 2. Sketch of a smart factory maintenance scenario with different levels of scaled information shown to a user. While the worker maintains close proximity, more machine information and relevant interaction opportunities are available. The worker can see other machines in the distance with stack light indicators of their operational status. vicinity and only giving sparse information for processes with greater distance, see Figure 1. We use four proxemic dimensions that can facilitate such interactions: Movement, Orientation, Distance, and Identity. Movement lets us understand when a person is walking towards a machine, how quickly (e.g., in the case of an emergency), or when changing directions. Orientation gives information about the direction a user is facing. Orientation can be inferred by the positioning of faces and limbs which can in turn suggest different postures and gaze direction [12]. Distance is used to assign zones for interaction [8, 12, 20, 21]. This allows workers to perform their specialized tasks while keeping an overview of the whole process. 2.1 Worker Specific Visualizations In more typical factories, we see stack lights [9, 14] as the dominant way to communicate the status of a machine or process often encoded using a three-tier system: 1) green signifying normal operation, 2) yellow signaling warnings such as overheating, or highly pressurized conditions, and 3) red signifying failure conditions such as an emergency stop or machine fault. While stack lights give an initial indication over process status, additional or even specific information cannot be displayed within a single stack light system that may be relevant for situational understanding. Furthermore, all workers can see stack lights even if they are not relevant to their tasks, which may affect performance levels if various status indicators are consistently in view. We propose that workers use AR headsets to provide a similar stack light logic and visualization as status indicators. In this context, only the worker or team in charge of a specific machine or process would be shown relevant indicators which can potentially reduce workload as stack lights tend to have salient features that grab attention such as blinking and beeping components when in the red state. We envision these indicators to only be visible when a worker is in view of the machine or process, see Figure 2. As the worker comes close to the error, more specific information will be displayed to assist the worker in correcting the error. Special situations such as emergencies may indeed occur, which would trigger alerts and information displays more exigent in nature and do not depend on the field of view or proximity. 3 2.2 Additional Information The real-time virtualization of data in smart factories creates a new space for workers to use data in-situ for task performance. In the past, workers would need to carry or use specialized tools that may be cumbersome or require special tuning, such as a mechanic’s stethoscope or infrared thermometers. In contrast, we imagine a proxemics smart factory where a worker takes advantage of real-time data analysis to visualize detailed information as an AR overlay. Visualized sensor readings from machines that indicate important diagnostic information, such as voltage, heat dissipation, chemical levels, but also aggregated indicators on connected processes, such as the last service date, could be shown in a proxemics-enabled display when in the appropriate area and proper orientation, see Figure 2. 2.3 Task-Specific Visualizations Getting the most accurate information in-situ is essential given the diversity of data available in the continuous information flow. As shown in Figure 1, we envision a primary egocentric “Zone 0” that encompasses the user’s identity as a maintenance worker. This primary zone encapsulates, follows the user, and is also where physical work occurs in a close-up space where assistive task-specific data is displayed. This zone is instantiated at a consistent and more user-centric degree of distance that enables the worker to see details of a specific error. Information management in this stage is critical due to the restricted working space. The Identity dimension is one where information type is pre-filtered for a specific user’s working tasks [12] – in Figure 2, a maintenance worker. 2.4 Interaction Summary information alone may often not be enough to complete tasks in a smart factory. Meaningful interaction with proxemics-enabled systems could allow a worker to work more efficiently. Thus, adding interaction capabilities directly into AR visualizations, which scale with distance, adds to the usability of proxemics-enabled systems. This opens a wide range of capabilities, such as displaying additional real-time sensor values, showing reading histories, overlaying schematics. However, more importantly, this can also enable the worker to directly get simulated data after making physical modifications to engage in localized quality assurance processes using real-time data visualizations. 3 DEPLOYMENT CHALLENGES Displaying relevant information in-situ into the environment enables the worker to keep an overview of current tasks and to use real-time data as a dynamic feedback assistant to complete tasks much faster than without support. Thus, proxemic interaction will improve efficiency by lowering the worker’s time searching for information and interaction possibilities. Although we present a sketched out design for a proxemics factory in this work, several challenges need to be addressed that we believe will enable this vision. 3.1 User Tracking Allowing the user to access and work with relevant data in the correct context is key to keeping human workers in-the-loop. Thus, sensor networks to track user position in smart factories are essential to track the worker’s orientation, posture, and gaze to offer the correct visual assistance. While outdoor tracking has improved massively over the last years, indoor localization is still a highly researched topic [22]. The more stable tracking systems use optical sensors; however, they bring privacy concerns along with them. Regardless of the type, stable tracking is crucial to displaying real-time data in-situ to the worker. 4 3.2 Designing the Interaction and Visualization The current design of the visualization is designed around stack lights. Due to the ubiquity of stack lights, factory workers are already familiar with their functionality, and adding additional information to them is a logical next step. However, which information the worker needs and to what detail and functionality should be part of future investigations. In this next step, we envision visualizations extending beyond one unit in the maintenance process but then to also assist the user through detailed steps in the repair process [5]. 3.3 Defining the Visualizations Zones While Hall [7] showed four zones around the user for different interpersonal spaces, it is not clear yet how these zones change in the context of a smart factory. Thus, it is important to understand how a worker interacts effectively to fulfill tasks and to which extent the worker just needs to be aware of their surroundings. We envision this to be different for individual tasks. In the micro case, the worker needs to repair one small piece of a large machine, and in the macro case, the worker has to supervises a full production line and is responsible for all sub-functions. 3.4 Modality Proxemic interactions in smart factories require a unique approach due to the data complexity being interfaced with. The data generated and the different user identities accessing and making decisions based on in-situ information will create an even more complex system of human and digital connectivity. While the fastest way to access data in-situ is by using an AR display, not all work will require Head-mounted displays with immersive visualization. For some tasks, tablets or projections can be more effective [3]. 3.5 Collaboration Collaboration is critical to take on complex projects. However, as each worker has their own view and workflow, this can create conflicts during collaboration. For example, when team members have different egocentric “Zone 0” identities, such as maintenance and production workers, handling conflicts of individual interaction possibilities from disparate proxemic-aware visualizations will be a unique HCI challenge. As a simple solution to enable collaboration, we propose that the user with a Zone 0 that has a larger scope may be extended to other users while their individual scopes are incorporated. Other solutions could be, prioritizing user proximity, creating composite, or merging views [1]. 4 DISCUSSION Smart factories offer dynamic ways for workers to interact with their surroundings. Displaying relevant information in-situ for different task performance stages enables the worker to keep overviews of task engagement and complete tasks much faster than in traditional contexts. For proxemic interactions to reach full maturity, HCI investigations will have to give attention to designing visualizations and interactions for detailed steps in task performance [5]. Additionally, proper user tracking to enable seamless interactions within proxemic zones will need to be investigated in ways that mitigate cognitive load and distraction effectively. Rendering discrepancies on different devices can also affect view consistency [11] though it is unclear how consistent or real-time system updates are needed to achieve seamless interaction. Research that can identify appropriate modalities will also be necessary for seamless use. While immersive in-situ feedback using AR will enable all worker to get instant feedback they also put extra weight onto the 5 user. Thus, we argue that for some tasks tablets or projections [5] may be sufficient to support the worker with in-situ feedback. Considerations for different cultural groups [7, 17] that understand and utilize space differently can lead to the design of adaptive systems. Finally, considerations on group collaboration with proxemic systems will need to reconcile conflicts from multiple visualization updates and interactions from differing proxemic-enabled zones based on user identity. 5 CONCLUSION Overall, we see great potential but also specific challenges for proxemic interactions in smart factories. The sheer amount of data generated in smart factories make these potentials particularly valid as information management investigations for specialized human tasks. In this paper, we provided an overview of how proxemics can be used to keep humans in the loop for smart factory processes by showing how meaningful connections can be made between physical infrastructure and complex information spaces that are necessary for operation. This work provides an overview for researchers and industry professionals when considering proxemic interactions in smart factories. REFERENCES [1] Till Ballendat, Nicolai Marquardt, and Saul Greenberg. 2010. Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment. In Proceedings of ITS ’10. ACM. https://doi.org/10.1145/1936652.1936676 [2] Frederik Brudy, David Ledo, Saul Greenberg, and Andreas Butz. 2014. Is Anyone Looking? Mitigating Shoulder Surfing on Public Displays through Awareness and Protection. In Proceedings of PerDis ’14. ACM. https://doi.org/10.1145/2611009.2611028 [3] Sebastian Büttner, Henrik Mucha, Markus Funk, Thomas Kosch, Mario Aehnelt, Sebastian Robert, and Carsten Röcker. 2017. The Design Space of Augmented and Virtual Reality Applications for Assistive Environments in Manufacturing: A Visual Approach. In Proceedings of PETRA ’17. Association for Computing Machinery. https://doi.org/10.1145/3056540.3076193 [4] Xiang ’Anthony’ Chen, Sebastian Boring, Sheelagh Carpendale, Anthony Tang, and Saul Greenberg. 2012. Spalendar: Visualizing a Group’s Calendar Events over a Geographic Space on a Public Display. In Proceedings of AVI ’12. ACM. https://doi.org/10.1145/2254556.2254686 [5] Markus Funk, Lars Lischke, Sven Mayer, Alireza Sahami Shirazi, and Albrecht Schmidt. 2018. Teach Me How! Interactive Assembly Instructions Using Demonstration and In-Situ Projection. Springer Singapore, Singapore, 49–73. https://doi.org/10.1007/978-981-10-6404-3_4 [6] Thomas Gabor, Lenz Belzner, Marie Kiermeier, Michael Till Beck, and Alexander Neitz. 2016. A simulation-based architecture for smart cyber-physical systems. In Proceedings of ICAC ’16. IEEE. https://doi.org/10.1109/ICAC.2016.29 [7] Edward T. Hall. 1966. The hidden dimension. Anchor Books. [8] Heiko Hecht, Robin Welsch, Jana Viehoff, and Matthew R Longo. 2019. The shape of personal space. Acta Psychologica. https://doi.org/10.1016/j. actpsy.2018.12.009 [9] IEC 60073:2002 2002. Basic and safety principles for man-machine interface, marking and identification - Coding principles for indicators and actuators. Standard. International Electrotechnical Commission, Geneva, CH. https://webstore.iec.ch/publication/587 [10] Wendy Ju, Brian A. Lee, and Scott R. Klemmer. 2008. Range: Exploring Implicit Interaction through Electronic Whiteboard Design. In Proceedings of CSCW ’08. ACM. https://doi.org/10.1145/1460563.1460569 [11] Gerd Kortuem, Christian Kray, and Hans Gellersen. 2005. Sensing and Visualizing Spatial Relations of Mobile Devices. In Proceedings of UIST ’05. ACM. https://doi.org/10.1145/1095034.1095049 [12] Nicolai Marquardt and Saul Greenberg. 2015. Proxemic interactions: From theory to practice. Synthesis Lectures on Human-Centered Informatics. https://doi.org/10.2200/S00619ED1V01Y201502HCI025 [13] Ahmed E. Mostafa, Saul Greenberg, Emilio Vital Brazil, Ehud Sharlin, and Mario C. Sousa. 2013. Interacting with Microseismic Visualizations. In Proceedings of CHI EA ’13. ACM. https://doi.org/10.1145/2468356.2468670 [14] Felix Nilsson, Jens Jakobsen, and Fernando Alonso-Fernandez. 2020. Detection and Classification of Industrial Signal Lights for Factory Floors. In Proceedings of ISCV ’20. IEEE. https://doi.org/10.1109/ISCV49265.2020.9204045 [15] Antonis Protopsaltis, Panagiotis Sarigiannidis, Dimitrios Margounakis, and Anastasios Lytos. 2020. Data Visualization in Internet of Things: Tools, Methodologies, and Challenges. In Proceedings of ARES ’20. Association for Computing Machinery. https://doi.org/10.1145/3407023.3409228 [16] Qinglin Qi and Fei Tao. 2018. Digital Twin and Big Data Towards Smart Manufacturing and Industry 4.0: 360 Degree Comparison. IEEE Access. https://doi.org/10.1109/ACCESS.2018.2793265 [17] Maurizio Sicorello, Jasmina Stevanov, Hiroshi Ashida, and Heiko Hecht. 2019. Effect of Gaze on Personal Space: A Japanese–German Cross-Cultural Study. Journal of Cross-Cultural Psychology. https://doi.org/10.1177/0022022118798513 6 [18] Scott S. Snibbe and Hayes S. Raffle. 2009. Social Immersive Media: Pursuing Best Practices for Multi-User Interactive Camera/Projector Exhibits. In Proceedings of CHI ’09. ACM. https://doi.org/10.1145/1518701.1518920 [19] Miaosen Wang, Sebastian Boring, and Saul Greenberg. 2012. Proxemic Peddler: A Public Advertising Display That Captures and Preserves the Attention of a Passerby. In Proceedings of PerDis ’12. ACM. https://doi.org/10.1145/2307798.2307801 [20] Robin Welsch, Heiko Hecht, and Christoph von Castell. 2018. Psychopathy and the Regulation of Interpersonal Distance. Clinical Psychological Science. https://doi.org/10.1177/2167702618788874 [21] Robin Welsch, Christoph von Castell, and Heiko Hecht. 2019. The anisotropy of personal space. PloS one. https://doi.org/10.1371/journal.pone.0217587 [22] Faheem Zafari, Athanasios Gkelias, and Kin K Leung. 2019. A Survey of Indoor Localization Systems and Technologies. IEEE Communications Surveys & Tutorials. https://doi.org/10.1109/COMST.2019.2911558 7