<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>May</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>The Future of Proxemic Interaction in Smart Factories</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>DONOVAN TOURE</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Daimler AG</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Germany ROBIN WELSCH</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>LMU Munich</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Germany SVEN MAYER</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>LMU Munich</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Germany</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Additional Key Words and Phrases: Smart Factory</institution>
          ,
          <addr-line>Cyber-Physical Systems, Proxemics, Egocentric Interaction, Information Management, Augmented Reality, Industry 4.0</addr-line>
          ,
          <country>Big Data Visualization</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2021</year>
      </pub-date>
      <volume>07</volume>
      <issue>2021</issue>
      <abstract>
        <p>Digitalization in Smart Factories is allowing virtual, physical asset data, as well as processes data to be connected throughout their lifecycles. Here, digital twins mirror the behaviors of physical assets and can simulate their spatiotemporal statuses. The work systems that employ digital twins have yet to address in-situ information representation to workers and ways to mitigate task information overload. Thus, the key is to only present relevant information when and where it is needed. We propose proxemic interaction patterns, i.e. the distance from the user to the device or between devices, for visualization of this data. Here, we outline how scaling the amount and type of augmented reality visualization could be realized using distance, angle, and orientation of users. We first showcase possible scenarios of how proxemic interaction can support workers in smart factories. We then highlight challenges and opportunities when using proxemic interaction in industrial settings such as manufacturing and warehousing. Finally, we present possible future investigations concerning proxemic interactions in the context of a smart factory. CCS Concepts: • Human-centered computing → Mixed / augmented reality; • Applied computing → Industry and manufacturing.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1 INTRODUCTION</title>
      <p>
        Data-driven demands of smart factories are creating new opportunities to develop systems and interaction paradigms
based on real-time data access. With this, the concept of a digital twin, i.e., a digital replica of a physical system or
asset, has gained increased emphasis due to the capacity to integrate virtual and physical data of machine processes
and lifecycles [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]. Currently, this data is used for purposes of simulation [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] and the close connection of virtual and
physical processes [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]. The sheer amount of data generated in real-time from smart factory assets presents a high
potentiality of information overload [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] if human workers want to access this information in-situ to perform tasks,
particularly when using augmented reality (AR). Consequently, future system design should address how users can
meaningfully interact with complex information.
      </p>
      <p>This challenge can be addressed through innovative means of human-computer interaction (HCI) that carefully
consider human capability and functionality in socio-technical spaces. Building on how humans make use of social</p>
      <sec id="sec-1-1">
        <title>Zone 0</title>
      </sec>
      <sec id="sec-1-2">
        <title>Zone 1</title>
      </sec>
      <sec id="sec-1-3">
        <title>Zone 2</title>
      </sec>
      <sec id="sec-1-4">
        <title>Zone 3</title>
        <p>Welding Robot #124
E ciency: 256 spots / h
Downtime 58min</p>
        <p>Show Schematics
Start Maintenance</p>
        <p>Welding Robot #421
E ciency: 3896 spots / h
Downtime 2min</p>
        <p>Welding Robot #387
E ciency: 2441 spots / h
Downtime 38min</p>
        <p>Press #081
Press #079</p>
        <p>
          Welding Robot #588
Welding Robot #583
boundaries for interaction [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ], contemporary proxemic interaction research has moved to consider digital spaces as
ways to scale information and interaction potential with devices around the user [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ]. This extension into devices and
information spaces seems logical because proxemic relationships are largely intuitive and the ubiquitous nature of
modern devices allows for more dynamic interaction. Recently, proxemic relationships have found use in smart home
scenarios [
          <xref ref-type="bibr" rid="ref1">1</xref>
          ], multi-user interactive exhibits [18], device location sensing for interaction enhancement [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ], negotiating
implicit and explicit interactions with notifications [
          <xref ref-type="bibr" rid="ref1 ref10">1, 10</xref>
          ], capturing the attention of and mitigating activity exposure
to passersby [
          <xref ref-type="bibr" rid="ref2">2, 19</xref>
          ], 3D spatial orientation and navigation [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ], and displaying events as spatiotemporal activities [
          <xref ref-type="bibr" rid="ref4">4</xref>
          ].
Proxemic interactions are beginning to find utility in a wide range of areas in HCI by allowing users to get relevant
information in action spaces when it is needed.
        </p>
        <p>In this paper, we propose proxemic interactions in AR for smart factories. In contrast to smart homes or general
population settings, smart factories have ever-increasing and complex data flows and operate on a vastly diferent scale
reaching in excess of 200, 0002 1. Thus, the scale on which workers need to interact is also diferent. On a macro-scale,
workers need to maintain overall production processes; on a micro-scale, each machine and its sub processes need
to maintain optimal eficiency. Given this context, we argue that proxemic interactions can help workers maintain a
real-time overviews of factory operations by overlaying relevant information as diferent factory spaces are engaged.
In a more industry related context, we will explore how proxemic interactions can support workers in their tasks by
providing meaningful real-time information using AR.</p>
        <p>First, we will map out a possible deployment scenario for proxemic interactions. Then, we will outline the challenges
and opportunities in operationalizing such a scenario.</p>
      </sec>
    </sec>
    <sec id="sec-2">
      <title>2 THE PROXEMICS FACTORY</title>
      <p>
        We propose using proxemic interactions to manage the information load presented to production and maintenance
workers. We envision scaling information by displaying more detailed information as the worker maintains close
1https://www.daimler.com/innovation/production/factory-56.html
vicinity and only giving sparse information for processes with greater distance, see Figure 1. We use four proxemic
dimensions that can facilitate such interactions: Movement, Orientation, Distance, and Identity. Movement lets us
understand when a person is walking towards a machine, how quickly (e.g., in the case of an emergency), or when
changing directions. Orientation gives information about the direction a user is facing. Orientation can be inferred by
the positioning of faces and limbs which can in turn suggest diferent postures and gaze direction [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. Distance is used
to assign zones for interaction [
        <xref ref-type="bibr" rid="ref12 ref8">8, 12, 20, 21</xref>
        ]. This allows workers to perform their specialized tasks while keeping an
overview of the whole process.
2.1
      </p>
    </sec>
    <sec id="sec-3">
      <title>Worker Specific Visualizations</title>
      <p>
        In more typical factories, we see stack lights [
        <xref ref-type="bibr" rid="ref14 ref9">9, 14</xref>
        ] as the dominant way to communicate the status of a machine or
process often encoded using a three-tier system: 1) green signifying normal operation, 2) yellow signaling warnings
such as overheating, or highly pressurized conditions, and 3) red signifying failure conditions such as an emergency
stop or machine fault. While stack lights give an initial indication over process status, additional or even specific
information cannot be displayed within a single stack light system that may be relevant for situational understanding.
Furthermore, all workers can see stack lights even if they are not relevant to their tasks, which may afect performance
levels if various status indicators are consistently in view.
      </p>
      <p>We propose that workers use AR headsets to provide a similar stack light logic and visualization as status indicators.
In this context, only the worker or team in charge of a specific machine or process would be shown relevant indicators
which can potentially reduce workload as stack lights tend to have salient features that grab attention such as blinking
and beeping components when in the red state. We envision these indicators to only be visible when a worker is
in view of the machine or process, see Figure 2. As the worker comes close to the error, more specific information
will be displayed to assist the worker in correcting the error. Special situations such as emergencies may indeed occur,
which would trigger alerts and information displays more exigent in nature and do not depend on the field of view or
proximity.
2.2</p>
    </sec>
    <sec id="sec-4">
      <title>Additional Information</title>
      <p>The real-time virtualization of data in smart factories creates a new space for workers to use data in-situ for task
performance. In the past, workers would need to carry or use specialized tools that may be cumbersome or require
special tuning, such as a mechanic’s stethoscope or infrared thermometers. In contrast, we imagine a proxemics
smart factory where a worker takes advantage of real-time data analysis to visualize detailed information as an AR
overlay. Visualized sensor readings from machines that indicate important diagnostic information, such as voltage, heat
dissipation, chemical levels, but also aggregated indicators on connected processes, such as the last service date, could
be shown in a proxemics-enabled display when in the appropriate area and proper orientation, see Figure 2.
2.3</p>
    </sec>
    <sec id="sec-5">
      <title>Task-Specific Visualizations</title>
      <p>
        Getting the most accurate information in-situ is essential given the diversity of data available in the continuous
information flow. As shown in Figure 1, we envision a primary egocentric “Zone 0” that encompasses the user’s identity
as a maintenance worker. This primary zone encapsulates, follows the user, and is also where physical work occurs
in a close-up space where assistive task-specific data is displayed. This zone is instantiated at a consistent and more
user-centric degree of distance that enables the worker to see details of a specific error. Information management in this
stage is critical due to the restricted working space. The Identity dimension is one where information type is pre-filtered
for a specific user’s working tasks [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] – in Figure 2, a maintenance worker.
      </p>
    </sec>
    <sec id="sec-6">
      <title>2.4 Interaction</title>
      <p>Summary information alone may often not be enough to complete tasks in a smart factory. Meaningful interaction with
proxemics-enabled systems could allow a worker to work more eficiently. Thus, adding interaction capabilities directly
into AR visualizations, which scale with distance, adds to the usability of proxemics-enabled systems. This opens a
wide range of capabilities, such as displaying additional real-time sensor values, showing reading histories, overlaying
schematics. However, more importantly, this can also enable the worker to directly get simulated data after making
physical modifications to engage in localized quality assurance processes using real-time data visualizations.
3</p>
    </sec>
    <sec id="sec-7">
      <title>DEPLOYMENT CHALLENGES</title>
      <p>Displaying relevant information in-situ into the environment enables the worker to keep an overview of current tasks
and to use real-time data as a dynamic feedback assistant to complete tasks much faster than without support. Thus,
proxemic interaction will improve eficiency by lowering the worker’s time searching for information and interaction
possibilities. Although we present a sketched out design for a proxemics factory in this work, several challenges need
to be addressed that we believe will enable this vision.
3.1</p>
    </sec>
    <sec id="sec-8">
      <title>User Tracking</title>
      <p>Allowing the user to access and work with relevant data in the correct context is key to keeping human workers
in-the-loop. Thus, sensor networks to track user position in smart factories are essential to track the worker’s orientation,
posture, and gaze to ofer the correct visual assistance. While outdoor tracking has improved massively over the last
years, indoor localization is still a highly researched topic [22]. The more stable tracking systems use optical sensors;
however, they bring privacy concerns along with them. Regardless of the type, stable tracking is crucial to displaying
real-time data in-situ to the worker.
3.2</p>
    </sec>
    <sec id="sec-9">
      <title>Designing the Interaction and Visualization</title>
      <p>
        The current design of the visualization is designed around stack lights. Due to the ubiquity of stack lights, factory
workers are already familiar with their functionality, and adding additional information to them is a logical next
step. However, which information the worker needs and to what detail and functionality should be part of future
investigations. In this next step, we envision visualizations extending beyond one unit in the maintenance process but
then to also assist the user through detailed steps in the repair process [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
3.3
      </p>
    </sec>
    <sec id="sec-10">
      <title>Defining the Visualizations Zones</title>
      <p>
        While Hall [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] showed four zones around the user for diferent interpersonal spaces, it is not clear yet how these zones
change in the context of a smart factory. Thus, it is important to understand how a worker interacts efectively to fulfill
tasks and to which extent the worker just needs to be aware of their surroundings. We envision this to be diferent for
individual tasks. In the micro case, the worker needs to repair one small piece of a large machine, and in the macro case,
the worker has to supervises a full production line and is responsible for all sub-functions.
3.4
      </p>
    </sec>
    <sec id="sec-11">
      <title>Modality</title>
      <p>
        Proxemic interactions in smart factories require a unique approach due to the data complexity being interfaced with.
The data generated and the diferent user identities accessing and making decisions based on in-situ information will
create an even more complex system of human and digital connectivity. While the fastest way to access data in-situ is
by using an AR display, not all work will require Head-mounted displays with immersive visualization. For some tasks,
tablets or projections can be more efective [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
3.5
      </p>
    </sec>
    <sec id="sec-12">
      <title>Collaboration</title>
      <p>
        Collaboration is critical to take on complex projects. However, as each worker has their own view and workflow, this
can create conflicts during collaboration. For example, when team members have diferent egocentric “Zone 0” identities,
such as maintenance and production workers, handling conflicts of individual interaction possibilities from disparate
proxemic-aware visualizations will be a unique HCI challenge. As a simple solution to enable collaboration, we propose
that the user with a Zone 0 that has a larger scope may be extended to other users while their individual scopes are
incorporated. Other solutions could be, prioritizing user proximity, creating composite, or merging views [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
4
      </p>
    </sec>
    <sec id="sec-13">
      <title>DISCUSSION</title>
      <p>
        Smart factories ofer dynamic ways for workers to interact with their surroundings. Displaying relevant information
in-situ for diferent task performance stages enables the worker to keep overviews of task engagement and complete
tasks much faster than in traditional contexts. For proxemic interactions to reach full maturity, HCI investigations will
have to give attention to designing visualizations and interactions for detailed steps in task performance [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
      </p>
      <p>
        Additionally, proper user tracking to enable seamless interactions within proxemic zones will need to be investigated
in ways that mitigate cognitive load and distraction efectively. Rendering discrepancies on diferent devices can also
afect view consistency [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] though it is unclear how consistent or real-time system updates are needed to achieve
seamless interaction. Research that can identify appropriate modalities will also be necessary for seamless use. While
immersive in-situ feedback using AR will enable all worker to get instant feedback they also put extra weight onto the
user. Thus, we argue that for some tasks tablets or projections [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] may be suficient to support the worker with in-situ
feedback.
      </p>
      <p>
        Considerations for diferent cultural groups [
        <xref ref-type="bibr" rid="ref17 ref7">7, 17</xref>
        ] that understand and utilize space diferently can lead to the
design of adaptive systems. Finally, considerations on group collaboration with proxemic systems will need to reconcile
conflicts from multiple visualization updates and interactions from difering proxemic-enabled zones based on user
identity.
5
      </p>
    </sec>
    <sec id="sec-14">
      <title>CONCLUSION</title>
      <p>Overall, we see great potential but also specific challenges for proxemic interactions in smart factories. The sheer
amount of data generated in smart factories make these potentials particularly valid as information management
investigations for specialized human tasks. In this paper, we provided an overview of how proxemics can be used to keep
humans in the loop for smart factory processes by showing how meaningful connections can be made between physical
infrastructure and complex information spaces that are necessary for operation. This work provides an overview for
researchers and industry professionals when considering proxemic interactions in smart factories.
[18] Scott S. Snibbe and Hayes S. Rafle. 2009. Social Immersive Media: Pursuing Best Practices for Multi-User Interactive Camera/Projector Exhibits. In</p>
      <p>Proceedings of CHI ’09. ACM. https://doi.org/10.1145/1518701.1518920
[19] Miaosen Wang, Sebastian Boring, and Saul Greenberg. 2012. Proxemic Peddler: A Public Advertising Display That Captures and Preserves the</p>
      <p>Attention of a Passerby. In Proceedings of PerDis ’12. ACM. https://doi.org/10.1145/2307798.2307801
[20] Robin Welsch, Heiko Hecht, and Christoph von Castell. 2018. Psychopathy and the Regulation of Interpersonal Distance. Clinical Psychological</p>
      <p>Science. https://doi.org/10.1177/2167702618788874
[21] Robin Welsch, Christoph von Castell, and Heiko Hecht. 2019. The anisotropy of personal space. PloS one. https://doi.org/10.1371/journal.pone.0217587
[22] Faheem Zafari, Athanasios Gkelias, and Kin K Leung. 2019. A Survey of Indoor Localization Systems and Technologies. IEEE Communications
Surveys &amp; Tutorials. https://doi.org/10.1109/COMST.2019.2911558</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>Till</given-names>
            <surname>Ballendat</surname>
          </string-name>
          , Nicolai Marquardt, and
          <string-name>
            <given-names>Saul</given-names>
            <surname>Greenberg</surname>
          </string-name>
          .
          <year>2010</year>
          .
          <article-title>Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment</article-title>
          .
          <source>In Proceedings of ITS '10</source>
          . ACM. https://doi.org/10.1145/1936652.1936676
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>Frederik</given-names>
            <surname>Brudy</surname>
          </string-name>
          , David Ledo,
          <string-name>
            <given-names>Saul</given-names>
            <surname>Greenberg</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Andreas</given-names>
            <surname>Butz</surname>
          </string-name>
          .
          <year>2014</year>
          .
          <article-title>Is Anyone Looking? Mitigating Shoulder Surfing on Public Displays through Awareness and Protection</article-title>
          .
          <source>In Proceedings of PerDis '14</source>
          . ACM. https://doi.org/10.1145/2611009.2611028
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Sebastian</given-names>
            <surname>Büttner</surname>
          </string-name>
          , Henrik Mucha, Markus Funk, Thomas Kosch, Mario Aehnelt,
          <string-name>
            <given-names>Sebastian</given-names>
            <surname>Robert</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Carsten</given-names>
            <surname>Röcker</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>The Design Space of Augmented and Virtual Reality Applications for Assistive Environments in Manufacturing: A Visual Approach</article-title>
          .
          <source>In Proceedings of PETRA '17</source>
          .
          <article-title>Association for Computing Machinery</article-title>
          . https://doi.org/10.1145/3056540.3076193
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Xiang</surname>
            'Anthony' Chen,
            <given-names>Sebastian</given-names>
          </string-name>
          <string-name>
            <surname>Boring</surname>
            , Sheelagh Carpendale,
            <given-names>Anthony</given-names>
          </string-name>
          <string-name>
            <surname>Tang</surname>
            , and
            <given-names>Saul</given-names>
          </string-name>
          <string-name>
            <surname>Greenberg</surname>
          </string-name>
          .
          <year>2012</year>
          .
          <article-title>Spalendar: Visualizing a Group's Calendar Events over a Geographic Space on a Public Display</article-title>
          .
          <source>In Proceedings of AVI '12</source>
          . ACM. https://doi.org/10.1145/2254556.2254686
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Markus</given-names>
            <surname>Funk</surname>
          </string-name>
          , Lars Lischke, Sven Mayer, Alireza Sahami Shirazi, and
          <string-name>
            <given-names>Albrecht</given-names>
            <surname>Schmidt</surname>
          </string-name>
          .
          <year>2018</year>
          .
          <article-title>Teach Me How! Interactive Assembly Instructions Using Demonstration</article-title>
          and In-Situ Projection. Springer Singapore, Singapore,
          <fpage>49</fpage>
          -
          <lpage>73</lpage>
          . https://doi.org/10.1007/
          <fpage>978</fpage>
          -981-10-6404-
          <issue>3</issue>
          _
          <fpage>4</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Thomas</given-names>
            <surname>Gabor</surname>
          </string-name>
          , Lenz Belzner, Marie Kiermeier, Michael Till Beck, and
          <string-name>
            <given-names>Alexander</given-names>
            <surname>Neitz</surname>
          </string-name>
          .
          <year>2016</year>
          .
          <article-title>A simulation-based architecture for smart cyber-physical systems</article-title>
          .
          <source>In Proceedings of ICAC '16</source>
          . IEEE. https://doi.org/10.1109/ICAC.
          <year>2016</year>
          .29
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Edward</surname>
            <given-names>T.</given-names>
          </string-name>
          <string-name>
            <surname>Hall</surname>
          </string-name>
          .
          <year>1966</year>
          .
          <article-title>The hidden dimension</article-title>
          .
          <source>Anchor Books.</source>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>Heiko</given-names>
            <surname>Hecht</surname>
          </string-name>
          , Robin Welsch, Jana Viehof, and
          <string-name>
            <surname>Matthew R Longo</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>The shape of personal space</article-title>
          .
          <source>Acta Psychologica</source>
          . https://doi.org/10.1016/j. actpsy.
          <year>2018</year>
          .
          <volume>12</volume>
          .009
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <source>[9] IEC</source>
          <volume>60073</volume>
          :
          <year>2002</year>
          2002.
          <article-title>Basic and safety principles for man-machine interface, marking and identification - Coding principles for indicators and actuators</article-title>
          . Standard. International Electrotechnical Commission, Geneva, CH. https://webstore.iec.ch/publication/587
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Wendy</surname>
            <given-names>Ju</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>Brian A.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>and Scott R.</given-names>
            <surname>Klemmer</surname>
          </string-name>
          .
          <year>2008</year>
          .
          <article-title>Range: Exploring Implicit Interaction through Electronic Whiteboard Design</article-title>
          .
          <source>In Proceedings of CSCW '08</source>
          . ACM. https://doi.org/10.1145/1460563.1460569
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Gerd</surname>
            <given-names>Kortuem</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>Christian</given-names>
            <surname>Kray</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Hans</given-names>
            <surname>Gellersen</surname>
          </string-name>
          .
          <year>2005</year>
          .
          <article-title>Sensing and Visualizing Spatial Relations of Mobile Devices</article-title>
          .
          <source>In Proceedings of UIST '05</source>
          . ACM. https://doi.org/10.1145/1095034.1095049
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>Nicolai</given-names>
            <surname>Marquardt</surname>
          </string-name>
          and
          <string-name>
            <given-names>Saul</given-names>
            <surname>Greenberg</surname>
          </string-name>
          .
          <year>2015</year>
          .
          <article-title>Proxemic interactions: From theory to practice. Synthesis Lectures on Human-Centered Informatics</article-title>
          . https://doi.org/10.2200/S00619ED1V01Y201502HCI025
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Ahmed</surname>
            <given-names>E.</given-names>
          </string-name>
          <string-name>
            <surname>Mostafa</surname>
          </string-name>
          , Saul Greenberg, Emilio Vital Brazil, Ehud Sharlin, and
          <string-name>
            <surname>Mario</surname>
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Sousa</surname>
          </string-name>
          .
          <year>2013</year>
          .
          <article-title>Interacting with Microseismic Visualizations</article-title>
          .
          <source>In Proceedings of CHI EA '13</source>
          . ACM. https://doi.org/10.1145/2468356.2468670
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Felix</surname>
            <given-names>Nilsson</given-names>
          </string-name>
          , Jens Jakobsen, and
          <string-name>
            <surname>Fernando</surname>
          </string-name>
          Alonso-Fernandez.
          <year>2020</year>
          .
          <article-title>Detection and Classification of Industrial Signal Lights for Factory Floors</article-title>
          .
          <source>In Proceedings of ISCV '20</source>
          . IEEE. https://doi.org/10.1109/ISCV49265.
          <year>2020</year>
          .9204045
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Antonis</surname>
            <given-names>Protopsaltis</given-names>
          </string-name>
          , Panagiotis Sarigiannidis, Dimitrios Margounakis, and
          <string-name>
            <given-names>Anastasios</given-names>
            <surname>Lytos</surname>
          </string-name>
          .
          <year>2020</year>
          .
          <article-title>Data Visualization in Internet of Things: Tools, Methodologies, and Challenges</article-title>
          .
          <source>In Proceedings of ARES '20</source>
          .
          <article-title>Association for Computing Machinery</article-title>
          . https://doi.org/10.1145/3407023.3409228
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>Qinglin</given-names>
            <surname>Qi</surname>
          </string-name>
          and
          <string-name>
            <given-names>Fei</given-names>
            <surname>Tao</surname>
          </string-name>
          .
          <year>2018</year>
          .
          <article-title>Digital Twin and Big Data Towards Smart Manufacturing and Industry 4.0: 360 Degree Comparison</article-title>
          . IEEE Access. https://doi.org/10.1109/ACCESS.
          <year>2018</year>
          .2793265
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Maurizio</surname>
            <given-names>Sicorello</given-names>
          </string-name>
          , Jasmina Stevanov, Hiroshi Ashida, and
          <string-name>
            <given-names>Heiko</given-names>
            <surname>Hecht</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Efect of Gaze on Personal Space: A Japanese-German Cross-Cultural Study</article-title>
          .
          <source>Journal of Cross-Cultural Psychology</source>
          . https://doi.org/10.1177/0022022118798513
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>