<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Relevance of Visualization and Interaction Technologies for Industry 5.0</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Ander Garcia</string-name>
          <email>agarcia@vicomtech.org</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Marco Quartulli</string-name>
          <email>mquartulli@vicomtech.org</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Igor G. Olaizola</string-name>
          <email>iolaizola@vicomtech.org</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Iñigo Barandiaran</string-name>
          <email>ibarandiaran@vicomtech.org</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Vicomtech Foundation, Basque Research and Technology Alliance (BRTA)</institution>
          ,
          <addr-line>Mikeletegi 57, 20009 Donostia- San Sebastián</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>New synergies between human and Cyber Physical Systems (CPS) are key aspects for generating collaborators instead of competitors, strengthening the human role of Industry 5.0. Bidirectional Communication Channels (BCCs) between humans and machines lay the foundation of these synergies within manufacturing environments. This position paper reviews main currently available visualization and interaction technologies to connect data, humans, CPS, Artificial Intelligence (AI) and machines, presenting a selection of use cases and applications for AI services. The successful design, development and integration of these bidirectional communication channels poses relevant open research challenges for the future to revolutionize current manufacturing environments.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Visual computing</kwd>
        <kwd>Industry 4</kwd>
        <kwd>0</kwd>
        <kwd>Industry 5</kwd>
        <kwd>0</kwd>
        <kwd>AI</kwd>
        <kwd>human-in-the-loop</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Industry 4.0 is a paradigm, centered around the emergence of cyber-physical objects, and offering
a promise of enhanced efficiency through digital connectivity and Artificial Intelligence (AI).
According to the European Commission, Industry 4.0, as currently conceived, “is not fit for purpose
in a context of climate crisis and planetary emergency, nor does it address deep social tensions” [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>
        Although nowadays the Industry 4.0 vision is not a reality, the European Union envisions the
Industry 5.0 as a new paradigm evolving a production and consumption driven economic model into a
more transformative view of growth that is focused on human progress and well-being based on
reducing and shifting consumption to new forms of sustainable, circular and regenerative economic
value creation and equitable prosperity. The objective is to “seek people-planet-prosperity, combining
competitiveness and sustainability, rather than simply value extraction to benefit shareholders” [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>
        One of the main objectives of Industry 5.0 is to bring back human workers to the factory floors,
generating synergies combining the human brainpower and creativity with the automation and AI
technologies of semi and/or fully autonomous machines [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. This objective has not been successfully
met by Industry 4.0. For example, [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] analyze the role of operators in Industry 4.0 and identify three
main technology challenges to (i) support operators to perform process tasks, (ii) support operators to
understand and make decisions, and (iii) to learn from the activity of the operators in order to be able
to predict specific situations, optimize the process and better organize the Smart Factory.
      </p>
      <p>
        A Bidirectional Communication Channel (BCC) is a requirement to generate these synergies,
leading to effective human-in-the-loop systems. Human-in-the-loop is an anthropocentric mechanism,
which allows a direct sharing or transfer of human skills to a subset of CPS control loops [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ].
      </p>
      <p>
        To effectively collaborate with humans, CPS must adequately understand human intention and
desire. Moreover, humans must have the means to understand, analyze and trust predictions and
actions from CPS. According to [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], this tight interaction between CPS and human requires (i) a rich
unambiguous bidirectional information flow and (ii) a proper set of abstract interactive
humanmachine interfaces (HMIs). Industry HMIs have evolved from basic light, buttons and levers to
advanced graphical user interfaces (GUI) with touch screens. Moreover, they keep evolving to
multimodal interfaces supporting new interaction channels [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ].
      </p>
      <p>
        For example, [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] introduce the natural human-machine interfaces (NHMI) as the interfaces
reducing the technological barriers required for a rich interaction. They present an application of
NHMI to integrate humans within the decision-making process of a cybernetic control loop of an
assembly system with cobots. They analyze expertise transfer between humans and CPS, but they do
not focus on decision making mechanism for CPS. This approach is shared by this paper, focusing on
the visualization and interaction technologies and use cases, but not on the technologies to implement
the use cases.
      </p>
      <p>
        Focusing on HMI, [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] analyze a collaborative decision-making process where acceptance and
adaptation of humans to the process is integrated. They provide an extensive literature review and
distinguish Human-Computer Interaction (HCI), from HMI and Human-Technology Interaction
(HTI). HTI encompasses the processes, actions and dialogues that a user engages in to interact with
technology, whether it is a computer, machine or robot. They distinguish three potential HTI modes:
(i) system first, where the human adapts to the actions of the system; (ii) human first, where the
system adapts to instructions of the human; and (iii) hybrid, where human and autonomous system
have equal responsibility with existing bidirectional communication channels. This paper tackles the
hybrid mode as the one viable to successfully meet Industry 5.0 requirements.
      </p>
      <p>Visualization and interaction technologies are the foundation of these BCCs for this new
generation HMIs, connecting humans with machines, CPS, data and AI services. Although these
technologies are already available to use within Industry 4.0, their effective integration is still an open
research topic. This position paper reviews main currently available technologies to generate these
BCCs, presenting some use cases and applications to encourage further research in this area.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Visualization and interaction technologies for Industry 5.0</title>
      <p>This section introduces the most promising visualization and interaction technologies for the
generation of BCCs for Industry 5.0. Although the integration of these technologies into new
generation interfaces is an open research area, examples of the individual use of the technologies
already exist. Figure 1 summarizes the contribution and the approach followed by this paper.</p>
      <p>
        At the left corner, different human profiles interacting with industrial systems are represented. In
production systems, two main reference models for human activities have been identified:
Human-inthe-Loop and Human-in-the-Mesh. The first one, traditionally related to operators, refers to situations
in which the worker is directly participating in the process of products fabrication or assembling and
its loop of control. The second one, traditionally related to managers and engineers, refers to
situations where the worker participates in production planning and it loop of control [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. In the
future, new profiles may arise within Industry 5.0. Although both profiles perform different activities
within the decision-making process phase, interaction and visualization technologies are key to
establish the communication between humans, machines, CPS, data and AI services.
      </p>
      <p>This BCC is composed by both Human-to-Machine and Machine-to-Human channels. Although
not all the visualization and interaction technologies are suitable for both channels, their integration
will foster the development of new BCCs empowering workers through the use of digital devices and
endorsing a human centric approach to technology.</p>
      <p>
        Visual Computing has already been identified as a key enabling technology for Industry 4.0.
Visual Analytics, Augmented and Virtual Reality, Computer Vision, HCI and related technologies are
central to many disruptive applications in a Smart Factory perspective [
        <xref ref-type="bibr" rid="ref10 ref9">9, 10</xref>
        ]. Voice and gesture
based interfaces have also been proposed to enhance communication between operators and CPS [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
      </p>
      <p>
        These technologies connect humans with the main elements of Industry 5.0 (Figure 1). The CPS
represents the core communication element, as it merges the physical and the virtual world,
connecting both to machines and to their digital representation, commonly named as digital twin. CPS
capture data from the elements and feed AI services executed either at the edge, cloud or at both.
Industry 5.0 aligns the objective of these AI services towards a sustainable, human-centric and
resilient industry [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. Reviewing the plethora of AI services for Industry 5.0 is out of the scope of
this paper, interested readers are referred to existing updated industrial AI services reviews [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
      </p>
      <p>This paper focuses on the following visualization and interaction technologies: visual analytics;
augmented, virtual and mixed reality; voice recognition; natural language processing and gesture
recognition.</p>
      <p>
        Visual Analytics (VA) combines the strength of human cognitive abilities with analysis methods
to extract information from data. High-dimensional, real-time visualization allows the graphical
expression of complex process variables at a fraction of the cost of full-scale digitalization [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. VA
combines machine intelligence with human intelligence to gain insight from the data to support
informed decision-making. In a recent survey on the use of VA in manufacturing scenarios [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ], the
authors highlight the extensive need for professional domain specific knowledge and see
human-inthe-loop analysis as one of the major ongoing key challenges of VA systems. Business Analytics is a
special case of VA focused in the analysis of historical raw data in order to achieve useful and
focused insights and a better understanding of the business performance areas [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ].
      </p>
      <p>
        Challenges faced by VA include (i) the integration of process relevant analytics and visualization,
and (ii) the integration of aging workforce. Aligned with Industry 5.0 objectives, measures should be
taken to ensure the ease of use and increased accessibility of VA, with minimal training and
upskilling required to gain access to intuitive data visualization [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
      </p>
      <p>
        Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) technologies are
shaping new interaction environments. These technologies, which integrate physical and virtual
objects at different levels, have received several definitions [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. VR replicates an environment that
simulates a physical presence in places in the real world or an imagined world, allowing the user to
interact in that world. AR is a live, direct or indirect view of a physical, real-world environment
whose elements are augmented (or supplemented) by computer-generated sensory input such as
sound, video, graphics or GPS data. The real world content and the computer-generated content are
not able to respond to each other. MR is the merging of real and virtual worlds to produce new
environments and visualizations where physical and digital objects co-exist and interact in real time.
eXtended Reality (XR) is composed by the full spectrum of technologies in the virtual-to-reality
continuum, such as AR, VR and MR. XR is an overlay of synthetic content on the real world that is
anchored to and interacts with the real world. The key characteristic of XR is that the synthetic and
the real-world content are able to react to each other in real time. XR technology is paving the way to
new forms of interaction that disrupt traditional desktop interaction. [18] describes related concepts
and presents an agenda for future research.
      </p>
      <p>Voice recognition and natural language processing systems have greatly increased their
performance within the last years due to the integration of new models based on deep neural
networks. Conversational assistants, such as general-purpose Apple Siri, Google Now, Microsoft
Cortana or Amazon Alexa, have become the main example of this improvement, simplifying and
making human-machine interaction more natural. Voice assistants could act as a central interaction
technology for Industry 5.0. As they are natural for workers, they require minimal training, lowering
the knowledge barrier for existing workforce to interact with CPS. Moreover, they are both eyes-free
and hands-free, allowing workers to perform simultaneous tasks, and they are flexible to adapt to
different communication contexts [19].</p>
      <p>The acoustic noise of industrial scenarios has been a traditional problem for this technology.
However, recent advances in noise cancelation and speech recognition have been proved to be robust
enough for manufacturing environments [20].</p>
      <p>
        The goal of gesture recognition is to use manpower as a direct input device, eliminating the need
for intermediate media and controlling the machine directly through defined gestures [21]. The
objective is to make machines understand the meaning of gestures of humans, based on technologies
such as Computer Vision or glove-based gesture recognition systems [
        <xref ref-type="bibr" rid="ref10">10, 22</xref>
        ]. Gesture recognition not
only presents technological challenges, the definition of gestures in order to become robust against
individual variations in the performance is an open research field.
      </p>
      <p>XR with voice recognition scenarios showcase example of BCC. [23] proposes a system for
operators to carry out a certain task through the combination of XR technologies with voice
interaction process control logic (Figure 2). The proposed work streamlines multiple input and output
XR devices into the logical scheme of a voice recognition system, describing and validating a
framework enhancing human–machine communication interfaces. The authors showcase two
examples empowering operators. The first one focuses on an operator maintaining the gripper of a
Universal Robot using HoloLens Glasses. As this operation requires the simultaneous use of both
hands, voice based interaction suits its requirements. The second example focuses on the assembling
of cables for electric panels with an Augmented Reality system combining different uses of the
projections and voice-based responses. Moreover, aural responses are also projected for operators
with hearing impairments.</p>
      <p>[24] presents an optical inspection-guiding system for electronic board manufacturing. The system
monitors in real time the mounting process of electronic components performed by an operator. It
visually guides the operator through the mounting process while checking the correctness of its
actions. Thus, mounting errors are reduced while operator comfort is enhanced. The interaction with
the operator is based on a Computer Vision system to recognize the correct mounting of the board
coupled with an AR system, extended with a voice recognition interface, projecting information into
the real board and a screen, where additional data and controls are located (Figure 2).</p>
    </sec>
    <sec id="sec-3">
      <title>3. Example of applications of BCCs for AI manufacturing services</title>
      <p>This section presents some examples of the application of the previous technologies to create
BCCs to generate Industry 5.0 human-centric workflows within AI-based services. While Industry 4.0
AI manufacturing services are mainly focused on optimizing operational efficiency, within Industry
5.0 AI services should optimize cost functions including more global footprints of the manufacturing
processes. This requires an evolution of both AI models and industrial KPIs and metrics, such as
OEE.</p>
      <p>Furthermore, human-machine collaboration must be more flexible. CPS, robot, cobots and even AI
models should be able to harmonize human interactions. This will lead to an Augmented Intelligence,
which can be defined as the "synergistic technology of humans and computers" [25], merging Human
Intelligence (HI) and Artificial Intelligence (AI) and understanding intelligence as a fundamentally
distributed phenomenon [26]. AI based systems should evolve from rigid pattern recognition
capabilities to manage less structured and more chaotic scenarios, more suitable to resemble the
complexity of real problems. This requires the integration of rules, laws, ontologies or functionally
equivalent technologies into AI models, which currently is a challenging open research area.</p>
      <p>Focusing on BCCs, they are critical to enrich decision flows and to integrate humans into them.
Moreover, they could improve explicability and interpretability of current AI algorithms.
Interpretability has to do with how accurate a machine learning model can associate a cause to an
effect. Explicability has to do with the ability of the parameters, often hidden in deep networks, to
justify the results. Humans may ask about characteristics of AI algorithms before approving their
output, applying previous technologies to lay a natural interface adapted to humans and the
manufacturing environment.</p>
      <p>The integration of human knowledge to improve, customize, tune and command AI algorithms
requires a dialog where humans and machines assist each other at several tasks. This dialog will add
value to human experience and knowledge, strengthening the human role of Industry 5.0. For
example, interactive exploratory data analysis is one of the more general tasks where these channels
may generate synergies. The interaction channel can be adapted to the context of humans and tasks,
applying automatic data filters, custom visualizations, and data proposals and allowing humans to
express queries in natural language avoiding training in query languages (for example, “show me in a
bar chart the average temperature of the core temperature of the process for the last 20 production
cycles”).</p>
      <p>Data labelling and annotation is another common required task for several AI services be trained.
Although profound advances are being developed in the field of automatic labelling and annotation,
human intervention is still required to further improve the quality of the output. BCCs will allow
humans to guide automatic labelling algorithms, improving the output in iterative workflows. Humans
may select a subset of images or data (similar to current captcha systems), approve the output of the
algorithm, or guide the labelling algorithm, while the system informs of the output and provides
alternatives and suggestions of following steps.</p>
      <p>The same workflow applies to interactive and active learning, where an AI algorithm generates
iterative outputs that are increasingly improved by decisions and expertise from humans. The system
automatically finds results and ask humans when it does not know the best next step or requires
feedback and validation. This feedback and validation is integrated into the knowledge based on the
algorithms, improving future results automatically. Furthermore, humans could ask algorithms about
foundations of their decisions in order to understand and validate the reasons underneath them.</p>
      <p>Regarding prescriptive analytics, besides integrating humans in the flow, these channels will
empower humans to ask suitable models for prescriptions of further scenarios to foster quality
databased decisions, or to identify potentially dangerous or relevant situations that may harm productivity.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Conclusions</title>
      <p>New synergies between human and CPS are key aspects for generating collaborators instead of
competitors, strengthening the human role of Industry 5.0. BCCs between humans and machines lay
the foundation of these synergies. This paper has analyzed main visualization and interaction
technologies to generate these BCCs: visual analytics; augmented, virtual and mixed reality; voice
recognition; natural language processing and gesture recognition. Although the integration of these
technologies into new generation interfaces is an open research area, examples of the individual use of
the technologies within manufacturing scenarios already exist.</p>
      <p>The successful design, development and integration of these BCCs poses relevant open research
challenges for the future to revolutionize current manufacturing environments, specially related to the
integration of humans into AI service and algorithm workflows.</p>
    </sec>
    <sec id="sec-5">
      <title>5. References</title>
      <p>[18] C. Flavián, S. Ibáñez-Sánchez, C. Orús, The impact of virtual, augmented and mixed reality
technologies on the customer experience, Journal of Business Research 100 (2019) 547–560. doi:
10.1016/j.jbusres.2018.10.050
[19] A. del Pozo, L. Garcia-Sardiña, M. Serras, A. Gonzalez-Docasal, M. I. Torres, … I. Etxebeste,
EKIN: Towards natural language interaction with industrial production machines, CEUR
Workshop Proceedings 2968 (2021) 5-8.
[20] R. Gaizauskas, Investigating spoken dialogue to support manufacturing processes, 2019. URL:
https://connectedeverything.ac.uk/feasibility-studies/spoken-dialogue-manufacturing/
[21] O. K. Oyedotun, A. Khashman, Deep learning in vision-based static hand gesture recognition,</p>
      <p>Neural Computing and Applications 28 (2017) 3941–3951. doi: 10.1007/s00521-016-2294-8.
[22] L. Roda-Sanchez, T. Olivares, C. Garrido-Hidalgo, A. Fernández-Caballero, Gesture Control
Wearables for Human-Machine Interaction in Industry 4.0, in: Lecture Notes in Computer
Science, vol. 11487 LNCS, Springer-Verlag, London, 2019, pp. 99-108.
[23] M. Serras, L. García-Sardiña, B. Simões, H. Álvarez, J. Arambarri, Dialogue Enhanced Extended
Reality: Interactive System for the Operator 4.0, Applied Sciences 10 (2020) 3960. doi:
10.3390/app10113960.
[24] M. Ojer, I. Serrano, F. A. Saiz, I. Barandiaran, I. Gill, D. Aguinaga, D. Alejandro, Real-time
automatic optical system to assist operators in the assembling of electronic components,
International Journal of Advanced Manufacturing Technology 107 (2020) 2261–2275. doi:
10.1007/s00170-020-05125-z
[25] M. N. O. Sadiku, T. J. Ashaolu, A. Ajayi-Majebi, S. M. Musa, Augmented Intelligence,</p>
      <p>International Journal of Scientific Advances 2 (2021) 772-776. doi: 10.51542/ijscia.v2i5.17
[26] J. Ito, Extended Intelligence, 2016. URL:
https://pubpub.ito.com/pub/extendedintelligence/release/1</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>European</given-names>
            <surname>Commission</surname>
          </string-name>
          ,
          <source>Industry</source>
          <volume>5</volume>
          .0:
          <string-name>
            <given-names>A</given-names>
            <surname>Transformative Vision for Europe</surname>
          </string-name>
          ,
          <source>Governing Systemic Transformations towards a Sustainable Industry</source>
          ,
          <year>2022</year>
          . URL: https://ec.europa.eu/info/publications/industry-50
          <string-name>
            <surname>-</surname>
          </string-name>
          transformative
          <string-name>
            <surname>-</surname>
          </string-name>
          vision-europe_en
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>S.</given-names>
            <surname>Nahavandi</surname>
          </string-name>
          , Industry
          <volume>5</volume>
          .
          <fpage>0</fpage>
          -
          <string-name>
            <given-names>A</given-names>
            <surname>Human-Centric</surname>
          </string-name>
          <string-name>
            <surname>Solution</surname>
          </string-name>
          ,
          <source>Sustainability</source>
          <volume>11</volume>
          (
          <year>2019</year>
          )
          <article-title>4371</article-title>
          . doi:
          <volume>10</volume>
          .3390/su11164371.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>J.</given-names>
            <surname>Posada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Zorrilla</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Dominguez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Simoes</surname>
          </string-name>
          , P. Eisert, … M.
          <article-title>Guevara, Graphics and Media Technologies for Operators in Industry 4.0</article-title>
          ,
          <source>IEEE Computer Graphics and Applications</source>
          <volume>38</volume>
          (
          <year>2018</year>
          )
          <fpage>119</fpage>
          -
          <lpage>132</lpage>
          . doi:
          <volume>10</volume>
          .1109/
          <string-name>
            <surname>MCG</surname>
          </string-name>
          .
          <year>2018</year>
          .
          <volume>053491736</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>M.</given-names>
            <surname>Gaham</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Bouzouia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Achour</surname>
          </string-name>
          ,
          <article-title>Human-in-the-Loop Cyber-Physical Production Systems Control Human-in-the-Loop Cyber-Physical Production Systems Control (HiLCP2sC)</article-title>
          ,
          <source>Studies in Computational Intelligence</source>
          <volume>594</volume>
          (
          <year>2015</year>
          )
          <fpage>315</fpage>
          -
          <lpage>325</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Ruiz</surname>
          </string-name>
          <string-name>
            <surname>Garcia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Rojas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Gualtieri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Rauch</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Matt</surname>
          </string-name>
          ,
          <article-title>A human-in-the-loop cyberphysical system for collaborative assembly in smart manufacturing</article-title>
          ,
          <source>Procedia CIRP 81</source>
          (
          <year>2019</year>
          )
          <fpage>600</fpage>
          -
          <lpage>605</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.procir.
          <year>2019</year>
          .
          <volume>03</volume>
          .162
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>S.</given-names>
            <surname>Oviatt</surname>
          </string-name>
          , P. Cohen,
          <article-title>Perceptual User Interfaces: Multimodal Interfaces that Process What Comes Naturally</article-title>
          ,
          <source>Communications of the ACM</source>
          <volume>43</volume>
          (
          <year>2000</year>
          )
          <fpage>45</fpage>
          -
          <lpage>53</lpage>
          . doi:
          <volume>10</volume>
          .1145/330534.330538.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>J.</given-names>
            <surname>Coetzer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. B.</given-names>
            <surname>Kuriakose</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. J.</given-names>
            <surname>Vermaak</surname>
          </string-name>
          ,
          <article-title>Collaborative Decision-Making for HumanTechnology Interaction-A Case Study Using an Automated Water Bottling Plant</article-title>
          ,
          <source>Journal of Physics: Conference Series</source>
          <volume>1577</volume>
          (
          <year>2020</year>
          )
          <fpage>012024</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>P.</given-names>
            <surname>Fantini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Leitao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Barbosa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Taisch</surname>
          </string-name>
          ,
          <article-title>Symbiotic Integration of Human Activities in CyberPhysical Systems</article-title>
          , IFAC-PapersOnLine
          <volume>52</volume>
          (
          <year>2019</year>
          )
          <fpage>133</fpage>
          -
          <lpage>138</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.ifacol.
          <year>2019</year>
          .
          <volume>12</volume>
          .124.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>J.</given-names>
            <surname>Posada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Toro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Barandiaran</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Oyarzun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Eisert</surname>
          </string-name>
          ,
          <article-title>Visual computing as key enabling technology for Industry 4</article-title>
          .0 &amp;
          <string-name>
            <surname>Industrial</surname>
            <given-names>Internet</given-names>
          </string-name>
          ,
          <source>IEEE Computer Graphics and Applications</source>
          <volume>35</volume>
          (
          <year>2015</year>
          )
          <fpage>26</fpage>
          -
          <lpage>40</lpage>
          . doi:
          <volume>10</volume>
          .1109/
          <string-name>
            <surname>MCG</surname>
          </string-name>
          .
          <year>2015</year>
          .
          <volume>45</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>J.</given-names>
            <surname>Posada</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Barandiaran</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. R.</given-names>
            <surname>Sanchez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Mejia-Parra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Moreno</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Ojer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Ruiz-Salguero</surname>
          </string-name>
          ,
          <article-title>Computer graphics and visual computing use cases for Industry 4.0 and Operator 4</article-title>
          .0,
          <source>International Journal for Simulation and Multidisciplinary Design Optimization</source>
          <volume>12</volume>
          (
          <year>2021</year>
          )
          <fpage>4</fpage>
          -
          <lpage>9</lpage>
          . doi:
          <volume>10</volume>
          .1051/smdo/2021026
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>F.</given-names>
            <surname>Longo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Padovano</surname>
          </string-name>
          ,
          <article-title>Voice-enabled Assistants of the Operator 4.0 in the Social Smart Factory: Prospective role and challenges for an advanced human-machine interaction</article-title>
          ,
          <source>Manufacturing Letters</source>
          <volume>26</volume>
          (
          <year>2020</year>
          )
          <fpage>12</fpage>
          -
          <lpage>16</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.mfglet.
          <year>2020</year>
          .
          <volume>09</volume>
          .001.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>M.</given-names>
            <surname>Breque</surname>
          </string-name>
          , L. De Nul,
          <string-name>
            <given-names>A.</given-names>
            <surname>Petrides</surname>
          </string-name>
          ,
          <article-title>Industry 5.0 - Towards a sustainable, human-centric and resilient European industry</article-title>
          ,
          <year>2021</year>
          . URL: https://ec.europa.eu/info/news/industry-50
          <string-name>
            <surname>-</surname>
          </string-name>
          towardsmore
          <article-title>-sustainable-resilient-and-human-centric-</article-title>
          <string-name>
            <surname>industry-</surname>
          </string-name>
          2021
          <string-name>
            <surname>-</surname>
          </string-name>
          jan-07_en
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>R. S.</given-names>
            <surname>Peres</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Jia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Sun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. W.</given-names>
            <surname>Colombo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Barata</surname>
          </string-name>
          ,
          <source>Industrial Artificial Intelligence in Industry 4</source>
          .
          <fpage>0</fpage>
          -
          <string-name>
            <given-names>Systematic</given-names>
            <surname>Review</surname>
          </string-name>
          , Challenges and Outlook, IEEE Access 8
          <article-title>(</article-title>
          <year>2020</year>
          )
          <fpage>220121</fpage>
          -
          <lpage>220139</lpage>
          . doi:
          <volume>10</volume>
          .1109/ACCESS.
          <year>2020</year>
          .
          <volume>3042874</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>L.</given-names>
            <surname>Allen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Atkinson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Jayasundara</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Cordiner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. Z.</given-names>
            <surname>Moghadam</surname>
          </string-name>
          ,
          <article-title>Data visualization for Industry 4.0: A stepping-stone toward a digital future, bridging the gap between academia and industry</article-title>
          ,
          <source>Patterns</source>
          <volume>2</volume>
          (
          <year>2021</year>
          )
          <article-title>100266</article-title>
          . doi:
          <volume>10</volume>
          .1016/j.patter.
          <year>2021</year>
          .100266
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>F.</given-names>
            <surname>Zhou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Lin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Zhao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Ren</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Xue</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Ren</surname>
          </string-name>
          ,
          <article-title>A survey of visualization for smart manufacturing</article-title>
          ,
          <source>Journal of Visualization</source>
          <volume>22</volume>
          (
          <year>2019</year>
          )
          <fpage>419</fpage>
          -
          <lpage>435</lpage>
          . doi:
          <volume>10</volume>
          .1007/s12650-018- 0530-2.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>A. J.</given-names>
            <surname>Silva</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Cortez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Pereira</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Pilastri</surname>
          </string-name>
          ,
          <article-title>Business analytics in Industry 4.0: A systematic review</article-title>
          ,
          <source>Expert Systems</source>
          <volume>38</volume>
          (
          <year>2021</year>
          )
          <article-title>e12741</article-title>
          . doi:
          <volume>10</volume>
          .1111/exsy.12741.
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>T.</given-names>
            <surname>Williams</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Szafir</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Chakraborti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. Ben</given-names>
            <surname>Amor</surname>
          </string-name>
          , Virtual, Augmented, and
          <article-title>Mixed Reality for Human-Robot Interaction</article-title>
          , in: ACM/IEEE International Conference on
          <string-name>
            <surname>Human-Robot</surname>
            <given-names>Interaction</given-names>
          </string-name>
          , IEEE, New York,
          <year>2018</year>
          , pp.
          <fpage>671</fpage>
          -
          <lpage>672</lpage>
          . doi:
          <volume>10</volume>
          .1109/HRI.
          <year>2019</year>
          .
          <volume>8673207</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>