<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Relevance of Visualization and Interaction Technologies for Industry 5.0</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Ander</forename><surname>Garcia</surname></persName>
							<email>agarcia@vicomtech.org</email>
							<affiliation key="aff0">
								<orgName type="department" key="dep1">Vicomtech Foundation</orgName>
								<orgName type="department" key="dep2">Basque Research and Technology Alliance (BRTA)</orgName>
								<address>
									<addrLine>Mikeletegi 57</addrLine>
									<postCode>20009</postCode>
									<settlement>Donostia-San Sebastián (</settlement>
									<country key="ES">Spain</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Marco</forename><surname>Quartulli</surname></persName>
							<email>mquartulli@vicomtech.org</email>
							<affiliation key="aff0">
								<orgName type="department" key="dep1">Vicomtech Foundation</orgName>
								<orgName type="department" key="dep2">Basque Research and Technology Alliance (BRTA)</orgName>
								<address>
									<addrLine>Mikeletegi 57</addrLine>
									<postCode>20009</postCode>
									<settlement>Donostia-San Sebastián (</settlement>
									<country key="ES">Spain</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Igor</forename><forename type="middle">G</forename><surname>Olaizola</surname></persName>
							<email>iolaizola@vicomtech.org</email>
							<affiliation key="aff0">
								<orgName type="department" key="dep1">Vicomtech Foundation</orgName>
								<orgName type="department" key="dep2">Basque Research and Technology Alliance (BRTA)</orgName>
								<address>
									<addrLine>Mikeletegi 57</addrLine>
									<postCode>20009</postCode>
									<settlement>Donostia-San Sebastián (</settlement>
									<country key="ES">Spain</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Iñigo</forename><surname>Barandiaran</surname></persName>
							<email>ibarandiaran@vicomtech.org</email>
							<affiliation key="aff0">
								<orgName type="department" key="dep1">Vicomtech Foundation</orgName>
								<orgName type="department" key="dep2">Basque Research and Technology Alliance (BRTA)</orgName>
								<address>
									<addrLine>Mikeletegi 57</addrLine>
									<postCode>20009</postCode>
									<settlement>Donostia-San Sebastián (</settlement>
									<country key="ES">Spain</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Relevance of Visualization and Interaction Technologies for Industry 5.0</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">9B7E0A72E6D0FC10291F6B0EE88A8380</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-25T07:20+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Visual computing</term>
					<term>Industry 4.0</term>
					<term>Industry 5.0</term>
					<term>AI</term>
					<term>human-in-the-loop</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>New synergies between human and Cyber Physical Systems (CPS) are key aspects for generating collaborators instead of competitors, strengthening the human role of Industry 5.0. Bidirectional Communication Channels (BCCs) between humans and machines lay the foundation of these synergies within manufacturing environments. This position paper reviews main currently available visualization and interaction technologies to connect data, humans, CPS, Artificial Intelligence (AI) and machines, presenting a selection of use cases and applications for AI services. The successful design, development and integration of these bidirectional communication channels poses relevant open research challenges for the future to revolutionize current manufacturing environments.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction</head><p>Industry 4.0 is a paradigm, centered around the emergence of cyber-physical objects, and offering a promise of enhanced efficiency through digital connectivity and Artificial Intelligence (AI). According to the European Commission, Industry 4.0, as currently conceived, "is not fit for purpose in a context of climate crisis and planetary emergency, nor does it address deep social tensions" <ref type="bibr" target="#b0">[1]</ref>.</p><p>Although nowadays the Industry 4.0 vision is not a reality, the European Union envisions the Industry 5.0 as a new paradigm evolving a production and consumption driven economic model into a more transformative view of growth that is focused on human progress and well-being based on reducing and shifting consumption to new forms of sustainable, circular and regenerative economic value creation and equitable prosperity. The objective is to "seek people-planet-prosperity, combining competitiveness and sustainability, rather than simply value extraction to benefit shareholders" <ref type="bibr" target="#b0">[1]</ref>.</p><p>One of the main objectives of Industry 5.0 is to bring back human workers to the factory floors, generating synergies combining the human brainpower and creativity with the automation and AI technologies of semi and/or fully autonomous machines <ref type="bibr" target="#b1">[2]</ref>. This objective has not been successfully met by Industry 4.0. For example, <ref type="bibr" target="#b2">[3]</ref> analyze the role of operators in Industry 4.0 and identify three main technology challenges to (i) support operators to perform process tasks, (ii) support operators to understand and make decisions, and (iii) to learn from the activity of the operators in order to be able to predict specific situations, optimize the process and better organize the Smart Factory.</p><p>A Bidirectional Communication Channel (BCC) is a requirement to generate these synergies, leading to effective human-in-the-loop systems. Human-in-the-loop is an anthropocentric mechanism, which allows a direct sharing or transfer of human skills to a subset of CPS control loops <ref type="bibr" target="#b3">[4]</ref>.</p><p>To effectively collaborate with humans, CPS must adequately understand human intention and desire. Moreover, humans must have the means to understand, analyze and trust predictions and actions from CPS. According to <ref type="bibr" target="#b4">[5]</ref>, this tight interaction between CPS and human requires (i) a rich unambiguous bidirectional information flow and (ii) a proper set of abstract interactive humanmachine interfaces (HMIs). Industry HMIs have evolved from basic light, buttons and levers to advanced graphical user interfaces (GUI) with touch screens. Moreover, they keep evolving to multimodal interfaces supporting new interaction channels <ref type="bibr" target="#b5">[6]</ref>.</p><p>For example, <ref type="bibr" target="#b4">[5]</ref> introduce the natural human-machine interfaces (NHMI) as the interfaces reducing the technological barriers required for a rich interaction. They present an application of NHMI to integrate humans within the decision-making process of a cybernetic control loop of an assembly system with cobots. They analyze expertise transfer between humans and CPS, but they do not focus on decision making mechanism for CPS. This approach is shared by this paper, focusing on the visualization and interaction technologies and use cases, but not on the technologies to implement the use cases.</p><p>Focusing on HMI, <ref type="bibr" target="#b6">[7]</ref> analyze a collaborative decision-making process where acceptance and adaptation of humans to the process is integrated. They provide an extensive literature review and distinguish Human-Computer Interaction (HCI), from HMI and Human-Technology Interaction (HTI). HTI encompasses the processes, actions and dialogues that a user engages in to interact with technology, whether it is a computer, machine or robot. They distinguish three potential HTI modes: (i) system first, where the human adapts to the actions of the system; (ii) human first, where the system adapts to instructions of the human; and (iii) hybrid, where human and autonomous system have equal responsibility with existing bidirectional communication channels. This paper tackles the hybrid mode as the one viable to successfully meet Industry 5.0 requirements.</p><p>Visualization and interaction technologies are the foundation of these BCCs for this new generation HMIs, connecting humans with machines, CPS, data and AI services. Although these technologies are already available to use within Industry 4.0, their effective integration is still an open research topic. This position paper reviews main currently available technologies to generate these BCCs, presenting some use cases and applications to encourage further research in this area.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Visualization and interaction technologies for Industry 5.0</head><p>This section introduces the most promising visualization and interaction technologies for the generation of BCCs for Industry 5.0. Although the integration of these technologies into new generation interfaces is an open research area, examples of the individual use of the technologies already exist. Figure <ref type="figure" target="#fig_0">1</ref> summarizes the contribution and the approach followed by this paper. At the left corner, different human profiles interacting with industrial systems are represented. In production systems, two main reference models for human activities have been identified: Human-inthe-Loop and Human-in-the-Mesh. The first one, traditionally related to operators, refers to situations in which the worker is directly participating in the process of products fabrication or assembling and its loop of control. The second one, traditionally related to managers and engineers, refers to situations where the worker participates in production planning and it loop of control <ref type="bibr" target="#b7">[8]</ref>. In the future, new profiles may arise within Industry 5.0. Although both profiles perform different activities within the decision-making process phase, interaction and visualization technologies are key to establish the communication between humans, machines, CPS, data and AI services.</p><p>This BCC is composed by both Human-to-Machine and Machine-to-Human channels. Although not all the visualization and interaction technologies are suitable for both channels, their integration will foster the development of new BCCs empowering workers through the use of digital devices and endorsing a human centric approach to technology.</p><p>Visual Computing has already been identified as a key enabling technology for Industry 4.0. Visual Analytics, Augmented and Virtual Reality, Computer Vision, HCI and related technologies are central to many disruptive applications in a Smart Factory perspective <ref type="bibr" target="#b8">[9,</ref><ref type="bibr" target="#b9">10]</ref>. Voice and gesture based interfaces have also been proposed to enhance communication between operators and CPS <ref type="bibr" target="#b10">[11]</ref>.</p><p>These technologies connect humans with the main elements of Industry 5.0 (Figure <ref type="figure" target="#fig_0">1</ref>). The CPS represents the core communication element, as it merges the physical and the virtual world, connecting both to machines and to their digital representation, commonly named as digital twin. CPS capture data from the elements and feed AI services executed either at the edge, cloud or at both. Industry 5.0 aligns the objective of these AI services towards a sustainable, human-centric and resilient industry <ref type="bibr" target="#b11">[12]</ref>. Reviewing the plethora of AI services for Industry 5.0 is out of the scope of this paper, interested readers are referred to existing updated industrial AI services reviews <ref type="bibr" target="#b12">[13]</ref>.</p><p>This paper focuses on the following visualization and interaction technologies: visual analytics; augmented, virtual and mixed reality; voice recognition; natural language processing and gesture recognition.</p><p>Visual Analytics (VA) combines the strength of human cognitive abilities with analysis methods to extract information from data. High-dimensional, real-time visualization allows the graphical expression of complex process variables at a fraction of the cost of full-scale digitalization <ref type="bibr" target="#b13">[14]</ref>. VA combines machine intelligence with human intelligence to gain insight from the data to support informed decision-making. In a recent survey on the use of VA in manufacturing scenarios <ref type="bibr" target="#b14">[15]</ref>, the authors highlight the extensive need for professional domain specific knowledge and see human-inthe-loop analysis as one of the major ongoing key challenges of VA systems. Business Analytics is a special case of VA focused in the analysis of historical raw data in order to achieve useful and focused insights and a better understanding of the business performance areas <ref type="bibr" target="#b15">[16]</ref>.</p><p>Challenges faced by VA include (i) the integration of process relevant analytics and visualization, and (ii) the integration of aging workforce. Aligned with Industry 5.0 objectives, measures should be taken to ensure the ease of use and increased accessibility of VA, with minimal training and upskilling required to gain access to intuitive data visualization <ref type="bibr" target="#b13">[14]</ref>.</p><p>Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) technologies are shaping new interaction environments. These technologies, which integrate physical and virtual objects at different levels, have received several definitions <ref type="bibr" target="#b16">[17]</ref>. VR replicates an environment that simulates a physical presence in places in the real world or an imagined world, allowing the user to interact in that world. AR is a live, direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. The real world content and the computer-generated content are not able to respond to each other. MR is the merging of real and virtual worlds to produce new environments and visualizations where physical and digital objects co-exist and interact in real time. eXtended Reality (XR) is composed by the full spectrum of technologies in the virtual-to-reality continuum, such as AR, VR and MR. XR is an overlay of synthetic content on the real world that is anchored to and interacts with the real world. The key characteristic of XR is that the synthetic and the real-world content are able to react to each other in real time. XR technology is paving the way to new forms of interaction that disrupt traditional desktop interaction. <ref type="bibr" target="#b17">[18]</ref> describes related concepts and presents an agenda for future research.</p><p>Voice recognition and natural language processing systems have greatly increased their performance within the last years due to the integration of new models based on deep neural networks. Conversational assistants, such as general-purpose Apple Siri, Google Now, Microsoft Cortana or Amazon Alexa, have become the main example of this improvement, simplifying and making human-machine interaction more natural. Voice assistants could act as a central interaction technology for Industry 5.0. As they are natural for workers, they require minimal training, lowering the knowledge barrier for existing workforce to interact with CPS. Moreover, they are both eyes-free and hands-free, allowing workers to perform simultaneous tasks, and they are flexible to adapt to different communication contexts <ref type="bibr" target="#b18">[19]</ref>.</p><p>The acoustic noise of industrial scenarios has been a traditional problem for this technology. However, recent advances in noise cancelation and speech recognition have been proved to be robust enough for manufacturing environments <ref type="bibr" target="#b19">[20]</ref>.</p><p>The goal of gesture recognition is to use manpower as a direct input device, eliminating the need for intermediate media and controlling the machine directly through defined gestures <ref type="bibr" target="#b20">[21]</ref>. The objective is to make machines understand the meaning of gestures of humans, based on technologies such as Computer Vision or glove-based gesture recognition systems <ref type="bibr" target="#b9">[10,</ref><ref type="bibr" target="#b21">22]</ref>. Gesture recognition not only presents technological challenges, the definition of gestures in order to become robust against individual variations in the performance is an open research field.</p><p>XR with voice recognition scenarios showcase example of BCC. <ref type="bibr" target="#b22">[23]</ref> proposes a system for operators to carry out a certain task through the combination of XR technologies with voice interaction process control logic (Figure <ref type="figure">2</ref>). The proposed work streamlines multiple input and output XR devices into the logical scheme of a voice recognition system, describing and validating a framework enhancing human-machine communication interfaces. The authors showcase two examples empowering operators. The first one focuses on an operator maintaining the gripper of a Universal Robot using HoloLens Glasses. As this operation requires the simultaneous use of both hands, voice based interaction suits its requirements. The second example focuses on the assembling of cables for electric panels with an Augmented Reality system combining different uses of the projections and voice-based responses. Moreover, aural responses are also projected for operators with hearing impairments.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Figure 2:</head><p>Examples for XR with voice recognition <ref type="bibr" target="#b23">[24]</ref> presents an optical inspection-guiding system for electronic board manufacturing. The system monitors in real time the mounting process of electronic components performed by an operator. It visually guides the operator through the mounting process while checking the correctness of its actions. Thus, mounting errors are reduced while operator comfort is enhanced. The interaction with the operator is based on a Computer Vision system to recognize the correct mounting of the board coupled with an AR system, extended with a voice recognition interface, projecting information into the real board and a screen, where additional data and controls are located (Figure <ref type="figure">2</ref>).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Example of applications of BCCs for AI manufacturing services</head><p>This section presents some examples of the application of the previous technologies to create BCCs to generate Industry 5.0 human-centric workflows within AI-based services. While Industry 4.0 AI manufacturing services are mainly focused on optimizing operational efficiency, within Industry 5.0 AI services should optimize cost functions including more global footprints of the manufacturing processes. This requires an evolution of both AI models and industrial KPIs and metrics, such as OEE.</p><p>Furthermore, human-machine collaboration must be more flexible. CPS, robot, cobots and even AI models should be able to harmonize human interactions. This will lead to an Augmented Intelligence, which can be defined as the "synergistic technology of humans and computers" <ref type="bibr" target="#b24">[25]</ref>, merging Human Intelligence (HI) and Artificial Intelligence (AI) and understanding intelligence as a fundamentally distributed phenomenon <ref type="bibr" target="#b25">[26]</ref>. AI based systems should evolve from rigid pattern recognition capabilities to manage less structured and more chaotic scenarios, more suitable to resemble the complexity of real problems. This requires the integration of rules, laws, ontologies or functionally equivalent technologies into AI models, which currently is a challenging open research area.</p><p>Focusing on BCCs, they are critical to enrich decision flows and to integrate humans into them. Moreover, they could improve explicability and interpretability of current AI algorithms. Interpretability has to do with how accurate a machine learning model can associate a cause to an effect. Explicability has to do with the ability of the parameters, often hidden in deep networks, to justify the results. Humans may ask about characteristics of AI algorithms before approving their output, applying previous technologies to lay a natural interface adapted to humans and the manufacturing environment.</p><p>The integration of human knowledge to improve, customize, tune and command AI algorithms requires a dialog where humans and machines assist each other at several tasks. This dialog will add value to human experience and knowledge, strengthening the human role of Industry 5.0. For example, interactive exploratory data analysis is one of the more general tasks where these channels may generate synergies. The interaction channel can be adapted to the context of humans and tasks, applying automatic data filters, custom visualizations, and data proposals and allowing humans to express queries in natural language avoiding training in query languages (for example, "show me in a bar chart the average temperature of the core temperature of the process for the last 20 production cycles").</p><p>Data labelling and annotation is another common required task for several AI services be trained. Although profound advances are being developed in the field of automatic labelling and annotation, human intervention is still required to further improve the quality of the output. BCCs will allow humans to guide automatic labelling algorithms, improving the output in iterative workflows. Humans may select a subset of images or data (similar to current captcha systems), approve the output of the algorithm, or guide the labelling algorithm, while the system informs of the output and provides alternatives and suggestions of following steps.</p><p>The same workflow applies to interactive and active learning, where an AI algorithm generates iterative outputs that are increasingly improved by decisions and expertise from humans. The system automatically finds results and ask humans when it does not know the best next step or requires feedback and validation. This feedback and validation is integrated into the knowledge based on the algorithms, improving future results automatically. Furthermore, humans could ask algorithms about foundations of their decisions in order to understand and validate the reasons underneath them.</p><p>Regarding prescriptive analytics, besides integrating humans in the flow, these channels will empower humans to ask suitable models for prescriptions of further scenarios to foster quality databased decisions, or to identify potentially dangerous or relevant situations that may harm productivity.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Conclusions</head><p>New synergies between human and CPS are key aspects for generating collaborators instead of competitors, strengthening the human role of Industry 5.0. BCCs between humans and machines lay the foundation of these synergies. This paper has analyzed main visualization and interaction technologies to generate these BCCs: visual analytics; augmented, virtual and mixed reality; voice recognition; natural language processing and gesture recognition. Although the integration of these technologies into new generation interfaces is an open research area, examples of the individual use of the technologies within manufacturing scenarios already exist.</p><p>The successful design, development and integration of these BCCs poses relevant open research challenges for the future to revolutionize current manufacturing environments, specially related to the integration of humans into AI service and algorithm workflows.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: Visualization and Interaction technologies for Industry 5.0</figDesc><graphic coords="2,99.08,534.50,396.73,158.55" type="bitmap" /></figure>
		</body>
		<back>
			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<monogr>
		<ptr target="https://ec.europa.eu/info/publications/industry-50-transformative-vision-europe_en" />
		<title level="m">European Commission, Industry 5.0: A Transformative Vision for Europe, Governing Systemic Transformations towards a Sustainable Industry</title>
				<imprint>
			<date type="published" when="2022">2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Industry 5.0 -A Human-Centric Solution</title>
		<author>
			<persName><forename type="first">S</forename><surname>Nahavandi</surname></persName>
		</author>
		<idno type="DOI">10.3390/su11164371</idno>
	</analytic>
	<monogr>
		<title level="j">Sustainability</title>
		<imprint>
			<biblScope unit="volume">11</biblScope>
			<biblScope unit="page">4371</biblScope>
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Graphics and Media Technologies for Operators in Industry 4.0</title>
		<author>
			<persName><forename type="first">J</forename><surname>Posada</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Zorrilla</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Dominguez</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Simoes</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Eisert</surname></persName>
		</author>
		<author>
			<persName><forename type="first">…</forename><forename type="middle">M</forename><surname>Guevara</surname></persName>
		</author>
		<idno type="DOI">10.1109/MCG.2018.053491736</idno>
	</analytic>
	<monogr>
		<title level="j">IEEE Computer Graphics and Applications</title>
		<imprint>
			<biblScope unit="volume">38</biblScope>
			<biblScope unit="page" from="119" to="132" />
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Human-in-the-Loop Cyber-Physical Production Systems Control Human-in-the-Loop Cyber-Physical Production Systems Control (HiLCP2sC)</title>
		<author>
			<persName><forename type="first">M</forename><surname>Gaham</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Bouzouia</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Achour</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Studies in Computational Intelligence</title>
		<imprint>
			<biblScope unit="volume">594</biblScope>
			<biblScope unit="page" from="315" to="325" />
			<date type="published" when="2015">2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">A human-in-the-loop cyberphysical system for collaborative assembly in smart manufacturing</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">A</forename><surname>Ruiz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Garcia</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Rojas</surname></persName>
		</author>
		<author>
			<persName><forename type="first">E</forename><surname>Gualtieri</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Rauch</surname></persName>
		</author>
		<author>
			<persName><surname>Matt</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.procir.2019.03.162</idno>
	</analytic>
	<monogr>
		<title level="j">Procedia CIRP</title>
		<imprint>
			<biblScope unit="volume">81</biblScope>
			<biblScope unit="page" from="600" to="605" />
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Perceptual User Interfaces: Multimodal Interfaces that Process What Comes Naturally</title>
		<author>
			<persName><forename type="first">S</forename><surname>Oviatt</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Cohen</surname></persName>
		</author>
		<idno type="DOI">10.1145/330534.330538</idno>
	</analytic>
	<monogr>
		<title level="j">Communications of the ACM</title>
		<imprint>
			<biblScope unit="volume">43</biblScope>
			<biblScope unit="page" from="45" to="53" />
			<date type="published" when="2000">2000</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">Collaborative Decision-Making for Human-Technology Interaction-A Case Study Using an Automated Water Bottling Plant</title>
		<author>
			<persName><forename type="first">J</forename><surname>Coetzer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">B</forename><surname>Kuriakose</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">J</forename><surname>Vermaak</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Physics: Conference Series</title>
		<imprint>
			<biblScope unit="volume">1577</biblScope>
			<biblScope unit="page">12024</biblScope>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">Symbiotic Integration of Human Activities in Cyber-Physical Systems</title>
		<author>
			<persName><forename type="first">P</forename><surname>Fantini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Leitao</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Barbosa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Taisch</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.ifacol.2019.12.124</idno>
	</analytic>
	<monogr>
		<title level="j">IFAC-PapersOnLine</title>
		<imprint>
			<biblScope unit="volume">52</biblScope>
			<biblScope unit="page" from="133" to="138" />
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Visual computing as key enabling technology for Industry 4.0 &amp; Industrial Internet</title>
		<author>
			<persName><forename type="first">J</forename><surname>Posada</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Toro</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Barandiaran</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Oyarzun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Eisert</surname></persName>
		</author>
		<idno type="DOI">10.1109/MCG.2015.45</idno>
	</analytic>
	<monogr>
		<title level="j">IEEE Computer Graphics and Applications</title>
		<imprint>
			<biblScope unit="volume">35</biblScope>
			<biblScope unit="page" from="26" to="40" />
			<date type="published" when="2015">2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Computer graphics and visual computing use cases for Industry 4.0 and Operator 4.0</title>
		<author>
			<persName><forename type="first">J</forename><surname>Posada</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Barandiaran</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">R</forename><surname>Sanchez</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Mejia-Parra</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Moreno</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Ojer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">O</forename><surname>Ruiz-Salguero</surname></persName>
		</author>
		<idno type="DOI">10.1051/smdo/2021026</idno>
	</analytic>
	<monogr>
		<title level="j">International Journal for Simulation and Multidisciplinary Design Optimization</title>
		<imprint>
			<biblScope unit="volume">12</biblScope>
			<biblScope unit="page" from="4" to="9" />
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Voice-enabled Assistants of the Operator 4.0 in the Social Smart Factory: Prospective role and challenges for an advanced human-machine interaction</title>
		<author>
			<persName><forename type="first">F</forename><surname>Longo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Padovano</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.mfglet.2020.09.001</idno>
	</analytic>
	<monogr>
		<title level="j">Manufacturing Letters</title>
		<imprint>
			<biblScope unit="volume">26</biblScope>
			<biblScope unit="page" from="12" to="16" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<monogr>
		<author>
			<persName><forename type="first">M</forename><surname>Breque</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>De Nul</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Petrides</surname></persName>
		</author>
		<ptr target="https://ec.europa.eu/info/news/industry-50-towards-more-sustainable-resilient-and-human-centric-industry-2021-jan-07_en" />
		<title level="m">Industry 5.0 -Towards a sustainable, human-centric and resilient European industry</title>
				<imprint>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">Industrial Artificial Intelligence in Industry 4.0 -Systematic Review, Challenges and Outlook</title>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">S</forename><surname>Peres</surname></persName>
		</author>
		<author>
			<persName><forename type="first">X</forename><surname>Jia</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Lee</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Sun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">W</forename><surname>Colombo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Barata</surname></persName>
		</author>
		<idno type="DOI">10.1109/ACCESS.2020.3042874</idno>
	</analytic>
	<monogr>
		<title level="j">IEEE Access</title>
		<imprint>
			<biblScope unit="volume">8</biblScope>
			<biblScope unit="page" from="220121" to="220139" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">Data visualization for Industry 4.0: A stepping-stone toward a digital future, bridging the gap between academia and industry</title>
		<author>
			<persName><forename type="first">L</forename><surname>Allen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Atkinson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Jayasundara</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Cordiner</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><forename type="middle">Z</forename><surname>Moghadam</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.patter.2021.100266</idno>
	</analytic>
	<monogr>
		<title level="j">Patterns</title>
		<imprint>
			<biblScope unit="volume">2</biblScope>
			<biblScope unit="page">100266</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">A survey of visualization for smart manufacturing</title>
		<author>
			<persName><forename type="first">F</forename><surname>Zhou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">X</forename><surname>Lin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Liu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Zhao</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Xu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Ren</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Xue</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Ren</surname></persName>
		</author>
		<idno type="DOI">10.1007/s12650-018-0530-2</idno>
	</analytic>
	<monogr>
		<title level="j">Journal of Visualization</title>
		<imprint>
			<biblScope unit="volume">22</biblScope>
			<biblScope unit="page" from="419" to="435" />
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<analytic>
		<title level="a" type="main">Business analytics in Industry 4.0: A systematic review</title>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">J</forename><surname>Silva</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Cortez</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Pereira</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Pilastri</surname></persName>
		</author>
		<idno type="DOI">10.1111/exsy.12741</idno>
	</analytic>
	<monogr>
		<title level="j">Expert Systems</title>
		<imprint>
			<biblScope unit="volume">38</biblScope>
			<biblScope unit="page">e12741</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b16">
	<analytic>
		<title level="a" type="main">Virtual, Augmented, and Mixed Reality for Human-Robot Interaction</title>
		<author>
			<persName><forename type="first">T</forename><surname>Williams</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Szafir</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Chakraborti</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Ben Amor</surname></persName>
		</author>
		<idno type="DOI">10.1109/HRI.2019.8673207</idno>
	</analytic>
	<monogr>
		<title level="m">ACM/IEEE International Conference on Human-Robot Interaction</title>
				<meeting><address><addrLine>New York</addrLine></address></meeting>
		<imprint>
			<publisher>IEEE</publisher>
			<date type="published" when="2018">2018</date>
			<biblScope unit="page" from="671" to="672" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b17">
	<analytic>
		<title level="a" type="main">The impact of virtual, augmented and mixed reality technologies on the customer experience</title>
		<author>
			<persName><forename type="first">C</forename><surname>Flavián</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Ibáñez-Sánchez</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Orús</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.jbusres.2018.10.050</idno>
	</analytic>
	<monogr>
		<title level="j">Journal of Business Research</title>
		<imprint>
			<biblScope unit="volume">100</biblScope>
			<biblScope unit="page" from="547" to="560" />
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b18">
	<analytic>
		<title level="a" type="main">EKIN: Towards natural language interaction with industrial production machines</title>
		<author>
			<persName><forename type="first">A</forename><surname>Del Pozo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Garcia-Sardiña</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Serras</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Gonzalez-Docasal</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">I</forename><surname>Torres</surname></persName>
		</author>
		<author>
			<persName><forename type="first">…</forename><forename type="middle">I</forename><surname>Etxebeste</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">CEUR Workshop Proceedings</title>
				<imprint>
			<date type="published" when="2021">2021</date>
			<biblScope unit="volume">2968</biblScope>
			<biblScope unit="page" from="5" to="8" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b19">
	<monogr>
		<author>
			<persName><forename type="first">R</forename><surname>Gaizauskas</surname></persName>
		</author>
		<ptr target="https://connectedeverything.ac.uk/feasibility-studies/spoken-dialogue-manufacturing/" />
		<title level="m">Investigating spoken dialogue to support manufacturing processes</title>
				<imprint>
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b20">
	<analytic>
		<title level="a" type="main">Deep learning in vision-based static hand gesture recognition</title>
		<author>
			<persName><forename type="first">O</forename><forename type="middle">K</forename><surname>Oyedotun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Khashman</surname></persName>
		</author>
		<idno type="DOI">10.1007/s00521-016-2294-8</idno>
	</analytic>
	<monogr>
		<title level="j">Neural Computing and Applications</title>
		<imprint>
			<biblScope unit="volume">28</biblScope>
			<biblScope unit="page" from="3941" to="3951" />
			<date type="published" when="2017">2017</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b21">
	<analytic>
		<title level="a" type="main">Gesture Control Wearables for Human-Machine Interaction in Industry 4</title>
		<author>
			<persName><forename type="first">L</forename><surname>Roda-Sanchez</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Olivares</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Garrido-Hidalgo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Fernández-Caballero</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="s">Lecture Notes in Computer Science</title>
		<imprint>
			<biblScope unit="volume">0</biblScope>
			<biblScope unit="page" from="99" to="108" />
			<date type="published" when="2019">2019</date>
			<publisher>Springer-Verlag</publisher>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b22">
	<analytic>
		<title level="a" type="main">Dialogue Enhanced Extended Reality: Interactive System for the Operator 4.0</title>
		<author>
			<persName><forename type="first">M</forename><surname>Serras</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>García-Sardiña</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Simões</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Álvarez</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Arambarri</surname></persName>
		</author>
		<idno type="DOI">10.3390/app10113960</idno>
	</analytic>
	<monogr>
		<title level="j">Applied Sciences</title>
		<imprint>
			<biblScope unit="volume">10</biblScope>
			<biblScope unit="page">3960</biblScope>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b23">
	<analytic>
		<title level="a" type="main">Real-time automatic optical system to assist operators in the assembling of electronic components</title>
		<author>
			<persName><forename type="first">M</forename><surname>Ojer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Serrano</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><forename type="middle">A</forename><surname>Saiz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Barandiaran</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Gill</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Aguinaga</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Alejandro</surname></persName>
		</author>
		<idno type="DOI">10.1007/s00170-020-05125-z</idno>
	</analytic>
	<monogr>
		<title level="j">International Journal of Advanced Manufacturing Technology</title>
		<imprint>
			<biblScope unit="volume">107</biblScope>
			<biblScope unit="page" from="2261" to="2275" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b24">
	<analytic>
		<title level="a" type="main">Augmented Intelligence</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">N O</forename><surname>Sadiku</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><forename type="middle">J</forename><surname>Ashaolu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Ajayi-Majebi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">M</forename><surname>Musa</surname></persName>
		</author>
		<idno type="DOI">10.51542/ijscia.v2i5.17</idno>
	</analytic>
	<monogr>
		<title level="j">International Journal of Scientific Advances</title>
		<imprint>
			<biblScope unit="volume">2</biblScope>
			<biblScope unit="page" from="772" to="776" />
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b25">
	<monogr>
		<author>
			<persName><forename type="first">J</forename><surname>Ito</surname></persName>
		</author>
		<ptr target="https://pubpub.ito.com/pub/extended-intelligence/release/1" />
		<title level="m">Extended Intelligence</title>
				<imprint>
			<date type="published" when="2016">2016</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
