<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Numerical methods for handling robotic arms using augmented reality</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Edgar Iván De la Cruz Vaca</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Edgar Roberto Salazar Achig</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jonathan Alexis Romero López</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Adriana Estefanía Tigselema Benavides</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jacson Javier Rodriguez Conde</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Universidad de las Fuerzas Armadas ESPE</institution>
          ,
          <addr-line>S/N y Ambato, Av. Gral. Rumiñahui, Sangolquí, 171103</addr-line>
          ,
          <country country="EC">Ecuador</country>
        </aff>
      </contrib-group>
      <fpage>176</fpage>
      <lpage>192</lpage>
      <abstract>
        <p>This document presents an augmented reality application for mobile devices, as a contribution to education through a technological learning tool that allows the management of industrial robotic arms, implementing advanced control algorithms, which allows the simulation of several selected desired trajectories by the user; and the incorporation of animations that allow to know its operation and to verify the follow-up of the proposed trajectory, as well as the visualization of control errors in each trajectory taken. The application is oriented to the simulation of industrial robotic arms within an intuitive and friendly augmented reality environment, which allows users a great interaction with the robot's structure, providing simulation programs with new immersion technologies, in the educational field. Tests in the augmented reality application demonstrate ease of use and user intuition, providing a better understanding of the operation and structure of programmable manipulators.</p>
      </abstract>
      <kwd-group>
        <kwd>augmented reality</kwd>
        <kwd>industrial robots</kwd>
        <kwd>3D animation</kwd>
        <kwd>numerical control methods</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        Education in recent years has been developing new milestones and ways of teaching,
the use of new technological tools immersed increases the interest and learning
experience [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ], these tools allow the student to interact with the environment, naturally
promoting their interest in discovering things [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. The development of AR and VR
applications has increased enormously in the last decade, these technological tools have
as main axis, the interaction of the user with a virtual environment, the great difference
is that the augmented reality provides the user a mixed, approach since it allows
manipulating the virtual environment through a technological device without leaving
the physical world [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. The applications in a virtual environment are huge because it
allows the construction of any object without being limited by dimensions, which
causes that students have new learning tools in different contexts and topics [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
___________________
Copyright © 2020 for this paper by its authors. Use permitted under Creative Commons License
Attribution 4.0 International (CC BY 4.0).
      </p>
      <p>
        AR applications are developed for specific tasks [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], and can be grouped into two
main groups. Firstly, the applications focused as training assistants [7; 10; 11; 15; 18],
are applications that provide step-by-step assistance for manual assembly processes,
[
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] compare the impact of AR assistants with video-based assistants, concluding that
it reduces the amount of errors in production, and operation failures due to erroneous
maneuvers, requiring less effort to memorize the execution steps, in addition to
optimizing the time of training or onsite courses in the field. The use of AR assistants
has a wide range of uses, allowing users to interact with different processes, whether
educational or industrial, among others. Authors of [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] presents a system that allows
collaboration to collaborate in real time to successfully carry out a mechanical
maintenance task, this system unites the paradigmatic interaction that allows simulating
the presence of the expert with an operator in an assistance situation.
      </p>
      <p>
        Second, the entertainment applications [3; 6; 13; 16]. In [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] presented an application
intended to teach the metabolism of the human body, especially the metabolism of
glucose. The application is full of animations that provide a very immersive and very
realistic environment, allowing the understanding of metabolism, from the moment
food enters the mouth and its respective journey through the digestive system. Article
[
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] describes an AR game based on vision and interaction with non-contact
movements, in which users through dynamic hand and foot gestures, interact with the
virtual elements of the scene in front of a camera, activating an event of interaction
predefined. Author of [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] makes an analysis of what are the technological, pedagogical
characteristics of mobile AR games focused on education, there they describe
characteristics about Calory Battle AR, a location-based game that combines physical
exercise with possibility to include educational content, and Leometry which is a
storybased geometry learning AR game.
      </p>
      <p>This work aims to assist in learning about industrial robots through the development
of an AR application. The application allows the visualization of the components of the
manipulators by means of animation and visualization of multimedia files through the
2D recognition of identification codes, as well as the simulation of the behavior of the
robot structure through the implementation of advanced control algorithms that allow
the mobility of the operating end towards the path desired by the user.</p>
      <p>This work is divided into IV sections, the first section includes the Introduction, the
second section called Development, details the detection technique used, the
characteristics of the mobile application environment and the laws of closed loop
control, the third section details the Results obtained when using the AR application
through a smartphone with its respective analysis of results and, finally, the fourth
section describes the Conclusions of the work done.
2</p>
      <p>Development
The proposed workflow diagram for the creation and development of the AR
application is shown in figure 1, it considers five main stages with a specific task each,
plus one or more processes that allow to execute the workflow tasks of the application
for the smartphone:</p>
    </sec>
    <sec id="sec-2">
      <title>Blender</title>
    </sec>
    <sec id="sec-3">
      <title>Object Hierarchy</title>
      <p>*.fbx</p>
      <sec id="sec-3-1">
        <title>Create Rotation Points</title>
      </sec>
      <sec id="sec-3-2">
        <title>Origin Point Coicident with</title>
      </sec>
      <sec id="sec-3-3">
        <title>Rotation Point</title>
        <p>*.fbx</p>
      </sec>
      <sec id="sec-3-4">
        <title>UNITY</title>
      </sec>
      <sec id="sec-3-5">
        <title>Motion Simulation on</title>
      </sec>
      <sec id="sec-3-6">
        <title>Aumented Reality</title>
      </sec>
      <sec id="sec-3-7">
        <title>Law Control</title>
      </sec>
      <sec id="sec-3-8">
        <title>Set Manual Control</title>
      </sec>
      <sec id="sec-3-9">
        <title>Motion Control in each</title>
      </sec>
      <sec id="sec-3-10">
        <title>Rotation Point</title>
      </sec>
      <sec id="sec-3-11">
        <title>Path Control</title>
      </sec>
      <sec id="sec-3-12">
        <title>Set Trayectory</title>
      </sec>
      <sec id="sec-3-13">
        <title>Motion Control in each</title>
      </sec>
      <sec id="sec-3-14">
        <title>Rotation Point</title>
      </sec>
      <sec id="sec-3-15">
        <title>Sublayer 1.1</title>
        <p>
          i) Layer 1.1 allows 3D development using Blender software, which is a program
specifically designed for the development of 3D objects, rendering, animation, special
effects, etc. [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. The 3D model of the robotic arm is shown in fig. 2, in which each of
the elements or parts is made to coincide with the rotation points, in order to generate
a correct movement of the elements within the Unity platform; the file is exported in
*.fbx format.
        </p>
        <p>ii) Layer 1.2 describes the 2D recognition technique, which allows the detection of
images and text. This image is loaded into Vuforia Developer Portal’s Target Manager,
which allows the creation of databases and the generation of a Unity-compatible file,
as seen in fig 3. Vuforia processes each image and generates characteristic points,</p>
      </sec>
      <sec id="sec-3-16">
        <title>AR Camera</title>
      </sec>
      <sec id="sec-3-17">
        <title>QR Target</title>
      </sec>
      <sec id="sec-3-18">
        <title>Sublayer 1.2</title>
      </sec>
      <sec id="sec-3-19">
        <title>Layer 1</title>
      </sec>
      <sec id="sec-3-20">
        <title>Layer 2</title>
      </sec>
      <sec id="sec-3-21">
        <title>Layer 3</title>
      </sec>
      <sec id="sec-3-22">
        <title>Layer 4</title>
        <p>according to these characteristic points the recognition quality is given. Fig. 4 shows
the standard image to be used and fig. 5 shows the image characteristics that will
determine the recognition quality.
iii) Layer 2 this layer describes the characteristics of the mobile application
environment and the incorporation of the animations.</p>
        <p>Characteristics of the environment, the implementation of a simple but intuitive
interface allows a quick handling and navigation in the mobile application, for which
Table 1 shows the icons that were used for the realization of the AR application with
their respective description; and 3 main scenes are created: the first scene is presented
in fig. 6 and contains the main menu of the application, the second scene allows to
visualize the introduction on robotic arms through video and animation, as shown in
fig. 7, the third scene allows to perform manual and path control its environment is
presented in fig. 8.</p>
        <sec id="sec-3-22-1">
          <title>Introduction</title>
          <p>Control
Video
Animation
Play
Graphics
Return
Exit
Name</p>
        </sec>
        <sec id="sec-3-22-2">
          <title>Icon</title>
        </sec>
        <sec id="sec-3-22-3">
          <title>Action</title>
          <p>iv) Layer 3 this layer details the advanced control laws that are implemented in the AR
application. To meet this objective, the kinematic model of the robotic arm and its
respective stability analysis are described.
2.1</p>
          <p>
            Kinematic model
The diagram proposed in fig. 9 shows a robotic arm with generalized coordinates and
with a position at the operating end, where ℎ is the position of the operating end in
space, is the length of each link and is its rotation angle, is given by:
The first-order differential eℎℎℎq̇u==a=ℎtio(=nℎ,is ,co)nws+++iditehrehd(0(,,,)1=)h.0
The implemented controller is based on numerical methods given in publication [
            <xref ref-type="bibr" rid="ref1">1</xref>
            ], in
equation (1) the output of the system to be controlled is represented by h, ℎ̇ the first
derivative, u the control action, and t the time. The values of h(t) in the discrete time
t = kT0, is called h(k) where T0 represents the sampling time and k  {0, 1, 2, 3, 4, ...}.
          </p>
          <p>The use of numerical methods to calculate the evolution of the system is mainly
based on the possibility of approximating the state system at the instantaneous moment
k+1, if the state and control action over time at the instant k are known, this approach
is called Euler’s method.</p>
          <p>h(k+1) = h(k) + T0 f(h, u, t)
h(k+1) = h(k) + T0 J(q(k)) v(k)
h(k+1) = hd(k) – W(hd(k) – h(k))
Thus, the discrete model can be expressed by
The following expression is used, so that the tracking error tends to zero.
(2)
(3)
(4)
(5)
In which, W is the diagonal matrix and its values 0 &lt; diag(whx, why, whz) &lt; 1 are
parameters for the proposed controller, hd is the desired path.</p>
          <p>Recital (1) and (2), the system can be rewritten as Au = b.</p>
          <p>( )
( ) =
(
)
( ) ( )
( )
Therefore, the viable solution method is to formulate it as a constrained linear
optimization problem.</p>
          <p>‖ ‖ =
v) Layer 4 this layer allows the rotation angles obtained from the sliders as well as from
the position and trajectory controls developed, to be incorporated to the links
(extremities) of the 3D model in order to generate the respective movement that
complies with the trajectory entered by the user.
3</p>
          <p>Results obtained
This section shows the augmented reality interface and the usability that it has as a
technological tool to manage industrial robots, as well as the simulation of a control
algorithm that allows users to follow the desired path. To get started with the app, you
need to pre-install the APK on your smartphone.</p>
          <p>When the application is run and the QR code is focused, the main scene is shown as
shown in fig. 10, to exit the AR application press “exit”. Clicking on “introduction”
presents a new environment called introduction, pressing the “animation” icon starts
the animation of the constituent parts of a robotic arm, as shown in Fig. 11.</p>
          <p>By clicking on “video” an introductory video is presented containing: general
characteristics of a robotic arm, constitution and mathematical modeling, as shown in
fig. 12.</p>
          <p>To control the robotic arm, press “control”, this shows a scene that contains 3 sliders,
as shown in fig. 13, these sliders allow to move the rotation angles of the 3 links, thus
performing the manual control, as shown in fig. 14.</p>
          <p>For the robot to perform path control it is necessary to focus the path target next to the
QR image of the robot arm, when the image is recognized its path appears in 3D, fig. 15
shows the circular path. Clicking on “play” starts the trajectory control in which the
operating end of the arm follows the circular trajectory as shown in fig. 16.</p>
          <p>The graph of the x, y and z position errors in the AR application as in software Matlab
can be seen in fig. 17 and Fig. 18, respectively.</p>
          <p>The spiral path of the robot in the AR application can be seen in fig. 19. The fig. 20
and fig. 21 show the graph of the x, y and z position errors in the AR application as in
the Matlab under the same conditions respectively.</p>
          <p>
            The cylindrical spiral path of the robot in the AR application can be seen in fig. 22.
The fig. 23 and fig. 24 show the graph of the x, y and z position errors in the AR
application as in the Matlab under the same conditions respectively.
Next, the obtained results are shown, that that indicate the validity of the usability of
augmented reality, for handling of industrial robots, in a punctual way; in the one of
handling and training of industrial robots of three degrees of freedom. The SUS
summary evaluation method was used, whose weighting characteristics are described
in the document [
            <xref ref-type="bibr" rid="ref2">2</xref>
            ]. The sample for the survey is 13 students of the Universidad de las
Fuerzas Armadas ESPE of the Electronic Engineering career that are in the last
semesters to which a survey of 10 questions was applied according to their experience
when using the application. In which it is appreciated in a general way in the scale of
styles that generates only a number, the same that represents an average composed by
the usability of the application for mobile devices (smartphone), as it is indicated in
table 2.
The weighting for the odd questions has a value of 1 to 5, being 1 the worst and 5 the
best, while for the even questions, it has a value of 5 to 1, being 1 the best and 5 the
worst, these values are multiplied by the number of answers in each question and finally
the arithmetic average is obtained. The total obtained, from the sum of the operation of
the 10 questions gives as result 34.61; the SUS score is calculated and expressed by
means of a multiplication of 2.5 to the total obtained, with which it is determined if the
application is feasible for the handling and training of industrial robots, obtaining a
percentage of 86.53%, this result represents a high usability for this type of
technological tools.
4
          </p>
          <p>Conclusions
The work presents an augmented reality application for smart phones that allows
scanning 2D objects and later interacting in the handling of the robotic arm and each of
its links, in addition to knowing its parts through animations and multimedia files.
Finally, the application allows the visualization of the 3D animation of the robotic arm
in the different trajectory control tests established by the user, such as, circular
trajectory, spiral and cylindrical spiral, as well as the visualization of the 2D graphics
of the control errors, in which it can be seen that the error reaches zero, which indicates
that the arm arrives and performs the desired task.</p>
          <p>For subsequent works, we will contrast the impact and influence of the application
of applied RA in education versus the general teaching methodology, without the
implementation of AR technologies; as well as the advantages and disadvantages of
learning using this technological tool focused on the assistance and training of industrial
robotic arms.</p>
          <p>Acknowledgements
The authors would like to thanks to Universidad de las Fuerzas Armadas ESPE, for the
support to develop this work.</p>
        </sec>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Andaluz</surname>
            ,
            <given-names>V.H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Molina</surname>
            ,
            <given-names>M.F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Erazo</surname>
            ,
            <given-names>Y.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ortiz</surname>
            ,
            <given-names>J.O.</given-names>
          </string-name>
          :
          <article-title>Numerical Methods for Cooperative Control of Double Mobile Manipulators</article-title>
          . In: Huang,
          <string-name>
            <given-names>Y.</given-names>
            ,
            <surname>Wu</surname>
          </string-name>
          ,
          <string-name>
            <surname>H.</surname>
          </string-name>
          , Liu,
          <string-name>
            <given-names>H.</given-names>
            ,
            <surname>Yin</surname>
          </string-name>
          ,
          <string-name>
            <surname>Z</surname>
          </string-name>
          . (eds.)
          <article-title>Intelligent Robotics and Applications</article-title>
          .
          <source>ICIRA 2017. Lecture Notes in Computer Science</source>
          , vol.
          <volume>10463</volume>
          , pp.
          <fpage>889</fpage>
          -
          <lpage>898</lpage>
          . Springer, Cham (
          <year>2017</year>
          ). doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>319</fpage>
          -65292-4_
          <fpage>77</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Bangor</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kortum</surname>
            ,
            <given-names>P.T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Miller</surname>
            ,
            <given-names>J.T.</given-names>
          </string-name>
          :
          <article-title>An Empirical Evaluation of the System Usability Scale</article-title>
          .
          <source>International Journal of Human-Computer Interaction</source>
          <volume>24</volume>
          (
          <issue>6</issue>
          ),
          <fpage>574</fpage>
          -
          <lpage>594</lpage>
          (
          <year>2008</year>
          ).
          <source>doi:10.1080/10447310802205776</source>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Barrow</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Forker</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sands</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>O'Hare</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hurst</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          :
          <article-title>Augmented Reality for Enhancing Life Science Education</article-title>
          .
          <source>In: VISUAL 2019 - The Fourth International Conference on Applications and Systems of Visual Paradigms - H10 ROMA CITTA</source>
          , Rome, Italy, 30 Jun 2019 - 4
          <article-title>Jul 2019</article-title>
          . https://abdn.pure.elsevier.com/files/144953995/Augmented_Reality_fo r_Enhancing_Life_Science_Education_Camera_Ready.pdf (
          <year>2019</year>
          ).
          <source>Accessed 29 Nov 2019</source>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Billinghurst</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Augmented Reality in Education. New Horizons for Learning</article-title>
          . http://solomonalexis.com/downloads/ar_edu.pdf (
          <year>2002</year>
          ).
          <source>Accessed 21 Mar 2020</source>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5. blender.org
          <article-title>- Home of the Blender project - Free and Open 3D Creation Software</article-title>
          . https://www.blender.org (
          <year>2020</year>
          ).
          <source>Accessed 21 Mar 2020</source>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Boletsis</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>McCallum</surname>
            ,
            <given-names>S.:</given-names>
          </string-name>
          <article-title>The Table Mystery: An Augmented Reality Collaborative Game for Chemistry Education</article-title>
          . In: Ma,
          <string-name>
            <given-names>M.</given-names>
            ,
            <surname>Oliveira</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.F.</given-names>
            ,
            <surname>Petersen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Hauge</surname>
          </string-name>
          , J.B. (eds.)
          <article-title>Serious Games Development and Applications</article-title>
          .
          <source>SGDA 2013. Lecture Notes in Computer Science</source>
          , vol.
          <volume>8101</volume>
          , pp.
          <fpage>86</fpage>
          -
          <lpage>95</lpage>
          . Springer, Berlin, Heidelberg (
          <year>2013</year>
          ). doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>642</fpage>
          - 40790-
          <issue>1</issue>
          _
          <fpage>9</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Boonbrahm</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kaewrat</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Assembly of the Virtual Model with Real Hands Using Augmented Reality Technology</article-title>
          . In: Shumaker,
          <string-name>
            <given-names>R.</given-names>
            ,
            <surname>Lackey</surname>
          </string-name>
          , S. (eds.) Virtual, Augmented and
          <string-name>
            <given-names>Mixed</given-names>
            <surname>Reality</surname>
          </string-name>
          .
          <article-title>Designing and Developing Virtual and Augmented Environments</article-title>
          .
          <source>VAMR 2014. Lecture Notes in Computer Science</source>
          , vol.
          <volume>8525</volume>
          , pp.
          <fpage>329</fpage>
          -
          <lpage>338</lpage>
          . Springer, Cham (
          <year>2014</year>
          ). doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>319</fpage>
          -07458-0_
          <fpage>31</fpage>
          (
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Bottecchia</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cieutat</surname>
            ,
            <given-names>J.-M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jessel</surname>
          </string-name>
          , J.-P.:
          <string-name>
            <surname>T.A.C:</surname>
          </string-name>
          <article-title>Augmented Reality System for Collaborative Tele-Assistance in the Field of Maintenance through Internet</article-title>
          . In: AH'
          <string-name>
            <surname>2010 (Augmented</surname>
            <given-names>Human)</given-names>
          </string-name>
          ,
          <source>Apr</source>
          <year>2010</year>
          , Megève, France, pp.
          <fpage>1</fpage>
          -
          <lpage>7</lpage>
          (
          <year>2010</year>
          ). doi:
          <volume>10</volume>
          .1145/1785455.1785469
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Bower</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Howe</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>McCredie</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Robinson</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Grover</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          :
          <article-title>Augmented Reality in education - cases, places and potentials</article-title>
          .
          <source>Educational Media International</source>
          <volume>51</volume>
          (
          <issue>1</issue>
          ),
          <fpage>1</fpage>
          -
          <lpage>15</lpage>
          (
          <year>2014</year>
          ). doi:
          <volume>10</volume>
          .1080/09523987.
          <year>2014</year>
          .889400
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Chicaiza</surname>
            ,
            <given-names>E.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>De la Cruz</surname>
            ,
            <given-names>E.I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Andaluz</surname>
            ,
            <given-names>V.H.</given-names>
          </string-name>
          :
          <article-title>Augmented Reality System for Training and Assistance in the Management of Industrial Equipment and Instruments</article-title>
          . In: Bebis,
          <string-name>
            <surname>G.</surname>
          </string-name>
          et al. (eds.)
          <source>Advances in Visual Computing. ISVC 2018. Lecture Notes in Computer Science</source>
          , vol.
          <volume>11241</volume>
          , pp.
          <fpage>675</fpage>
          -
          <lpage>686</lpage>
          . Springer, Cham. https://doi.org/10.1007/978-3-
          <fpage>030</fpage>
          - 03801-4_
          <fpage>59</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Haritos</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Macchiarella</surname>
            ,
            <given-names>N.D.:</given-names>
          </string-name>
          <article-title>A mobile application of augmented reality for aerospace maintenance training</article-title>
          .
          <source>In: 24th Digital Avionics Systems Conference, 30 Oct.-3 Nov</source>
          . 2005, Washington, DC, USA. IEEE (
          <year>2005</year>
          ). doi:
          <volume>10</volume>
          .1109/DASC.
          <year>2005</year>
          .1563376
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Kaufmann</surname>
          </string-name>
          , H.:
          <article-title>Collaborative Augmented Reality in Education</article-title>
          . https://www.researchgate.net/publication/2555518_Collaborative_Augmented_Reality_in _Education (
          <year>2003</year>
          ) .
          <source>Accessed 21 Mar 2020</source>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Laine</surname>
          </string-name>
          , T.H.:
          <article-title>Mobile educational augmented reality games: a systematic literature review and two case studies</article-title>
          .
          <source>Computers</source>
          <volume>7</volume>
          (
          <issue>19</issue>
          ),
          <fpage>2</fpage>
          -
          <lpage>28</lpage>
          (
          <year>2018</year>
          ). doi:
          <volume>10</volume>
          .3390/computers7010019
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Lee</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          :
          <article-title>Augmented Reality in Education and Training</article-title>
          .
          <source>TechTrends 56</source>
          ,
          <fpage>13</fpage>
          -
          <lpage>21</lpage>
          (
          <year>2012</year>
          ).
          <source>doi:10.1007/s11528-012-0559-3</source>
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Loch</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Quint</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Brishtel</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          :
          <article-title>Comparing Video and Augmented Reality Assistance in Manual Assembly</article-title>
          .
          <source>In: 2016 12th International Conference on Intelligent Environments (IE)</source>
          ,
          <fpage>14</fpage>
          -
          <lpage>16</lpage>
          Sept.
          <year>2016</year>
          , London, UK, pp.
          <fpage>147</fpage>
          -
          <lpage>150</lpage>
          . IEEE (
          <year>2016</year>
          ). doi:
          <volume>10</volume>
          .1109/IE.
          <year>2016</year>
          .31
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Lv</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Halawani</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Feng</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , ur Réhman,
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <surname>H.</surname>
          </string-name>
          :
          <article-title>Touch-less interactive augmented reality game on vision-based wearable device</article-title>
          .
          <source>Personal and Ubiquitous Computing</source>
          <volume>19</volume>
          (
          <issue>3- 4</issue>
          ),
          <fpage>551</fpage>
          -
          <lpage>567</lpage>
          (
          <year>2015</year>
          ).
          <source>doi:10.1007/s00779-015-0844-1</source>
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Pantelidis</surname>
            ,
            <given-names>V.S.</given-names>
          </string-name>
          :
          <article-title>Reasons to Use Virtual Reality in Education</article-title>
          .
          <source>Themes in Science and Technology Education</source>
          <volume>2</volume>
          (
          <issue>1-2</issue>
          ),
          <fpage>59</fpage>
          -
          <lpage>70</lpage>
          (
          <year>2007</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Schwald</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>de Laval</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          :
          <article-title>An Augmented Reality System for Training and Assistance to Maintenance in the Industrial Context</article-title>
          .
          <source>Journal of WSCG</source>
          <volume>11</volume>
          (
          <issue>1</issue>
          -
          <fpage>3</fpage>
          ). http://wscg.zcu.cz/wscg2003/Papers_2003/I23.pdf (
          <year>2003</year>
          ).
          <source>Accessed 21 Mar 2020</source>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>