<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>IS-EUD</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>AR TutorialKit: an Augmented Reality Toolkit to Create Tutorials</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Federico Meloni</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alessandra Perniciano</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Giulia Cerniglia</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Vittoria Frau</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Lucio Davide Spano</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>University of Cagliari, Department of Mathematics and Computer Science</institution>
          ,
          <addr-line>Via Ospedale 72,09124, Cagliari</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2023</year>
      </pub-date>
      <volume>9</volume>
      <fpage>06</fpage>
      <lpage>08</lpage>
      <abstract>
        <p>Augmented Reality (AR) is a widely used technology in fields such as medicine, engineering, and architecture, and is also prevalent in social media platforms like Snapchat, Instagram, and TikTok. In recent years, the availability of AR applications and improvements in hardware have made it afordable for educational training in various disciplines. However, limited options are available for the general construction of AR tutorials in the literature. Most solutions are specific for particular contexts, such as medical procedures or industry-specific tasks. This paper proposes an AR toolkit that enables novice programmers to create tutorials without topic restrictions. Our aim is to keep improving TutorialKit in such a way that it can be used flexibly and efectively in a variety of diferent contexts, enabling it to meet the diverse needs and requirements of users.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;End-User Development</kwd>
        <kwd>Augmented Reality</kwd>
        <kwd>Training</kwd>
        <kwd>Education</kwd>
        <kwd>Tutorial</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Nowadays, Augmented Reality (AR) interfaces are widespread, finding applications in medicine [
        <xref ref-type="bibr" rid="ref1 ref2">1,
2</xref>
        ], engineering [
        <xref ref-type="bibr" rid="ref3 ref4">3, 4</xref>
        ] or architecture [
        <xref ref-type="bibr" rid="ref5 ref6">5, 6</xref>
        ]. In our daily life, AR features are pervasive in social
media platforms like Snapchat1, Instagram2 or Tiktok3, which allow users for instance to apply
augmented reality filters to their faces or surroundings. A key AR capability is enhancing the
user’s sensory experience by seamlessly integrating virtual elements within the real-world
environment [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. This characteristic makes AR well-suited for use in training. In the past
few years, the availability of AR toolkit and enhancements in hardware have made it easier
to implement AR in educational training. As a result, the use of AR has become afordable
for education and training across various disciplines such as industry [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], medicine [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], the
military [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], agriculture [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] etc. Augmented Reality is an innovation that has the potential to
change where and when education and training take place significantly.
      </p>
      <p>
        In the literature, there are not many options available to ease the construction of Augmented
Reality tutorials in the general case. Most of the solutions are extremely focused on a particular
domain and are created ad-hoc for a specific context. For example, in the medical field, several
applications explain how to do activities such as sterilisation of tools [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] or actual simulations
of operations such as thoracotomy[
        <xref ref-type="bibr" rid="ref13">13</xref>
        ], but the infrastructure behind each application (medical
or other) is strictly personalised for that context.
      </p>
      <p>Our project aims to provide an Augmented Reality toolkit that enables novice developers to
rapidly create tutorials in diferent domains, without the burden of learning all the required
aspects of a full-fledged AR toolkit. The toolkit is flexible and applicable in diferent contexts
and domains.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related Work</title>
      <p>
        In the past few years, numerous AR-based training tutorials have been presented[
        <xref ref-type="bibr" rid="ref11 ref14">11, 14, 15, 16</xref>
        ].
An example of an AR tutorial is provided by the American Fuel &amp; Petrochemical Manufacturers
(AFPM), who created a digital toolkit with simulations using virtual and augmented reality to
bridge the knowledge gap across employee generations and deal with competence training
issues [17]. Interesting work was provided by Eckhof et al.[ 18] who developed TutAR which
automatically converts videos of hand procedures into 3D AR tutorials with minimal user input.
Despite the system being a good example of creating AR tutorials without any programming
knowledge, it only accepts videos with hand movements and, when a tutorial is done, it can not
be modified. Another fascinating system is Meta-Ar-App[ 19], which is an authoring platform
for collaborative AR that enables teachers to create AR tutorials. One of the most important
limitations of this work is related to the tutorial context. Indeed, it is possible to create tutorials
only with electronic circuit objects.
      </p>
      <p>Our work enables novice programmers to create AR tutorials without being restricted to a
specific field. The structure of tutorials supported by our toolkit is simple, composed of atomic
and elementary steps that can be applied in various contexts. As a result, our toolkit allows
novice programmers to create AR tutorials that can be tailored to the needs of individual users.</p>
    </sec>
    <sec id="sec-3">
      <title>3. The AR Tutorial Kit</title>
      <p>As mentioned earlier, our work aims to make augmented reality tutorial creation easier for less
experienced programmers, by providing tools that can be adapted as much as possible to any
topic, without necessarily having to write complex code. To do this, we created a simple AR
tutorial abstraction, consisting of two key concepts: the task and the tool.</p>
      <p>• A Task is an action that a user must take to finish the tutorial. The aim of the tutorial is
to guide the user in achieving a goal by completing all tasks.</p>
      <p>• A Tool is an object that the user controls to carry out a task.</p>
      <p>TutorialKit allows defining tutorials by a structured encoding of tasks and tools. The
underlying toolkit support interprets the information and generates the interface for performing the
tutorial. The structured encoding consists of a JSON file. For tasks, it includes the following
information: title, description, media’s URL (image or video, optional), and tool name (identifying
the tool to be used for the task, optional). Every task has a tool, which is an object that will be
used to do the task. A tool is defined as follows: name, image (needed for the tracking), and the
dimension of the hint. Tasks may be grouped in unordered and ordered lists, which express
their temporal relationships. Ordered lists define tasks that must be performed in sequence
(e.g. executing a recipe), whereas when ordering is not mandatory (e.g. decorating a Christmas
Tree), we use unordered lists.</p>
      <p>We describe the support to the execution of the tutorials defined by this structure in Section 3.1,
while the show the user interface in Section 3.2.</p>
      <sec id="sec-3-1">
        <title>3.1. UIManager</title>
        <p>The implementation of the toolkit relies on the AR Foundation package4 in Unity 3D5. The
solution consists of three modules:
• The UIManager is the main class, and it provides represents the entry point for library
users and coordinates communications between the other classes;
• The ClipboardManager handles information displayed on the clipboard, such as the
tutorial’s progress of the user, the success or failure callbacks etc.;
• The ToolManager identifies the tool in the real environment when performing a task.</p>
        <p>The novice developer imports the UIManager in his/her application to implement a tutorial,
providing the aforementioned JSON description of tasks and tools. The implementation also
requires callback functions for the correct or wrong performance of a given task. After loading
the JSON file, the novice developer links the callbacks to trigger positive or negative feedback
on a given task, while the interface is completely generated by the underlying toolkit. The
feedback consists of a green check or a red cross on the clipboard, for a positive and negative
outcome, respectively.</p>
        <p>In future work, we would like to open the development of tutorial to non-programmers,
through an authoring interface allowing to specify the structure (i.e., the information in the
JSON file) and to provide a low or no-code definition of the completion callbacks when a task
has been performed correctly or not. This may be achieved by including a set of predefined
triggers and actions (e.g., setting an object position or visibility on collision).
4https://unity.com/unity/features/arfoundation
5https://unity.com/</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Interface</title>
        <p>While the UIManager represents the programming interface for the novice developer, the
ClipboardManager and the ToolManager are two modules responsible for updating the AR
interface. The clipboard is the core of the interface, and it displays the task information and
some optional information, which depends on the current task. The Tool Manager is responsible
for highlighting the elements required for completing the task in the AR environment, including
the item that needs to be handled to finish the task and any external tools that will help do so.
Suppose we have a tutorial for making a cake in which one of the task is to whip the cream.
Our toolkit could highlight the cream as a necessary object for completing the task, as well as
the whisk, which is considered an external object (since an electric mixer might also be used).</p>
        <sec id="sec-3-2-1">
          <title>3.2.1. Clipboard</title>
          <p>The Clipboard is placed in the virtual space of the scene, and it is depicted as a real clipboard.
Users interact with an object they are familiar with in suggesting guidance for completing
actions. The information in the clipboard contains the task title, its description, a hint about
the tool to be used, warnings in case the task can not be performed and feedback on correct
or wrong task performance. In addition, in the clipboard interface shows the progress of the
activity based on the previously completed tasks.</p>
          <p>In the case where tasks have to be performed in an ordered manner (Figure 2), we display a
progress bar whose elements change colour as the tasks are performed, ranging from red to
green as the bar gets filled like the colours of a battery when it’s charging. Specifically, the
progress bar is red when a few tasks have been completed, green when the tutorial is done, and
transitions through intermediate colours based on the number of completed tasks.</p>
          <p>This feature provides users with a visual representation of their progress, which can help
motivate them to complete the tutorial.</p>
          <p>In the case where the task can be executed in random order (Figure 3), we display a series of
spheres, where each sphere represents a task. We display uncompleted tasks with a gray colour,
which changes to green or red, respectively, on success or failure.</p>
          <p>The interface provides buttons for navigating the task list. When there are random order
tasks, the contour of the sphere that represents the task currently displayed in the clipboard
is highlighted in yellow. In both cases, the percentage value of the overall progress in task
completion is displayed.</p>
          <p>The clipboard allows displaying optional information as media elements, such as videos and
images that can aid in task performance. Such audiovisual material is displayed on the left side
of the clipboard on a virtual screen that can be turned on or of by the user and, in the case of a
video, allows the user to play and pause the content.</p>
        </sec>
        <sec id="sec-3-2-2">
          <title>3.2.2. Tool Manager</title>
          <p>The Tool Manager handles the interface guidance on the tool used for the task accomplishment.
Through image recognition techniques, we locate the correct tool for the task displayed in the
clipboard. We highlight it in AR by placing a semitransparent green cuboid in the center of it.
Our current implementation allows highlighting one tool per task. Such a choice was not made
for not for technical reasons, but to force the tutorial designer to divide the tasks into simple
tasks so that it is easier for the user to understand and execute them. The highlights for that
specific task are turned of as soon as it is completed.</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Use Cases</title>
      <p>In this section, we explain how the application works through two diferent usage scenarios:
assembling a PC (with ordered tasks) and decorating a Christmas Tree (without ordered tasks).
These two use cases were chosen because the first use case requires accurate order in assembling
the components, whereas in the second it is not strictly necessary.</p>
      <sec id="sec-4-1">
        <title>4.1. Assembly a PC</title>
        <p>Suppose novice users bought all the necessary components to assemble their PC at home. They
discovered that there is an app that enables them to follow instructions to put the PC together.
They open the app on his phone and start the PC assembly tutorial. When they start the tutorial,
the app prompts them to rotate the phone screen for a better field of view (Figure 2a). Next, the
app detects the planes in the scene and places a clipboard on the real-world surface (Figure 2b).
The clipboard suggests the tools users will need to assemble the PC (Figure 2c). To tell if users
are executing the task correctly, a green mark is displayed in case of success, and a red sign
in case of failure (Figure 2d). The app guides users through each process step, suggesting the
appropriate tools and providing visual aids to help them complete the task successfully (Figure
2e). Users complete the PC assembly without any dificulties (Figure 2f).</p>
        <p>(a) Plane Detection
(b) The Clipboard
(c) Tool’s suggestion
(d) Success/Failure of the task
(e) Additional information
(f) Tutorial Completed</p>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Decoration of a Christmas Tree</title>
        <p>Consider users who want to decorate their Christmas tree but are unsure of where to begin.
They launch an app that ofers a tutorial to help them get started. They can do the tasks in
any sequence they choose because the tutorial, in this instance, is made up of unordered tasks.
When they start the tutorial, they see a clipboard on the right-hand side of the screen (Figure
3a). Each of the spheres on the clipboard is a diferent task that they must finish. The sphere
representing the task they are currently working on is highlighted in yellow (Figure 3b). Users
decide to start with the third task, which involves placing a ball on the tree even though they
haven’t completed the second task yet. As they work on each task, they can tap on the clipboard
to see a small panel appear to the left of it. The panel contains an illustrative image to help them
complete the task correctly (Figure 3c). Once users have completed all the tasks, the tutorial is
ifnished (Figure 3d).</p>
        <p>(a) The Clipboard</p>
        <p>(b) Unordered manner (c) Additional Information (d) Tutorial Completed</p>
      </sec>
      <sec id="sec-4-3">
        <title>4.3. Building the examples with TutorialKit</title>
        <p>We discuss here the information needed to construct these two examples with TutorialKit. First
of all, the type of tutorial to be conducted must be specified i.e., the way in which the tasks are
to be performed, ordered or unordered. In the first example (Assembly a PC) they are ordered,
while in the second one (Decoration of a Christmas Tree) they are unordered. Next, we needed
to collect all information about the tasks, such as the title, the text that explains the action
which the user should do, and all information about the tools, like the name and associated
image. The ease of creating these two tutorials lies in the fact that this information is contained
within a JSON file that is loaded when the application is started.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusion and Future Work</title>
      <p>In this paper, we introduced an AR toolkit to help novice developers implement tutorials. The
toolkit relies on a JSON description of the tasks involved and on callbacks for identifying the
successful or wrong completion of the tasks. It manages the visualisation of a clipboard-based
guidance interface and the highlighting of the required tools in the AR environment. We
included two examples showing the required information and the resulting tutorials.</p>
      <p>In future work, we will perform a thorough examination of our prototype. We plan to conduct
user studies with novice developers to collect feedback to enhance our work. One of the most
important extensions is an authoring environment for end-user developers. It will provide
instruments to manipulate JSON files and handle the success and failure of tasks without writing
code. Minor improvements include compatibility with further operating systems and devices,
such as AR headsets, and the management of haptic feedback on task completion.
impact on speed of learning and task performance in aeronautical engineering technology
education, The International Journal of Aerospace Psychology 31 (2021) 219–229. doi:10.
1080/24721840.2021.1881403.
[15] J.-R. Chardonnet, G. Fromentin, J. C. M. Outeiro, Augmented reality as an aid for the use
of machine tools, Research and Science Today Supplement (2017) 25–31.
[16] C.-M. Chen, Y.-N. Tsai, Interactive augmented reality system for enhancing library
instruction in elementary schools, Computers &amp; Education 59 (2012) 638–652. doi:https:
//doi.org/10.1016/j.compedu.2012.03.001.
[17] D. Forest, Training the next generation of operators: Afpm immersive learning, Process</p>
      <p>Safety Progress 40 (2021). doi:10.1002/prs.12246.
[18] D. Eckhof, C. Sandor, C. Lins, U. Eck, D. Kalkofen, A. Hein, Tutar: augmented reality
tutorials for hands-only procedures, Proceedings of the 16th ACM SIGGRAPH International
Conference on Virtual-Reality Continuum and its Applications in Industry (2018).
[19] A. Villanueva, Z. Zhu, Z. Liu, K. Peppler, T. Redick, K. Ramani, Meta-ar-app: An authoring
platform for collaborative augmented reality in stem classrooms, CHI ’20, Association for
Computing Machinery, New York, NY, USA, 2020. URL: https://doi.org/10.1145/3313831.
3376146. doi:10.1145/3313831.3376146.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>M.</given-names>
            <surname>Hanna</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Ahmed</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Nine</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Prajapati</surname>
          </string-name>
          , L. Pantanowitz,
          <article-title>Augmented reality technology using microsoft hololens in anatomic pathology</article-title>
          ,
          <source>Archives of Pathology &amp; Laboratory Medicine</source>
          <volume>142</volume>
          (
          <year>2018</year>
          ). doi:
          <volume>10</volume>
          .5858/arpa.2017-0189-OA.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>H.</given-names>
            <surname>Al Janabi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Aydın</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Palaneer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Macchione</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Al-Jabir</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Khan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Dasgupta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Ahmed</surname>
          </string-name>
          ,
          <article-title>Efectiveness of the hololens mixed reality headset in minimally invasive surgery: A simulation-based feasibility study</article-title>
          ,
          <source>Surgical Endoscopy</source>
          <volume>34</volume>
          (
          <year>2020</year>
          ).
          <source>doi:10.1007/ s00464-019-06862-3.</source>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>A.</given-names>
            <surname>Hietanen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Pieters</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Lanz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Latokartano</surname>
          </string-name>
          ,
          <string-name>
            <surname>J.-K. Kämäräinen</surname>
          </string-name>
          ,
          <article-title>Ar-based interaction for human-robot collaborative manufacturing</article-title>
          ,
          <source>Robotics and Computer-Integrated Manufacturing</source>
          <volume>63</volume>
          (
          <year>2020</year>
          )
          <article-title>101891</article-title>
          . doi:https://doi.org/10.1016/j.rcim.
          <year>2019</year>
          .
          <volume>101891</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>W.</given-names>
            <surname>Vorraber</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Gasser</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Webb</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Neubacher</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Url</surname>
          </string-name>
          ,
          <article-title>Assessing augmented reality in production: remote-assisted maintenance with hololens</article-title>
          ,
          <source>Procedia CIRP 88</source>
          (
          <year>2020</year>
          )
          <fpage>139</fpage>
          -
          <lpage>144</lpage>
          . doi:https://doi.org/10.1016/j.procir.
          <year>2020</year>
          .
          <volume>05</volume>
          .025.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>H.</given-names>
            <surname>Bahri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Krcmarik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Moezzi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Kočí</surname>
          </string-name>
          ,
          <article-title>Eficient use of mixed reality for bim system using microsoft hololens</article-title>
          ,
          <source>IFAC-PapersOnLine</source>
          <volume>52</volume>
          (
          <year>2019</year>
          )
          <fpage>235</fpage>
          -
          <lpage>239</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.ifacol.
          <year>2019</year>
          .
          <volume>12</volume>
          .762.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>L.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , S. Chen,
          <string-name>
            <given-names>H.</given-names>
            <surname>Dong</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>El</surname>
          </string-name>
          <string-name>
            <surname>Saddik</surname>
          </string-name>
          ,
          <article-title>Visualizing toronto city data with hololens: Using augmented reality for a city model</article-title>
          ,
          <source>IEEE Consumer Electronics Magazine</source>
          <volume>7</volume>
          (
          <year>2018</year>
          )
          <fpage>73</fpage>
          -
          <lpage>80</lpage>
          . doi:
          <volume>10</volume>
          .1109/
          <string-name>
            <surname>MCE</surname>
          </string-name>
          .
          <year>2018</year>
          .
          <volume>2797658</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>K.</given-names>
            <surname>Prasad</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Winter</surname>
          </string-name>
          , U. Bhat,
          <string-name>
            <given-names>R. V.</given-names>
            <surname>Acharya</surname>
          </string-name>
          , G. Prabhu,
          <article-title>Image analysis approach for development of a decision support system for detection of malaria parasites in thin blood smear images, Journal of digital imaging : the oficial journal of the Society for Computer Applications in Radiology 25 (</article-title>
          <year>2011</year>
          )
          <fpage>542</fpage>
          -
          <lpage>9</lpage>
          . doi:
          <volume>10</volume>
          .1007/s10278-011-9442-6.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>S.</given-names>
            <surname>Feiner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Macintyre</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Seligmann</surname>
          </string-name>
          ,
          <article-title>Knowledge-based augmented reality</article-title>
          ,
          <source>Commun. ACM</source>
          <volume>36</volume>
          (
          <year>1993</year>
          )
          <fpage>53</fpage>
          -
          <lpage>62</lpage>
          . doi:
          <volume>10</volume>
          .1145/159544.159587.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>M.</given-names>
            <surname>Aebersold</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Voepel-Lewis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Cherara</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Weber</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Khouri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Levine</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Tait</surname>
          </string-name>
          ,
          <article-title>Interactive anatomy-augmented virtual simulation training</article-title>
          ,
          <source>Clinical Simulation in Nursing</source>
          <volume>15</volume>
          (
          <year>2018</year>
          )
          <fpage>34</fpage>
          -
          <lpage>41</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.ecns.
          <year>2017</year>
          .
          <volume>09</volume>
          .008.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>M.</given-names>
            <surname>Chmielewski</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Sapiejewski</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Sobolewski</surname>
          </string-name>
          ,
          <article-title>Application of augmented reality, mobile devices, and sensors for a combat entity quantitative assessment supporting decisions and situational awareness development</article-title>
          ,
          <source>Applied Sciences</source>
          <volume>9</volume>
          (
          <year>2019</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>X.</given-names>
            <surname>Yang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Shu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Ferrag</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Wu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Nurellari</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Huang</surname>
          </string-name>
          ,
          <article-title>A survey on smart agriculture: Development modes, technologies, and security and privacy challenges</article-title>
          ,
          <source>IEEE/CAA Journal of Automatica Sinica</source>
          <volume>8</volume>
          (
          <year>2021</year>
          )
          <fpage>273</fpage>
          -
          <lpage>302</lpage>
          . doi:
          <volume>10</volume>
          .1109/JAS.
          <year>2020</year>
          .
          <volume>1003536</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>V.</given-names>
            <surname>Krauß</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Uzun</surname>
          </string-name>
          ,
          <article-title>Supporting medical auxiliary work: The central sterile services department as a challenging environment for augmented reality applications</article-title>
          ,
          <source>in: 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>665</fpage>
          -
          <lpage>671</lpage>
          . doi:
          <volume>10</volume>
          .1109/ISMAR50242.
          <year>2020</year>
          .
          <volume>00096</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>T.</given-names>
            <surname>Yonghang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Shi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Pan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Hao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Chang</surname>
          </string-name>
          ,
          <article-title>Augmented reality-based visual-haptic modeling for thoracoscopic surgery training systems</article-title>
          ,
          <source>Virtual Reality &amp; Intelligent Hardware</source>
          <volume>3</volume>
          (
          <year>2021</year>
          )
          <fpage>274</fpage>
          -
          <lpage>286</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.vrih.
          <year>2021</year>
          .
          <volume>08</volume>
          .002.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>K. B. Borgen</surname>
            ,
            <given-names>T. D.</given-names>
          </string-name>
          <string-name>
            <surname>Ropp</surname>
          </string-name>
          , W. T. Weldon,
          <article-title>Assessment of augmented reality technology's</article-title>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>