<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Methodology for teaching development of web-based augmented reality with integrated machine learning models</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Serhiy O. Semerikov</string-name>
          <email>semerikov@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
          <xref ref-type="aff" rid="aff4">4</xref>
          <xref ref-type="aff" rid="aff8">8</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Mykhailo V. Foki</string-name>
          <xref ref-type="aff" rid="aff4">4</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Dmytro S. Shepiliev</string-name>
          <xref ref-type="aff" rid="aff4">4</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Mykhailo M. Mintii</string-name>
          <email>mintii@acnsci.org</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Iryna S. Mintii</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
          <xref ref-type="aff" rid="aff4">4</xref>
          <xref ref-type="aff" rid="aff5">5</xref>
          <xref ref-type="aff" rid="aff7">7</xref>
          <xref ref-type="aff" rid="aff8">8</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Olena H. Kuzminska</string-name>
          <email>o.kuzminska@nubip.edu.ua</email>
          <xref ref-type="aff" rid="aff4">4</xref>
          <xref ref-type="aff" rid="aff6">6</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Academy of Cognitive and Natural Sciences</institution>
          ,
          <addr-line>54 Universytetskyi Ave., Kryvyi Rih, 50086</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Institute for Digitalisation of Education of the NAES of Ukraine</institution>
          ,
          <addr-line>9 M. Berlynskoho Str., Kyiv, 04060</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Kremenchuk Mykhailo Ostrohradskyi National University</institution>
          ,
          <addr-line>20 University Str., Kremenchuk, 39600</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Kryvyi Rih National University</institution>
          ,
          <addr-line>11 Vitalii Matusevych Str., Kryvyi Rih, 50027</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff4">
          <label>4</label>
          <institution>Kryvyi Rih State Pedagogical University</institution>
          ,
          <addr-line>54 Universytetskyi Ave., Kryvyi Rih, 50086</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff5">
          <label>5</label>
          <institution>Lviv Polytechnic National University</institution>
          ,
          <addr-line>12 Stepana Bandery Str., Lviv, 79000</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff6">
          <label>6</label>
          <institution>National University of Life and Environmental Sciences of Ukraine</institution>
          ,
          <addr-line>15 Heroiv Oborony Str., Kyiv, 03041</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff7">
          <label>7</label>
          <institution>University of Łódź</institution>
          ,
          <addr-line>68 Gabriela Narutowicza Str., 90-136 Łódź</addr-line>
          ,
          <country country="PL">Poland</country>
        </aff>
        <aff id="aff8">
          <label>8</label>
          <institution>Zhytomyr Polytechnic State University</institution>
          ,
          <addr-line>103 Chudnivsyka Str., Zhytomyr, 10005</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <fpage>118</fpage>
      <lpage>145</lpage>
      <abstract>
        <p>Augmented reality (AR) is an emerging technology with many applications in education. Web-based augmented reality (WebAR) provides a cross-platform approach to deliver immersive learning experiences on mobile devices. Integrating machine learning models into WebAR applications can enable advanced interactive efects by responding to user actions. However, little research exists on efective methodologies to teach students WebAR development with integrated machine learning. This paper proposes a methodology with three main steps: (1) Integrating standard TensorFlow.js models like handpose into WebAR scenes for gestures and interactions; (2) Developing custom image classification models with Teachable Machine and exporting to TensorFlow.js; (3) Modifying WebAR applications to load and use exported custom models, displaying model outputs as augmented reality content. The methodology is designed to incrementally introduce machine learning integration, build an understanding of model training and usage, and spark ideas for using machine learning to augment educational content. The methodology provides a starting point for further research into pedagogical frameworks, assessments, and empirical studies on teaching WebAR development with embedded intelligence.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;web-based augmented reality</kwd>
        <kwd>WebAR</kwd>
        <kwd>machine learning</kwd>
        <kwd>TensorFlow</kwd>
        <kwd>js</kwd>
        <kwd>Teachable Machine</kwd>
        <kwd>educational technology</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Web-based Augmented Reality (WebAR) is one of the most common ways to combine the real and the
virtual on mobile Internet devices [
        <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
        ]. The development of web-based augmented reality applications
difers from other development methods in that it is cross-platform and does not require the installation
of developed applications, which significantly increases the level of software mobility compared to
traditional mobile applications [
        <xref ref-type="bibr" rid="ref3 ref4">3, 4</xref>
        ].
      </p>
      <p>
        Currently, the world’s most famous non-profit library for WebAR development is AR.js [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], founded
by Jerome Etienne (for example, [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] provides a systematic description of the possibilities of using AR.js
for the development of professional competences of future teachers of STEM disciplines), but HiuKim
Yuen [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], one of the developers of AR.js, created a new library called MindAR [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], which is more compact
and technologically advanced, but, unlike AR.js, is little known.
      </p>
      <p>
        AR.js and MindAR are built on the classic ARToolKit and OpenCV engines, respectively, which are
currently the industry standard. At the same time, while AR.js is focused on processing primarily simple
markers up to 16 × 16, MindAR is focused on natural images of complex structures. Another feature
of MindAR that makes it an appropriate learning tool is the inclusion of the well-known TensorFlow
machine learning library [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], which provides potential opportunities for integrating machine learning
models into WebAR applications to create highly interactive and exciting efects, such as using hand
gestures or facial expressions to control AR content.
      </p>
      <p>The aim of the study is to develop the methodology for teaching the development of augmented
reality for the Web with integrated machine learning models.</p>
      <p>The main objectives of the study are as follows:
1. Perform a bibliometric analysis of sources from educational applications of WebAR.
2. Choose tools for developing augmented reality for the Web.
3. Develop and test a methodology for developing WebAR applications for face tracking.
4. Develop and test a methodology for integrating machine learning models into WebAR applications.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Bibliometric analysis of sources from educational applications of</title>
    </sec>
    <sec id="sec-3">
      <title>WebAR</title>
      <p>
        To perform a systematic bibliometric analysis for the queries “WebAR” and “Web-based augmented
reality for education”, VOSviewer version 1.6.18 [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] was used.
      </p>
      <p>As a data source for the first query, Crossref was selected with a search by document titles, which
made it possible to select 19 documents from 2017-2022 (date of request: 26.11.2022). The selected
documents were analysed by the times they were co-cited with other documents.</p>
      <p>
        Out of 92 sources cited in 19 documents, 26 are cited together more than once, forming only 1 cluster
(figure 1), which includes works [
        <xref ref-type="bibr" rid="ref1 ref11 ref2">1, 2, 11</xref>
        ], performed under the supervision of Serhiy O. Semerikov.
      </p>
      <p>
        Scopus was chosen as the data source for the second query, with a search by titles, abstracts and
keywords, which made it possible to select 93 documents from 2001-2023 (figure 2), 66 of which are from
the last five years. The majority of them are journal articles (58 [
        <xref ref-type="bibr" rid="ref12 ref13 ref14 ref15 ref16 ref17 ref18 ref19 ref20 ref21 ref22 ref23 ref24 ref25 ref26 ref27 ref28 ref29 ref30 ref31 ref32 ref33 ref34 ref35 ref36 ref37 ref38 ref39 ref40 ref41 ref42 ref43 ref44 ref45 ref46 ref47 ref48 ref49 ref50 ref51 ref52 ref53 ref54 ref55 ref56 ref57 ref58 ref59 ref60 ref61 ref62 ref63 ref64 ref65 ref66 ref67 ref68 ref69">12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22,
23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52,
53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69</xref>
        ]), the smaller part is books (4 [
        <xref ref-type="bibr" rid="ref70 ref71 ref72 ref73">70, 71, 72, 73</xref>
        ])
and articles in conference proceedings (31 [
        <xref ref-type="bibr" rid="ref100 ref101 ref102 ref103 ref104 ref74 ref75 ref76 ref77 ref78 ref79 ref80 ref81 ref82 ref83 ref84 ref85 ref86 ref87 ref88 ref89 ref90 ref91 ref92 ref93 ref94 ref95 ref96 ref97 ref98 ref99">74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90,
91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104</xref>
        ]).
      </p>
      <p>Out of 301 authors of 93 documents, 27 were cited twice or more times and 9 were cited three or more
times. figure 3 shows the semantic network of keywords in the documents for the query “Web-based
augmented reality for education”. The distribution of keywords by clusters (figure 4) is shown in table 1.</p>
      <p>The first cluster (highlighted in red in figure 4 and table 1) connects the basic concepts of augmented
reality in education: augmented and virtual reality with education (including medical education) and
human learning, including the use of smartphones.</p>
      <p>Augmented reality is a systemic element – it connects all the clusters and is itself connected to all
other concepts.</p>
      <p>In the analysed documents, virtual reality is not linked to traditional, mobile, and Internet/web-based
learning. It is essential to distinguish virtual reality from virtual learning environments, which include
these concepts.</p>
      <p>The concept of education is also almost universal – it is not only associated with user interfaces and
AR applications.</p>
      <p>The links of medical education with other clusters are quite revealing: in the second cluster – with
the concepts of curricula, computer-aided instruction, education computing, e-learning and students; in
the third – with websites and pedagogical augmented reality technology, in the fourth – with distance
education.</p>
      <p>Learning (in the sense of studying) is related in the second cluster to teaching, students, education
computing, computer-aided instruction, e-learning and user interfaces, and in the third cluster to
websites and motivation. This concept has no direct links to distance education.</p>
      <p>The concept of a human(s) (person(s)) outside their cluster is linked to students and e-learning in the
second cluster and websites in the third.</p>
      <p>Outside of its cluster, Internet/web-based learning is only associated with traditional teaching in the
second cluster.</p>
      <p>Finally, smartphones are linked in the second cluster to teaching, students, education computing,
e-learning and engineering education, and in the third cluster to websites and augmented reality
applications.</p>
      <p>The second cluster (highlighted in green in figure 4 and table 1) connects the concepts of
learning environment design: teaching, engineering education, computer-aided instruction, e-learning,
students, mobile learning, learning environments, education computing, and curricula.</p>
      <p>Central to the second cluster are the concepts of “e-learning” and “students”, which are also almost
universal – formally, they are not associated only with Internet/web-based learning due to their
synonymity with e-learning.</p>
      <p>Computer-aided instruction is related to the concepts of the first (augmented and virtual reality,
education (including medical) and learning) and third (motivation, websites, learning systems, interactive
learning environments, augmented reality applications, augmented reality technology) clusters.
15
year
4
16
2
17
6
18</p>
      <p>The concept of teaching is linked in the first cluster to augmented reality, education and learning,
smartphones and Internet/web-based learning, and in the third cluster to websites, augmented reality
applications and augmented reality technology.</p>
      <p>Engineering education is related in the first cluster to augmented and virtual reality, education and
smartphones, and all concepts of the third and fourth clusters.</p>
      <p>Education computing is related in the first cluster to augmented and virtual reality, education (including
medical) and learning, smartphones, and in the third cluster to motivation, learning systems and websites,
and the fourth – with distance education.</p>
      <p>Outside their cluster, learning environments are only related to augmented and virtual reality education
from the first cluster and websites from the third.</p>
      <p>Similarly, mobile learning is related to education and augmented reality in the first cluster and
motivation, websites and learning systems in the third.</p>
      <p>User interfaces have links to the concepts of the first (learning, augmented and virtual reality) and
third (motivation, websites) clusters.</p>
      <p>The Curricula are related to education (including medical education), augmented and virtual reality
in the first cluster, websites in the third cluster, and distance education in the fourth cluster.</p>
      <p>The third cluster (highlighted in blue in figure 4 and table 1) connects the concepts of immersive
learning environment implementation: websites, motivation, learning systems, interactive learning
environments, augmented reality applications and augmented reality technology.</p>
      <p>Central to the third cluster are websites, which are almost universal concepts – formally, they are not
associated only with Internet/web-based learning due to the overlap of the relevant concepts.</p>
      <p>The concept of motivation is related in the first cluster to augmented and virtual reality, education
and learning, and in the third cluster to e-learning and mobile learning, education computing, user
interfaces, computer-aided instruction, students and engineering education.</p>
      <p>Learning systems are related in the first cluster to augmented and virtual reality and education and in
the third cluster to e-learning and mobile learning, education computing, computer-aided instruction,
student-centred teaching and engineering education.</p>
      <p>Interactive learning environments also have similar links: in the first cluster, with augmented and
virtual reality and education, and in the third cluster, with e-learning, computer-aided instruction,
students and engineering education.</p>
      <p>Naturally, augmented reality applications are related to augmented reality and smartphones in the
ifrst cluster and to e-learning, computer-aided instruction, teaching, students and engineering education
in the second.</p>
      <p>Augmented reality technology are related in the first cluster to augmented and virtual reality and
education (including medical education) and in the second cluster to e-learning, computer-aided
instruction, teaching, students and engineering education.</p>
      <p>The fourth cluster (highlighted in yellow in figure 4 and table 1) contains the concept of distance
education, which is linked in the first cluster to the concepts of augmented and virtual reality and the
concept of education (including medical education), in the second cluster to the concepts of student,
engineering education, education computing, e-learning and curricula, and in the third cluster to the
concept of website.</p>
      <p>The analysis of the distribution of concepts by the density of links (figure 5) and time makes it possible
to determine that the oldest (before 2015) studies focused on user interfaces and their application in
education. In 2016, the focus shifted to studying the impact of teaching in learning environments on
students. In 2017, the research actualised the concepts of virtual reality, interactive learning environments,
curricula, and computer-aided instruction in engineering education. The focus of research in 2018
was on education computing, the use of smartphones, augmented reality applications and pedagogical
augmented reality technology.</p>
      <p>WebAR is the focus of research in 2019, with studies addressing the use of smartphones,
online/webbased learning and augmented reality. In 2020, the impact of the COVID-19 pandemic added to the
issues of learning motivation and medical education. A new element of recent research is human
augmentation.</p>
    </sec>
    <sec id="sec-4">
      <title>3. Augmented reality development tools for the Web</title>
      <sec id="sec-4-1">
        <title>3.1. Setting up a web server and remote debugger</title>
        <p>The main development tools to develop in HTML and JavaScript are a simple text editor and a web
browser, where you can open a regular HTML web page saved locally.</p>
        <p>However, this may not work for applications that require a camera. In addition, you may want to test
applications on your own mobile devices from time to time, so it is best to install a local web server like
Simple Web Server.</p>
        <p>It may be helpful to select the HTTPS protocol in the advanced settings – without it, the mobile
device may not be able to access the camera.</p>
        <p>Doing all the development and testing work directly on a desktop browser is possible, but sometimes,
it is still worth trying on a mobile phone.</p>
        <p>If the devices are connected to the same local area network that does not have a firewall, there is no
problem accessing the web server. However, if the network access point is behind a firewall, you can
use ngrok to redirect trafic from the restricted port.</p>
        <p>
          After installing ngrok and creating an account on the website [
          <xref ref-type="bibr" rid="ref105">105</xref>
          ], you need to register the ngrok
agent [
          <xref ref-type="bibr" rid="ref106">106</xref>
          ] and start it, specifying the protocol (e.g. HTTP) and the port number that the firewall denies
access to (e.g. 8887).
        </p>
        <p>Once started, ngrok provides a global HTTPS Internet link – but only while the local web server and
the ngrok redirect are running.</p>
        <p>Traditionally, debugging web applications involves viewing the web browser console, which displays
notifications related to debugging the application.</p>
        <p>
          However, it may be challenging on a mobile device. Here, RemoteJS [
          <xref ref-type="bibr" rid="ref107">107</xref>
          ] will help – by clicking the
Start Debugging button after going to the site, we will get the RemoteJS agent code like this:
&lt;script data-consolejs-channel="9817ec3e-a3f7-fbe3-3836-e2e2d07d5c99"
src="https://remotejs.com/agent/agent.js"&gt;
&lt;/script&gt;
        </p>
        <sec id="sec-4-1-1">
          <title>This code should be copied and pasted directly into the web page.</title>
          <p>After that, all debug messages will be sent to the web page at https://remotejs.com/viewer/agent_code,
where agent_code is the value of the data-consolejs-channel variable.</p>
        </sec>
      </sec>
      <sec id="sec-4-2">
        <title>3.2. Application of a graphical library for augmented reality on the Web</title>
        <p>
          WebGL [
          <xref ref-type="bibr" rid="ref108">108</xref>
          ] is a JavaScript API for rendering 3D graphics in browsers. It is a cross-platform display
standard supported by all major browsers. However, low-level WebGL code is dificult to read and write,
so more user-friendly libraries have been created.
        </p>
        <p>
          Three.js [
          <xref ref-type="bibr" rid="ref109">109</xref>
          ] is one such library. Its author, Ricardo Miguel Cabello, also known as mrdoob, is one
of the pioneers of WebGL, so this library is often used when building other libraries. Most WebAR
SDKs support Three.js, so it is a must-have language for efectively developing augmented reality web
applications.
        </p>
        <p>To understand how Three.js works at a high level, it is useful to draw an analogy with the work of a
photographer or film director who:
1) customises the scene by placing objects on it;
2) moves the camera to capture footage from diferent positions and angles.</p>
        <p>Three.js is not a specialised library for augmented reality – it contains much more functionality,
including that which is more suitable for web VR (lighting, cameras, etc.) (figure 6).</p>
        <p>As shown in figure 6, the basis is a scene where objects are created in three steps:
1) determination of object geometry – position vectors, colours, etc.: e.g., BoxGeometry is responsible
for the rectangular parallelepiped;
2) definition of the material – the way the object is rendered (its optical properties – colour, texture,
gloss, etc.): for example, MeshBasicMaterial corresponds to a material that has its colour and
does not reflect rays;
3) geometry and material composition is performed using Mesh.</p>
        <p>The renderer will display the 3D model on the canvas, considering the material, texture and lighting.
For WebAR applications to work, the scene needs to be transparent so that the video stream from the
camera can be overlaid. This is achieved by setting the alpha parameter to true in the WebGLRenderer
class constructor.</p>
        <p>Rendering itself is performed by the render method, which displays the projection of the scene onto
the canvas (canvas element) from the camera’s point of view.</p>
        <p>Connect the video stream before linking a canvas to an HTML page for WebAR applications.
ifgure 7 shows the first implementation of WebAR, in which a real object from the camera is
supplemented with a virtual object.</p>
        <p>Placing a canvas over the video is the basis of WebAR. The only thing that needs to be added is
displaying the object in a more appropriate location and updating its position according to the camera
signal, i.e. object tracking.</p>
      </sec>
      <sec id="sec-4-3">
        <title>3.3. Setting up a library for augmented reality on the Web</title>
        <p>You can change the position of the image by moving the virtual camera, changing its position
(coordinates), and tilting it. Appropriate changes require tracking objects, so it is expected to classify
augmented reality into marker-based, markerless, location-based, etc.</p>
        <p>HiuKim Yuen ofers a classification of augmented reality by the type of tracking.</p>
        <p>The first type is image tracking: in this type, virtual objects appear on top of target images, which
can be barcode-like, which have a predefined structure, and natural, which can be anything.</p>
        <p>
          The images do not have to be printed or on-screen – there can even be augmented reality T-shirts
[
          <xref ref-type="bibr" rid="ref110">110</xref>
          ].
        </p>
        <p>The second type of augmented reality is face tracking, where objects are attached to the human face.
Examples include Instagram filters, Google Meet, social media campaigns, apps for trying on virtual
accessories, etc.</p>
        <p>The third type of augmented reality is world tracking, also known as markerless augmented reality.
With this type of tracking, augmented reality objects can be placed anywhere, not limited to a specific
image, face, or physical object.</p>
        <p>World tracking applications continuously capture and track the environment and estimate the physical
position of the application user. Augmented reality objects are often attached to a specific surface, such
as the ground.</p>
        <p>Location-based augmented reality, known for Pokémon GO, Ingress etc., involves linking content to
a specific geographical location – latitude and longitude. Usually, these apps track the environment,
as the augmented content is usually attached to the ground, and the location-based part is rather an
additional condition that triggers the tracking of the environment (or a face) in a specific location.</p>
        <p>Other types of tracking can be defined, such as 3D object tracking, hand tracking, etc.</p>
        <p>Despite the variety of libraries for augmented reality, their main task is to determine the position of
the virtual camera following the tracked object, as illustrated by the following pseudocode:
const ar_engine = new SOME_AR_ENGINE();
while(true) {
await nextVideoFrameReady();
const {position, rotation} = ar_engine.computeCameraPose(video);
camera.position = position;
camera.rotation = rotation;
}</p>
        <p>First, you need to initiate a library – a specific AR engine – and get a link to it. Then, in a continuous
loop, wait for a frame from the video stream of the real camera, determine its position (tilt coordinates),
and move the virtual camera on the canvas to the same position.</p>
        <p>Often, however, it is not the virtual camera that moves but the objects on the scene. In this case, the
position of the tracked object is determined, rather than the real camera, and then the virtual reality
object is moved to the same position as the tracked object:
const ar_engine = new SOME_AR_ENGINE();
while(true) {
await nextVideoFrameReady();
const {position, rotation} = ar_engine.computeObjectPose(video);
some_object.position = position;
some_object.rotation = rotation;</p>
        <p>The tracked image can be of any origin, but it must be prepared: if it contains unnecessary elements,
they should be removed.</p>
        <p>To recognise an image using the MindAR library, you need to select landmarks on the image – the
elements that will be used for recognition. This can be done using the image compiler available at https:
//hiukim.github.io/mind-ar-js-doc/tools/compile. Compiling results in the binary file targets.mind,
which describes the reference points to be tracked.</p>
        <p>Other libraries have similar means of obtaining image descriptions, often called NFT (natural feature
tracking) marker compilers. Such an image should be visually complex and have a high resolution
(details matter here). A visually complex image provides the software many opportunities to track
unique and easily recognisable parts of the image.</p>
        <p>The physical size of the NFT marker also afects the quality of its recognition: small images should
be approached by the mobile device, while large ones should be kept away from it.</p>
        <p>The recognition quality also depends on the brightness of the mobile device’s screen; low-resolution
cameras usually work better when they are close to the markers.</p>
        <p>The Three.js library is a part of MindAR, which significantly simplifies their interaction: the
MindARThree class constructor creates the objects necessary for working with Three.js – renderer,
scene, and camera, which are available as renderer, scene and camera fields, respectively.</p>
        <p>The anchor objects returned by the call to the addAnchor method, whose parameter corresponds
to the number of the image to be recognised, are used to track target images and provide the position
where the object should be placed.</p>
        <p>Instead of adding Three.js objects directly to the scene, they are added to an anchor component – a
group object of the THREE.Groupclass that defines a set of related objects whose position, orientation,
and visibility can be controlled together. This anchor group is managed by the MindAR library, which
will continuously update the group’s position and orientation in accordance with our tracking set.</p>
        <p>The start method of the MindARThree class sets up the parameters, turns on the camera, and loads
all the necessary data into the web browser’s memory.</p>
        <p>For the renderer, camera, and scene to work, you must create a function to render them. In the
unnamed callback function created by the setAnimationLoop function, for each frame, the render
method is called from the renderer object, whose parameters are the scene and camera objects –
this is the animation on the canvas.</p>
        <p>The result is a fully functional WebAR application that tracks a single image (figure 8).</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>4. Methodology for developing WebAR applications for face tracking</title>
      <sec id="sec-5-1">
        <title>4.1. Model of facial anchor points</title>
        <p>The MindAR library has two main sets of modules – for working with images (image) and for working
with faces (face).</p>
        <p>The similarities between the image-tracking and face-tracking APIs are visible in the MindAR code.
Despite the similarities, the addAnchor method interprets the parameter diferently. For image tracking,
it was the number of the target image; for face recognition, it is the number of the face reference point.</p>
        <p>
          Face landmark detection is based on the well-known TensorFlow library model [
          <xref ref-type="bibr" rid="ref111">111</xref>
          ]. MediaPipe Face
Mesh model [
          <xref ref-type="bibr" rid="ref112">112</xref>
          ] is a convolutional neural network that detects 468 three-dimensional landmarks on the
face (https://github.com/tensorflow/tfjs-models/raw/master/face-landmarks-detection/mesh_map.jpg),
and we can bind objects to any of them (figure 9).
        </p>
      </sec>
      <sec id="sec-5-2">
        <title>4.2. Putting a mesh on your face</title>
        <p>A face mesh is another type of augmented reality that overlaps images (textures) on all the reference
points of a person’s face rather than being linked to individual points. Face meshes are used to create
various makeup efects, tattoos, etc. – up to full face virtualisation.</p>
        <p>The face mesh is not a predefined 3D model – it is dynamically generated with constant geometry
updates.</p>
        <p>To apply the mesh to the face, we need a suitable texture.</p>
        <p>The mesh is created by calling addFaceMesh. The addFaceMesh method is similar in form to
addAnchor, but they are diferent: addAnchor creates an empty group to which objects whose
position is controlled by MindAR are added, while the faceMesh returned by addFaceMesh is a single
displayed object whose geometry changes in each frame.</p>
        <p>The material of the face mesh can be any texture – if you do not set it, the face mesh will look like
the one shown in the first image (figure 10).</p>
        <p>You can see the structure of this mesh in the second image (figure 10) – to do this, set the wireframe
attribute of the image material.</p>
        <p>
          The third and fourth images (figure 10) are examples of the modified texture of facial landmarks. In
the documentation for Meta Spark Studio [
          <xref ref-type="bibr" rid="ref113">113</xref>
          ] you can find a set of textures for face meshes that can
be used to create your mesh, as described in [
          <xref ref-type="bibr" rid="ref114">114</xref>
          ].
        </p>
        <p>Creating a beautiful mesh requires specific artistic skills, but using the canonical texture (figure 9) is
quite simple – apply the desired image over it and remove unnecessary lines.</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>5. Methodology for integrating machine learning models into WebAR applications</title>
      <sec id="sec-6-1">
        <title>5.1. Integration of standard models</title>
        <p>
          For machine learning on the Internet, TensorFlow [
          <xref ref-type="bibr" rid="ref115">115</xref>
          ] is the most commonly used free and open-source
machine learning library developed by Google. Currently, it supports many languages, including major
ones – Python, Java, and C++ – and community-supported ones: Haskell, C#, Julia, R, Ruby, Rust, and
Scala. It is available on many platforms, including Linux, Windows, Android, and embedded platforms
– the TensorFlow Lite library version is designed to run machine learning models on mobile devices,
microcontrollers, IoT devices, etc.
        </p>
        <p>
          TensorFlow.js [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ] is a JavaScript version of TensorFlow that allows you to develop and use models
using this language directly in the browser.
        </p>
        <p>
          TensorFlow.js has many pre-trained models that can be used immediately [
          <xref ref-type="bibr" rid="ref116">116</xref>
          ]. A complete list of
currently available models can be found at https://github.com/tensorflow/tfjs-models – many of them
are extremely useful and can be an excellent addition to AR applications. If the required functionality is
unavailable, you can create and train your models or modify existing ones.
        </p>
        <p>
          TensorFlow.js is part of the MindAR library. However, models are not part of Tensorflow.js, so they
need to be connected separately – as shown in the example of the handpose.js model described in
[
          <xref ref-type="bibr" rid="ref117">117</xref>
          ]. This model is used to define the hand and its components.
        </p>
        <p>
          The handpose model is loaded from the TensorFlow Hub (since 2023, a part of Kaggle) [
          <xref ref-type="bibr" rid="ref118">118</xref>
          ]: looking
at this model repository, you can see that they take up a considerable amount of space, so the load
method that loads them is called as an asynchronous function.
        </p>
        <p>The handpose model processes individual frames taken from the video stream. This is a rather
computationally intensive procedure, so, given that, as long as high accuracy of hand identification is
not required, you can try to detect them not in every frame. The detect function creates a separate
animation loop, in which for every tenth frame, the estimateHands method of the loaded model is
called, and a video frame is passed to it. The method returns a predictions containing information
about the hand images detected in the frame, so a non-zero array size is a sign that there was a hand in
the frame:
const video = mindarThree.video;
let frameCount = 1;
const detect = async () =&gt; {
if (frameCount % 10 == 0) {</p>
        <p>const predictions = await model.estimateHands(video);
if (predictions.length &gt; 0) {
//...</p>
        <p>}
frameCount++;
window.requestAnimationFrame(detect);
}</p>
        <p>ifgure 11 shows an example of setting the position of the plane in the detected image so that it reflects
the position of the bounding box of the hand in the frames – the efect is quite simple, but it provides
an idea of how to use machine learning models in AR applications.</p>
      </sec>
      <sec id="sec-6-2">
        <title>5.2. Developing custom models</title>
        <p>
          To quickly create and train your model, you can use the Teachable Machine [
          <xref ref-type="bibr" rid="ref119">119</xref>
          ] is a part of the Google
AI Experiment project (https://labs.google/ and https://experiments.withgoogle.com/), which allows
building models to solve problems of image, sound, and pose classification.
        </p>
        <p>To use the Teachable Machine, students are asked to create a new Google account or use an existing
one, and then they can choose the type of model they want to create. There are three types of models
available:
• Image recognition model allows you to identify objects in photos;
• Sound recognition model allows you to recognise audio recordings;
• Pose recognition model allows you to recognise body movements.</p>
        <p>After selecting the model type, you need to provide data for training it through photos, audio
recordings, or videos. Once the data is provided, the Teachable Machine will start training the model,
which may take some time, depending on the size and complexity of the training. Once the model is
trained, it is advisable to test it to ensure it correctly recognises the data. If the model is inaccurate
enough, you can provide additional data to improve it. Once the model has been successfully trained
and validated, it can be exported to other projects.</p>
        <p>With Teachable Machine’s wide range of features, we can recognise sounds, poses, faces, or any image.
Nevertheless, to start using it, you must prepare photos and audio recordings for further experiments,
train the selected model, and apply it directly to the web environment.</p>
        <p>Clicking on the Get Started button on the home page will take you to a new window where you
can use a project template or create your own.</p>
        <p>When creating your project, choose which model you want to use. We choose Image Project
and click on Standard image model. As a source of images, we suggest that students use their
webcams and take a series of headshots from diferent angles (tilt and rotation angles), which we save in
a pre-prepared catalogue. We will take several diferent images from each participant in the experiment
and divide them into classes, noting the corresponding names (figure 12).</p>
        <p>For each image class, there is a probability that a particular image belongs to that class. Students can
configure additional training parameters, such as the number of iterations and the model’s learning
speed.</p>
        <p>Next, we move on to training the model – at this stage, all images are converted to the corresponding
numerical tensors. The last step is to experiment by choosing images of diferent people (not just the
participants in the experiment) and discussing the recognition results (figure 13).</p>
      </sec>
      <sec id="sec-6-3">
        <title>5.3. Integration of custom models</title>
        <p>
          The libraries included in Teachable Machine are based on TensorFlow models: MobileNet for
image classification [
          <xref ref-type="bibr" rid="ref120">120</xref>
          ], Speech Commands for sound classification[
          <xref ref-type="bibr" rid="ref121">121</xref>
          ], and PoseNet for body pose
classification [
          <xref ref-type="bibr" rid="ref122">122</xref>
          ].
        </p>
        <p>Accordingly, the built face classification model can be exported and used the same way as the
previously used models of facial landmarks and hand pose.</p>
        <p>Clicking the Export Model button allows you to export in various formats:
• TensorFlow.js – placement of the model at https://teachablemachine.withgoogle.com/models/[...]
or downloading the model and the JavaScript and p5.js code (figure 14);</p>
        <p>• TensorFlow – download Python code and model in h5 (Keras) and Savedmodel (TensorFlow)
formats;
• TensorFlow Lite – downloading a model in tflite format for IoT devices based on Android and</p>
        <p>Coral.</p>
        <sec id="sec-6-3-1">
          <title>The archive with the model for TensorFlow.js contains three files: • metadata.json – a text file in JSON format containing information about the version numbers of</title>
          <p>TensorFlow.js (tfjsVersion), Teachable Machine (tmVersion), libraries from the Teachable
Machine (packageVersion) and its name (packageName – in our case, it is @teachablemachine/image),
date of creation (timeStamp) and model name (modelName – by default tm-my-image-model),
image size (imageSize – all images are resized to the same size) and categories (labels) used
for data labelling;
• model.json – a text file in JSON format containing information about the neural network
architecture (modelTopology);
• weights.bin – a binary file containing the weighting coeficients of the neural network.</p>
          <p>When exporting models, a test code is ofered to verify them, from which you can learn how to
connect the tmImage library and load the model by calling load, the parameters of which are the paths
to the model architecture and metadata files – model.json and metadata.json.</p>
          <p>After loading the model by calling the getTotalClasses method, you can determine the number
of categories that the model will distinguish – in our case, this value, stored in maxPredictions, is
three.</p>
          <p>Just as before, every tenth frame is passed to the model for analysis by calling predict, which returns
an array of two objects containing information about the category (className) and the probability
that the image belongs to it (probability) – a string with information about them and is visualised.</p>
          <p>From figure 15, we can see that the image on the left is identified correctly despite the change in
the background compared to the training set (figure 12), while the image on the right is identified
incorrectly.</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-7">
      <title>6. Conclusions</title>
      <p>The completed solution to the problem of developing a methodology for teaching the development of
augmented reality for the Web with integrated machine learning models made it possible to draw the
following conclusions:
(93 documents in 2001-2023) databases made it possible to identify the main concepts of the study,
grouped into 4 clusters:
a) The first cluster connects the basic concepts of augmented reality in education: augmented and
virtual reality with education (including medical education) and human learning, including the
use of smartphones;
b) The second cluster links the concepts of learning environment design: teaching, engineering
education, computer-aided instruction, e-learning, students, mobile learning, learning
environments, user interfaces, education computing and curricula;
c) The third cluster connects the concepts of immersive learning environment implementation:
websites, motivation, learning systems, interactive learning environments, augmented reality
applications and augmented reality technology;
d) The fourth cluster contains the concept of distance education, linked in the first cluster to the
concepts of augmented and virtual reality and the concept of education (including medical
education), in the second to the concepts of students, engineering education, education computing,
e-learning and curricula, and in the third to the concept of websites.</p>
      <p>The analysis of the distribution of concepts by the density of links and time made it possible to date
the emergence of diferent concepts and track their development from educational applications of
user interfaces to their augmentation.
2. The selected tools for developing augmented reality for the Web form three groups:
a) fixed assets:
• Simple Web Server provides the full functionality you need without installation needs that
meet the requirements of simplicity and mobility;
• ngrok trafic redirection allows access to a web server located behind a firewall (on a
student’s or teacher’s computer), which creates conditions for working together remotely;
• RemoteJS remote debugger allows you to debug JavaScript applications on mobile devices
using desktop browsers;
b) Three.js graphics library is a high-level implementation of the cross-platform WebGL display
standard in JavaScript, which allows working with high-level graphical abstractions;
c) MindAR augmented reality library allows working with natural images as augmented reality
anchors and includes the Three.js and TensorFlow.js libraries – the latter is key for integrating
machine learning models created with TensorFlow with WebAR applications built with MindAR.
3. In the process of developing and testing the methodology for developing WebAR applications for
face tracking, the expediency of joint use of the MediaPipe Face Mesh model, a convolutional neural
network that identifies 468 three-dimensional landmarks on the face, and the MindAR library, which
allows to define any of them as an anchor, is substantiated. It is shown that the complete application
of the MediaPipe Face Mesh model in the MindAR library is implemented in the form of a face
mesh that is dynamically generated with constant geometry updates – a type of augmented reality
associated with the overlay of images on all anchor points of the human face. Examples of using
face meshes to create makeup efects, tattoos, etc., are presented.
4. The methodology of integrating machine learning models into WebAR applications involves
mastering three main steps:
a) The first step, integration of standard models, involves familiarisation with pre-trained
TensorFlow.js models that can be used in WebAR applications. The article shows the feasibility of
considering the handpose.js model used to determine the hand and its components,
demonstrates the main problem of WebAR – a significant performance drop when applying the model
to each frame, and suggests a way to solve it. As a result of the first step, a WebAR application
for gestural size control is created and the position of the virtual object;
b) The second step, custom model development, involves creating and training your TensorFlow
models using the Teachable Machine, which allows you to build models to solve problems of
image, sound, and pose classification;
c) The third step, integration of user models, is performed by exporting the face classification
model built with the Teachable Machine and modifying the WebAR application developed
in the first step: we load our model, determine the number of categories it will classify, and
the object of augmented reality is information about each category and the probability that
the webcam image belongs to it. The latter provides an opportunity to discuss the issues of
classification errors and their dependence on both the settings of the model training parameters
and the way the test images are presented to the WebAR application.</p>
      <p>This study does not exhaust all the components of the problem, and further research is needed:
• history and prospects of WebAR development in education;
• a methodology for the joint use of diferent neural network modelling environments;
• development of WebAR libraries, in particular, in the direction of implementing ubiquitous
augmented reality;
• the relationship between real and virtual in training in the context of a pandemic, natural disaster
and military conflict.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>D. S.</given-names>
            <surname>Shepiliev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. V.</given-names>
            <surname>Yechkalo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Tkachuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. M.</given-names>
            <surname>Markova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. O.</given-names>
            <surname>Modlo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Mintii</surname>
          </string-name>
          ,
          <string-name>
            <surname>M. M. Mintii</surname>
            ,
            <given-names>T. V.</given-names>
          </string-name>
          <string-name>
            <surname>Selivanova</surname>
            ,
            <given-names>N. K.</given-names>
          </string-name>
          <string-name>
            <surname>Maksyshko</surname>
            ,
            <given-names>T. A.</given-names>
          </string-name>
          <string-name>
            <surname>Vakaliuk</surname>
            ,
            <given-names>V. V.</given-names>
          </string-name>
          <string-name>
            <surname>Osadchyi</surname>
            ,
            <given-names>R. O.</given-names>
          </string-name>
          <string-name>
            <surname>Tarasenko</surname>
            ,
            <given-names>S. M.</given-names>
          </string-name>
          <string-name>
            <surname>Amelina</surname>
            ,
            <given-names>A. E.</given-names>
          </string-name>
          <string-name>
            <surname>Kiv</surname>
          </string-name>
          ,
          <article-title>Development of career guidance quests using WebAR</article-title>
          ,
          <source>Journal of Physics: Conference Series</source>
          <year>1840</year>
          (
          <year>2021</year>
          )
          <article-title>012028</article-title>
          . URL: https://doi.org/10.1088/
          <fpage>1742</fpage>
          -
          <lpage>6596</lpage>
          /
          <year>1840</year>
          /1/012028. doi:
          <volume>10</volume>
          .1088/
          <fpage>1742</fpage>
          -
          <lpage>6596</lpage>
          /
          <year>1840</year>
          /1/012028.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>D. S.</given-names>
            <surname>Shepiliev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. O.</given-names>
            <surname>Modlo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. V.</given-names>
            <surname>Yechkalo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Tkachuk</surname>
          </string-name>
          ,
          <string-name>
            <surname>M. M. Mintii</surname>
            ,
            <given-names>I. S.</given-names>
          </string-name>
          <string-name>
            <surname>Mintii</surname>
            ,
            <given-names>O. M.</given-names>
          </string-name>
          <string-name>
            <surname>Markova</surname>
            ,
            <given-names>T. V.</given-names>
          </string-name>
          <string-name>
            <surname>Selivanova</surname>
            ,
            <given-names>O. M.</given-names>
          </string-name>
          <string-name>
            <surname>Drashko</surname>
            ,
            <given-names>O. O.</given-names>
          </string-name>
          <string-name>
            <surname>Kalinichenko</surname>
            ,
            <given-names>T. A.</given-names>
          </string-name>
          <string-name>
            <surname>Vakaliuk</surname>
            ,
            <given-names>V. V.</given-names>
          </string-name>
          <string-name>
            <surname>Osadchyi</surname>
            ,
            <given-names>S. O.</given-names>
          </string-name>
          <string-name>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <article-title>WebAR development tools: An overview</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>2832</volume>
          (
          <year>2020</year>
          )
          <fpage>84</fpage>
          -
          <lpage>93</lpage>
          . URL: http://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2832</volume>
          /paper12.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Syrovatskyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. O.</given-names>
            <surname>Modlo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. V.</given-names>
            <surname>Yechkalo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Zelinska</surname>
          </string-name>
          ,
          <article-title>Augmented reality software design for educational purposes</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>2292</volume>
          (
          <year>2018</year>
          )
          <fpage>193</fpage>
          -
          <lpage>225</lpage>
          . URL: http://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2292</volume>
          /paper20.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>M. I.</given-names>
            <surname>Striuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. M.</given-names>
            <surname>Striuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <article-title>Mobility in the information society: a holistic model</article-title>
          ,
          <source>Educational Technology Quarterly</source>
          <year>2023</year>
          (
          <year>2023</year>
          )
          <fpage>277</fpage>
          -
          <lpage>301</lpage>
          . URL: https://doi.org/10.55056/etq.619.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <source>[5] AR.js Documentation</source>
          ,
          <year>2024</year>
          . URL: https://ar-js-org.github.io/AR.js-Docs/.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <surname>M. M. Mintii</surname>
            ,
            <given-names>I. S.</given-names>
          </string-name>
          <string-name>
            <surname>Mintii</surname>
          </string-name>
          ,
          <article-title>Review of the course “Development of Virtual and Augmented Reality Software” for STEM teachers: implementation results and improvement potentials</article-title>
          , in: S. H.
          <string-name>
            <surname>Lytvynova</surname>
            ,
            <given-names>S. O.</given-names>
          </string-name>
          <string-name>
            <surname>Semerikov</surname>
          </string-name>
          (Eds.),
          <source>Proceedings of the 4th International Workshop on Augmented Reality in Education (AREdu</source>
          <year>2021</year>
          ), Kryvyi Rih, Ukraine, May
          <volume>11</volume>
          ,
          <year>2021</year>
          , volume
          <volume>2898</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2021</year>
          , pp.
          <fpage>159</fpage>
          -
          <lpage>177</lpage>
          . URL: http: //ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2898</volume>
          /paper09.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>H.</given-names>
            <surname>Yuen</surname>
          </string-name>
          , HiuKim Yuen,
          <year>2023</year>
          . URL: https://www.youtube.com/channel/ UC-JyA1Z1-p0wgxj5WEX56wg/featured.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>H.</given-names>
            <surname>Yuen</surname>
          </string-name>
          , MindAR,
          <year>2023</year>
          . URL: https://hiukim.github.io/mind-ar
          <string-name>
            <surname>-</surname>
          </string-name>
          js-doc/.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>TensorFlow.</surname>
          </string-name>
          <source>js | Machine Learning for JavaScript Developers</source>
          ,
          <year>2024</year>
          . URL: https://www.tensorflow. org/js.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <article-title>Centre for Science and Technology Studies</article-title>
          , Leiden University, The Netherlands, VOSviewer - Visualizing scientific landscapes,
          <year>2024</year>
          . URL: https://www.vosviewer.com/.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Tkachuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. V.</given-names>
            <surname>Yechkalo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. M.</given-names>
            <surname>Markova</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <article-title>M. Mintii, WebAR development tools: comparative analysis</article-title>
          ,
          <source>Physical and Mathematical Education</source>
          (
          <year>2020</year>
          ). URL: https://doi.org/10. 31110%
          <fpage>2F2413</fpage>
          -
          <lpage>1571</lpage>
          -2020-024-2-021. doi:
          <volume>10</volume>
          .31110/
          <fpage>2413</fpage>
          -1571-2020-024-2-021.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>J.</given-names>
            <surname>An</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.-P.</given-names>
            <surname>Poly</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Holme</surname>
          </string-name>
          ,
          <article-title>Usability testing and the development of an augmented reality application for laboratory learning</article-title>
          ,
          <source>Journal of Chemical Education</source>
          <volume>97</volume>
          (
          <year>2020</year>
          )
          <fpage>97</fpage>
          -
          <lpage>105</lpage>
          . URL: https://doi.org/10.1021/acs.jchemed.9b00453.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>P. E.</given-names>
            <surname>Antoniou</surname>
          </string-name>
          , E. Dafli, G. Arfaras,
          <string-name>
            <given-names>P. D.</given-names>
            <surname>Bamidis</surname>
          </string-name>
          ,
          <article-title>Versatile mixed reality medical educational spaces; requirement analysis from expert users</article-title>
          ,
          <source>Personal and Ubiquitous Computing</source>
          <volume>21</volume>
          (
          <year>2017</year>
          )
          <fpage>1015</fpage>
          -
          <lpage>1024</lpage>
          . URL: https://doi.org/10.1007/s00779-017-1074-5.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>J. V.</given-names>
            <surname>Arteaga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. L.</given-names>
            <surname>Gravini-Donado</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. D. Z.</given-names>
            <surname>Riva</surname>
          </string-name>
          ,
          <article-title>Digital technologies for heritage teaching: Trend analysis in new realities</article-title>
          ,
          <source>International Journal of Emerging Technologies in Learning 16</source>
          (
          <year>2021</year>
          )
          <fpage>132</fpage>
          -
          <lpage>148</lpage>
          . URL: https://doi.org/10.3991/ijet.v16i21.
          <fpage>25149</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>T. N.</given-names>
            <surname>Arvanitis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Petrou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. F.</given-names>
            <surname>Knight</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Savas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Sotiriou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Gargalakos</surname>
          </string-name>
          , E. Gialouri,
          <article-title>Human factors and qualitative pedagogical evaluation of a mobile augmented reality system for science education used by learners with physical disabilities</article-title>
          ,
          <source>Personal and Ubiquitous Computing</source>
          <volume>13</volume>
          (
          <year>2009</year>
          )
          <fpage>243</fpage>
          -
          <lpage>250</lpage>
          . URL: https://doi.org/10.1007/s00779-007-0187-7.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>H. T.</given-names>
            <surname>Atmaca</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. S.</given-names>
            <surname>Terzi</surname>
          </string-name>
          ,
          <article-title>Building a web-augmented reality application for demonstration of kidney pathology for veterinary education</article-title>
          ,
          <source>Polish Journal of Veterinary Sciences</source>
          <volume>24</volume>
          (
          <year>2021</year>
          )
          <fpage>345</fpage>
          -
          <lpage>350</lpage>
          . URL: https://doi.org/10.24425/pjvs.
          <year>2021</year>
          .
          <volume>137671</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Baashar</surname>
          </string-name>
          , G. Alkawsi,
          <string-name>
            <given-names>W. N. W.</given-names>
            <surname>Ahmad</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Alhussian</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Alwadain</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. F.</given-names>
            <surname>Capretz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Babiker</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Alghail</surname>
          </string-name>
          ,
          <article-title>Efectiveness of using augmented reality for training in the medical professions: Meta-analysis</article-title>
          ,
          <source>JMIR Serious Games</source>
          <volume>10</volume>
          (
          <year>2022</year>
          )
          <article-title>e32715</article-title>
          . URL: https://doi.org/10.2196/32715.
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>K.</given-names>
            <surname>Bhavika</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Martin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Ardit</surname>
          </string-name>
          ,
          <article-title>Technology will never replace hands on surgical training in plastic surgery</article-title>
          ,
          <source>Journal of Plastic, Reconstructive and Aesthetic Surgery</source>
          <volume>75</volume>
          (
          <year>2022</year>
          )
          <fpage>439</fpage>
          -
          <lpage>488</lpage>
          . URL: https://doi.org/10.1016/j.bjps.
          <year>2021</year>
          .
          <volume>11</volume>
          .034.
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>H. M. Bradford</surname>
            ,
            <given-names>C. L.</given-names>
          </string-name>
          <string-name>
            <surname>Farley</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Escobar</surname>
            ,
            <given-names>E. T.</given-names>
          </string-name>
          <string-name>
            <surname>Heitzler</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          <string-name>
            <surname>Tringali</surname>
            ,
            <given-names>K. C.</given-names>
          </string-name>
          <string-name>
            <surname>Walker</surname>
          </string-name>
          ,
          <article-title>Rapid curricular innovations during covid-19 clinical suspension: Maintaining student engagement with simulation experiences</article-title>
          ,
          <source>Journal of Midwifery and Women's Health</source>
          <volume>66</volume>
          (
          <year>2021</year>
          )
          <fpage>366</fpage>
          -
          <lpage>371</lpage>
          . URL: https://doi.org/10.1111/jmwh.13246.
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>A.</given-names>
            <surname>Brunzini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Papetti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. B.</given-names>
            <surname>Serrani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Scafà</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Germani</surname>
          </string-name>
          , How to Improve
          <source>Medical Simulation Training: A New Methodology Based on Ergonomic Evaluation</source>
          , in: W. Karwowski,
          <string-name>
            <given-names>T.</given-names>
            <surname>Ahram</surname>
          </string-name>
          , S. Nazir (Eds.),
          <article-title>Advances in Human Factors in Training, Education, and Learning Sciences</article-title>
          , volume
          <volume>963</volume>
          <source>of Advances in Intelligent Systems and Computing</source>
          , Springer International Publishing, Cham,
          <year>2020</year>
          , pp.
          <fpage>145</fpage>
          -
          <lpage>155</lpage>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>030</fpage>
          -20135-7_
          <fpage>14</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>B. K.</given-names>
            <surname>Burian</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Ebnali</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. M.</given-names>
            <surname>Robertson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Musson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. N.</given-names>
            <surname>Pozner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Doyle</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. S.</given-names>
            <surname>Smink</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Miccile</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Paladugu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Atamna</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Lipsitz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Yule</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. .</given-names>
            <surname>Dias</surname>
          </string-name>
          ,
          <article-title>Using extended reality (xr) for medical training and real-time clinical support during deep space missions</article-title>
          ,
          <source>Applied Ergonomics</source>
          <volume>106</volume>
          (
          <year>2023</year>
          )
          <article-title>103902</article-title>
          . URL: https://doi.org/10.1016/j.apergo.
          <year>2022</year>
          .
          <volume>103902</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>I.</given-names>
            <surname>Coma-Tatay</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Casas-Yrurzum</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Casanova-Salas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Fernández-Marín</surname>
          </string-name>
          ,
          <article-title>Fi-ar learning: a webbased platform for augmented reality educational content</article-title>
          ,
          <source>Multimedia Tools and Applications</source>
          <volume>78</volume>
          (
          <year>2019</year>
          )
          <fpage>6093</fpage>
          -
          <lpage>6118</lpage>
          . URL: https://doi.org/10.1007/s11042-018-6395-5.
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <given-names>F.</given-names>
            <surname>Cortés Rodríguez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. Dal</given-names>
            <surname>Peraro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Abriata</surname>
          </string-name>
          ,
          <article-title>Online tools to easily build virtual molecular models for display in augmented and virtual reality on the web</article-title>
          ,
          <source>Journal of Molecular Graphics and Modelling</source>
          <volume>114</volume>
          (
          <year>2022</year>
          )
          <article-title>108164</article-title>
          . URL: https://doi.org/10.1016/j.jmgm.
          <year>2022</year>
          .
          <volume>108164</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <given-names>T.</given-names>
            <surname>Coughlin</surname>
          </string-name>
          ,
          <article-title>Impact of covid-19 on the consumer electronics market</article-title>
          ,
          <source>IEEE Consumer Electronics Magazine</source>
          <volume>10</volume>
          (
          <year>2021</year>
          )
          <fpage>58</fpage>
          -
          <lpage>59</lpage>
          . URL: https://doi.org/10.1109/
          <string-name>
            <surname>MCE</surname>
          </string-name>
          .
          <year>2020</year>
          .
          <volume>3016753</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>P. G.</given-names>
            <surname>Crandall</surname>
          </string-name>
          ,
          <string-name>
            <surname>R. K. Engler</surname>
            <given-names>III</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>D. E.</given-names>
            <surname>Beck</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. A.</given-names>
            <surname>Killian</surname>
          </string-name>
          ,
          <string-name>
            <surname>C. A. O'Bryan</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Jarvis</surname>
          </string-name>
          , E. Clausen,
          <article-title>Development of an augmented reality game to teach abstract concepts in food chemistry</article-title>
          ,
          <source>Journal of Food Science Education</source>
          <volume>14</volume>
          (
          <year>2015</year>
          )
          <fpage>18</fpage>
          -
          <lpage>23</lpage>
          . URL: https://ift.onlinelibrary.wiley. com/doi/abs/10.1111/
          <fpage>1541</fpage>
          -
          <lpage>4329</lpage>
          .12048. doi:https://doi.org/10.1111/
          <fpage>1541</fpage>
          -
          <lpage>4329</lpage>
          .12048. arXiv:https://ift.onlinelibrary.wiley.com/doi/pdf/10.1111/
          <fpage>1541</fpage>
          -
          <lpage>4329</lpage>
          .
          <fpage>12048</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>S. A.</given-names>
            <surname>Dar</surname>
          </string-name>
          ,
          <article-title>Mobile library initiatives: a new way to revitalize the academic library settings</article-title>
          ,
          <source>Library Hi Tech News</source>
          <volume>36</volume>
          (
          <year>2019</year>
          )
          <fpage>15</fpage>
          -
          <lpage>21</lpage>
          . URL: https://doi.org/10.1108/LHTN-05-2019-0032.
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>L.</given-names>
            <surname>Dunkel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Fernandez-Luque</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Loche</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. O.</given-names>
            <surname>Savage</surname>
          </string-name>
          ,
          <article-title>Digital technologies to improve the precision of paediatric growth disorder diagnosis and management</article-title>
          ,
          <source>Growth Hormone and IGF Research</source>
          <volume>59</volume>
          (
          <year>2021</year>
          )
          <article-title>101408</article-title>
          . URL: https://doi.org/10.1016/j.ghir.
          <year>2021</year>
          .
          <volume>101408</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [28]
          <string-name>
            <given-names>E.</given-names>
            <surname>Erçağ</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. Yasakcı,</surname>
          </string-name>
          <article-title>The perception scale for the 7e model-based augmented reality enriched computer course (7emagbaÖ): Validity and reliability study</article-title>
          ,
          <source>Sustainability</source>
          <volume>14</volume>
          (
          <year>2022</year>
          )
          <article-title>12037</article-title>
          . URL: https://doi.org/10.3390/su141912037.
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [29]
          <string-name>
            <given-names>E.</given-names>
            <surname>Faridi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Ghaderian</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Honarasa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Shafie</surname>
          </string-name>
          ,
          <article-title>Next generation of chemistry and biochemistry conference posters: Animation, augmented reality</article-title>
          , visitor statistics, and visitors' attention,
          <source>Biochemistry and Molecular Biology Education</source>
          <volume>49</volume>
          (
          <year>2021</year>
          )
          <fpage>619</fpage>
          -
          <lpage>624</lpage>
          . URL: https://doi.org/10.1002/ bmb.21520.
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          [30]
          <string-name>
            <given-names>S.</given-names>
            <surname>Farra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Hodgson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Miller</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Timm</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Brady</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Gneuhs</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Ying</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Hausfeld</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Cosgrove</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Simon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Bottomley</surname>
          </string-name>
          ,
          <article-title>Efects of virtual reality simulation on worker emergency evacuation of neonates</article-title>
          ,
          <source>Disaster Medicine and Public Health Preparedness</source>
          <volume>13</volume>
          (
          <year>2019</year>
          )
          <fpage>301</fpage>
          -
          <lpage>308</lpage>
          . URL: https: //doi.org/10.1017/dmp.
          <year>2018</year>
          .
          <volume>58</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          [31]
          <string-name>
            <given-names>N.</given-names>
            <surname>Gordon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Brayshaw</surname>
          </string-name>
          , T. Aljaber,
          <article-title>Heuristic Evaluation for Serious Immersive Games and M-instruction</article-title>
          , in: P.
          <string-name>
            <surname>Zaphiris</surname>
            ,
            <given-names>A</given-names>
          </string-name>
          . Ioannou (Eds.),
          <source>Learning and Collaboration Technologies</source>
          , volume
          <volume>9753</volume>
          of Lecture Notes in Computer Science, Springer International Publishing, Cham,
          <year>2016</year>
          , pp.
          <fpage>310</fpage>
          -
          <lpage>319</lpage>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>319</fpage>
          -39483-1_
          <fpage>29</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          [32]
          <string-name>
            <given-names>B.</given-names>
            <surname>Hensen</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Koren</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Klamma</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Herrler</surname>
          </string-name>
          ,
          <article-title>An augmented reality framework for gamified learning</article-title>
          , in: G. Hancke,
          <string-name>
            <given-names>M.</given-names>
            <surname>Spaniol</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Osathanunkul</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Unankard</surname>
          </string-name>
          , R. Klamma (Eds.),
          <source>Advances in Web-Based Learning - ICWL</source>
          <year>2018</year>
          , volume
          <volume>11007</volume>
          of Lecture Notes in Computer Science, Springer International Publishing, Cham,
          <year>2018</year>
          , pp.
          <fpage>67</fpage>
          -
          <lpage>76</lpage>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>319</fpage>
          -96565-
          <issue>9</issue>
          _
          <fpage>7</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref33">
        <mixed-citation>
          [33]
          <string-name>
            <given-names>T. G.</given-names>
            <surname>Hoog</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. M.</given-names>
            <surname>Aufdembrink</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. J.</given-names>
            <surname>Gaut</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.-J.</given-names>
            <surname>Sung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. P.</given-names>
            <surname>Adamala</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. E.</given-names>
            <surname>Engelhart</surname>
          </string-name>
          ,
          <article-title>Rapid deployment of smartphone-based augmented reality tools for field and online education in structural biology</article-title>
          ,
          <source>Biochemistry and Molecular Biology Education</source>
          <volume>48</volume>
          (
          <year>2020</year>
          )
          <fpage>448</fpage>
          -
          <lpage>451</lpage>
          . URL: https://doi.org/10.1002/bmb.21396.
        </mixed-citation>
      </ref>
      <ref id="ref34">
        <mixed-citation>
          [34]
          <string-name>
            <surname>T.-C. Huang</surname>
          </string-name>
          ,
          <article-title>Seeing creativity in an augmented experiential learning environment</article-title>
          ,
          <source>Universal Access in the Information Society</source>
          <volume>18</volume>
          (
          <year>2019</year>
          )
          <fpage>301</fpage>
          -
          <lpage>313</lpage>
          . URL: https://doi.org/10.1007/s10209-017-0592-2.
        </mixed-citation>
      </ref>
      <ref id="ref35">
        <mixed-citation>
          [35]
          <string-name>
            <surname>M. B. Ibáñez</surname>
            ,
            <given-names>Á. Di</given-names>
          </string-name>
          <string-name>
            <surname>Serio</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Villarán</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <article-title>Delgado Kloos, Experimenting with electromagnetism using augmented reality: Impact on flow student experience and educational efectiveness</article-title>
          ,
          <source>Computers and Education</source>
          <volume>71</volume>
          (
          <year>2014</year>
          )
          <fpage>1</fpage>
          -
          <lpage>13</lpage>
          . URL: https://doi.org/10.1016/j.compedu.
          <year>2013</year>
          .
          <volume>09</volume>
          .004.
        </mixed-citation>
      </ref>
      <ref id="ref36">
        <mixed-citation>
          [36]
          <string-name>
            <surname>M.-B. Ibanez</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Di-Serio</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Villaran-Molina</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Delgado-Kloos</surname>
          </string-name>
          ,
          <article-title>Augmented reality-based simulators as discovery learning tools: An empirical study</article-title>
          ,
          <source>IEEE Transactions on Education</source>
          <volume>58</volume>
          (
          <year>2015</year>
          )
          <fpage>208</fpage>
          -
          <lpage>213</lpage>
          . URL: https://doi.org/10.1109/TE.
          <year>2014</year>
          .
          <volume>2379712</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref37">
        <mixed-citation>
          [37]
          <string-name>
            <surname>M. B. Ibáñez</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Peláez</surname>
            ,
            <given-names>C. D.</given-names>
          </string-name>
          <string-name>
            <surname>Kloos</surname>
          </string-name>
          ,
          <article-title>Using an Augmented Reality Geolocalized Quiz Game as an Incentive to Overcome Academic Procrastination</article-title>
          , in: M. E. Auer, T. Tsiatsos (Eds.),
          <article-title>Mobile Technologies and Applications for the Internet of Things</article-title>
          , volume
          <volume>909</volume>
          <source>of Advances in Intelligent Systems and Computing</source>
          , Springer International Publishing, Cham,
          <year>2019</year>
          , pp.
          <fpage>175</fpage>
          -
          <lpage>184</lpage>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>030</fpage>
          -11434-3_
          <fpage>21</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref38">
        <mixed-citation>
          [38]
          <string-name>
            <given-names>M.</given-names>
            <surname>Ibáñez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Uriarte</surname>
          </string-name>
          <string-name>
            <surname>Portillo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. Zatarain</given-names>
            <surname>Cabada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Barrón</surname>
          </string-name>
          ,
          <article-title>Impact of augmented reality technology on academic achievement and motivation of students from public and private mexican schools. a case study in a middle-school geometry course</article-title>
          ,
          <source>Computers and Education</source>
          <volume>145</volume>
          (
          <year>2020</year>
          )
          <article-title>103734</article-title>
          . URL: https://doi.org/10.1016/j.compedu.
          <year>2019</year>
          .
          <volume>103734</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref39">
        <mixed-citation>
          [39]
          <string-name>
            <given-names>K.</given-names>
            <surname>Jung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Nguyen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.-C.</given-names>
            <surname>Yoo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Park</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Currie</surname>
          </string-name>
          ,
          <article-title>Palmitoar: The last battle of the u.s. civil war reenacted using augmented reality</article-title>
          ,
          <source>ISPRS International Journal of Geo-Information</source>
          <volume>9</volume>
          (
          <year>2020</year>
          )
          <article-title>75</article-title>
          . URL: https://doi.org/10.3390/ijgi9020075.
        </mixed-citation>
      </ref>
      <ref id="ref40">
        <mixed-citation>
          [40]
          <string-name>
            <given-names>B.</given-names>
            <surname>Kang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Heo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. H. S.</given-names>
            <surname>Choi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. H.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <year>2030</year>
          <article-title>toy web of the future</article-title>
          , in: S. Kim,
          <string-name>
            <given-names>J.-W.</given-names>
            <surname>Jung</surname>
          </string-name>
          , N. Kubota (Eds.),
          <source>Soft Computing in Intelligent Control</source>
          , volume
          <volume>272</volume>
          <source>of Advances in Intelligent Systems and Computing</source>
          , Springer International Publishing, Cham,
          <year>2014</year>
          , pp.
          <fpage>69</fpage>
          -
          <lpage>75</lpage>
          . URL: https: //doi.org/10.1007/978-3-
          <fpage>319</fpage>
          -05570-
          <issue>1</issue>
          _
          <fpage>8</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref41">
        <mixed-citation>
          [41]
          <string-name>
            <given-names>S. I.</given-names>
            <surname>Karas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. V.</given-names>
            <surname>Grakova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Balakhonova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. B.</given-names>
            <surname>Arzhanik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. E.</given-names>
            <surname>Kara-Sal</surname>
          </string-name>
          ,
          <article-title>Distance learning in cardiology: The use of multimedia clinical diagnostic tasks</article-title>
          ,
          <source>Russian Journal of Cardiology</source>
          <volume>25</volume>
          (
          <year>2020</year>
          )
          <fpage>187</fpage>
          -
          <lpage>194</lpage>
          . URL: https://doi.org/10.15829/
          <fpage>1560</fpage>
          -4071-2020-4116.
        </mixed-citation>
      </ref>
      <ref id="ref42">
        <mixed-citation>
          [42]
          <string-name>
            <given-names>M.</given-names>
            <surname>Karayilan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. M.</given-names>
            <surname>McDonald</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. J.</given-names>
            <surname>Bahnick</surname>
          </string-name>
          ,
          <string-name>
            <surname>K. M. Godwin</surname>
            ,
            <given-names>Y. M.</given-names>
          </string-name>
          <string-name>
            <surname>Chan</surname>
            ,
            <given-names>M. L.</given-names>
          </string-name>
          <string-name>
            <surname>Becker</surname>
          </string-name>
          ,
          <article-title>Reassessing undergraduate polymer chemistry laboratory experiments for virtual learning environments</article-title>
          ,
          <source>Journal of Chemical Education</source>
          <volume>99</volume>
          (
          <year>2022</year>
          )
          <fpage>1877</fpage>
          -
          <lpage>1889</lpage>
          . URL: https://doi.org/10.1021/acs.jchemed. 1c01259.
        </mixed-citation>
      </ref>
      <ref id="ref43">
        <mixed-citation>
          [43]
          <string-name>
            <given-names>T.</given-names>
            <surname>Katika</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. N.</given-names>
            <surname>Bolierakis</surname>
          </string-name>
          , E. Vasilopoulos,
          <string-name>
            <given-names>M.</given-names>
            <surname>Antonopoulos</surname>
          </string-name>
          , G. Tsimiklis,
          <string-name>
            <surname>I. Karaseitanidis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Amditis</surname>
          </string-name>
          ,
          <article-title>Coupling AR with Object Detection Neural Networks for End-User Engagement</article-title>
          , in: G. Zachmann,
          <string-name>
            <given-names>M. Alcañiz</given-names>
            <surname>Raya</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Bourdot</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Marchal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Stefanucci</surname>
          </string-name>
          ,
          <string-name>
            <surname>X.</surname>
          </string-name>
          Yang (Eds.),
          <source>Virtual Reality and Mixed Reality</source>
          , volume
          <volume>13484</volume>
          of Lecture Notes in Computer Science, Springer International Publishing, Cham,
          <year>2022</year>
          , pp.
          <fpage>135</fpage>
          -
          <lpage>145</lpage>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>031</fpage>
          -16234-
          <issue>3</issue>
          _
          <fpage>8</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref44">
        <mixed-citation>
          [44]
          <string-name>
            <given-names>I.</given-names>
            <surname>Kazanidis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Pellas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Christopoulos</surname>
          </string-name>
          ,
          <article-title>A learning analytics conceptual framework for augmented reality-supported educational case studies</article-title>
          ,
          <source>Multimodal Technologies and Interaction</source>
          <volume>5</volume>
          (
          <year>2021</year>
          )
          <article-title>9</article-title>
          . URL: https://doi.org/10.3390/mti5030009.
        </mixed-citation>
      </ref>
      <ref id="ref45">
        <mixed-citation>
          [45]
          <string-name>
            <given-names>H.</given-names>
            <surname>Le</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Nguyen</surname>
          </string-name>
          ,
          <article-title>An Online Platform for Enhancing Learning Experiences with Web-Based Augmented Reality and Pictorial Bar Code</article-title>
          , in: V.
          <string-name>
            <surname>Geroimenko</surname>
          </string-name>
          (Ed.),
          <source>Augmented Reality in Education: A New Technology for Teaching and Learning</source>
          , Springer Series on Cultural Computing, Springer International Publishing, Cham,
          <year>2020</year>
          , pp.
          <fpage>45</fpage>
          -
          <lpage>57</lpage>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>030</fpage>
          -42156-
          <issue>4</issue>
          _ 3. doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>030</fpage>
          -42156-
          <issue>4</issue>
          _
          <fpage>3</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref46">
        <mixed-citation>
          [46]
          <string-name>
            <given-names>E.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Cai</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Liu</surname>
          </string-name>
          , C. Liu,
          <article-title>WebART : Web-based augmented reality learning resources authoring tool and its user experience study among teachers</article-title>
          ,
          <source>IEEE Transactions on Learning Technologies</source>
          <volume>16</volume>
          (
          <year>2023</year>
          )
          <fpage>53</fpage>
          -
          <lpage>65</lpage>
          . URL: https://doi.org/10.1109/TLT.
          <year>2022</year>
          .
          <volume>3214854</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref47">
        <mixed-citation>
          [47]
          <string-name>
            <given-names>D.</given-names>
            <surname>Lou</surname>
          </string-name>
          ,
          <article-title>Two fast prototypes of web-based augmented reality enhancement for books</article-title>
          ,
          <source>Library Hi Tech News</source>
          <volume>36</volume>
          (
          <year>2019</year>
          )
          <fpage>19</fpage>
          -
          <lpage>24</lpage>
          . URL: https://doi.org/10.1108/LHTN-08-2019-0057.
        </mixed-citation>
      </ref>
      <ref id="ref48">
        <mixed-citation>
          [48]
          <string-name>
            <given-names>C.</given-names>
            <surname>Lytridis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Tsinakos</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Kazanidis</surname>
          </string-name>
          ,
          <article-title>Artutor-an augmented reality platform for interactive distance learning</article-title>
          ,
          <source>Education Sciences 8</source>
          (
          <year>2018</year>
          )
          <article-title>6</article-title>
          . URL: https://doi.org/10.3390/educsci8010006.
        </mixed-citation>
      </ref>
      <ref id="ref49">
        <mixed-citation>
          [49]
          <string-name>
            <given-names>R.</given-names>
            <surname>Marín</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. J.</given-names>
            <surname>Sanz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. P.</given-names>
            <surname>Del Pobil</surname>
          </string-name>
          ,
          <article-title>The uji online robot: An education and training experience</article-title>
          ,
          <source>Autonomous Robots</source>
          <volume>15</volume>
          (
          <year>2003</year>
          )
          <fpage>283</fpage>
          -
          <lpage>297</lpage>
          . URL: https://doi.org/10.1023/A:
          <fpage>1026220621431</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref50">
        <mixed-citation>
          [50]
          <string-name>
            <given-names>D. R.</given-names>
            <surname>Nemirovsky</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. J.</given-names>
            <surname>Garcia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Gupta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Shoen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Walia</surname>
          </string-name>
          ,
          <article-title>Evaluation of surgical improvement of clinical knowledge ops (sicko), an interactive training platform</article-title>
          ,
          <source>Journal of Digital Imaging</source>
          <volume>34</volume>
          (
          <year>2021</year>
          )
          <fpage>1067</fpage>
          -
          <lpage>1071</lpage>
          . URL: https://doi.org/10.1007/s10278-021-00482-x.
        </mixed-citation>
      </ref>
      <ref id="ref51">
        <mixed-citation>
          [51]
          <string-name>
            <given-names>V. T.</given-names>
            <surname>Nguyen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Jung</surname>
          </string-name>
          , T. Dang,
          <article-title>Blocklyar: A visual programming interface for creating augmented reality experiences</article-title>
          ,
          <source>Electronics</source>
          <volume>9</volume>
          (
          <year>2020</year>
          )
          <fpage>1</fpage>
          -
          <lpage>20</lpage>
          . URL: https://doi.org/10.3390/electronics9081205.
        </mixed-citation>
      </ref>
      <ref id="ref52">
        <mixed-citation>
          [52]
          <string-name>
            <given-names>S.</given-names>
            <surname>Brewster</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Murray-Smith</surname>
          </string-name>
          (Eds.), Haptic Human-Computer Interaction: First International Workshop, Glasgow, UK,
          <year>August</year>
          31 - September 1,
          <year>2000</year>
          , Proceedings, volume
          <volume>2058</volume>
          <source>of Lecture Notes in Computer Science</source>
          , Springer-Verlag, Berlin Heidelberg,
          <year>2001</year>
          . URL: https://doi.org/10.1007/ 3-540-44589-7. doi:
          <volume>10</volume>
          .1007/3-540-44589-7.
        </mixed-citation>
      </ref>
      <ref id="ref53">
        <mixed-citation>
          [53]
          <string-name>
            <given-names>J. D.</given-names>
            <surname>Westwood</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. W.</given-names>
            <surname>Westwood</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Felländer-Tsai</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. M.</given-names>
            <surname>Fidopiastis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Senger</surname>
          </string-name>
          , K. G. Vosburgh (Eds.),
          <source>Medicine Meets Virtual Reality 22 - NextMed</source>
          ,
          <year>MMVR 2016</year>
          , Los Angeles, California, USA, April 7-
          <issue>9</issue>
          ,
          <year>2016</year>
          , volume
          <volume>220</volume>
          of
          <article-title>Studies in Health Technology and Informatics</article-title>
          , IOS Press,
          <year>2016</year>
          . URL: http://ebooks.iospress.nl/volume/medicine
          <article-title>-meets-virtual-</article-title>
          <string-name>
            <surname>reality-</surname>
          </string-name>
          22
          <string-name>
            <surname>-</surname>
          </string-name>
          nextmed-mmvr22.
        </mixed-citation>
      </ref>
      <ref id="ref54">
        <mixed-citation>
          [54]
          <string-name>
            <given-names>W.</given-names>
            <surname>Budiharto</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. A. S.</given-names>
            <surname>Gunawan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. A.</given-names>
            <surname>Wulandhari</surname>
          </string-name>
          , Williem, Faisal,
          <string-name>
            <given-names>R.</given-names>
            <surname>Sutoyo</surname>
          </string-name>
          , Meiliana,
          <string-name>
            <given-names>D.</given-names>
            <surname>Suryani</surname>
          </string-name>
          , Y. Arifin (Eds.),
          <source>The 3rd International Conference on Computer Science and Computational Intelligence (ICCSCI</source>
          <year>2018</year>
          )
          <article-title>: Empowering Smart Technology in Digital Era for a Better Life</article-title>
          , volume
          <volume>135</volume>
          of Procedia Computer Science, Elsevier
          <string-name>
            <surname>B.V.</surname>
          </string-name>
          ,
          <year>2018</year>
          . URL: https://www.sciencedirect. com/journal/procedia-computer-science/vol/135/suppl/C.
        </mixed-citation>
      </ref>
      <ref id="ref55">
        <mixed-citation>
          [55]
          <string-name>
            <given-names>L.</given-names>
            <surname>Rønningsbakk</surname>
          </string-name>
          , T.-
          <string-name>
            <surname>T. Wu</surname>
            ,
            <given-names>F. E.</given-names>
          </string-name>
          <string-name>
            <surname>Sandnes</surname>
          </string-name>
          , Y.
          <string-name>
            <surname>-M. Huang</surname>
            (Eds.),
            <given-names>Innovative</given-names>
          </string-name>
          <string-name>
            <surname>Technologies</surname>
          </string-name>
          and Learning: Second International Conference, ICITL 2019, Tromsø, Norway, December 2-
          <issue>5</issue>
          ,
          <year>2019</year>
          , Proceedings, volume
          <volume>11937</volume>
          of Lecture Notes in Computer Science, Springer International Publishing,
          <year>2019</year>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>030</fpage>
          -35343-8. doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>030</fpage>
          -35343-8.
        </mixed-citation>
      </ref>
      <ref id="ref56">
        <mixed-citation>
          [56]
          <string-name>
            <surname>Preface</surname>
          </string-name>
          ,
          <source>Journal of Physics: Conference Series</source>
          <year>1860</year>
          (
          <year>2021</year>
          )
          <article-title>011001</article-title>
          . URL: https://doi.org/10.1088/
          <fpage>1742</fpage>
          -
          <lpage>6596</lpage>
          /
          <year>1860</year>
          /1/011001. doi:
          <volume>10</volume>
          .1088/
          <fpage>1742</fpage>
          -
          <lpage>6596</lpage>
          /
          <year>1860</year>
          /1/011001.
        </mixed-citation>
      </ref>
      <ref id="ref57">
        <mixed-citation>
          [57]
          <string-name>
            <given-names>N.</given-names>
            <surname>Nordin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. R. M.</given-names>
            <surname>Nordin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Omar</surname>
          </string-name>
          ,
          <article-title>Rev-opoly: A study on educational board game with webbased augmented reality</article-title>
          ,
          <source>Asian Journal of University Education</source>
          <volume>18</volume>
          (
          <year>2022</year>
          )
          <fpage>81</fpage>
          -
          <lpage>90</lpage>
          . URL: https://doi.org/10.24191/ajue.v18i1.
          <fpage>17172</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref58">
        <mixed-citation>
          [58]
          <string-name>
            <given-names>M. E.</given-names>
            <surname>Rollo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. J.</given-names>
            <surname>Aguiar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. L.</given-names>
            <surname>Williams</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Wynne</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Kriss</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Callister</surname>
          </string-name>
          , C. E. Collins,
          <article-title>Ehealth technologies to support nutrition and physical activity behaviors in diabetes self-management,</article-title>
          <string-name>
            <surname>Diabetes</surname>
          </string-name>
          ,
          <source>Metabolic Syndrome and Obesity: Targets and Therapy</source>
          <volume>9</volume>
          (
          <year>2016</year>
          )
          <fpage>381</fpage>
          -
          <lpage>390</lpage>
          . URL: https: //doi.org/10.2147/DMSO.S95247.
        </mixed-citation>
      </ref>
      <ref id="ref59">
        <mixed-citation>
          [59]
          <string-name>
            <given-names>C.</given-names>
            <surname>Samat</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Chaijaroen</surname>
          </string-name>
          ,
          <article-title>Design and Development of Constructivist Augmented Reality (AR) Book Enhancing Analytical Thinking in Computer Classroom</article-title>
          , in: L.
          <string-name>
            <surname>Rønningsbakk</surname>
          </string-name>
          , T.-
          <string-name>
            <surname>T. Wu</surname>
            ,
            <given-names>F. E.</given-names>
          </string-name>
          <string-name>
            <surname>Sandnes</surname>
          </string-name>
          , Y.
          <string-name>
            <surname>-M. Huang</surname>
          </string-name>
          (Eds.),
          <source>Innovative Technologies and Learning</source>
          , volume
          <volume>11937</volume>
          of Lecture Notes in Computer Science, Springer International Publishing, Cham,
          <year>2019</year>
          , pp.
          <fpage>175</fpage>
          -
          <lpage>183</lpage>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>030</fpage>
          -35343-8_
          <fpage>19</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref60">
        <mixed-citation>
          [60]
          <string-name>
            <given-names>S. M. E.</given-names>
            <surname>Sepasgozar</surname>
          </string-name>
          ,
          <article-title>Digital twin and web-based virtual gaming technologies for online education: A case of construction management and engineering</article-title>
          ,
          <source>Applied Sciences</source>
          <volume>10</volume>
          (
          <year>2020</year>
          )
          <article-title>4678</article-title>
          . URL: https://doi.org/10.3390/app10134678.
        </mixed-citation>
      </ref>
      <ref id="ref61">
        <mixed-citation>
          [61]
          <string-name>
            <given-names>K.</given-names>
            <surname>Sharp</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>McCorvie</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Wagner</surname>
          </string-name>
          ,
          <article-title>Sharing hidden histories: The xrchaeology at miller grove, a free african american community in southern illinois</article-title>
          ,
          <source>Journal of African Diaspora Archaeology and Heritage</source>
          <volume>12</volume>
          (
          <year>2023</year>
          )
          <fpage>5</fpage>
          -
          <lpage>31</lpage>
          . URL: https://doi.org/10.1080/21619441.
          <year>2021</year>
          .
          <volume>1902706</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref62">
        <mixed-citation>
          [62]
          <string-name>
            <given-names>E.</given-names>
            <surname>Smith</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>McRae</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Semple</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Welsh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Evans</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Blackwell</surname>
          </string-name>
          ,
          <article-title>Enhancing vocational training in the post-covid era through mobile mixed reality</article-title>
          ,
          <source>Sustainability</source>
          <volume>13</volume>
          (
          <year>2021</year>
          )
          <article-title>6144</article-title>
          . URL: https: //doi.org/10.3390/su13116144.
        </mixed-citation>
      </ref>
      <ref id="ref63">
        <mixed-citation>
          [63]
          <string-name>
            <given-names>C.</given-names>
            <surname>Thabvithorn</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Samat</surname>
          </string-name>
          ,
          <article-title>Development of Web-Based Learning with Augmented Reality (AR) to Promote Analytical Thinking on Computational Thinking for High School</article-title>
          , in: Y.
          <string-name>
            <surname>-M. Huang</surname>
          </string-name>
          , S.-C. Cheng, J.
          <string-name>
            <surname>Barroso</surname>
            ,
            <given-names>F. E.</given-names>
          </string-name>
          Sandnes (Eds.),
          <source>Innovative Technologies and Learning</source>
          , volume
          <volume>13449</volume>
          of Lecture Notes in Computer Science, Springer International Publishing, Cham,
          <year>2022</year>
          , pp.
          <fpage>125</fpage>
          -
          <lpage>133</lpage>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>031</fpage>
          -15273-3_
          <fpage>14</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref64">
        <mixed-citation>
          [64]
          <string-name>
            <given-names>F.</given-names>
            <surname>Turner</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Welch</surname>
          </string-name>
          ,
          <article-title>The mixed reality toolkit as the next step in the mass customization co-design experience</article-title>
          ,
          <source>International Journal of Industrial Engineering and Management</source>
          <volume>10</volume>
          (
          <year>2019</year>
          )
          <fpage>191</fpage>
          -
          <lpage>199</lpage>
          . URL: https://doi.org/10.24867/IJIEM-2019-2-239.
        </mixed-citation>
      </ref>
      <ref id="ref65">
        <mixed-citation>
          [65]
          <string-name>
            <given-names>A.</given-names>
            <surname>Vahabzadeh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Keshav</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. P.</given-names>
            <surname>Salisbury</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Sahin</surname>
          </string-name>
          ,
          <article-title>Improvement of attention-deficit/hyperactivity disorder symptoms in school-aged children, adolescents, and young adults with autism via a digital smartglasses-based socioemotional coaching aid: Short-term, uncontrolled pilot study</article-title>
          ,
          <source>JMIR Mental Health</source>
          <volume>5</volume>
          (
          <year>2018</year>
          )
          <article-title>e25</article-title>
          . URL: https://doi.org/10.2196/mental.9631.
        </mixed-citation>
      </ref>
      <ref id="ref66">
        <mixed-citation>
          [66]
          <string-name>
            <given-names>D.</given-names>
            <surname>Villarán</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. B.</given-names>
            <surname>Ibáñez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. D.</given-names>
            <surname>Kloos</surname>
          </string-name>
          ,
          <article-title>Augmented reality-based simulations embedded in problem based learning courses</article-title>
          , in: G. Conole,
          <string-name>
            <given-names>T.</given-names>
            <surname>Klobučar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Rensing</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Konert</surname>
          </string-name>
          , E. Lavoué (Eds.),
          <article-title>Design for Teaching and Learning in a Networked World</article-title>
          , volume
          <volume>9307</volume>
          of Lecture Notes in Computer Science, Springer International Publishing, Cham,
          <year>2015</year>
          , pp.
          <fpage>540</fpage>
          -
          <lpage>543</lpage>
          . URL: https: //doi.org/10.1007/978-3-
          <fpage>319</fpage>
          -24258-3_
          <fpage>55</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref67">
        <mixed-citation>
          [67]
          <string-name>
            <given-names>S.</given-names>
            <surname>Yang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Mei</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Yue</surname>
          </string-name>
          ,
          <article-title>Mobile augmented reality assisted chemical education: Insights from elements 4d</article-title>
          ,
          <source>Journal of Chemical Education</source>
          <volume>95</volume>
          (
          <year>2018</year>
          )
          <fpage>1060</fpage>
          -
          <lpage>1062</lpage>
          . URL: https://doi.org/10.1021/ acs.jchemed.8b00017.
        </mixed-citation>
      </ref>
      <ref id="ref68">
        <mixed-citation>
          [68]
          <string-name>
            <given-names>R.</given-names>
            <surname>Zatarain-Cabada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Barrón-Estrada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. A.</given-names>
            <surname>Cárdenas-Sainz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. E.</given-names>
            <surname>Chavez-Echeagaray</surname>
          </string-name>
          ,
          <article-title>Experiences of web-based extended reality technologies for physics education</article-title>
          ,
          <source>Computer Applications in Engineering Education</source>
          <volume>31</volume>
          (
          <year>2023</year>
          )
          <fpage>63</fpage>
          -
          <lpage>82</lpage>
          . URL: https://doi.org/10.1002/cae.22571.
        </mixed-citation>
      </ref>
      <ref id="ref69">
        <mixed-citation>
          [69]
          <string-name>
            <given-names>N. U.</given-names>
            <surname>Zitzmann</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Matthisson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Ohla</surname>
          </string-name>
          , T. Joda,
          <article-title>Digital undergraduate education in dentistry: A systematic review</article-title>
          ,
          <source>International Journal of Environmental Research and Public Health</source>
          <volume>17</volume>
          (
          <year>2020</year>
          )
          <article-title>3269</article-title>
          . URL: https://doi.org/10.3390/ijerph17093269.
        </mixed-citation>
      </ref>
      <ref id="ref70">
        <mixed-citation>
          [70]
          <string-name>
            <given-names>S.</given-names>
            <surname>Hai-Jew</surname>
          </string-name>
          ,
          <article-title>Adult Coloring Books as Emotional Salve/Stress Relief, Tactual-Visual Learning: An Analysis from Mass-Scale Social Imagery, in: Common Visual Art in a Social Digital Age</article-title>
          , Nova Science Publishers, Inc.,
          <year>2022</year>
          , pp.
          <fpage>171</fpage>
          -
          <lpage>186</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref71">
        <mixed-citation>
          [71]
          <string-name>
            <given-names>L.</given-names>
            <surname>Huang</surname>
          </string-name>
          ,
          <source>Chemistry Apps on Smartphones and Tablets</source>
          , in: J.
          <string-name>
            <surname>García-Martínez</surname>
          </string-name>
          , E. SerranoTorregrosa (Eds.),
          <string-name>
            <surname>Chemistry</surname>
            <given-names>Education</given-names>
          </string-name>
          , John Wiley &amp; Sons, Ltd,
          <year>2015</year>
          , pp.
          <fpage>621</fpage>
          -
          <lpage>650</lpage>
          . URL: https: //doi.org/10.1002/9783527679300.ch25.
        </mixed-citation>
      </ref>
      <ref id="ref72">
        <mixed-citation>
          [72]
          <string-name>
            <given-names>C. A.</given-names>
            <surname>Jara</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F. A.</given-names>
            <surname>Candelas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Torres</surname>
          </string-name>
          ,
          <article-title>Internet virtual and remote control interface for robotics education</article-title>
          , in: Developments in Higher Education, Nova Science Publishers, Inc.,
          <year>2009</year>
          , pp.
          <fpage>136</fpage>
          -
          <lpage>154</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref73">
        <mixed-citation>
          [73]
          <string-name>
            <given-names>E.</given-names>
            <surname>Redondo</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Navarro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Sánchez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Fonseca</surname>
          </string-name>
          ,
          <article-title>Implementation of Augmented Reality in “3.0 Learning” Methodology: Case Studies with Students of Architecture Degree</article-title>
          , in: B.
          <string-name>
            <surname>Pătruţ</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Pătruţ</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          Cmeciu (Eds.),
          <source>Social Media and the New Academic Environment: Pedagogical Challenges</source>
          , IGI Global, Hershey, PA,
          <year>2013</year>
          , pp.
          <fpage>391</fpage>
          -
          <lpage>413</lpage>
          . URL: https://doi.org/10.4018/ 978-1-
          <fpage>4666</fpage>
          -2851-9.
          <year>ch019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref74">
        <mixed-citation>
          [74]
          <string-name>
            <given-names>J.</given-names>
            <surname>Al-Gharaibeh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Jefery</surname>
          </string-name>
          ,
          <article-title>Portable non-player character tutors with quest activities</article-title>
          ,
          <source>in: 2010 IEEE Virtual Reality Conference (VR)</source>
          ,
          <year>2010</year>
          , pp.
          <fpage>253</fpage>
          -
          <lpage>254</lpage>
          . URL: https://doi.org/10.1109/VR.
          <year>2010</year>
          .
          <volume>5444779</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref75">
        <mixed-citation>
          [75]
          <string-name>
            <given-names>P. E.</given-names>
            <surname>Antoniou</surname>
          </string-name>
          , E. Dafli, G. Arfaras,
          <string-name>
            <given-names>P. D.</given-names>
            <surname>Bamidis</surname>
          </string-name>
          ,
          <article-title>Versatile Mixed Reality Educational Spaces; A Medical Education Implementation Case</article-title>
          , in: N.
          <string-name>
            <surname>Georgalas</surname>
            ,
            <given-names>Q.</given-names>
          </string-name>
          <string-name>
            <surname>Jin</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Garcia-Blas</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Carretero</surname>
          </string-name>
          , I. Ray (Eds.),
          <source>Proceedings - 2016 15th International Conference on Ubiquitous Computing and Communications and 2016 8th International Symposium on Cyberspace and Security</source>
          ,
          <string-name>
            <surname>IUCCCSS</surname>
          </string-name>
          <year>2016</year>
          ,
          <article-title>Institute of Electrical and Electronics Engineers Inc</article-title>
          .,
          <year>2017</year>
          , pp.
          <fpage>132</fpage>
          -
          <lpage>137</lpage>
          . URL: https: //doi.org/10.1109/IUCC-CSS.
          <year>2016</year>
          .
          <volume>026</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref76">
        <mixed-citation>
          [76]
          <string-name>
            <given-names>S.</given-names>
            <surname>Anwar</surname>
          </string-name>
          ,
          <string-name>
            <surname>J. LeClair</surname>
          </string-name>
          , A. Peskin,
          <source>Development Of Nanotechnology And Power Systems Options For An On Line Bseet Degree</source>
          , in: 2010 Annual Conference &amp; Exposition, ASEE Conferences, Louisville, Kentucky,
          <year>2010</year>
          , pp.
          <volume>15</volume>
          .
          <issue>420</issue>
          .
          <fpage>1</fpage>
          -
          <lpage>15</lpage>
          .
          <fpage>420</fpage>
          .10. URL: https://doi.org/10.18260/1-2--
          <lpage>15776</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref77">
        <mixed-citation>
          [77]
          <string-name>
            <given-names>B.</given-names>
            <surname>Cardenas-Sainz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Zatarain-Cabada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Barron-Estrada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Chavez-Echeagaray</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Cabada</surname>
          </string-name>
          ,
          <article-title>FisicARtivo: Design of a learning tool for physics education using web-based XR technology</article-title>
          ,
          <source>in: 2022 IEEE Mexican International Conference on Computer Science</source>
          , ENC 2022 - Proceedings, Institute of Electrical and Electronics Engineers Inc.,
          <year>2022</year>
          . URL: https://doi.org/10.1109/ENC56672.
          <year>2022</year>
          .
          <volume>9882930</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref78">
        <mixed-citation>
          [78]
          <string-name>
            <surname>I. Demir</surname>
          </string-name>
          ,
          <article-title>Interactive web-based hydrological simulation system as an education platform</article-title>
          , in: A. E. Rizzoli,
          <string-name>
            <given-names>N. W. T.</given-names>
            <surname>Quinn</surname>
          </string-name>
          ,
          <string-name>
            <surname>D. P.</surname>
          </string-name>
          Ames (Eds.), Proceedings - 7th
          <source>International Congress on Environmental Modelling and Software: Bold Visions for Environmental Modeling</source>
          , iEMSs
          <year>2014</year>
          , volume
          <volume>2</volume>
          ,
          <source>International Environmental Modelling and Software Society</source>
          ,
          <year>2014</year>
          , pp.
          <fpage>910</fpage>
          -
          <lpage>912</lpage>
          . URL: https://doi.org/10.17077/aseenmw2014.
          <fpage>1008</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref79">
        <mixed-citation>
          [79]
          <string-name>
            <given-names>M.</given-names>
            <surname>Farella</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Taibi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Arrigo</surname>
          </string-name>
          , G. Todaro, G. Fulantelli,
          <string-name>
            <surname>G. Chiazzese,</surname>
          </string-name>
          <article-title>An augmented reality mobile learning experience based on treasure hunt serious game</article-title>
          , in: C.
          <string-name>
            <surname>Busch</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Steinicke</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          <string-name>
            <surname>Friess</surname>
          </string-name>
          , T. Wendler (Eds.),
          <source>Proceedings of the European Conference on e-Learning</source>
          ,
          <string-name>
            <surname>ECEL</surname>
          </string-name>
          , Academic Conferences and Publishing International Limited,
          <year>2021</year>
          , pp.
          <fpage>148</fpage>
          -
          <lpage>154</lpage>
          . URL: https: //doi.org/10.34190/EEL.21.109.
        </mixed-citation>
      </ref>
      <ref id="ref80">
        <mixed-citation>
          [80]
          <string-name>
            <given-names>J.</given-names>
            <surname>Ferguson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Mentzelopoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Protopsaltis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Economou</surname>
          </string-name>
          ,
          <article-title>Small and flexible web based framework for teaching QR and AR mobile learning application development</article-title>
          ,
          <source>in: Proceedings of 2015 International Conference on Interactive Mobile Communication Technologies and Learning</source>
          , IMCL 2015,
          <article-title>Institute of Electrical and Electronics Engineers Inc</article-title>
          .,
          <year>2015</year>
          , pp.
          <fpage>383</fpage>
          -
          <lpage>385</lpage>
          . URL: https: //doi.org/10.1109/IMCTL.
          <year>2015</year>
          .
          <volume>7359624</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref81">
        <mixed-citation>
          [81]
          <string-name>
            <surname>Harun</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Tuli</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Mantri</surname>
          </string-name>
          ,
          <article-title>Experience fleming's rule in electromagnetism using augmented reality: Analyzing impact on students learning</article-title>
          ,
          <source>Procedia Computer Science</source>
          <volume>172</volume>
          (
          <year>2020</year>
          )
          <fpage>660</fpage>
          -
          <lpage>668</lpage>
          . URL: https://doi.org/10.1016/j.procs.
          <year>2020</year>
          .
          <volume>05</volume>
          .086.
        </mixed-citation>
      </ref>
      <ref id="ref82">
        <mixed-citation>
          [82]
          <string-name>
            <given-names>T.</given-names>
            <surname>Kobayashi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Sasaki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Toguchi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Mizuno</surname>
          </string-name>
          ,
          <article-title>A discussion on web-based learning contents with the AR technology and its authoring tools to improve students' skills in exercise courses</article-title>
          , in: A. F. Mohd
          <string-name>
            <surname>Ayub</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Kashihara</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          <string-name>
            <surname>Matsui</surname>
            , C.-C. Liu,
            <given-names>H.</given-names>
          </string-name>
          <string-name>
            <surname>Ogata</surname>
          </string-name>
          , S. C. Kong (Eds.),
          <source>Work-In-Progress Poster - Proceedings of the 22nd International Conference on Computers in Education, ICCE</source>
          <year>2014</year>
          ,
          <article-title>Asia-Pacific Society for</article-title>
          Computers in Education,
          <year>2014</year>
          , pp.
          <fpage>34</fpage>
          -
          <lpage>36</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref83">
        <mixed-citation>
          [83]
          <string-name>
            <given-names>L. O.</given-names>
            <surname>Maggi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. M. X. N.</given-names>
            <surname>Teixeira</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. R. F. E. S.</given-names>
            <surname>Junior</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. P. C.</given-names>
            <surname>Cajueiro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. V.</given-names>
            <surname>S. G. De Lima</surname>
          </string-name>
          ,
          <string-name>
            <surname>M. H. R. De Alencar Bezerra</surname>
            ,
            <given-names>G. N.</given-names>
          </string-name>
          <string-name>
            <surname>Melo</surname>
          </string-name>
          , 3DJPi:
          <article-title>An Open-Source Web-Based 3D Simulator for Pololu's 3Pi Platform</article-title>
          , in
          <source>: Proceedings - 2019 21st Symposium on Virtual and Augmented Reality</source>
          ,
          <string-name>
            <surname>SVR</surname>
          </string-name>
          <year>2019</year>
          ,
          <article-title>Institute of Electrical and Electronics Engineers Inc</article-title>
          .,
          <year>2019</year>
          , pp.
          <fpage>52</fpage>
          -
          <lpage>58</lpage>
          . URL: https://doi.org/10.1109/SVR.
          <year>2019</year>
          .
          <volume>00025</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref84">
        <mixed-citation>
          [84]
          <string-name>
            <given-names>R.</given-names>
            <surname>Marín</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. J.</given-names>
            <surname>Sanz</surname>
          </string-name>
          ,
          <article-title>The Human-Machine Interaction through the UJI Telerobotic Training System</article-title>
          , in: M. H.
          <string-name>
            <surname>Hamza</surname>
          </string-name>
          (Ed.),
          <source>IASTED International Conference Robotics and Applications</source>
          ,
          <string-name>
            <surname>RA</surname>
          </string-name>
          <year>2003</year>
          , June 25-27,
          <year>2003</year>
          , Salzburg, Austria, IASTED/ACTA Press,
          <year>2003</year>
          , pp.
          <fpage>47</fpage>
          -
          <lpage>52</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref85">
        <mixed-citation>
          [85]
          <string-name>
            <given-names>H. S.</given-names>
            <surname>Narman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Berry</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Canfield</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Carpenter</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Giese</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Loftus</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Schrader</surname>
          </string-name>
          ,
          <article-title>Augmented Reality for Teaching Data Structures in Computer Science</article-title>
          , in: 2020
          <source>IEEE Global Humanitarian Technology Conference, GHTC</source>
          <year>2020</year>
          ,
          <article-title>Institute of Electrical and Electronics Engineers Inc</article-title>
          .,
          <year>2020</year>
          , p.
          <fpage>9342932</fpage>
          . URL: https://doi.org/10.1109/GHTC46280.
          <year>2020</year>
          .
          <volume>9342932</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref86">
        <mixed-citation>
          [86]
          <string-name>
            <given-names>M.</given-names>
            <surname>Nguyen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Le</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. M.</given-names>
            <surname>Lai</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W. Q.</given-names>
            <surname>Yan</surname>
          </string-name>
          ,
          <article-title>A web-based augmented reality platform using pictorial QR code for educational purposes and beyond</article-title>
          , in: S. N.
          <string-name>
            <surname>Spencer</surname>
          </string-name>
          (Ed.),
          <source>Proceedings of the ACM Symposium on Virtual Reality Software and Technology</source>
          ,
          <string-name>
            <surname>VRST</surname>
          </string-name>
          , Association for Computing Machinery,
          <year>2019</year>
          , p.
          <fpage>3364793</fpage>
          . URL: https://doi.org/10.1145/3359996.3364793.
        </mixed-citation>
      </ref>
      <ref id="ref87">
        <mixed-citation>
          [87]
          <string-name>
            <given-names>V. T.</given-names>
            <surname>Nguyen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Jung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Yoo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Park</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Currie</surname>
          </string-name>
          ,
          <article-title>Civil war battlefield experience: Historical event simulation using augmented reality technology</article-title>
          ,
          <source>in: Proceedings - 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality</source>
          ,
          <string-name>
            <surname>AIVR</surname>
          </string-name>
          <year>2019</year>
          ,
          <article-title>Institute of Electrical and Electronics Engineers Inc</article-title>
          .,
          <year>2019</year>
          , pp.
          <fpage>294</fpage>
          -
          <lpage>297</lpage>
          . URL: https://doi.org/10.1109/AIVR46125.
          <year>2019</year>
          .
          <volume>00068</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref88">
        <mixed-citation>
          <source>[88] LATICE '14: Proceedings of the 2014 International Conference on Teaching and Learning in Computing and Engineering</source>
          , IEEE Computer Society, USA,
          <year>2014</year>
          . URL: https://www.computer. org/csdl/proceedings/latice/2014/12OmNrAdsty.
        </mixed-citation>
      </ref>
      <ref id="ref89">
        <mixed-citation>
          <source>[89] Proceedings of 2015 International Conference on Interactive Mobile Communication Technologies and Learning</source>
          , IMCL 2015,
          <article-title>Institute of Electrical and Electronics Engineers Inc</article-title>
          .,
          <year>2015</year>
          . URL: https://doi.org/10.1109/IMCL37494.
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref90">
        <mixed-citation>
          [90]
          <string-name>
            <given-names>T.</given-names>
            <surname>Tsiatsos</surname>
          </string-name>
          , M. E. Auer (Eds.),
          <source>11th International Conference on Interactive Mobile Communication Technologies and Learning, IMCL2017</source>
          , volume
          <volume>725</volume>
          <source>of Advances in Intelligent Systems and Computing</source>
          , Springer Verlag,
          <year>2018</year>
          . URL: https://doi.org/10.1007/978-3-
          <fpage>319</fpage>
          -75175-7.
        </mixed-citation>
      </ref>
      <ref id="ref91">
        <mixed-citation>
          [91]
          <string-name>
            <given-names>Innovative</given-names>
            <surname>Technologies</surname>
          </string-name>
          and Learning: 4th International Conference, ICITL 2021,
          <string-name>
            <given-names>Virtual</given-names>
            <surname>Event</surname>
          </string-name>
          ,
          <source>November 29 - December 1</source>
          ,
          <year>2021</year>
          , Proceedings, volume
          <volume>13117</volume>
          of Lecture Notes in Computer Science, Springer International Publishing,
          <year>2021</year>
          . URL: http://doi.org/10.1007/978-3-
          <fpage>030</fpage>
          -91540-7. doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>030</fpage>
          -91540-7.
        </mixed-citation>
      </ref>
      <ref id="ref92">
        <mixed-citation>
          [92]
          <string-name>
            <given-names>N.</given-names>
            <surname>Nordin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Markom</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F. A.</given-names>
            <surname>Suhaimi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ishak</surname>
          </string-name>
          ,
          <article-title>A web-based campus navigation system with mobile augmented reality intervention</article-title>
          ,
          <source>Journal of Physics: Conference Series</source>
          <year>1997</year>
          (
          <year>2021</year>
          )
          <article-title>012038</article-title>
          . URL: https://doi.org/10.1088/
          <fpage>1742</fpage>
          -
          <lpage>6596</lpage>
          /
          <year>1997</year>
          /1/012038.
        </mixed-citation>
      </ref>
      <ref id="ref93">
        <mixed-citation>
          [93]
          <string-name>
            <given-names>S. L.</given-names>
            <surname>Proskura</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. H.</given-names>
            <surname>Lytvynova</surname>
          </string-name>
          ,
          <article-title>The approaches to web-based education of computer science bachelors in higher education institutions</article-title>
          ,
          <source>CTE Workshop Proceedings</source>
          <volume>7</volume>
          (
          <year>2020</year>
          )
          <fpage>609</fpage>
          -
          <lpage>625</lpage>
          . URL: https://doi.org/10.55056/cte.416.
        </mixed-citation>
      </ref>
      <ref id="ref94">
        <mixed-citation>
          [94]
          <string-name>
            <given-names>S.</given-names>
            <surname>Proskura</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Lytvynova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Kronda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Demeshkant</surname>
          </string-name>
          ,
          <article-title>Mobile Learning Approach as a Supplementary Approach in the Organization of the Studying Process in Educational Institutions</article-title>
          , in: O.
          <string-name>
            <surname>Sokolov</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <string-name>
            <surname>Zholtkevych</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          <string-name>
            <surname>Yakovyna</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          <string-name>
            <surname>Tarasich</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          <string-name>
            <surname>Kharchenko</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          <string-name>
            <surname>Kobets</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          <string-name>
            <surname>Burov</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Semerikov</surname>
          </string-name>
          , H. Kravtsov (Eds.),
          <source>Proceedings of the 16th International Conference on ICT in Education, Research and Industrial Applications</source>
          . Integration, Harmonization and
          <string-name>
            <given-names>Knowledge</given-names>
            <surname>Transfer</surname>
          </string-name>
          . Volume II: Workshops, Kharkiv, Ukraine,
          <source>October 06-10</source>
          ,
          <year>2020</year>
          , volume
          <volume>2732</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>650</fpage>
          -
          <lpage>664</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2732</volume>
          /20200650.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref95">
        <mixed-citation>
          [95]
          <string-name>
            <given-names>G.</given-names>
            <surname>Ryan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Murphy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Higgins</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>McAulife</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Mangina</surname>
          </string-name>
          ,
          <article-title>Work-in-Progress-Development of a Virtual Reality Learning Environment: VR Baby</article-title>
          , in: D.
          <string-name>
            <surname>Economou</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Klippel</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <string-name>
            <surname>Dodds</surname>
            , A. PenaRios,
            <given-names>M. J. W.</given-names>
          </string-name>
          <string-name>
            <surname>Lee</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Beck</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Pirker</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Dengel</surname>
            ,
            <given-names>T. M.</given-names>
          </string-name>
          <string-name>
            <surname>Peres</surname>
          </string-name>
          , J. Richter (Eds.),
          <source>Proceedings of 6th International Conference of the Immersive Learning Research Network, iLRN</source>
          <year>2020</year>
          ,
          <article-title>Institute of Electrical and Electronics Engineers Inc</article-title>
          .,
          <year>2020</year>
          , pp.
          <fpage>312</fpage>
          -
          <lpage>315</lpage>
          . URL: https://doi.org/10.23919/ iLRN47897.
          <year>2020</year>
          .
          <volume>9155203</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref96">
        <mixed-citation>
          [96]
          <string-name>
            <given-names>S.</given-names>
            <surname>Sendari</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Wibawanto</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Jasmine</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Jiono</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Puspitasari</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Diantoro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Nur</surname>
          </string-name>
          ,
          <article-title>Integrating Robo-PEM with AR Application for Introducing Fuel Cell Implementation</article-title>
          , in: 7th International Conference on Electrical, Electronics and Information Engineering:
          <article-title>Technological Breakthrough for Greater New Life</article-title>
          , ICEEIE 2021,
          <article-title>Institute of Electrical and Electronics Engineers Inc</article-title>
          .,
          <year>2021</year>
          . URL: https://doi.org/10.1109/ICEEIE52663.
          <year>2021</year>
          .
          <volume>9616683</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref97">
        <mixed-citation>
          [97]
          <string-name>
            <given-names>T.</given-names>
            <surname>Sharkey</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Twomey</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Eguchi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Sweet</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. C.</given-names>
            <surname>Wu</surname>
          </string-name>
          ,
          <article-title>Need Finding for an Embodied Coding Platform: Educators' Practices and Perspectives</article-title>
          , in: M.
          <string-name>
            <surname>Cukurova</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Rummel</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Gillet</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          <string-name>
            <surname>McLaren</surname>
            ,
            <given-names>J</given-names>
          </string-name>
          . Uhomoibhi (Eds.),
          <source>International Conference on Computer Supported Education, CSEDU - Proceedings</source>
          , volume
          <volume>1</volume>
          , Science and
          <string-name>
            <given-names>Technology</given-names>
            <surname>Publications</surname>
          </string-name>
          , Lda,
          <year>2022</year>
          , pp.
          <fpage>216</fpage>
          -
          <lpage>227</lpage>
          . URL: https://doi.org/10.5220/0011000200003182.
        </mixed-citation>
      </ref>
      <ref id="ref98">
        <mixed-citation>
          [98]
          <string-name>
            <given-names>N.</given-names>
            <surname>Spasova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Ivanova</surname>
          </string-name>
          ,
          <article-title>Towards augmented reality technology in CAD/CAM systems and engineering education</article-title>
          , in: I. Roceanu (Ed.),
          <source>eLearning and Software for Education Conference</source>
          , National Defence University - Carol I Printing House,
          <year>2020</year>
          , pp.
          <fpage>496</fpage>
          -
          <lpage>503</lpage>
          . URL: https://doi.org/10. 12753/2066-026X-
          <fpage>20</fpage>
          -151.
        </mixed-citation>
      </ref>
      <ref id="ref99">
        <mixed-citation>
          [99]
          <string-name>
            <given-names>D.</given-names>
            <surname>Tennakoon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. U.</given-names>
            <surname>Usmani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Usman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Vasileiou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Latchaev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Baljko</surname>
          </string-name>
          , U. T. Khan,
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Perras</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <article-title>Jadidi, TEaching Earth Systems Beyond the Classroom: Developing a Mixed Reality (XR) Sandbox</article-title>
          , in: ASEE Annual Conference and Exposition, Conference Proceedings, American Society for Engineering Education,
          <year>2022</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref100">
        <mixed-citation>
          [100]
          <string-name>
            <given-names>A.</given-names>
            <surname>Toguchi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Sasaki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Mizuno</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Shikoda</surname>
          </string-name>
          ,
          <article-title>Build a prototype of new e-Learning contents by using the AR technology</article-title>
          ,
          <source>in: IMSCI 2011 - 5th International Multi-Conference on Society, Cybernetics and Informatics, Proceedings</source>
          , volume
          <volume>1</volume>
          , International Institute of Informatics and Systemics,
          <string-name>
            <surname>IIIS</surname>
          </string-name>
          ,
          <year>2011</year>
          , pp.
          <fpage>261</fpage>
          -
          <lpage>264</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref101">
        <mixed-citation>
          [101]
          <string-name>
            <given-names>A.</given-names>
            <surname>Toguchi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Sasaki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Mizuno</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Shikoda</surname>
          </string-name>
          ,
          <article-title>Development of new e-Learning contents for improvement of laboratory courses by using the AR technology</article-title>
          ,
          <source>in: IMSCI 2012 - 6th International Multi-Conference on Society, Cybernetics</source>
          and Informatics, Proceedings, International Institute of Informatics and Systemics,
          <string-name>
            <surname>IIIS</surname>
          </string-name>
          ,
          <year>2012</year>
          , pp.
          <fpage>189</fpage>
          -
          <lpage>193</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref102">
        <mixed-citation>
          [102]
          <string-name>
            <given-names>N.</given-names>
            <surname>Tuli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Mantri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Sharma</surname>
          </string-name>
          ,
          <article-title>Impact of augmented reality tabletop learning environment on learning and motivation of kindergarten kids</article-title>
          ,
          <source>AIP Conference Proceedings</source>
          <volume>2357</volume>
          (
          <year>2022</year>
          )
          <article-title>040017</article-title>
          . URL: https://doi.org/10.1063/5.0080600.
        </mixed-citation>
      </ref>
      <ref id="ref103">
        <mixed-citation>
          [103]
          <string-name>
            <given-names>I.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Nguyen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Le</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Yan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Hooper</surname>
          </string-name>
          ,
          <article-title>Enhancing Visualisation of Anatomical Presentation and Education Using Marker-based Augmented Reality Technology on Web-based Platform</article-title>
          ,
          <source>in: Proceedings of AVSS 2018 - 2018 15th IEEE International Conference on Advanced Video</source>
          and
          <string-name>
            <surname>Signal-Based</surname>
            <given-names>Surveillance</given-names>
          </string-name>
          ,
          <article-title>Institute of Electrical and Electronics Engineers Inc</article-title>
          .,
          <year>2019</year>
          , p.
          <fpage>8639147</fpage>
          . URL: https://doi.org/10.1109/AVSS.
          <year>2018</year>
          .
          <volume>8639147</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref104">
        <mixed-citation>
          [104]
          <string-name>
            <given-names>S.</given-names>
            <surname>Wongchiranuwat</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Samat</surname>
          </string-name>
          ,
          <article-title>Synthesis of theoretical framework for augmented reality learning environment to promote creative thinking on topic implementation of graphic design for grade 9 students</article-title>
          , in: S. L.
          <string-name>
            <surname>Wong</surname>
            ,
            <given-names>A. G.</given-names>
          </string-name>
          <string-name>
            <surname>Barrera</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <string-name>
            <surname>Mitsuhara</surname>
            , G. Biswas,
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Jia</surname>
            ,
            <given-names>J.-C.</given-names>
          </string-name>
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>M. P.</given-names>
          </string-name>
          <string-name>
            <surname>Banawan</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Demirbilek</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Gaydos</surname>
            ,
            <given-names>C.-P.</given-names>
          </string-name>
          <string-name>
            <surname>Lin</surname>
            ,
            <given-names>J. G.</given-names>
          </string-name>
          <string-name>
            <surname>Shon</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Iyer</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Gulz</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Holden</surname>
          </string-name>
          , G. Kessler,
          <string-name>
            <surname>M. M. T. Rodrigo</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          <string-name>
            <surname>Sengupta</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          <string-name>
            <surname>Taalas</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Murthy</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          <string-name>
            <surname>Kim</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          <string-name>
            <surname>Ochoa</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Sun</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Baloian</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          <string-name>
            <surname>Hoel</surname>
          </string-name>
          , U. Hoppe, T.-
          <string-name>
            <surname>C. Hsu</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Kukulska-Hulme</surname>
          </string-name>
          , H.
          <string-name>
            <surname>-C. Chu</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          <string-name>
            <surname>Gu</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>J. S.</given-names>
          </string-name>
          <string-name>
            <surname>Huang</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.-F. Jan</surname>
          </string-name>
          , L.
          <string-name>
            <surname>-H. Wong</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          Yin (Eds.),
          <source>ICCE 2016 - 24th International Conference on Computers in Education: Think Global Act Local - Main Conference Proceedings, Asia-Pacific Society for Computers in Education</source>
          ,
          <year>2016</year>
          , pp.
          <fpage>639</fpage>
          -
          <lpage>641</lpage>
          . URL: https://files.eric.ed.gov/fulltext/EJ1211500.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref105">
        <mixed-citation>
          [105] ngrok,
          <source>Unified Application Delivery Platform for Developers</source>
          ,
          <year>2024</year>
          . URL: https://ngrok.com/.
        </mixed-citation>
      </ref>
      <ref id="ref106">
        <mixed-citation>
          [106] ngrok, Your Authtoken,
          <year>2024</year>
          . URL: https://dashboard.ngrok.com/get-started/your-authtoken.
        </mixed-citation>
      </ref>
      <ref id="ref107">
        <mixed-citation>
          [107]
          <string-name>
            <surname>TrackJS</surname>
            <given-names>LLC</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Remote JavaScript Debugger - RemoteJS</surname>
          </string-name>
          ,
          <year>2022</year>
          . URL: https://remotejs.com/.
        </mixed-citation>
      </ref>
      <ref id="ref108">
        <mixed-citation>
          [108]
          <article-title>MDN contributors, WebGL: 2D and 3D graphics for the web</article-title>
          ,
          <year>2023</year>
          . URL: https://developer.mozilla. org/en-US/docs/Web/API/WebGL_API.
        </mixed-citation>
      </ref>
      <ref id="ref109">
        <mixed-citation>
          [109]
          <string-name>
            <surname>Three</surname>
          </string-name>
          .js - JavaScript
          <source>3D Library</source>
          ,
          <year>2024</year>
          . URL: https://threejs.org/.
        </mixed-citation>
      </ref>
      <ref id="ref110">
        <mixed-citation>
          [110]
          <string-name>
            <given-names>A.</given-names>
            <surname>Klavins</surname>
          </string-name>
          ,
          <article-title>9 ideas for creating tech-infused augmented reality T-shirts</article-title>
          ,
          <year>2021</year>
          . URL: https:// overlyapp.com/blog/9
          <article-title>-ideas-for-creating-tech-infused-augmented-reality-t-shirts/.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref111">
        <mixed-citation>
          [111]
          <string-name>
            <given-names>Face</given-names>
            <surname>Landmarks Detection</surname>
          </string-name>
          ,
          <year>2023</year>
          . URL: https://github.com/tensorflow/tfjs-models/tree/master/ face-landmarks-detection.
        </mixed-citation>
      </ref>
      <ref id="ref112">
        <mixed-citation>
          [112]
          <string-name>
            <surname>Google</surname>
            <given-names>LLC</given-names>
          </string-name>
          ,
          <article-title>Face landmark detection guide | MediaPipe | Google for Developers</article-title>
          ,
          <year>2023</year>
          . URL: https://developers.google.com/mediapipe/solutions/vision/face_landmarker/.
        </mixed-citation>
      </ref>
      <ref id="ref113">
        <mixed-citation>
          [113]
          <article-title>Face reference assets for Meta Spark Studio</article-title>
          ,
          <year>2023</year>
          . URL: https://spark.meta.com/learn/articles/ people-tracking/
          <article-title>face-reference-assets.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref114">
        <mixed-citation>
          [114]
          <article-title>The face mask</article-title>
          template in Adobe® Photoshop®,
          <year>2023</year>
          . URL: https://spark.meta.com/learn/articles/ creating-and
          <article-title>-prepping-assets/the-face-mask-template-in-Adobe.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref115">
        <mixed-citation>
          [115]
          <string-name>
            <surname>TensorFlow</surname>
          </string-name>
          ,
          <year>2024</year>
          . URL: https://www.tensorflow.org/.
        </mixed-citation>
      </ref>
      <ref id="ref116">
        <mixed-citation>
          [116] TensorFlow.js models,
          <year>2024</year>
          . URL: https://www.tensorflow.org/js/models.
        </mixed-citation>
      </ref>
      <ref id="ref117">
        <mixed-citation>
          [117]
          <string-name>
            <given-names>Hand</given-names>
            <surname>Pose Detection</surname>
          </string-name>
          ,
          <year>2023</year>
          . URL: https://github.com/tensorflow/tfjs-models/tree/master/ hand-pose-detection.
        </mixed-citation>
      </ref>
      <ref id="ref118">
        <mixed-citation>
          [118]
          <string-name>
            <surname>Find</surname>
          </string-name>
          Pre-trained
          <source>Models | Kaggle</source>
          ,
          <year>2024</year>
          . URL: https://www.kaggle.com/models.
        </mixed-citation>
      </ref>
      <ref id="ref119">
        <mixed-citation>
          [119]
          <string-name>
            <surname>Google</surname>
            ,
            <given-names>Teachable</given-names>
          </string-name>
          <string-name>
            <surname>Machine</surname>
          </string-name>
          ,
          <year>2017</year>
          . URL: https://teachablemachine.withgoogle.com/.
        </mixed-citation>
      </ref>
      <ref id="ref120">
        <mixed-citation>
          [120]
          <string-name>
            <surname>MobileNet</surname>
          </string-name>
          ,
          <year>2023</year>
          . URL: https://github.com/tensorflow/tfjs-models/tree/master/mobilenet.
        </mixed-citation>
      </ref>
      <ref id="ref121">
        <mixed-citation>
          [121]
          <string-name>
            <given-names>Speech</given-names>
            <surname>Command Recognizer</surname>
          </string-name>
          ,
          <year>2024</year>
          . URL: https://github.com/tensorflow/tfjs-models/tree/master/ speech-commands.
        </mixed-citation>
      </ref>
      <ref id="ref122">
        <mixed-citation>
          <source>[122] Pose Detection in the Browser: PoseNet Model</source>
          ,
          <year>2024</year>
          . URL: https://github.com/tensorflow/ tfjs-models/tree/master/posenet.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>