<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Harnessing immersive technologies for enhancing Japanese language acquisition: Methodological insights for prospective language educators</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Olena V. Gayevska</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Taras Shevchenko National University of Kyiv</institution>
          ,
          <addr-line>60 Volodymyrska Str., Kyiv, 01033</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <fpage>60</fpage>
      <lpage>69</lpage>
      <abstract>
        <p>This article explores the methodological aspects of leveraging immersive technologies to facilitate Japanese language acquisition for prospective language educators. The study analyses the application of virtual and augmented reality in supporting and organising Japanese language learning for aspiring language teachers, and identifies the primary approaches to employing augmented reality in language education. The findings suggest that immersive technologies introduce a novel paradigm for educational materials, positively influencing the development of fundamental and professional competencies in future Japanese language educators. These technologies can be particularly efective when integrated into a blended learning model that combines distance, online, traditional, and self-directed learning of Oriental languages. The study highlights the need for further research to develop guidelines for utilising immersive technologies in teaching Oriental languages at various stages of language teacher training.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;immersive technologies</kwd>
        <kwd>augmented reality</kwd>
        <kwd>virtual reality</kwd>
        <kwd>Japanese language acquisitionv prospective Japanese language educators</kwd>
        <kwd>blended learning</kwd>
        <kwd>oriental language learning</kwd>
        <kwd>distance education</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>The rapid advancement of information and communication technologies (ICTs) and their pervasive
integration into various domains of human activity necessitate the adaptation of young individuals
to novel modes of working, living, and interacting. Contemporary technologies, employed across
diverse professional fields, hold immense significance for incorporation into the educational process,
particularly within general education institutions, and are crucial for the competitiveness of youth in
the global job market.</p>
      <p>
        In this context, immersive technologies (ITs) are gaining increasing popularity in the education
sector [
        <xref ref-type="bibr" rid="ref1 ref2 ref3">1, 2, 3</xref>
        ]. These technologies, which extend reality or create new realities by harnessing the 360°
space, are exerting a profound influence on numerous facets of 21st-century life, including commerce,
tourism, the interaction with and perception of digital information and media, science, and education
[
        <xref ref-type="bibr" rid="ref4 ref5 ref6 ref7">4, 5, 6, 7</xref>
        ]. Makransky and Petersen [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] emphasise that the application of these technologies can enhance
real-world visualisation through the incorporation of virtual objects, graphics, and object recognition
technologies.
      </p>
      <p>
        ITs encompass virtual reality (VR), augmented reality (AR), and mixed reality (MR), which are
currently being employed in a wide array of fields, ranging from gaming and entertainment; theatre and
live events; museums and cultural heritage; marketing, advertising, and tourism; architecture, product
development, and design; to simulation and healthcare [
        <xref ref-type="bibr" rid="ref10 ref11 ref12 ref13 ref14 ref9">9, 10, 11, 12, 13, 14</xref>
        ].
      </p>
      <p>While ITs are predominantly utilised in science education to cover topics such as human anatomy
(Anatomy AR-VR, AR Human Anatomy, The Brain AR App, etc.), the universe (Planets AR, EARTH
AR Poster, etc.), chemical reactions (MoleculAR, Chemistry Augmented Reality Education Arloon,
etc.), and plant anatomy (Froggipeadia, Arloon Plants AR, etc.), this paper focuses on their role in
foreign language education, specifically in the context of Japanese language learning. Given the limited
availability of applications and IT content tailored for language education, we will explore all potential
applications of ITs (VR and AR) in Japanese language acquisition and the experiences of students.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Theoretical backgrounds</title>
      <p>The multifunctionality of ITs and the unfamiliarity of the concept of “virtual reality” have catalysed the
actualisation of the term “virtual” and the rapid expansion of its scope, serving as an impetus for the
conceptual design of the idea of virtual reality across various domains of human activity.</p>
      <p>
        AR holds immense potential in the field of language education [
        <xref ref-type="bibr" rid="ref15 ref16">15, 16</xref>
        ], as it serves two primary
functions: contextual visualisation (i.e., the presentation of virtual information within an extended
context) and interactivity of learning (i.e., the embodiment of interaction with virtual content). VR,
on the other hand, is a virtual 3D world that enables users to experience visual simulations and feel
immersed in an environment free from temporal and spatial constraints.
      </p>
      <p>
        The popularisation of the phrase “virtual reality” can be attributed to Jaron Lanier in the late 1980s
[
        <xref ref-type="bibr" rid="ref17">17</xref>
        ].
      </p>
      <p>
        At the current stage of ICT development, immersive technologies based on VR can be categorised as
follows (figure 1):
• VR with full immersion, which provides a realistic simulation of the virtual world with a high
degree of detail (e.g., the Virtual Shooter game zone);
• Partial immersion VR, consisting of VR and real-world attributes, is achieved by embodying
computer graphics objects in a reality scene (e.g., a flight simulator) [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ];
• VR without immersion, related to the virtual experience with a computer, where users can control
individual characters or their actions in the software, while the environment does not directly
interact with the user (e.g., World of WarCraft, ReHabgame);
• VR with group work, which represents a three-dimensional virtual world with elements of a
social network (e.g., Minecraft already has a version of virtual reality supported by Oculus Rift
and Gear VR helmets) [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ];
• CAVE (Cave Automatic Virtual Environment), developed by students at the University of Illinois
in 1995, is a three-dimensional stage with wall projections [
        <xref ref-type="bibr" rid="ref18 ref20">18, 20</xref>
        ].
      </p>
      <p>
        The term “Augmented Reality” was coined by aircraft engineers Caudell and Mizell [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ] in 1990.
They developed head-mounted displays as equipment for electricians to be used during the assembly of
complex wiring harnesses [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ].
      </p>
      <p>
        Nelson [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ] identifies augmented reality as an essential element of the “Bring Your Own Device”
(BYOD) approach, which entails the use of mobile devices by teachers and students in the classroom for
learning purposes.
      </p>
      <p>
        Calo et al. [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ] define Augmented Reality as “. . . a mobile or embedded technology that senses,
processes, and outputs data in real-time, recognises and tracks real-world objects, and provides contextual
information by supplementing or replacing human senses.”
      </p>
      <p>
        AR is a technology that incorporates digital information such as images, video, and audio into
realworld spaces, enabling the blending of virtual environments with reality [
        <xref ref-type="bibr" rid="ref25">25</xref>
        ]. Users of this technology
have the opportunity to learn in immersive, computer-generated environments through realistic sensory
experiences.
      </p>
      <p>Mobile AR applications can be grouped into three categories based on their purpose, place of use,
and usability: marker-based, creation-based, and marker-less AR (figure 2).</p>
      <p>It is worth noting that some applications in these categories may possess both creation-based and
marker-less features. However, if an application is marker-based, it cannot have a marker-less AR
feature, as it could only function with flashcards.</p>
      <p>
        We can distinguish the following types of mobile AR [
        <xref ref-type="bibr" rid="ref26">26</xref>
        ]:
provides realistic simulation of the
virtual world with a high degree of detail
consists of VR and real-world attributes,
is performed by embodying computer
      </p>
      <p>graphics objects in a reality scene
relates to the virtual experience with</p>
      <p>a computer, when you can control
individual characters or their actions
in the software, while the environment
does not interact directly with the user
represents a three-dimensional
virtual world with elements</p>
      <p>of a social network VR
a three-dimensional stage with
wall projections VR (CAVE – Cave</p>
      <p>Automatic Virtual Environment</p>
      <p>to use the special visual
marker, such as a QR code
to use the browser-based
platform allows for upload 3D files
and edit them with comments</p>
      <p>to use Global
Positioning System (GPS)
• marker-based, which uses a camera and a special visual marker, such as a QR code (quick response
code);
• creation-based, which uses a browser-based platform allowing users to upload 3D files and edit
them with comments, detailed instructions, and animations via a drag-and-drop interface;
• marker-less, which uses the Global Positioning System (GPS); the most common uses are to
mark destinations, search for the correct location, such as a café or ofice, or in location-oriented
applications.</p>
      <p>
        Researchers have identified several positive efects of AR on students’ foreign language learning,
including enhancing the efectiveness of their language skills in professional translation, increasing
motivation to learn, and engaging students in collaboration with each other and with native speakers
of the foreign language being studied [
        <xref ref-type="bibr" rid="ref19 ref23 ref25 ref27 ref28 ref29 ref30 ref31 ref32 ref8">27, 28, 25, 29, 8, 30, 31, 19, 23, 32</xref>
        ]. AR has immense potential in
the field of language education due to its functions of contextual visualisation (i.e., the presentation of
virtual information within an extended context) and interactivity of learning (i.e., the embodiment of
interaction with virtual content).
      </p>
      <p>
        A review of the literature by Viberg and Grönlund [
        <xref ref-type="bibr" rid="ref33">33</xref>
        ] indicates that Mobile Assisted Language
Learning (MALL), as a mobile technology that can be adapted to support language learning, is applied
in various ways but generally focuses on vocabulary acquisition, listening and speaking skills, and
language acquisition, while grammar learning, pronunciation, and writing skills were underrepresented
in the application of MALL.
      </p>
      <p>
        Hein et al. [
        <xref ref-type="bibr" rid="ref34">34</xref>
        ] analysed 2,507 sources and selected 54 articles published between 2001 and 2020 that
related to the role of immersive technology in students’ foreign language learning.
      </p>
      <p>They found that most of these studies focused on the comparative analysis of traditional blended
learning methods, which included the use of VR and AR. The main characteristics of these technologies
that support foreign language learning are the promotion of vocabulary learning, the development
of speaking skills and intercultural competence, students’ motivation for foreign language learning,
and the ability to overcome anxiety and discomfort when communicating in a foreign language. The
advantage of learning with AR over traditional teaching methods lies in the fact that students are
given the opportunity to feel, rather than imagine, the subject, situation, or scenario, which cannot be
demonstrated or described using traditional teaching methods.</p>
      <p>The purpose of this article is to analyse the use of immersive technologies for supporting and organising
Japanese language learning for prospective language teachers and to identify the main approaches to
the use of immersive technologies in Japanese language learning.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Research methods</title>
      <p>To achieve the purpose of our study and clarify the problem of utilising immersive technologies for
prospective Japanese language teachers, we employed the following methods: systematic and
comparative analysis of pedagogical, psychological, philosophical, sociological works, methodological and
specialised literature; analysis of the pedagogical experience of using immersive technologies at the
Institute of Philology of Taras Shevchenko National University of Kyiv in lectures and seminars on “Japanese
characters”; synthesis and generalisation to formulate the main points of the study; interpretation of
the research results through a student survey and comparative analysis of exam results in Japanese
lexicology of students who studied the language using ICT and immersive technologies with exam
results of students who studied the language using ICT but not immersive technologies. The research
hypothesis is based on the assumption that the training of prospective Japanese language teachers will
be efective if the following pedagogical conditions are implemented: activating the motivation of future
foreign language teachers to carry out project activities using immersive technologies as didactic tools
for learning Japanese; improving the content of training future foreign language teachers to form their
knowledge about the use of information and communication technologies and immersive technologies
for learning Japanese.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Results and discussion</title>
      <p>Scientists attach special importance to the use of augmented reality in the study of Oriental languages
by students, particularly prospective teachers of the Japanese language.</p>
      <p>They note that the preparation of prospective teachers of Oriental languages (including languages
with character-based writing, such as Japanese and Chinese) for professional activities is a complex
process, as it difers significantly from the study and teaching of any other foreign language (for instance,
English, French, German, Italian, Spanish, Turkish languages that are also included in the educational
planning of the Institute of Philology of Taras Shevchenko National University of Kyiv).</p>
      <p>
        Researchers recognise the use of ITs as a solution to the problems of fast, active, correct, and
convenient Oriental language learning by students [
        <xref ref-type="bibr" rid="ref23 ref25 ref27 ref29 ref30 ref8">27, 25, 29, 8, 30, 23</xref>
        ]. They note that the use of these
technologies can improve real-world visualisation with virtual objects, graphics, and object recognition
technologies.
      </p>
      <p>
        Frazier et al. [
        <xref ref-type="bibr" rid="ref29">29</xref>
        ] highlights the application of Google Earth VR and AR for foreign language
learning, including Japanese, which allows users to visit diferent locations throughout the world, while
simultaneously supporting their own learning of various subjects, such as history, political studies,
international relations, etc.
      </p>
      <p>
        Google Earth AR includes numerous instruments, like Mindshow, for the creation of new exciting
places and their use in role-playing [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ]. This tool is marker-less and uses GPS. Scientists focus their
attention on the issue that these instruments are useful for distant language learning, although they
should be supervised by a teacher.
      </p>
      <p>We should pay attention to the possibility of foreign language learning, particularly Japanese, with
the help of this service and others that focus on various fields of science in Japanese.</p>
      <p>
        It is important to emphasise the potential of augmented reality services that support the teaching of
various disciplines. Special emphasis should be placed on training in the fields of STEM education, which
involves integration between the disciplines of natural sciences, technological sciences, engineering,
and mathematics [
        <xref ref-type="bibr" rid="ref26 ref35 ref36">26, 35, 36</xref>
        ]. For example, many augmented reality applications ofer materials in
Japanese (BioDigital Human 3D anatomy, 3D Anatomy Learning – Atlas, GeoGebraAR, Planets AR,
etc.). It is clear that the vocabulary of these applications is designed for students who have language
skills at the B1 level and above.
      </p>
      <p>
        Geng and Yamada [
        <xref ref-type="bibr" rid="ref30 ref31">30, 31</xref>
        ] ofer their experience of using AR generators to create markers based on
Kanji characters as QR codes. They developed an AR compound verb learning system to support the
learning of Japanese verbs. Under this system, students can scan a card with the Kanji characters of a
particular verb and watch an animation that displays the corresponding action with the card through
the smartphone screen in the application. “In this system, the meanings of verbs, including both single
verbs and compound verbs, were represented by 3D animations created using Maya, according to the
image schemas of the verbs. Maya is a 3D computer graphics software, and it is used to create interactive
3D animations and visual efects”. The application was developed by scientists using Unity 3D and
Vuforia. In addition, the combination function was proposed based on a combination of two cards
with the corresponding Kanji characters (V1 + V2) to facilitate the efective study of complex verbs by
students. Researchers have proven that the approach involving AR in Oriental language learning is the
most efective for students compared to the traditional method.
      </p>
      <p>
        Platte et al. [
        <xref ref-type="bibr" rid="ref37">37</xref>
        ] suggests using ARTranslate (https://github.com/benpla/ARTranslate) for foreign
language learning using augmented reality. ARTranslate is software that recognises up to 1,000 objects
in a user’s environment using the Convolutional Neural Networks (CNN) method and names them
accordingly. Objects are superimposed on 3D information in diferent languages using AR. The user can
access the surrounding everyday objects in any language by switching languages in the ARTranslate
application settings. The software runs on iOS version 12.
      </p>
      <p>We surveyed students (31 students participated in this survey) about their attitudes towards the use
of ITs to improve the quality of Japanese language learning. We proposed the following statements,
which students should designate as “Strong disagree”, “Disagree”, “Neither agree”, “Agree”, or “Strongly
agree”: “I have a clear understanding of what ITs are and how I can integrate them into my own
education process”, “I have heard about ITs in foreign language learning”, “I have discussed ITs for
foreign language learning with my friends”, “I have experience with teachers using approaches with
ITs for Japanese language learning”.</p>
      <p>According to the questionnaire analysis of students’ attitudes and understanding of ITs in the Japanese
language learning process, it was found that students understand what augmented reality is but have
not used these tools to learn Japanese: “I have a clear understanding of what ITs are and how they can be
integrated into my own education process”: Strongly disagree – 8% students; Disagree – 17% students;
Neither agree nor disagree – 32% students; Agree – 39% students; Strongly agree – 4% students; “I have
heard about ITs in foreign language learning”: Strongly disagree – 3%; Disagree – 16%; Neither agree
nor disagree – 28%; Agree – 49%; Strongly agree – 4%.</p>
      <p>We showed students the options for using such IM for diferent levels of Japanese language learning
(Japanese language learning levels are available at https://www.jlpt.jp/) as:
• ITs for not language learning such, as BioDigital Human 3D anatomy, 3D Anatomy Learning –</p>
      <p>Atlas, GeoGebraAR, Planets AR, Google Earth AR and VR;
• ITs for language learning such, as Easy Japanese News, Triplens, ARTranslate;
• Platforms for creating web projects with AR elements such, as BlippAR and Google ARCore, and
with VR such, as CoSpaces, for students to create their own examples of language learning.</p>
      <p>These tools were proposed for use by 3rd year Bachelor’s students in the study “Japanese Kanji
characters”, 4th year Bachelor’s students in the study “Linguistic Tradition of Japan”, 4th year Bachelor’s
students in the study “Japanese Language Etiquette”, 2nd year Bachelor’s students in the study “Japanese
language: Practical Course for Translators”, and 1st-2nd year Bachelor’s students in the study “Oriental
Language (Japanese language)” of the Department of Languages and Literatures of the Far East and
Southeast Asia of the Institute of Philology of Taras Shevchenko National University of Kyiv.</p>
      <p>After classes and self-study of students with the help of ITs, a survey was conducted as experts (27
students) on the choice of approaches to the study of Japanese characters. They were asked to use the
Likert Scale method to rank approaches to language learning according to their importance – from
inefective (1 point) to very efective (5 points).</p>
      <p>Approaches to the study of Japanese Kanji characters were determined according to traditional
methods (direct method, grammar-translation method, audio-lingual method, cognitive method) and
considering the use of information and communication technologies, in particular immersive
technologies.</p>
      <p>Our students were ofered the following approaches to Japanese Kanji ( 漢 字) learning for the
assessment:
• use of electronic dictionaries;
• search and use of Internet resources;
• usage of online educational literature;
• creation and application of their own associations (ofline);
• handwriting Kanji characters (ofline);
• use of AR and VR applications;
• creation of their own educational materials on the basis of ITs.</p>
      <p>The results of this questionnaire are presented in table 2 “Results of students’ questionnaires on their
opinion on the choice of approaches to the Japanese Kanji characters learning”.</p>
      <p>Thus, the results of students’ questionnaires about their opinion on the choice of methods for studying
Japanese Kanji characters showed that the most necessary approach for them was based on the creation
of students’ own learning materials using augmented reality (5). According to interviews with students
who wished to comment on their answers, this was motivated by the creation of augmented reality
Kanji characters that would be of interest to other students and reflect the most dificult cases in Oriental
language translation practice. The use of electronic dictionaries (4.8) is also important, as most AR
applications are focused on the assimilation of foreign language vocabulary by users (for example,
Triplens, ARTranslate, etc.).</p>
      <p>To achieve our goal, we created, organised and implemented educational content (training course)
“Information Support of Philological Research in Japanese Studies” for philology bachelor’s students of
Oriental languages, based on the use of immersive technologies. It consists of the following modules:
Module 1 “Theoretical foundations of the use of ICT in the study of foreign languages”, covering topics
such as “Basic concepts”, “Methods of using ICT in the study of foreign languages”, etc.; Module 2
“Electronic educational resources for learning a foreign language (Japanese)”, which covers topics such
as “Electronic dictionaries and their practical use in translation and teaching”, “Online tests in foreign
languages: the use of international test systems and the creation of personal tests using web services”;
Module 3 “Immersive technologies of learning a foreign language (Japanese)”, which covers such topics
as “Model of learning a foreign language using virtual reality”, “Model of learning a foreign language
using augmented reality”; Module 4 “Research activities on the establishment of Electronic Educational
Resources for the translation and teaching of Japanese”.</p>
      <p>Students were divided into groups according to their desire to learn language using ICT, including
immersive technologies, which are present in separate modules of the course “Information Support of
Philological Research in Japanese Studies”, which is part of a series of linguistic disciplines that form the
philological basis of the bachelor’s program at the Institute of Philology of Taras Shevchenko National
University of Kyiv in diferent lectures and seminars.</p>
      <p>In response to the question “Do you want to learn a language using immersive technologies?”, 21
students answered, while 8 students did not take an active part in the survey and training due to
extreme conditions (military action in Ukraine). As a result of the survey, two groups were created: 11
students who will study language using ICT and immersive technologies, and 10 students who will
study language using ICT but not immersive technologies. The group of students studying the “Japanese
language and literature” course using ICT and immersive technologies passed the exam with an average
of 95 points, while the group of students that studied language using ICT but did not use immersive
technology passed the exam with an average of 85 points.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions and prospects for further research</title>
      <p>In conclusion, immersive technologies provide a new paradigm for the presentation of educational
materials, positively impacting the formation of fundamental and professional competencies in prospective
Japanese language teachers. We can identify the following benefits of using ITs to train future teachers
of the Japanese language:
• the use of ITs makes the learning process more visual and mobile;
• the use of ITs increases students’ interest and motivation to learn the language;
• ITs improve the learning process by incorporating innovative forms of student engagement;
• ITs create conditions for the formation and development of students’ creative abilities;
• these technologies and approaches contribute to the support of the linguistic and cultural aspect
of student learning.</p>
      <p>The following approaches to the use of ITs for the study of Japanese by students should be
distinguished: 1) the use of specialised applications for language learning; 2) the use of applications for
studying other disciplines (anatomy, biology, computer science, astronomy, etc.) while simultaneously
learning a foreign language; 3) the creation of personal examples by students for learning a foreign
language with the help of special web platforms.</p>
      <p>ITs can be efective when used in blended learning that combines distance, online, traditional, and
self-directed learning of Oriental languages.</p>
      <p>The author plans to continue the longitudinal research, analysing the statistical data of students’
academic performance and expanding the research to several other subjects (taught at Taras Shevchenko
National University of Kyiv) during the academic year 2022-2023.</p>
      <p>Prospects for further research include the creation of guidelines and manuals on the use of immersive
technologies for the study of prefabricated languages at diferent levels of training for prospective
teachers of the Japanese language.</p>
      <p>Declaration on Generative AI: The authors have not employed any Generative AI tools.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <source>[1] 2020 Augmented and Virtual Reality Survey Report, Technical Report, Perkins Coie</source>
          ,
          <year>2020</year>
          . URL: https://www.perkinscoie.com/images/content/2/3/v4/231654/2020-
          <article-title>AR-VR-Survey-v3.pdf</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>S. H.</given-names>
            <surname>Lytvynova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. M.</given-names>
            <surname>Striuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. I.</given-names>
            <surname>Striuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. S.</given-names>
            <surname>Kolgatina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. Y.</given-names>
            <surname>Velychko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Mintii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. O.</given-names>
            <surname>Kalinichenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Tukalo</surname>
          </string-name>
          , AREdu 2021 -
          <article-title>Immersive technology today</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>2898</volume>
          (
          <year>2021</year>
          )
          <fpage>1</fpage>
          -
          <lpage>40</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>S.</given-names>
            <surname>Palamar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Brovko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <article-title>Enhancing Foreign Language Learning in Ukraine: Immersive Technologies as Catalysts for Cognitive Interest and Achievement</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>3624</volume>
          (
          <year>2023</year>
          )
          <fpage>69</fpage>
          -
          <lpage>81</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Mintii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. A.</given-names>
            <surname>Hamaniuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. N.</given-names>
            <surname>Soloviev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Bondarenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. P.</given-names>
            <surname>Nechypurenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. V.</given-names>
            <surname>Shokaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. V.</given-names>
            <surname>Moiseienko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. S.</given-names>
            <surname>Shepiliev</surname>
          </string-name>
          ,
          <string-name>
            <surname>Immersive E-Learning</surname>
            <given-names>Resources</given-names>
          </string-name>
          : Design Methods, in: Digital Humanities Workshop, DHW 2021,
          <article-title>Association for Computing Machinery</article-title>
          , New York, NY, USA,
          <year>2022</year>
          , p.
          <fpage>37</fpage>
          -
          <lpage>47</lpage>
          . doi:
          <volume>10</volume>
          .1145/3526242.3526264.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Mintii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. A.</given-names>
            <surname>Hamaniuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Bondarenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. P.</given-names>
            <surname>Nechypurenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. V.</given-names>
            <surname>Shokaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. V.</given-names>
            <surname>Moiseienko</surname>
          </string-name>
          ,
          <article-title>Immersive cloud-based educational environment of the university: Design principles</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>3771</volume>
          (
          <year>2024</year>
          )
          <fpage>126</fpage>
          -
          <lpage>135</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Mintii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. A.</given-names>
            <surname>Hamaniuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Bondarenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. P.</given-names>
            <surname>Nechypurenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. V.</given-names>
            <surname>Shokaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. V.</given-names>
            <surname>Moiseienko</surname>
          </string-name>
          ,
          <article-title>Development of digital competencies in immersive cloud-based educational environment</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>3781</volume>
          (
          <year>2024</year>
          )
          <fpage>203</fpage>
          -
          <lpage>208</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Mintii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. A.</given-names>
            <surname>Hamaniuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Bondarenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. P.</given-names>
            <surname>Nechypurenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. V.</given-names>
            <surname>Shokaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. V.</given-names>
            <surname>Moiseienko</surname>
          </string-name>
          ,
          <article-title>Designing an immersive cloud-based educational environment for universities: a comprehensive approach</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>3844</volume>
          (
          <year>2024</year>
          )
          <fpage>107</fpage>
          -
          <lpage>116</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>G.</given-names>
            <surname>Makransky</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. B.</given-names>
            <surname>Petersen</surname>
          </string-name>
          ,
          <article-title>The Cognitive Afective Model of Immersive Learning (CAMIL): a Theoretical Research-Based Model of Learning in Immersive Virtual Reality</article-title>
          ,
          <source>Educational Psychology Review</source>
          <volume>33</volume>
          (
          <year>2021</year>
          )
          <fpage>937</fpage>
          -
          <lpage>958</lpage>
          . doi:
          <volume>10</volume>
          .1007/s10648-020-09586-2.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>F.</given-names>
            <surname>Buttussi</surname>
          </string-name>
          , L. Chittaro,
          <article-title>Efects of Diferent Types of Virtual Reality Display on Presence and Learning in a Safety Training Scenario</article-title>
          ,
          <source>IEEE Transactions on Visualization and Computer Graphics</source>
          <volume>24</volume>
          (
          <year>2018</year>
          )
          <fpage>1063</fpage>
          -
          <lpage>1076</lpage>
          . doi:
          <volume>10</volume>
          .1109/TVCG.
          <year>2017</year>
          .
          <volume>2653117</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <surname>M. M. Mintii</surname>
            ,
            <given-names>I. S.</given-names>
          </string-name>
          <string-name>
            <surname>Mintii</surname>
          </string-name>
          ,
          <article-title>Review of the course “Development of Virtual and Augmented Reality Software” for STEM teachers: Implementation results and improvement potentials</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>2898</volume>
          (
          <year>2021</year>
          )
          <fpage>159</fpage>
          -
          <lpage>177</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>S.</given-names>
            <surname>Papadakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. E.</given-names>
            <surname>Kiv</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. M.</given-names>
            <surname>Kravtsov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Osadchyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Marienko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. P.</given-names>
            <surname>Pinchuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. P.</given-names>
            <surname>Shyshkina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. M.</given-names>
            <surname>Sokolyuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Mintii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. E.</given-names>
            <surname>Azarova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. S.</given-names>
            <surname>Kolgatina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Amelina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. P.</given-names>
            <surname>Volkova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. Y.</given-names>
            <surname>Velychko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. M.</given-names>
            <surname>Striuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <article-title>Unlocking the power of synergy: the joint force of cloud technologies and augmented reality in education</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>3364</volume>
          (
          <year>2023</year>
          )
          <fpage>1</fpage>
          -
          <lpage>23</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>M. M. Mintii</surname>
            ,
            <given-names>N. M.</given-names>
          </string-name>
          <string-name>
            <surname>Sharmanova</surname>
            ,
            <given-names>A. O.</given-names>
          </string-name>
          <string-name>
            <surname>Mankuta</surname>
            ,
            <given-names>O. S.</given-names>
          </string-name>
          <string-name>
            <surname>Palchevska</surname>
            ,
            <given-names>S. O.</given-names>
          </string-name>
          <string-name>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <article-title>Selection of pedagogical conditions for training STEM teachers to use augmented reality technologies in their work</article-title>
          ,
          <source>Journal of Physics: Conference Series</source>
          <volume>2611</volume>
          (
          <year>2023</year>
          )
          <article-title>012022</article-title>
          . doi:
          <volume>10</volume>
          .1088/
          <fpage>1742</fpage>
          -6596/ 2611/1/012022.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Foki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. S.</given-names>
            <surname>Shepiliev</surname>
          </string-name>
          ,
          <string-name>
            <surname>M. M. Mintii</surname>
            ,
            <given-names>I. S.</given-names>
          </string-name>
          <string-name>
            <surname>Mintii</surname>
            ,
            <given-names>O. H.</given-names>
          </string-name>
          <string-name>
            <surname>Kuzminska</surname>
          </string-name>
          ,
          <article-title>Methodology for teaching development of web-based augmented reality with integrated machine learning models</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>3820</volume>
          (
          <year>2024</year>
          )
          <fpage>118</fpage>
          -
          <lpage>145</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. M.</given-names>
            <surname>Striuk</surname>
          </string-name>
          ,
          <article-title>Augmented Reality in Education 2023: innovations, applications, and future directions</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          <volume>3844</volume>
          (
          <year>2024</year>
          )
          <fpage>1</fpage>
          -
          <lpage>22</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Amelina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. O.</given-names>
            <surname>Tarasenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Shen</surname>
          </string-name>
          ,
          <article-title>Using mobile applications with augmented reality elements in the self-study process of prospective translators</article-title>
          ,
          <source>Educational Technology Quarterly</source>
          <year>2022</year>
          (
          <year>2022</year>
          )
          <fpage>263</fpage>
          -
          <lpage>275</lpage>
          . doi:
          <volume>10</volume>
          .55056/etq.51.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>R. O.</given-names>
            <surname>Tarasenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Amelina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. D.</given-names>
            <surname>Shynkaruk</surname>
          </string-name>
          ,
          <article-title>Using interactive semantic networks as an augmented reality element in autonomous learning</article-title>
          ,
          <source>Journal of Physics: Conference Series</source>
          <year>1946</year>
          (
          <year>2021</year>
          )
          <article-title>012023</article-title>
          . doi:
          <volume>10</volume>
          .1088/
          <fpage>1742</fpage>
          -
          <lpage>6596</lpage>
          /
          <year>1946</year>
          /1/012023.
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>N.</given-names>
            <surname>Firth</surname>
          </string-name>
          ,
          <article-title>Interview: The father of VR Jaron Lanier</article-title>
          , New Scientist 218 (
          <year>2013</year>
          )
          <article-title>21</article-title>
          . doi:
          <volume>10</volume>
          .1016/ S0262-
          <volume>4079</volume>
          (
          <issue>13</issue>
          )
          <fpage>61542</fpage>
          -
          <lpage>0</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>A. C. C. de Oliveira</surname>
            ,
            <given-names>J. A.</given-names>
          </string-name>
          <string-name>
            <surname>Nascimento</surname>
            ,
            <given-names>S. R.</given-names>
          </string-name>
          <string-name>
            <surname>Santos</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>M. D. de Queiros</surname>
          </string-name>
          ,
          <string-name>
            <surname>P. K. G. Brito</surname>
            ,
            <given-names>A. Z.</given-names>
          </string-name>
          <string-name>
            <surname>Clericuzi</surname>
          </string-name>
          ,
          <article-title>REANIME a neonatal resuscitation simulator for evaluating team training</article-title>
          ,
          <source>in: 2020 22nd Symposium on Virtual and Augmented Reality (SVR)</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>174</fpage>
          -
          <lpage>178</lpage>
          . doi:
          <volume>10</volume>
          .1109/SVR51698.
          <year>2020</year>
          .
          <volume>00038</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>T.</given-names>
            <surname>Monahan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>McArdle</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Bertolotto</surname>
          </string-name>
          ,
          <article-title>Virtual reality for collaborative e-learning,</article-title>
          <source>Computers &amp; Education</source>
          <volume>50</volume>
          (
          <year>2008</year>
          )
          <fpage>1339</fpage>
          -
          <lpage>1353</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.compedu.
          <year>2006</year>
          .
          <volume>12</volume>
          .008.
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>B.</given-names>
            <surname>Chang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Sheldon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Si</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Hand</surname>
          </string-name>
          ,
          <article-title>Foreign language learning in immersive virtual environments</article-title>
          , in: I. E.
          <string-name>
            <surname>McDowall</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          Dolinsky (Eds.),
          <source>The Engineering Reality of Virtual Reality</source>
          <year>2012</year>
          , volume
          <volume>8289</volume>
          , International Society for Optics and Photonics,
          <string-name>
            <surname>SPIE</surname>
          </string-name>
          ,
          <year>2012</year>
          , p.
          <fpage>828902</fpage>
          . doi:
          <volume>10</volume>
          .1117/12.909835.
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>T.</given-names>
            <surname>Caudell</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Mizell</surname>
          </string-name>
          ,
          <article-title>Augmented reality: An application of heads-up display technology to manual manufacturing processes</article-title>
          ,
          <source>in: Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences</source>
          , volume
          <volume>2</volume>
          ,
          <year>1992</year>
          , pp.
          <fpage>659</fpage>
          -
          <lpage>669</lpage>
          . doi:
          <volume>10</volume>
          .1109/HICSS.
          <year>1992</year>
          .
          <volume>183317</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>C.</given-names>
            <surname>Arth</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Grasset</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Gruber</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Langlotz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Mulloni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Wagner</surname>
          </string-name>
          ,
          <source>The History of Mobile Augmented Reality</source>
          ,
          <year>2015</year>
          . URL: https://arxiv.org/abs/1505.01319.
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <given-names>D.</given-names>
            <surname>Nelson</surname>
          </string-name>
          ,
          <string-name>
            <surname>BYOD</surname>
          </string-name>
          : An Opportunity Schools Cannot Aford to Miss,
          <source>Internet@Schools</source>
          <volume>19</volume>
          (
          <year>2012</year>
          )
          <fpage>12</fpage>
          -
          <lpage>15</lpage>
          . URL: https://tinyurl.com/bde3u7b7.
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <given-names>R.</given-names>
            <surname>Calo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Denning</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Friedman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Kohno</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Magassa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>McReynolds</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Newell</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Roesner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Woo</surname>
          </string-name>
          ,
          <article-title>Augmented reality: A technology and policy primer</article-title>
          ,
          <source>Tech Policy Lab</source>
          (University of Washington),
          <year>2015</year>
          . URL: http://techpolicylab.uw.edu/wp-content/uploads/2017/08/Augmented_ Reality_
          <article-title>Primer-TechPolicyLab.pdf</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>A. E.</given-names>
            <surname>Kiv</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. P.</given-names>
            <surname>Shyshkina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. M.</given-names>
            <surname>Striuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. V.</given-names>
            <surname>Yechkalo</surname>
          </string-name>
          , AREdu 2019 -
          <article-title>How augmented reality transforms to augmented learning</article-title>
          , in: A. E. Kiv,
          <string-name>
            <surname>M. P.</surname>
          </string-name>
          Shyshkina (Eds.),
          <source>Proceedings of the 2nd International Workshop on Augmented Reality in Education, Kryvyi Rih, Ukraine, March</source>
          <volume>22</volume>
          ,
          <year>2019</year>
          , volume
          <volume>2547</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2019</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>12</lpage>
          . URL: http://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2547</volume>
          /paper00.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>N.</given-names>
            <surname>Soroko</surname>
          </string-name>
          ,
          <article-title>The augmented reality functions to support the STEAM education at general education institutions</article-title>
          ,
          <source>Physical and Mathematical Education</source>
          <volume>29</volume>
          (
          <year>2021</year>
          )
          <fpage>24</fpage>
          -
          <lpage>30</lpage>
          . doi:
          <volume>10</volume>
          .31110/
          <fpage>2413</fpage>
          -1571-2021-029-3-004.
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <surname>H.-</surname>
          </string-name>
          J. Cheng, H.
          <string-name>
            <surname>Zhan</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Tsai</surname>
          </string-name>
          ,
          <article-title>Integrating Second Life Into a Chinese Language Teacher Training Program: A Pilot Study</article-title>
          ,
          <source>Journal of Technology and Chinese Language Teaching</source>
          <volume>1</volume>
          (
          <year>2010</year>
          ). URL: https://commons.erau.edu/publication/1099.
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [28]
          <string-name>
            <given-names>A.</given-names>
            <surname>Chik</surname>
          </string-name>
          ,
          <source>Digital Gaming and Language Learning: Autonomy and Community, Language Learning &amp; Technology</source>
          <volume>18</volume>
          (
          <year>2014</year>
          )
          <fpage>85</fpage>
          -
          <lpage>100</lpage>
          . doi:10125/44371.
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [29]
          <string-name>
            <given-names>E.</given-names>
            <surname>Frazier</surname>
          </string-name>
          , E. Bonner,
          <string-name>
            <given-names>R.</given-names>
            <surname>Lege</surname>
          </string-name>
          ,
          <string-name>
            <surname>A Brief</surname>
          </string-name>
          <article-title>Investigation into the Potential for Virtual Reality: A Tool for 2nd Language Learning Distance Education in Japan</article-title>
          ,
          <source>in: 2018年度 言語メディア教育研 究センター年報</source>
          ,
          <year>2018</year>
          , pp.
          <fpage>189</fpage>
          -
          <lpage>194</lpage>
          . URL: https://www.kandagaigo.ac.jp/kuis/cms/wp-content/ uploads/2018/04/15.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          [30]
          <string-name>
            <given-names>X.</given-names>
            <surname>Geng</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Yamada</surname>
          </string-name>
          ,
          <article-title>An augmented reality learning system for Japanese compound verbs: study of learning performance and cognitive load</article-title>
          ,
          <source>Smart Learning Environments</source>
          <volume>7</volume>
          (
          <year>2020</year>
          )
          <article-title>27</article-title>
          . doi:
          <volume>10</volume>
          .1186/s40561-020-00137-4.
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          [31]
          <string-name>
            <given-names>X.</given-names>
            <surname>Geng</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Yamada</surname>
          </string-name>
          ,
          <article-title>The development and evaluation of an augmented reality learning system for Japanese compound verbs using learning analytics</article-title>
          ,
          <source>in: 2020 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE)</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>71</fpage>
          -
          <lpage>76</lpage>
          . doi:
          <volume>10</volume>
          .1109/ TALE48869.
          <year>2020</year>
          .
          <volume>9368345</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          [32]
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Popova</surname>
          </string-name>
          ,
          <string-name>
            <surname>Theoretic-</surname>
          </string-name>
          and
          <article-title>-methodic grounds of the professional speech training targeted to the future translators of Chinese under conditions of university education, The dissertation for a doctoral degree of</article-title>
          <source>Pedagogical Sciences in specialties 13.00.04 - Theory and Methods of Professional Training</source>
          ,
          <volume>13</volume>
          .
          <fpage>00</fpage>
          .
          <fpage>02</fpage>
          - Theory and
          <article-title>Methods of Teaching (oriental languages), State institution “South Ukrainian National Pedagogical University named after</article-title>
          K. D. Ushynsky”, Odesa,
          <year>2017</year>
          . URL: https://nrat.ukrintei.ua/searchdoc/0517U000325.
        </mixed-citation>
      </ref>
      <ref id="ref33">
        <mixed-citation>
          [33]
          <string-name>
            <given-names>O.</given-names>
            <surname>Viberg</surname>
          </string-name>
          , Å. Grönlund,
          <article-title>Systematising the Field of Mobile Assisted Language Learning</article-title>
          ,
          <source>International Journal of Mobile and Blended Learning (IJMBL) 5</source>
          (
          <year>2013</year>
          )
          <fpage>72</fpage>
          -
          <lpage>90</lpage>
          . doi:
          <volume>10</volume>
          .4018/ijmbl. 2013100105.
        </mixed-citation>
      </ref>
      <ref id="ref34">
        <mixed-citation>
          [34]
          <string-name>
            <surname>R. M. Hein</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Wienrich</surname>
            ,
            <given-names>M. E.</given-names>
          </string-name>
          <string-name>
            <surname>Latoschik</surname>
          </string-name>
          ,
          <article-title>A systematic review of foreign language learning with immersive technologies (</article-title>
          <year>2001</year>
          -2020),
          <source>AIMS Electronics and Electrical Engineering</source>
          <volume>5</volume>
          (
          <year>2021</year>
          )
          <fpage>117</fpage>
          -
          <lpage>145</lpage>
          . doi:
          <volume>10</volume>
          .3934/electreng.2021007.
        </mixed-citation>
      </ref>
      <ref id="ref35">
        <mixed-citation>
          [35]
          <string-name>
            <given-names>N. S.</given-names>
            <surname>Lukychova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. V.</given-names>
            <surname>Osypova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. S.</given-names>
            <surname>Yuzbasheva</surname>
          </string-name>
          ,
          <article-title>ICT and current trends as a path to STEM education: implementation and prospects</article-title>
          ,
          <source>CTE Workshop Proceedings</source>
          <volume>9</volume>
          (
          <year>2022</year>
          )
          <fpage>39</fpage>
          -
          <lpage>55</lpage>
          . doi:
          <volume>10</volume>
          . 55056/cte.100.
        </mixed-citation>
      </ref>
      <ref id="ref36">
        <mixed-citation>
          [36]
          <string-name>
            <surname>M. M. Mintii</surname>
          </string-name>
          ,
          <article-title>Selection of pedagogical conditions for training STEM teachers to use augmented reality technologies in their work</article-title>
          ,
          <source>Educational Dimension</source>
          (
          <year>2023</year>
          ). doi:
          <volume>10</volume>
          .31812/educdim.4951.
        </mixed-citation>
      </ref>
      <ref id="ref37">
        <mixed-citation>
          [37]
          <string-name>
            <given-names>B.</given-names>
            <surname>Platte</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Platte</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Thomanek</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Roschke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Rolletschke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Zimmer</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <article-title>Ritter, ARTranslate - Immersive Language Exploration with Object Recognition and Augmented Reality</article-title>
          ,
          <source>in: Proceedings of The 12th Language Resources and Evaluation Conference (LREC</source>
          <year>2020</year>
          ),
          <year>2020</year>
          , pp.
          <fpage>356</fpage>
          -
          <lpage>362</lpage>
          . URL: https://www.researchgate.net/publication/354571674.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>