<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Design and validation of a virtual reality-based training program for student learning on tractor-plough setup</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Oleksandr V. Kanivets</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Irina M. Kanivets</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sofiia V. Pidhorna</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Oleksandra I. Bilovod</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Oleksii A. Burlaka</string-name>
          <email>oleksii.burlaka@pdau.edu.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Poltava State Agrarian University</institution>
          ,
          <addr-line>1/3 Skovorody Str., Poltava, 36003</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2025</year>
      </pub-date>
      <fpage>108</fpage>
      <lpage>120</lpage>
      <abstract>
        <p>This study focuses on developing and testing a virtual reality (VR) software system created in the Unity 3D environment, specifically designed to train students of agricultural specialities in the complex process of setting up a plough with a tractor. The project aimed to create an immersive and interactive simulator that enables students to practice essential skills related to the adjustment of agricultural machinery, simulating real field conditions. As part of the research, an in-depth analysis of the structural features of the tractor and plough was carried out, which made it possible to develop highly detailed 3D models and realistically simulate their interaction. An interactive control system was implemented, allowing the user to perform all necessary plough adjustment operations, such as changing the tillage depth and configuring the tractor's hitch system in longitudinal and transverse directions, using standard VR controllers. A key element of the program is the inclusion of various training scenarios. A comprehensive feedback and assessment system was developed to track the accuracy and sequence of student actions, task completion time, number of mistakes made, and the quality of the final result. This enables students to assess their progress and identify areas that require improvement objectively. The developed VR program was piloted at Poltava State Agrarian University with the participation of students majoring in H7 “Agroengineering”. A comparative analysis between the experimental group (trained using the VR simulator) and the control group (trained using traditional methods) revealed a significant increase in learning eficiency. Students who used the VR simulator demonstrated higher levels of theoretical knowledge, reduced the time required to acquire practical skills, and significantly decreased the number of errors during actual adjustment procedures. In addition to improving training quality, using VR technologies contributes to substantial savings in material resources and enhances the safety of the educational process. The study results confirm the high potential of virtual reality as an innovative tool for developing practical skills in agricultural production and support its integration into the curricula of higher agricultural education institutions.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;virtual reality</kwd>
        <kwd>Unity</kwd>
        <kwd>training</kwd>
        <kwd>agricultural education</kwd>
        <kwd>simulator</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The modern agro-industrial sector requires specialists to possess theoretical knowledge and
welldeveloped practical skills, particularly in the operation and adjustment of agricultural machinery.
Setting up a plough with a tractor for operation is one of the key stages of spring fieldwork, directly
afecting the quality of soil cultivation, fuel and resource use eficiency, and overall labour productivity
[
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Improper adjustment can lead to poor ploughing quality, increased wear of components, excessive
fuel consumption, and, consequently, significant economic losses [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>
        Traditional training methods, which include lectures and hands-on sessions with real machinery,
have certain limitations [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. Firstly, access to agricultural equipment is often restricted due to its high
cost, large dimensions, and the need for specific conditions to conduct practical training. Secondly,
training on real machinery involves safety risks for students (e.g., injuries, equipment damage) and
significant time and resource expenditures for preparing the equipment and ensuring safety compliance.
Moreover, real-life training conditions do not always allow for the simulation of a wide range of possible
scenarios and errors that may occur during actual fieldwork [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ].
      </p>
      <p>
        In light of these challenges, there is a pressing need to develop innovative training approaches that
enable students to efectively acquire and refine practical skills in a safe, controlled, and accessible
environment. One of the most promising directions in this context is using virtual reality (VR) technologies.
VR simulators can create realistic, immersive environments in which students can repeatedly practice
the procedures for plough adjustment, experiment with diferent parameters, and correct mistakes
without any negative consequences [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. This fosters a deeper understanding of mechanical processes,
enhances spatial awareness, and contributes to developing durable, practical skills.
      </p>
    </sec>
    <sec id="sec-2">
      <title>2. Analysis of recent research and publications</title>
      <p>
        In recent years, there has been a rapid increase in interest in the use of virtual and augmented reality
technologies in the educational process [
        <xref ref-type="bibr" rid="ref6 ref7 ref8">6, 7, 8</xref>
        ]. This trend is driven by the recognition of VR’s
potential to enhance student engagement, visualise complex processes, and create conditions for
efective practical skill development. A thorough analysis of the existing scientific literature makes it
possible to comprehensively assess the current state of research in this field, identify key trends, and
reveal gaps that require further investigation.
      </p>
      <p>
        Several studies have been devoted to the application of VR in technical education. For instance, in [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ],
the use of VR is explored for training engineers in industrial equipment maintenance, demonstrating a
significant reduction in training time and an improvement in learning outcomes through immersive fault
simulation and the practice of troubleshooting procedures. Study [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] focuses on developing virtual
laboratories for mechanical engineering students, enabling them to safely and eficiently assemble
complex mechanisms and disassemble, thereby minimising the risk of damaging expensive equipment
or potential injuries. The authors of [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] highlight the advantages of using VR simulators for training
agricultural machinery operators, emphasising the reduction of risks associated with operating large
machines and the conservation of resources (e.g., fuel, machinery wear), representing a significant
economic benefit.
      </p>
      <p>
        In the context of agricultural education, VR technologies have also been actively adopted. Article [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]
reviews existing VR solutions for training in agricultural specialities, including simulators for tractors
and harvesters. However, most available solutions primarily focus on machinery operation and basic
ifeldwork. At the same time, detailed adjustment of mounted implements—such as ploughs—receives
insuficient attention or is presented in a simplified manner. Publication [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ] describes a pilot project
for the development of a VR module demonstrating the operation of irrigation systems, highlighting
the potential of VR to visualise complex agrotechnological processes and helping students gain a better
understanding of their underlying principles.
      </p>
      <p>
        Special attention should be given to studies on developing VR applications using the Unity platform.
Unity is one of the leading platforms for creating interactive 3D applications due to its flexibility and
powerful graphics and physics simulation tools. Study [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] demonstrates the capabilities of Unity in
developing realistic simulators for training in construction-related fields, where precise modelling of
physical interactions and spatial perception plays a crucial role. In [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ], the use of Unity for simulating
complex engineering systems is explored, confirming its efectiveness in creating detailed and functional
VR simulators.
      </p>
      <p>
        The psychological aspects of learning through VR are equally important. Study [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] shows that
the immersiveness of virtual environments significantly increases student engagement, improves
information retention, and fosters the development of spatial thinking by allowing users to explore
objects from multiple perspectives interactively. Virtual reality also provides a safe environment for
experimentation, where mistakes have no real-world consequences, thereby reducing students’ fear of
failure and encouraging active learning [
        <xref ref-type="bibr" rid="ref17 ref18">17, 18</xref>
        ]. This is especially important when mastering complex
practical skills, where the “cost of error” in real-life settings can be very high. Moreover, VR simulations
can promote the development of critical thinking and problem-solving skills, as students are exposed
to realistic scenarios and must independently find optimal solutions [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ]. However, it is important to
consider potential psychological factors, such as motion sickness, which may afect user comfort, or the
risk of excessive immersion, which may require time-use monitoring [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ].
      </p>
      <p>
        Regarding pedagogical methods for using simulators, researchers identify several key approaches.
Discovery learning, where students independently explore the virtual environment and uncover
causeand-efect relationships, has proven particularly efective in VR settings [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ]. Scafolding, which involves
the gradual increase in task complexity and the provision of guidance during initial stages, is also
well-implemented in virtual environments [
        <xref ref-type="bibr" rid="ref22">22, 23</xref>
        ]. Scenario-based learning allows for the simulation
of real-world industrial situations, preparing students to handle various challenges [24]. Another
critical aspect is feedback. An efective VR-based training program should provide immediate, clear,
and constructive feedback on the learner’s actions, helping them correct mistakes and improve their
skills [25].
      </p>
      <p>Thus, despite significant progress in the application of VR in education and the availability of
simulators for operating agricultural machinery, insuficient attention has been given to specific,
detailed tasks such as the precise adjustment of complex agricultural implements. Existing tractor
simulators often oversimplify the aspects of configuring mounted equipment or exclude them entirely.
This creates a gap in specialist training, as the correct adjustment is critical for eficiently executing
ifeldwork.</p>
      <p>Therefore, there is an urgent need to develop a specialised VR solution focused specifically on
calibrating the plow with the tractor. This solution would enable students to master all necessary stages
and nuances of this process while considering the psychological and pedagogical advantages of the
virtual environment and minimising the aforementioned limitations.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Formulation of the research objective</title>
      <p>The primary aim of this study is to develop and test a virtual reality software package on the Unity
platform for efective training of students specialising in H7 “Agroengineering” in the adjustment of
a plough with a tractor for operation, as well as to evaluate its efectiveness compared to traditional
teaching methods.</p>
      <p>To achieve this aim, the following objectives were formulated:
• to analyse the features and key stages of the plow adjustment process with the tractor, and to
identify the main parameters requiring virtual simulation;
• to develop highly detailed 3D models of the tractor and plow, enabling accurate simulation of
their interaction and adjustment mechanisms;
• to implement interactive user mechanisms for manipulation of virtual objects during the
adjustment operations;
• to design instructional scenarios and algorithms that sequentially guide the student through the
adjustment process, providing feedback and assessing the correctness of actions;
• to develop an assessment system for knowledge and skills acquired in the VR environment;
• to conduct testing of the developed software among the target group of students specialising in
H7 “Agroengineering” and to compare the efectiveness of VR-based training with traditional
methods;
• to analyse the test results and formulate recommendations for further improvement of the VR
program and its integration into the educational process.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Results</title>
      <sec id="sec-4-1">
        <title>4.1. Architecture of the software package</title>
        <p>The developed virtual reality software package features a modular architecture, ensuring flexibility and
potential for further expansion. The main components of the system include the following modules:
• visualisation module;
• interaction module;
• training scenarios module;
• physics simulation module;
• assessment module.</p>
        <p>The Visualisation Module is responsible for rendering the 3D scene, including the models of the
tractor, plough, terrain, and surrounding environment. It uses animation and physical models to depict
the interactions between objects realistically.</p>
        <p>The Interaction Module provides the interface for user interaction with the virtual environment
via virtual reality controllers. It includes mechanisms for grabbing objects, manipulating them, and
activating interactive elements.</p>
        <p>The Training Scenarios Module contains sequences of steps required for configuring the plough.
Each scenario involves a set of tasks that the student must complete. The module ofers hints and
feedback throughout the process.</p>
        <p>Physics Simulation Module imitates the physical properties of objects (mass, friction, inertia) and
allows visualisation of changes in the plough’s position during adjustments.</p>
        <p>Assessment Module tracks the student’s progress by recording completed actions, errors, and task
completion times. Based on this data, it generates a report on the learning outcomes.</p>
        <p>The stereoscopic image creation is handled by the open-source package Google VR SDK for Unity
[26]. This package is designed for developing virtual reality applications for hardware devices such as
Google Cardboard (figure 1). In VR applications, the screen is split into two parts and exhibits a fisheye
efect. Combined with distortions from Google Cardboard’s plastic lenses, this creates the illusion of
image depth and maximises immersion in the virtual reality.</p>
        <p>This package contains prefabs that control VR mode settings, one of which is adapting the screen
to the Cardboard lenses. Additionally, the prefabs receive data from the smartphone’s gyroscope to
track head tilts and rotations. Thus, when the user turns their head, the camera in the video player also
rotates accordingly.</p>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Development of 3D models and virtual environment</title>
        <p>Detailed 3D models of the MTZ-80 traction class tractor and the two-furrow plough type PN-2-30 were
used to ensure a high degree of realism. Such models can be created using CAD software or Blender,
considering real agricultural machinery’s precise dimensions and structural features. Developing a
digital twin of any agricultural machine, especially a tractor with many parts, is a highly complex
technical and technological task [27]. Therefore, for our educational project, we utilised freely available
3D electronic models of the John Deere 6195M tractor [28] and a two-furrow plough [29]. Special
attention was paid to the units directly involved in the adjustment process during the development of
the VR application: the tractor’s hitch system, the depth regulation mechanism, the moldboards, and
the field board.</p>
        <p>The virtual environment simulates an agricultural hangar with a realistic floor texture. Quality
lighting conditions were considered to maximise immersion.</p>
      </sec>
      <sec id="sec-4-3">
        <title>4.3. Implementation of adjustment mechanisms</title>
        <p>The core of the adjustment functionality is based on an interactive system. Each adjustable element
on the plough and tractor is implemented as a separate interactive object with corresponding scripts.</p>
        <p>
          To interact with the interactive components of the models, the avatar must aim its white reticle at a
red marker. The red markers, indicated by red dots on the models, activate information panels at the
appropriate moments for further adjustment [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. These red markers are positioned to implement the
following key adjustment operations of the tractor-machine aggregate:
• Depth of plowing adjustment (figure 3) is achieved by changing the height of the beam under the
tractor’s left wheels, which visually alters the penetration depth of the ploughshares.
        </p>
        <p>• Aligning the plough horizontally in the longitudinal direction (figure 4) involves adjusting the
upper link so that the plough frame is parallel to the ground along the plough frame.</p>
        <p>• Horizontal levelling of the plough in the transverse direction (figure 5) involves adjusting the left
and right braces to ensure that the plough frame is parallel to the ground across the width of the
plough.
• The working width adjustment (figure 6) involves adjusting the distance between the plough
bodies.</p>
        <p>Each user action is accompanied by visual feedback. For example, a green tick appears when the
adjustment is correct, and a red cross appears when incorrect.</p>
      </sec>
      <sec id="sec-4-4">
        <title>4.4. Development of training scenarios and evaluation system</title>
        <p>Several training scenarios of varying dificulty levels were developed for the educational process.</p>
        <p>Scenario 1: Initial Plough Setup.</p>
        <p>This scenario covers basic operations such as attaching the plough to the tractor, primary adjustment
of the ploughing depth, and checking the horizontal alignment.</p>
        <p>Scenario 2: Diagnosis and Troubleshooting of Setup Issues.</p>
        <p>This scenario simulates common setup errors (e.g., plough tilt, uneven ploughing depth), and the
student must independently identify and correct them.</p>
        <p>The evaluation system automatically records:</p>
        <p>• total time spent by the student to complete the scenario;
• number of incorrect actions performed by the student;
• correctness of completed steps and the percentage of correctly performed operations.</p>
        <p>At the end of each scenario, the students receive a detailed report on their performance, which allows
them to analyse their mistakes and improve their skills.</p>
      </sec>
      <sec id="sec-4-5">
        <title>4.5. Program testing</title>
        <p>The developed VR program was tested at the Poltava State Agrarian University with students of the
H7 “Agroengineering” speciality during the course “Mechanisation of Agricultural Production.” Two
groups were formed: an experimental group (120 students) using the VR simulator for training, and a
control group (129 students) that underwent traditional training methods (lectures, demonstrations
on real equipment). Both groups took an initial test to assess their baseline knowledge of plough
adjustment. Then, the experimental group completed two training sessions using the VR simulator
(figure 1), practising all developed scenarios. During the same period, the control group received
traditional practical lessons. At the end of the training cycle, both groups underwent a final assessment,
which included theoretical questions and a practical task on a real plough (where possible, or through
detailed description and evaluation of the student’s actions). The following parameters were evaluated:
• average score on the theoretical test;
• time spent on adjusting the tractor-plough unit;
• number of errors made during the adjustment;
• quality of adjustment of the tractor-plough unit (according to expert assessment by the instructor).</p>
        <p>To confirm the study’s efectiveness, it is necessary to formulate a null hypothesis and test it. The
essence of hypothesis testing lies in determining whether the results of a sample are consistent with the
hypothesis and whether the observed diferences are random or statistically significant. Each hypothesis
testing problem begins with formulating the main and alternative hypotheses. The main hypothesis
is called the null hypothesis and is denoted as H0. At the same time, an alternative hypothesis H1 is
considered, which competes with the null hypothesis. The null hypothesis H0 is tested using statistical
methods, which is why it is referred to as statistical hypothesis testing. The rule by which the decision
is made to accept or reject the statistical hypothesis H0 is called a statistical criterion.</p>
        <p>The Pearson’s chi-square test was chosen for this study, as it ensures the lowest probability of a Type
2 error compared to other goodness-of-fit tests. The criterion is based on a comparison of theoretical
and empirical frequencies.</p>
        <p>In our study, we propose the null hypothesis H0: the developed application does not influence
students’ learning outcomes during the laboratory work in the course “Mechanisation of Agricultural
Production.” The alternative hypothesis H1 states that the developed application’s use does influence
the level of students’ learning outcomes on the above topic.</p>
        <p>A total of 249 students majoring in H7 “Agroengineering” at Poltava State Agrarian University
participated in the experiment. Homogeneous academic groups were selected for the experiment:
experimental groups (EG) and control groups (CG). The control group included 129 students, while the
experimental group had 120 students.</p>
        <p>Students in both EG and CG were categorised according to their learning achievement levels based
on a 4-point grading scale:
• “High” (A) – high level;
• “Suficient” (B, C) – suficient level;
• “Initial” (D, E) – average level;
• “Low” (FX, F) – low level.</p>
        <p>The experiment was conducted during the 2024–2025 academic year. During this period, many
air raid alerts occurred, during which students had to leave laboratories and move to shelter areas.
While in shelters, the learning process continued through distance learning technologies — students
watched educational videos and took notes on theoretical material. At the same time, students from the
experimental group additionally used the developed application to gain practical skills in adjusting a
machine-tractor unit to the specified ploughing depth. The objectivity of the experiment was maintained
by implementing a unified testing system to assess learning achievements across all students.</p>
        <p>The data characterising the students’ performance on the topic were collected after the formative
stage of the experiment and are presented in table 1. It should be noted that the results in table 1 reflect
only the outcomes of the first test and do not include any retakes of the topic.</p>
        <p>In the experimental group (EG), the number of students who achieved a high level of academic
performance increased by 9%. Additionally, the number of students with a low level of achievement
decreased threefold. The positive trend in the average performance score among students in the
experimental group indicates a higher level of learning outcomes formed during the study of the topic
“Adjustment of the Machine-Tractor Unit to a Specified Plowing Depth” through the use of the VR
application.</p>
        <p>The performance level indicators of the students at the end of the topic are shown in the diagram
(figure 7).</p>
        <p>To test the hypothesis, we conducted a pairwise comparison of the levels of academic achievement of
students in the control and experimental groups. The results were processed using Pearson’s criterion
[30].</p>
        <p>Find the value of the statistic  2 according to the formula:
 2 = ∑︁ ( −   )2 ,

where  – empirical frequency,  – theoretical frequency:
 = ∑︁ ( ×   ) ,

Where  – total sum for row ,  – total sum for column ,  – total sum of all variants.
(1)
(2)
We obtained  2 = 9.25.</p>
        <p>According to statistical tables for the level of significance  = 0.05 degrees of freedom for  = (4 –
1)×(2 – 1) = 3 we find the critical value of the test statistic  2( = 0.05;  = 3) :</p>
        <sec id="sec-4-5-1">
          <title>We found that</title>
        </sec>
        <sec id="sec-4-5-2">
          <title>The calculated data are presented in table 2.</title>
          <p>2(0.05; 3) = 7.81.</p>
          <p>2 &gt;  2.</p>
          <p>The obtained result is the basis for rejecting the null hypothesis H0 and accepting the
alternative hypothesis H1 about the positive impact of the developed VR application on students’ learning
achievements during laboratory work in the discipline “Mechanisation of agricultural production”.</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions</title>
      <p>The results of the pilot implementation demonstrated the significant efectiveness of the developed
virtual reality (VR) program for teaching students how to adjust a plough mounted to a tractor for
operation. Students in the experimental group showed higher performance in theoretical testing and
significantly better practical skills compared to the control group. The average score on the theoretical
test in the experimental group was 3.6, whereas in the control group it was 3.36.</p>
      <p>The number of mistakes made during plough adjustment in the virtual environment was minimal,
allowing students to experiment and learn from their errors safely without real-world consequences.
The experimental group made fewer critical mistakes during the practical task using a real plough.</p>
      <p>Students in the experimental group demonstrated a high level of interest and motivation in learning,
highlighting the interactivity and realism of the VR application. The use of the VR simulator also
contributes to resource savings (fuel, machinery wear, and instructor time) compared to traditional
training on real agricultural equipment.</p>
      <p>The developed software package has proven to be an efective tool for training future specialists
in the agricultural sector, enabling them to acquire practical skills in a safe, controlled, and realistic
environment. Future research may focus on expanding the simulator’s functionality (e.g., adding other
types of agricultural machinery, simulating various weather conditions), integrating it with distance
learning systems, and developing adaptive learning scenarios tailored to students’ individual needs.</p>
    </sec>
    <sec id="sec-6">
      <title>Author contributions</title>
      <p>Conceptualization, Oleksandr Kanivets and Irina Kanivets; methodology, Oleksandr Kanivets and
Oleksii Burlaka; software, Oleksandr Kanivets and Irina Kanivets; validation, Oleksandr Kanivets, Irina
Kanivets, Oleksii Burlaka, and Oleksandra Bilovod ; formal analysis, Irina Kanivets; investigation,
Oleksandr Kanivets, Irina Kanivets, Oleksii Burlaka, and Oleksandra Bilovod; data curation, Sofiia
Pidhorna; writing—original draft preparation, Oleksandr Kanivets and Irina Kanivets; writing—review
and editing, Sofiia Pidhorna, Oleksii Burlaka, and Oleksandra Bilovod; visualization, Oleksandr Kanivets
and Irina Kanivets; project administration, Oleksandr Kanivets. All authors have read and agreed to the
published version of the manuscript.</p>
    </sec>
    <sec id="sec-7">
      <title>Declaration on Generative AI</title>
      <p>During the preparation of this work, authors used GPT-4 in order to: Grammar and spelling check. After
using these services, authors reviewed and edited the content as needed and takes full responsibility for
the publication’s content.
CEUR Workshop Proceedings, CEUR-WS.org, 2020, pp. 217–238. URL: https://ceur-ws.org/Vol-2731/
paper12.pdf.
[23] O. V. Kanivets, I. Kanivets, N. V. Kononets, T. Gorda, E. O. Shmeltser, Development of mobile
applications of augmented reality for projects with projection drawings, in: A. E. Kiv, M. P. Shyshkina
(Eds.), Proceedings of the 2nd International Workshop on Augmented Reality in Education, Kryvyi
Rih, Ukraine, March 22, 2019, volume 2547 of CEUR Workshop Proceedings, CEUR-WS.org, 2019,
pp. 262–273. URL: https://ceur-ws.org/Vol-2547/paper19.pdf.
[24] K. Altmeyer, S. Kapp, M. Thees, S. Malone, J. Kuhn, R. Brünken, The use of augmented reality to
foster conceptual knowledge acquisition in STEM laboratory courses—Theoretical background and
empirical results, British Journal of Educational Technology 51 (2020) 611 – 628. doi:10.1111/
bjet.12900.
[25] J. Liu, M. C. Ang, J. K. Chaw, A.-L. Kor, K. W. Ng, M. C. Lam, Assessing the impact and development
of immersive VR technology in education: Insights from telepresence, emotion, and cognition,
Technological Forecasting and Social Change 213 (2025) 124024. doi:10.1016/j.techfore.
2025.124024.
[26] GVR SDK for Unity v1.200.1, 2023. URL: https://github.com/googlevr/gvr-unity-sdk/releases.
[27] C. Verdouw, B. Tekinerdogan, A. Beulens, S. Wolfert, Digital twins in smart farming, Agricultural</p>
      <p>Systems 189 (2021) 103046. doi:10.1016/j.agsy.2020.103046.
[28] John Deere 6195 Series Tractor, Detailed With SCV Ports,
Drawbar Hitch, &amp; 3PT, 2023. URL: https://grabcad.com/library/
john-deere-6195-series-tractor-detailed-with-scv-ports-drawbar-hitch-3pt-1.
[29] Mouldboard plough, 2024. URL: https://grabcad.com/library/mouldboard-plough-1.
[30] V. V. Osadchyi, K. P. Osadcha, H. B. Varina, S. V. Shevchenko, I. S. Bulakh, Specific features of the
use of augmented reality technologies in the process of the development of cognitive component
of future professionals’ mental capacity, Journal of Physics: Conference Series 1946 (2021) 012022.
doi:10.1088/1742-6596/1946/1/012022.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Kanivets</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. M.</given-names>
            <surname>Kanivets</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. M.</given-names>
            <surname>Gorda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. A.</given-names>
            <surname>Burlaka</surname>
          </string-name>
          ,
          <article-title>Development of a machine vision program to determine the completeness of wrapping plants in the soil 3077 (</article-title>
          <year>2021</year>
          )
          <fpage>27</fpage>
          -
          <lpage>43</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>3077</volume>
          /paper04.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>A.</given-names>
            <surname>Dudnikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Dudnikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Dudnyk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Mykhailichnko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Burlaka</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Kanivets</surname>
          </string-name>
          ,
          <article-title>Increasing the resource of agricultural machines</article-title>
          ,
          <source>Technology Audit and Production Reserves</source>
          <volume>5</volume>
          (
          <year>2021</year>
          )
          <fpage>6</fpage>
          -
          <lpage>11</lpage>
          . doi:
          <volume>10</volume>
          .15587/
          <fpage>2706</fpage>
          -
          <lpage>5448</lpage>
          .
          <year>2021</year>
          .
          <volume>242256</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>S.</given-names>
            <surname>Papadakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. E.</given-names>
            <surname>Kiv</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. M.</given-names>
            <surname>Kravtsov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Osadchyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Marienko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. P.</given-names>
            <surname>Pinchuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. P.</given-names>
            <surname>Shyshkina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. M.</given-names>
            <surname>Sokolyuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Mintii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. E.</given-names>
            <surname>Azarova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. S.</given-names>
            <surname>Kolgatina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Amelina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. P.</given-names>
            <surname>Volkova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. Y.</given-names>
            <surname>Velychko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. M.</given-names>
            <surname>Striuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <article-title>Unlocking the power of synergy: the joint force of cloud technologies and augmented reality in education</article-title>
          , in: S. O.
          <string-name>
            <surname>Semerikov</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          <string-name>
            <surname>Striuk</surname>
          </string-name>
          (Eds.),
          <source>Joint Proceedings of the 10th Workshop on Cloud Technologies in Education, and 5th International Workshop on Augmented Reality in Education (CTE+AREdu</source>
          <year>2022</year>
          ), Kryvyi Rih, Ukraine, May
          <volume>23</volume>
          ,
          <year>2022</year>
          , volume
          <volume>3364</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2022</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>23</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>3364</volume>
          /paper00.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>C.</given-names>
            <surname>Khansulivong</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Wicha</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Temdee</surname>
          </string-name>
          ,
          <article-title>Adaptive of new technology for agriculture online learning by metaverse: a case study in faculty of agriculture</article-title>
          , national university of laos, in: 2022 Joint International Conference on Digital Arts,
          <article-title>Media and Technology with ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunications Engineering (ECTI DAMT &amp; NCON</article-title>
          ),
          <year>2022</year>
          , pp.
          <fpage>428</fpage>
          -
          <lpage>432</lpage>
          . doi:
          <volume>10</volume>
          .1109/ECTIDAMTNCON53731.
          <year>2022</year>
          .
          <volume>9720366</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>I. M.</given-names>
            <surname>Kanivets</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Kanivets</surname>
          </string-name>
          , T. M.
          <article-title>Gorda, VR simulator for studying the structure of tractors</article-title>
          , in: T. A.
          <string-name>
            <surname>Vakaliuk</surname>
            ,
            <given-names>V. V.</given-names>
          </string-name>
          <string-name>
            <surname>Osadchyi</surname>
            ,
            <given-names>O. P.</given-names>
          </string-name>
          Pinchuk (Eds.),
          <source>Proceedings of the 2nd Workshop on Digital Transformation of Education (DigiTransfEd</source>
          <year>2023</year>
          )
          <article-title>co-located with 18th International Conference on ICT in Education, Research and Industrial Applications (ICTERI</article-title>
          <year>2023</year>
          ), Ivano-Frankivsk, Ukraine,
          <source>September 18-22</source>
          ,
          <year>2023</year>
          , volume
          <volume>3553</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2023</year>
          , pp.
          <fpage>124</fpage>
          -
          <lpage>141</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>3553</volume>
          /paper2.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>O.</given-names>
            <surname>Burov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. P.</given-names>
            <surname>Pinchuk</surname>
          </string-name>
          , Extended Reality in Digital Learning: Influence, Opportunities and Risks' Mitigation, in: S. H.
          <string-name>
            <surname>Lytvynova</surname>
            ,
            <given-names>O. Y.</given-names>
          </string-name>
          <string-name>
            <surname>Burov</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Demeshkant</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          <string-name>
            <surname>Osadchyi</surname>
          </string-name>
          , S. Semerikov (Eds.),
          <source>Proceedings of the VI International Workshop on Professional Retraining</source>
          and
          <article-title>Life-Long Learning using ICT: Person-oriented Approach (3L-Person 2021) co-located with 17th International Conference on ICT in Education, Research, and Industrial Applications: Integration, Harmonization, and Knowledge Transfer (ICTERI</article-title>
          <year>2021</year>
          ), Kherson, Ukraine, October 1,
          <year>2021</year>
          , volume
          <volume>3104</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2021</year>
          , pp.
          <fpage>119</fpage>
          -
          <lpage>128</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>3104</volume>
          /paper187.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Osadchyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. B.</given-names>
            <surname>Varina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. P.</given-names>
            <surname>Osadcha</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Kovalova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Voloshyna</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Sysoiev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. P.</given-names>
            <surname>Shyshkina</surname>
          </string-name>
          ,
          <article-title>The use of augmented reality technologies in the development of emotional intelligence of future specialists of socionomic professions under the conditions of adaptive learning</article-title>
          , in: S. H.
          <string-name>
            <surname>Lytvynova</surname>
            ,
            <given-names>S. O.</given-names>
          </string-name>
          <string-name>
            <surname>Semerikov</surname>
          </string-name>
          (Eds.),
          <source>Proceedings of the 4th International Workshop on Augmented Reality in Education (AREdu</source>
          <year>2021</year>
          ), Kryvyi Rih, Ukraine, May
          <volume>11</volume>
          ,
          <year>2021</year>
          , volume
          <volume>2898</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2021</year>
          , pp.
          <fpage>269</fpage>
          -
          <lpage>293</lpage>
          . URL: https: //ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2898</volume>
          /paper15.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Kanivets</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. M.</given-names>
            <surname>Kanivets</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. M.</given-names>
            <surname>Gorda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Gorbenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. O.</given-names>
            <surname>Kelemesh</surname>
          </string-name>
          ,
          <article-title>Using a mobile application to teach students to measure with a micrometer during remote laboratory work</article-title>
          , in: S. O.
          <string-name>
            <surname>Semerikov</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          <string-name>
            <surname>Striuk</surname>
          </string-name>
          (Eds.),
          <source>Joint Proceedings of the 10th Workshop on Cloud Technologies in Education, and 5th International Workshop on Augmented Reality in Education (CTE+AREdu</source>
          <year>2022</year>
          ), Kryvyi Rih, Ukraine, May
          <volume>23</volume>
          ,
          <year>2022</year>
          , volume
          <volume>3364</volume>
          <source>of CEUR Workshop Proceedings</source>
          , CEURWS.org,
          <year>2022</year>
          , pp.
          <fpage>87</fpage>
          -
          <lpage>107</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>3364</volume>
          /paper08.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>R. P.</given-names>
            <surname>Perez</surname>
          </string-name>
          , Özgür Keleş,
          <source>Immersive Virtual Reality Environments for Embodied Learning of Engineering Students</source>
          ,
          <year>2025</year>
          . URL: https://arxiv.org/abs/2503.16519. arXiv:
          <volume>2503</volume>
          .
          <fpage>16519</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <article-title>State-of-the-art in real-time virtual interfaces for tractors and farm machines: A systematic review</article-title>
          ,
          <source>Computers and Electronics in Agriculture</source>
          <volume>231</volume>
          (
          <year>2025</year>
          )
          <article-title>109947</article-title>
          . doi:
          <volume>10</volume>
          .1016/j.compag.
          <year>2025</year>
          .
          <volume>109947</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>M.</given-names>
            <surname>Cutini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Bisaglia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Brambilla</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Bragaglio</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Pallottino</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Assirelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Romano</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Montaghi</surname>
          </string-name>
          , E. Leo,
          <string-name>
            <given-names>M.</given-names>
            <surname>Pezzola</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Maroni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Menesatti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A</given-names>
            <surname>Co-Simulation Virtual Reality Machinery Simulator for Advanced Precision Agriculture Applications</surname>
          </string-name>
          ,
          <source>Agriculture</source>
          <volume>13</volume>
          (
          <year>2023</year>
          )
          <article-title>1603</article-title>
          . doi:
          <volume>10</volume>
          .3390/ agriculture13081603.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>K. S.</given-names>
            <surname>Swanson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. A.</given-names>
            <surname>Brown</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. N.</given-names>
            <surname>Brennan</surname>
          </string-name>
          ,
          <string-name>
            <surname>C.</surname>
          </string-name>
          <article-title>M. LaJambe, Extending driving simulator capabilities toward hardware-in-the-loop testbeds and remote vehicle interfaces</article-title>
          ,
          <source>in: 2013 IEEE Intelligent Vehicles Symposium Workshops (IV Workshops)</source>
          ,
          <year>2013</year>
          , pp.
          <fpage>115</fpage>
          -
          <lpage>120</lpage>
          . doi:
          <volume>10</volume>
          .1109/IVWorkshops.
          <year>2013</year>
          .
          <volume>6615236</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>J.</given-names>
            <surname>Radianti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Majchrzak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Fromm</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Wohlgenannt,</surname>
          </string-name>
          <article-title>A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned</article-title>
          , and research agenda,
          <source>Computers &amp; Education</source>
          <volume>147</volume>
          (
          <year>2020</year>
          )
          <article-title>103778</article-title>
          . doi:
          <volume>10</volume>
          .1016/j.compedu.
          <year>2019</year>
          .
          <volume>103778</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Tan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <article-title>Augmented and Virtual Reality (AR/VR) for Education and Training in the AEC Industry: A Systematic Review of Research and Applications</article-title>
          ,
          <source>Buildings</source>
          <volume>12</volume>
          (
          <year>2022</year>
          )
          <article-title>1529</article-title>
          . doi:
          <volume>10</volume>
          .3390/buildings12101529.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>D. H.</given-names>
            <surname>Naudé</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. S.</given-names>
            <surname>Botha</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Hugo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Jordaan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W. A.</given-names>
            <surname>Lombard</surname>
          </string-name>
          ,
          <article-title>Extended Reality in Agricultural Education: A Framework for Implementation</article-title>
          ,
          <source>Education Sciences</source>
          <volume>14</volume>
          (
          <year>2024</year>
          )
          <article-title>1309</article-title>
          . doi:
          <volume>10</volume>
          .3390/ educsci14121309.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>J.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Wang</surname>
          </string-name>
          , Before and After: How Did the Pandemic Reshape Virtual Laboratory Research?, SAGE Open 15 (
          <year>2025</year>
          ). doi:
          <volume>10</volume>
          .1177/21582440251339961.
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>X.</given-names>
            <surname>Huang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q.</given-names>
            <surname>Zhao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Harris</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Shawler</surname>
          </string-name>
          ,
          <article-title>Learning in an Immersive VR Environment: Role of Learner Characteristics and Relations Between Learning and Psychological Outcomes</article-title>
          ,
          <source>Journal of Educational Technology Systems</source>
          <volume>53</volume>
          (
          <year>2024</year>
          )
          <fpage>3</fpage>
          -
          <lpage>29</lpage>
          . doi:
          <volume>10</volume>
          .1177/00472395231216943.
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Kontsedailo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. S.</given-names>
            <surname>Antoniuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Korotun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Mintii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. V.</given-names>
            <surname>Pikilnyak</surname>
          </string-name>
          ,
          <article-title>Using game simulator Software Inc in the Software Engineering education</article-title>
          , in: A. E. Kiv,
          <string-name>
            <surname>M. P.</surname>
          </string-name>
          Shyshkina (Eds.),
          <source>Proceedings of the 2nd International Workshop on Augmented Reality in Education, Kryvyi Rih, Ukraine, March</source>
          <volume>22</volume>
          ,
          <year>2019</year>
          , volume
          <volume>2547</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2019</year>
          , pp.
          <fpage>66</fpage>
          -
          <lpage>80</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2547</volume>
          /paper05.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Wei</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Liao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.-H.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Qu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <article-title>Towards Enhanced Learning through Presence: A Systematic Review of Presence in Virtual Reality Across Tasks</article-title>
          and Disciplines,
          <year>2025</year>
          . URL: https://arxiv.org/ abs/2504.13845. arXiv:
          <volume>2504</volume>
          .
          <fpage>13845</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>O.</given-names>
            <surname>Pinchuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Burov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ahadzhanova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Logvinenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Dolgikh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Kharchenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Hlazunova</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. Shabalin,</surname>
          </string-name>
          <article-title>VR in education: Ergonomic features and cybersickness</article-title>
          , in: S. Nazir,
          <string-name>
            <given-names>T.</given-names>
            <surname>Ahram</surname>
          </string-name>
          , W. Karwowski (Eds.),
          <article-title>Advances in Human Factors in Training, Education, and Learning Sciences</article-title>
          , volume
          <volume>1211</volume>
          ,
          <year>2020</year>
          , pp.
          <fpage>350</fpage>
          -
          <lpage>355</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>030</fpage>
          -50896-8_
          <fpage>50</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>W.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <article-title>Theory and Practice of VR/AR in</article-title>
          K-12
          <string-name>
            <surname>Science Education - A Systematic Review</surname>
          </string-name>
          ,
          <source>Sustainability</source>
          <volume>13</volume>
          (
          <year>2021</year>
          )
          <article-title>12646</article-title>
          . doi:
          <volume>10</volume>
          .3390/su132212646.
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Hordiienko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. V.</given-names>
            <surname>Marchuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. V.</given-names>
            <surname>Pikilnyak</surname>
          </string-name>
          ,
          <article-title>Development of a model of the solar system in AR and 3D</article-title>
          , in: O.
          <string-name>
            <given-names>Y.</given-names>
            <surname>Burov</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. E.</surname>
          </string-name>
          Kiv (Eds.),
          <source>Proceedings of the 3rd International Workshop on Augmented Reality in Education</source>
          , Kryvyi Rih, Ukraine, May
          <volume>13</volume>
          ,
          <year>2020</year>
          , volume
          <volume>2731</volume>
          of
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>