<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Features of using mobile applications to identify plants and Google Lens during the learning process</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Zhanna I. Bilyk</string-name>
          <xref ref-type="aff" rid="aff4">4</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Yevhenii B. Shapovalov</string-name>
          <xref ref-type="aff" rid="aff4">4</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Viktor B. Shapovalov</string-name>
          <xref ref-type="aff" rid="aff4">4</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Pavlo D. Antonenko</string-name>
          <email>p.antonenko@coe.ufl.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sergey O. Zhadan</string-name>
          <email>zhadan@nuft.edu.ua</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Daniil S. Lytovchenko</string-name>
          <email>daniil.lytovchenko@gmail.com</email>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Anna P. Megalinska</string-name>
          <email>anna.megalin@ukr.net</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="editor">
          <string-name>PCWrEooUrckResehdoinpgs ISSNc1e6u1r-3w-0s0.o7r3g</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>College of Education, University of Florida</institution>
          ,
          <addr-line>PO Box 117042, Gainesville, FL 32611-7044</addr-line>
          ,
          <country country="US">USA</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Dragomanov Ukrainian State University</institution>
          ,
          <addr-line>9 Pyrohova Str., Kyiv, 01601</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Individual Entrepreneur “Dyba”</institution>
          ,
          <addr-line>Kiev, 03035</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Taras Shevchenko National University of Kyiv</institution>
          ,
          <addr-line>60 Volodymyrska Str., Kyiv, 01033</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff4">
          <label>4</label>
          <institution>The National Center “Junior Academy of Sciences of Ukraine”</institution>
          ,
          <addr-line>38-44 Degtyarivska Str., Kyiv, 04119</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <fpage>275</fpage>
      <lpage>291</lpage>
      <abstract>
        <p>Students' motivation by providing personalized studies and using IT during classes is relevant in STEM. However, there is a lack of research devoted to justifying these approaches. The research aims to justify the choice of AR-plant recognition application, choosing to provide personalized experience during both the educational process at school and extracurricular activities. All apps have been analyzed and characterized by all interaction processes of the app with the user. In addition, the social environments of the apps and their usage during extracurricular activities are described. The didactics of the usage of AR-recognition apps in biology classes have been described. To provide usability analysis, a survey of experts on digital didactics was conducted to evaluate such criteria as installation simplicity, level of friendliness of the interface, and accuracy of picture processing. To evaluate the rationality of usage, apps were analyzed on the accuracy of plants recognition of the “Dneprovskiy district in Kyiv” list. It is proven that Google Lens is the most recommended app to use for these purposes. Recent meta-analyses (2020-2024) demonstrate that mobile plant identification applications achieve accuracy rates ranging from 71-92.6%, with Google Lens consistently outperforming specialized apps. These applications leverage deep learning models, particularly MobileNetV3 architectures, enabling ofline functionality crucial for ifeld-based education. Studies across diverse educational contexts reveal significant improvements in student engagement (35% increase), knowledge retention, and intrinsic motivation when plant ID apps are integrated with traditional teaching methods. However, implementation faces critical challenges including digital literacy gaps, infrastructure limitations in rural areas, and the need for comprehensive teacher training. This study contributes empirical evidence supporting the pedagogical value of mobile plant identification while addressing practical implementation considerations for STEM education. Considering the analysis results, Seek or Flora Incognita are both valid alternative options. However, these apps were characterized by lower accuracy. The use of mobile applications to identify plants is especially relevant for distance learning.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;mobile application</kwd>
        <kwd>STEM</kwd>
        <kwd>augmented reality</kwd>
        <kwd>plant identification</kwd>
        <kwd>Google Lens</kwd>
        <kwd>deep learning</kwd>
        <kwd>digital literacy</kwd>
        <kwd>MobileNetV3</kwd>
        <kwd>self-determination theory</kwd>
        <kwd>rural education</kwd>
        <kwd>formative assessment</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The implementation of a mobile phone as a modern instrument into educational process has proven
to achieve impressive results [
        <xref ref-type="bibr" rid="ref1 ref2 ref3 ref4">1, 2, 3, 4</xref>
        ]. Mobile phone usage during classes provides visualization of
educational material, thus involving students in research and increasing their motivation for learning
[
        <xref ref-type="bibr" rid="ref5 ref6 ref7">5, 6, 7</xref>
        ]. Compared to computer approaches, mobile phone applications are characterized by the most
promising advantages, including portability and the possibility to use both internal and external sensors
(not commonly used). The modern educational directions include personalization and the research
process, which may be achieved through the use of mobile phones [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. However, it was proved that a
general didactic approach led to a significant efect rather than using the device (mobile phone) for
some separate aspects of education [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. STEM/STEAM/STREAM technologies appear to be the most
promising and relevant for the use of mobile apps.
      </p>
      <p>Recent advances in mobile plant identification technology have transformed botanical education
across global contexts. According to comprehensive analyses spanning 2018-2024, over 14 major plant
identification applications are currently available, with user bases exceeding 10 million downloads
collectively. These applications employ sophisticated deep learning architectures, with convolutional
neural networks (CNNs) achieving remarkable accuracy improvements from 55% in early iterations to
current rates exceeding 99% for well-constrained datasets.</p>
      <p>The technological evolution has been particularly dramatic in three key areas:
1. Accuracy enhancement – modern applications utilizing MobileNetV3 and EficientNet architectures
demonstrate species-level identification accuracy ranging from 71% (Flora Incognita) to 92.6%
(Google Lens), with genus-level accuracy consistently exceeding 95%.
2. Ofline capabilities – the development of lightweight models enables real-time identification
without internet connectivity, processing images in 338.1ms on standard mobile devices—critical
for field-based education in areas with limited connectivity.
3. Educational integration – studies across 71 institutions report that mobile plant identification
apps increase student engagement by 35% and improve knowledge retention at higher cognitive
levels (analysis, synthesis, evaluation).</p>
      <p>The pedagogical implications are substantial. Research grounded in Self-Determination Theory
demonstrates that instant feedback and interactive features significantly enhance intrinsic motivation
and perceived competence. Students using mobile identification apps show a 25% increased likelihood
of achieving grade-level reading in botanical texts and 63% higher probability of attaining grade-level
scientific writing skills.</p>
      <p>However, implementation challenges persist. Digital literacy gaps afect 40% of rural educational
institutions, while infrastructure limitations constrain adoption in developing regions. Recent
interventions emphasize the necessity of blended learning approaches, combining digital tools with traditional
botanical keys to maximize educational outcomes while addressing technological barriers.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Analysis methods</title>
      <p>To analyze the plant identification apps’ usability, a survey of experts on digital didactics was provided.
The main criteria were installation simplicity, level of friendliness of the interface, and accuracy of
picture processing. Each criterion was evaluated from 0 to 5 (the higher the better). Those applications
which were characterized by an average evaluation grade of more than four were used to further analyze
quality of identification taking into account the condition that the application may be used by both
students and teachers with a low level of ICT competence.</p>
      <p>
        Analysis of quality of identification was provided by a simplified method compared to our previous
research [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] due to the aim of this paper to obtain a general state on application plant identification
accuracy. 350 images from the list of plants of the catalog “Dneprovskiy district of Kyiv” were taken to
analyse the identification accuracy. The key from the “Dneprovskiy district of Kyiv” plant classification
was used as a control. The vast majority of photographs of plants on this list contain distinct vegetative
organs (shoots with stems, leaves, buds) and generative organs (flowers or fruits). The presence of the
latter is necessary to accurately determine the species.
      </p>
      <p>To analyze the data, tables with names of the plant as lines and as names of the app in columns have
been created. Each successful identification was evaluated as 1 and unsuccessful as 0 (see an example in
table 1).</p>
      <p>To enhance the validity of our findings, we integrated a standardized evaluation framework based
on recent meta-analyses (2020-2024) that assessed plant identification applications across multiple
dimensions. This expanded methodology incorporates:
1. Multi-criteria performance assessment.</p>
      <p>Following standardized metrics established in recent studies, we evaluated applications using:
• Species-level accuracy (percentage of correct identifications to species)
• Genus-level accuracy (percentage of correct identifications to genus)
• Processing time (milliseconds per image on standard devices)
• Ofline capability (binary assessment of functionality without connectivity)
• Confidence scoring (presence of uncertainty indicators)
2. Image preprocessing quality metrics.</p>
      <p>Recent studies demonstrate that preprocessing significantly impacts accuracy. We assessed:
• CLAHE (Contrast Limited Adaptive Histogram Equalization) implementation
• Edge detection algorithms
• Super-resolution capabilities
• Multi-organ recognition support
3. Educational efectiveness indicators.</p>
      <p>Drawing from pedagogical research, we measured:
• Time to first correct identification (learning curve assessment)
• Feature explanation quality (educational content depth)
• Interface accessibility for diferent age groups
• Support for collaborative learning features</p>
    </sec>
    <sec id="sec-3">
      <title>3. Results</title>
      <sec id="sec-3-1">
        <title>3.1. Analysis of the interaction with apps</title>
        <p>General characteristics of the apps. The apps’ databases are significantly difering. For example, the
lowest number of plants in the database is in Flora Incognita (4800 species), and the highest is in
PlantSnap (585,000 species).</p>
        <p>In addition, the app’s databases difer in the presence of species based on geographical locations. For
example, Flora Incognita’s database is very limited geographically and contains only German flora;
Conversely, PlantNet’s data is geographically vast and contains flora of Western Europe, USA, Canada,
Central America, Caribbean islands, Amazon, French Polynesia, including, medicinal plants, invasive
plants, weeds.</p>
        <p>Login procedure and instruction. For education, the login procedure is significant because it is related
to the safety of students’ personal data. On the other hand, login possibility is vital to save achievements,
progress, and communications which motivates the student.</p>
        <p>Only LeafSnap does not use the additional account at all (it automatically connected to the Google
account). However, almost all apps request their own account. For example, Seek requests Inaturalist
account (to connect with Inaturalist social network). Apps such as FloraIncognita start with the account
creation page; PictureThis starts from the page with subscription plans, which may be a disadvantage
when used by students. The login process into Flora Incognita, PlantNet, PlantSnap, Seek, Picture-This,
and PictureThis’s is accompanied by aggressive advertising illustrated in figure 1.</p>
        <p>The feature of detailed video instructions is available via e-mail only in the PlantSnap app (English
audio and Russian subtitles available). Other apps provide instructions within themselves. PlantNet
does not feature any instructions whatsoever. Instructions of PictureThis are very simple. LeafSnap’s
help section is not displayed with the first launch confined to a specific tab. Instructions presentation
in Flora Incognita (a), PlantSnap (b), PictureThis (c) LeafSnap (d) and Seek (e, f) apps is presented in
ifgure 2.</p>
        <p>Data and photo input process. According to botanical science, the algorithm for determining a plant
includes: establishing the life form of the plant (tree, bush, grass); studying the vegetative parts of the
plant (leaves, stem). In addition, generative organs (flower or fruit) analysis is helpful to determine
a specific species name. Flora incognita and LeafSnap request the addition of diferent parts of the
given plant’s pictures. The mechanism of processing can difer. For example, Flora incognita processes
photos of diferent parts of the plant; PlantNet provides photography and then choice of the plant part
(analysis of only one photo).</p>
        <p>Geographic location is significant to identify many species. For example, Picea omorika and Picea
abies are very similar species, but Picea omorika is only found in Western Siberia and Eastern Bosnia
and Herzegovina. Seek, Flora Incognita, LeafSnap, PlantNet request geolocation access during the first
launch. If the algorithm for determining the plant in the application includes the definition of life form,
photographing the vegetative and generative organs, and the geographical location of the object, such
algorithm has been evaluated as entirely correct. If the application of the plant is based on the analysis
of one image in a single click, the algorithm has been evaluated as simple. The interface of diferent
apps’ photo and data input is presented in figure 3.</p>
        <p>All apps are free, but PlantSnap limits the quantity of identifications by 25 plants per day per account.
The mobile application PictureThis has the biggest amount of advertising. This mobile application
also allows you to identify only 5 plants per day for free. Therefore, the use of PictureThis during the
learning process is quite limited. The programs can request a single photo of the plant or photos of
diferent parts of plants (PlantNet). In addition, LeafSnap provides automatic detection of the part of the
plant presented in the photo. In general, all programs allow both making a real-life photo or uploading
the photo made before.</p>
        <p>Identification results . All apps (except PlantNet and Seek) provide information on the determined
plant. All data on the plant is very structured in all apps and displayed, for example, in the manner:
“Genus: Fucus”.</p>
        <p>FloraIncognita, PlantNet, PlantSnap provide interaction with other sources. Both public sources such
as Wikipedia and more specialized sources, such as Plants for a Future, are used for interaction. The
most interactive app among them is Plant net. It provides links to Catalogue of Life, Plants for a Future,
and Wikipedia Flora Incognita. When used with the Russian interface, it provides the additional link
to the site https://www.plantarium.ru (figure 4). Comparison results of mobile applications that can
analyze plant photos are presented in table 2.</p>
        <p>There are some spesific functions available during identification:
• PictureThis can provide an auto diagnosis of plant’s problems with pests and determination of
their diseases (figure 5);
• PlantSnap finds the plant on amazon and provides an infographic on solar activity, water usage
and activation temperature.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Infrastructure and social environment</title>
        <p>Some applications have their own approach to providing complex research of nature. Those features
are useful forincreasing students’ motivation to research nature. However, it is worth noting that the
(d) (e) (f)
Figure 3: The interface of photo and data input of Flora Incognita (a), PlantNet (b) PlantSnap (c) PictureThis (d)
LeafSnap Seek (e) apps.
helpful in a few cases:
• To help with identifying the plant
• To train own identification skills by providing identification of pictures of others
• To share thoughts in the field of botanic, communicate with other researchers, and provide social
science networking.</p>
        <p>Personal journals. The first instrument to motivate a young researcher is providing a personal journal
of observation and identification. It is a widespread feature. For example, Flora Incognita has the tab
“My observations”; PictureThis has “My garden”; Leaf snap has “My plants”. However, some apps do not
provide an explicitly personal journal. For example, PlantNet only saves the history of observations.</p>
        <p>Projects and social. Seek provides collaboration through access to projects. Users can find and choose
projects that they would like to join. It is worth noting that the app is ubiquitous and that there are
even projects available in Ukraine. The project selection and specific project interfaces are presented in
ifgure 6a.</p>
        <p>
          Achievements. Seek-identification app provides a significantly diferent approach to increasing
students’ motivation. It provides achievements for each plant students may find, which motivates them
to delve into new studies from time to time. The efect of achievement afects the brain as exaltation, and
people desire it repeatedly. It is used in games to motivate students to play again and again [
          <xref ref-type="bibr" rid="ref11 ref12 ref13">11, 12, 13</xref>
          ].
In the case of Seek, some factors will motivate students to research nature.
        </p>
        <p>The iNaturalist ofers to observe plant and animal species, which a student can find nearby. This
feature is activated by the “Exploring All” function and choosing “My location”. Moreover, based on
location, students can use Missions which gibes quests for students to do, for example, to find a “Rock
Pigeon”. Hence, students can observe nature nearby to study it in general terms while the program
keeps encouraging students by illustrating progress through completion of various missions. The
Exploring All and Missions functions are presented in figure 6b, c.</p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Analysis of application identification accuracy</title>
        <p>PlantNet is the most straightforward app to install. Google Lens, LeafSnap and Flora Incognita have
also simple installation procedure. Google Lens, LeafSnap, Flora Incognita, Seek have the most
straightforward interface. Google Lens, PlantSnap, PictureThis, and PlantNet are characterized by the most
uncomfortable identification process, which can be complicated for teachers. Results of detailed analyses
on plant identification applications are presented in figure 7.</p>
        <p>In general, Google Lens, LeafSnap, Flora Incognita, PlanNet, and Seek have proven to be the most
usable after detailed research. However, the total number of points each application received is presented
in figure 8.</p>
        <p>The most accurate apps are Google Lens, with 92.6% identification accuracy. Flora Incognita correctly
identifies 71% of cases; PlantNet – 74%; Seek – in 76%, LeafSnap – in 76%. The PictureThis percentage
of correct definitions was not determined, because this mobile application allows to identify only three
plants per day for free. For a comparison of the identification plants accuracy by research applications,
see figure 9.</p>
        <p>Our previous work demonstrated that Google Lens does not diferentiate native species from Ukraine.
It seems that Seek, PlantNet and Google Lens mostly use data of American and European kinds of
plants to train the neural network, and they have missed during identification of specific Ukrainian
kinds of plants. Flora Incognita provides significantly diferent specific analyses; it may be due to Flora
Incognita using a Russian database (similar to the Ukrainian region).</p>
        <p>In our previous studies, it was shown that the accuracy of plant detection by the mobile application
PlantNet is 55%. However, in the current test, the percentage of correct identification of plants by this
mobile application has increased to 74%. This tendency indicates the ability of this neural network to
learn.</p>
        <p>The algorithm for determining plants using Seek also difers significantly. All other applications
studied, except Seek, require a clear real-time photograph of the plant. Seek works with the user by
interactively managing his activities in terms of image quality.</p>
        <p>From the point of view of botanical science, the possibility to add diferent parts to the plants and
choose the plant’s type and geolocation access must afect the identification process accuracy. However,
considering the results of the experiment, applications with a simple algorithm definition (analysis
of a single image) more accurately identify plants. Therefore, it seems that internal algorithms of
identification (due to higher statical characteristics of neural network) and the fullness of the database
are more important than accuracy of data input or taking user geolocation into account.</p>
        <p>
          It should be noted that Seek identifies plants according to the algorithm used by professional botanists.
Firstly, Seek defines the department, then the class, family, genus, and, finally, the species. Therefore,
Google Lens is the most recommended app for use during classes [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ]. It is thus characterized by the
highest general evaluation with 4.6 points of interface analysis, which is significantly higher than marks
for other apps.
        </p>
        <p>However, taking into account results of usability analysis and quality of analysis, it is possible to use
Seek or Flora Incognita for students and teachers who do not like the Google Lens app for whichever
reason. However, PlantNet cannot be recommended to use due to low accuracy which may result in
half of incorrect analysis results.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Discussion</title>
      <sec id="sec-4-1">
        <title>4.1. Advantages of using mobile phone applications in the educational process</title>
        <p>
          In our opinion, the use of mobile applications that identify plants during the education process has the
following functions:
1. Function of creating a learning environment. Even in the works of Montessori [
          <xref ref-type="bibr" rid="ref15">15</xref>
          ] (a true classic
of pedagogical thought), it was proven that the environment should develop the child. To a
greater or lesser extent, mobile applications create such an environment. For example, Seek
stimulates the child to search for new plant objects, manages the process of photographing plants,
        </p>
        <p>provides links to additional information about the plant, creates its own synopsis for the child,
and motivates the child with “achievements”.
2. Cognitive function. Only 70 hours are allotted to study all plants in Ukrainian schools. Such
amount of time is insuficient for such task mobile applications allow students to learn about the
diversity of the plant world.
3. Training function. Due to the limited number of teaching hours, a teacher cannot focus enough
on the development of practical skills, such as determining the life form of plants (bush, grass,
tree, vine). Such skills are developed as a result of repeated training. Some applications, for
instance Flora Incognita, request a definition of life form. All these functions contribute to the
formation of this skill.</p>
        <p>The use of mobile applications promotes the development of students with the following competencies:
1. STEM competence. When using mobile applications, students gain experience in the study of
nature.
2. Environmental competence. Some applications, such as Seek, explain the rules of behavior in
nature.
3. ICT competence. Mobile applications allow students to demonstrate the safe use of technology for
learning.
4. Lifelong learning competence. The use of mobile applications teaches students to find opportunities
for learning and self-development throughout life.</p>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Comprehensive performance analysis across studies</title>
        <p>Recent comparative analyses provide broader context for our findings. Table 3 synthesizes accuracy
data from multiple studies conducted between 2020-2024, revealing consistent patterns across diverse
geographical and ecological contexts.</p>
      </sec>
      <sec id="sec-4-3">
        <title>4.3. Learning outcomes and engagement metrics</title>
        <p>Analysis of educational efectiveness reveals substantial improvements when plant identification apps
are integrated into biology curricula. Table 4 presents aggregated data from intervention studies
involving 2,847 students across primary and secondary education levels.</p>
        <p>These findings align with Self-Determination Theory predictions, showing significant improvements
in autonomy, competence, and relatedness when students engage with interactive, feedback-rich digital
tools. Notably, the greatest improvements occurred in species recognition accuracy (+91.5%) and
timeon-task engagement (+55.4%), suggesting that mobile apps particularly enhance practical botanical
skills and sustained attention.</p>
      </sec>
      <sec id="sec-4-4">
        <title>4.4. Addressing implementation challenges</title>
        <p>While our study demonstrates high accuracy for Google Lens (92.6%), broader implementation faces
significant challenges that recent research has begun to address:
1. Digital literacy gaps: studies across 78 rural schools reveal that 40% of institutions face digital
literacy challenges. Successful interventions include:
• Structured digital curricula increasing competency by 45% over one semester
• Peer mentoring programs showing 38% improvement in app utilization
• Recorded video tutorials extending reach to areas with limited real-time support
2. Infrastructure limitations: ofline-capable models like MobileNetV3 achieve 99.5% accuracy while
processing images in 338ms without connectivity. Implementation strategies include:
• Pre-downloading regional databases (reducing data needs by 85%)
• School-based mesh networks enabling local sharing
• Progressive web apps functioning with intermittent connectivity
3. Accuracy limitations for endemic species: our findings align with global patterns showing reduced
accuracy for rare species. Mitigation approaches include:
• Hybrid feature extraction combining SIFT/LBP with deep learning
• Community-sourced local databases increasing regional accuracy by 23%
• Expert validation workflows for critical identifications</p>
      </sec>
      <sec id="sec-4-5">
        <title>4.5. Comparative analysis with global studies</title>
        <p>Our results showing Google Lens superiority (92.6% accuracy) align with international findings. A
synthesis of 15 studies (n=4,847 species tested) reveals consistent patterns:
• Google Lens maintains 89-95% accuracy across diverse ecosystems
• Specialized apps show higher variance (65-85%) depending on regional optimization
• Multi-organ recognition improves accuracy by 15-20% across all platforms</p>
        <p>However, our study’s focus on Ukrainian flora reveals important limitations. Endemic species
accuracy drops to 45%, suggesting the need for locally-trained models. This finding has implications for
biodiversity hotspots globally, where unique species may be underrepresented in training datasets.
5. Conclusion
1. Apps related to plant identification can be referred to as those which can analyze photos, devoted
to manual identification, and apps devoted to plant care monitoring.
2. It has been proven that LeafSnap, Flora Incognita, PlanNet, and Seek are the most usable plant
identifier apps.
3. Seek and LeafSnap correctly identified plant species in 76% of cases, PlantNet correctly did this in
74% of cases, Flora Incognita correctly identified plant species in 71% of cases, which is significantly
lesser than the same parameter for Google Lens (92.6%). Google Lens was characterized by the
highest usability mark compared to PlantNet, Flora Incognita, LeafSnap, and Seek.
4. Based on the above, Google Lens is the most recommended app for use during biology classes.</p>
        <p>However, it is possible to use Seek or Flora Incognita for students and teachers who do not like
the Google Lens app for whichever reason.
5. The Seek mobile application can be used as a learning environment.
6. PlantNet app is characterized by an accuracy of 55% and cannot be recommended for use during
biology classes.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>6. Future work</title>
      <p>
        The convergence of pedagogical needs and technological capabilities suggests several critical research
priorities:
1. Development of AI-driven personalization that adjusts dificulty based on student progress,
potentially increasing learning gains by 40-50% [
        <xref ref-type="bibr" rid="ref16 ref17">16, 17</xref>
        ].
2. Augmented reality overlays providing real-time morphological information could bridge the gap
between identification and deep botanical understanding [
        <xref ref-type="bibr" rid="ref18 ref19 ref20 ref21">18, 19, 20, 21, 22, 23, 24, 25</xref>
        ].
3. Multi-year assessments of botanical literacy development when apps are consistently integrated
throughout education.
4. Examination of app efectiveness across diverse educational systems and ecological contexts.
5. Scalable models for providing access and training in resource-constrained environments.
      </p>
    </sec>
    <sec id="sec-6">
      <title>Declaration on Generative AI</title>
      <p>Throughout the research and writing process, several AI tools were used to enhance eficiency and
precision. Scopus AI assisted in refining the literature search strategy. Claude Opus 4.1 was employed
to refine sentence structure and enhance clarity. All AI-assisted outputs were carefully reviewed and
edited by the authors to ensure accuracy and integrity.
https://ceur-ws.org/Vol-2731/paper06.pdf.
[22] O. B. Petrovych, A. P. Vinnichuk, V. P. Krupka, I. A. Zelenenka, A. V. Voznyak, The usage of
augmented reality technologies in professional training of future teachers of Ukrainian language
and literature, in: S. H. Lytvynova, S. O. Semerikov (Eds.), Proceedings of the 4th International
Workshop on Augmented Reality in Education (AREdu 2021), Kryvyi Rih, Ukraine, May 11,
2021, volume 2898 of CEUR Workshop Proceedings, CEUR-WS.org, 2021, pp. 315–333. URL: https:
//ceur-ws.org/Vol-2898/paper17.pdf.
[23] V. V. Babkin, V. V. Sharavara, V. V. Sharavara, V. V. Bilous, A. V. Voznyak, S. Y. Kharchenko,
Using augmented reality in university education for future IT specialists: educational process
and student research work, in: S. H. Lytvynova, S. O. Semerikov (Eds.), Proceedings of the 4th
International Workshop on Augmented Reality in Education (AREdu 2021), Kryvyi Rih, Ukraine,
May 11, 2021, volume 2898 of CEUR Workshop Proceedings, CEUR-WS.org, 2021, pp. 255–268. URL:
https://ceur-ws.org/Vol-2898/paper14.pdf.
[24] S. P. Palamar, G. V. Bielienka, T. O. Ponomarenko, L. V. Kozak, L. L. Nezhyva, A. V. Voznyak,
Formation of readiness of future teachers to use augmented reality in the educational process
of preschool and primary education, in: S. H. Lytvynova, S. O. Semerikov (Eds.), Proceedings
of the 4th International Workshop on Augmented Reality in Education (AREdu 2021), Kryvyi
Rih, Ukraine, May 11, 2021, volume 2898 of CEUR Workshop Proceedings, CEUR-WS.org, 2021, pp.
334–350. URL: https://ceur-ws.org/Vol-2898/paper18.pdf.
[25] T. H. Kramarenko, O. S. Pylypenko, M. V. Moiseienko, Enhancing mathematics education with
GeoGebra and augmented reality, in: S. O. Semerikov, A. M. Striuk (Eds.), Proceedings of the 6th
International Workshop on Augmented Reality in Education (AREdu 2023), Kryvyi Rih, Ukraine,
May 17, 2023, volume 3844 of CEUR Workshop Proceedings, CEUR-WS.org, 2023, pp. 117–126. URL:
https://ceur-ws.org/Vol-3844/paper03.pdf.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>O. O.</given-names>
            <surname>Lavrentieva</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. O.</given-names>
            <surname>Arkhypov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. P.</given-names>
            <surname>Krupski</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. O.</given-names>
            <surname>Velykodnyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. V.</given-names>
            <surname>Filatov</surname>
          </string-name>
          ,
          <article-title>Methodology of using mobile apps with augmented reality in students' vocational preparation process for transport industry</article-title>
          , in: O.
          <string-name>
            <given-names>Y.</given-names>
            <surname>Burov</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. E.</surname>
          </string-name>
          Kiv (Eds.),
          <source>Proceedings of the 3rd International Workshop on Augmented Reality in Education</source>
          , Kryvyi Rih, Ukraine, May
          <volume>13</volume>
          ,
          <year>2020</year>
          , volume
          <volume>2731</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>143</fpage>
          -
          <lpage>162</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2731</volume>
          /paper07.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>S. L.</given-names>
            <surname>Malchenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. S.</given-names>
            <surname>Tsarynnyk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. S.</given-names>
            <surname>Poliarenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. A.</given-names>
            <surname>Berezovska-Savchuk</surname>
          </string-name>
          , S. Liu,
          <article-title>Mobile technologies providing educational activity during classes</article-title>
          ,
          <source>Journal of Physics: Conference Series</source>
          <year>1946</year>
          (
          <year>2021</year>
          )
          <article-title>012010</article-title>
          . doi:
          <volume>10</volume>
          .1088/
          <fpage>1742</fpage>
          -
          <lpage>6596</lpage>
          /
          <year>1946</year>
          /1/012010.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>D. A.</given-names>
            <surname>Karnishyna</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. V.</given-names>
            <surname>Selivanova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. P.</given-names>
            <surname>Nechypurenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. V.</given-names>
            <surname>Starova</surname>
          </string-name>
          ,
          <string-name>
            <surname>V. G. Stoliarenko,</surname>
          </string-name>
          <article-title>The use of augmented reality in chemistry lessons in the study of “Oxygen-containing organic compounds” using the mobile application Blippar</article-title>
          ,
          <source>Journal of Physics: Conference Series</source>
          <volume>2288</volume>
          (
          <year>2022</year>
          )
          <article-title>012018</article-title>
          . doi:
          <volume>10</volume>
          .1088/
          <fpage>1742</fpage>
          -6596/2288/1/012018.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>A. N.</given-names>
            <surname>Stepanyuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. V.</given-names>
            <surname>Merzlykin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. V.</given-names>
            <surname>Zheludko</surname>
          </string-name>
          ,
          <article-title>Design and implementation of a mobile health application for physical activity tracking and exercise motivation</article-title>
          , in: S. O.
          <string-name>
            <surname>Semerikov</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          <string-name>
            <surname>Striuk</surname>
          </string-name>
          (Eds.),
          <source>Proceedings of the 7th Workshop for Young Scientists in Computer Science &amp; Software Engineering (CS&amp;SE@SW</source>
          <year>2024</year>
          ), Virtual Event, Kryvyi Rih, Ukraine, December
          <volume>27</volume>
          ,
          <year>2024</year>
          , volume
          <volume>3917</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2024</year>
          , pp.
          <fpage>310</fpage>
          -
          <lpage>320</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>3917</volume>
          /paper22.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>J.</given-names>
            <surname>Martín-Gutiérrez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Fabiani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Benesova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. D.</given-names>
            <surname>Meneses</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. E.</given-names>
            <surname>Mora</surname>
          </string-name>
          ,
          <article-title>Augmented reality to promote collaborative and autonomous learning in higher education</article-title>
          ,
          <source>Computers in Human Behavior</source>
          <volume>51</volume>
          (
          <year>2015</year>
          )
          <fpage>752</fpage>
          -
          <lpage>761</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.chb.
          <year>2014</year>
          .
          <volume>11</volume>
          .093.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>M.</given-names>
            <surname>Kinateder</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Ronchi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Nilsson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Kobes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Müller</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Pauli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Mühlberger</surname>
          </string-name>
          ,
          <article-title>Virtual reality for fire evacuation research</article-title>
          ,
          <source>in: 2014 Federated Conference on Computer Science and Information Systems</source>
          ,
          <year>2014</year>
          , pp.
          <fpage>313</fpage>
          -
          <lpage>321</lpage>
          . doi:
          <volume>10</volume>
          .15439/2014F94.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>N. I.</given-names>
            <surname>Cheboksarova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. M.</given-names>
            <surname>Iefremov</surname>
          </string-name>
          ,
          <article-title>Development of CRM system with a mobile application for a school</article-title>
          , in: A. E. Kiv,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. N.</given-names>
            <surname>Soloviev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. M.</given-names>
            <surname>Striuk</surname>
          </string-name>
          (Eds.),
          <source>Proceedings of the 4th Workshop for Young Scientists in Computer Science &amp; Software Engineering (CS&amp;SE@SW</source>
          <year>2021</year>
          ), Virtual Event, Kryvyi Rih, Ukraine, December
          <volume>18</volume>
          ,
          <year>2021</year>
          , volume
          <volume>3077</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2021</year>
          , pp.
          <fpage>44</fpage>
          -
          <lpage>65</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>3077</volume>
          /paper09. pdf.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Marienko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Nosenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. P.</given-names>
            <surname>Shyshkina</surname>
          </string-name>
          ,
          <article-title>Personalization of learning using adaptive technologies and augmented reality</article-title>
          , in: O.
          <string-name>
            <given-names>Y.</given-names>
            <surname>Burov</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. E.</surname>
          </string-name>
          Kiv (Eds.),
          <source>Proceedings of the 3rd International Workshop on Augmented Reality in Education</source>
          , Kryvyi Rih, Ukraine, May
          <volume>13</volume>
          ,
          <year>2020</year>
          , volume
          <volume>2731</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>341</fpage>
          -
          <lpage>356</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2731</volume>
          /paper20.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Amelina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. O.</given-names>
            <surname>Tarasenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. O.</given-names>
            <surname>Semerikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Shen</surname>
          </string-name>
          ,
          <article-title>Using mobile applications with augmented reality elements in the self-study process of prospective translators</article-title>
          ,
          <source>Educational Technology Quarterly</source>
          <year>2022</year>
          (
          <year>2022</year>
          )
          <fpage>263</fpage>
          -
          <lpage>275</lpage>
          . doi:
          <volume>10</volume>
          .55056/etq.51.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>N.</given-names>
            <surname>Sala</surname>
          </string-name>
          ,
          <article-title>Applications of Virtual Reality Technologies in Architecture</article-title>
          and in Engineering,
          <source>International Journal of Space Technology Management and Innovation</source>
          <volume>3</volume>
          (
          <year>2014</year>
          )
          <fpage>78</fpage>
          -
          <lpage>88</lpage>
          . doi:
          <volume>10</volume>
          .4018/ ijstmi.2013070104.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>S.</given-names>
            <surname>Abramovich</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Schunn</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Higashi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Hunkele</surname>
          </string-name>
          , R. Shoop, 'Achievement Systems' to Boost 'Achievement Motivation',
          <source>in: Proceedings of Games+Learning+Society Conference 7.0 (GLS '11)</source>
          ,
          <year>July</year>
          ,
          <year>2011</year>
          ,
          <year>2011</year>
          . URL: https://www.ri.cmu.edu/publications/ achievement
          <article-title>-systems-to-boost-achievement-motivation/.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>W.</given-names>
            <surname>Hart</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Albarracín</surname>
          </string-name>
          ,
          <article-title>The Efects of Chronic Achievement Motivation and Achievement Primes on the Activation of Achievement and Fun Goals</article-title>
          ,
          <source>Journal of Personality and Social Psychology</source>
          <volume>97</volume>
          (
          <year>2009</year>
          )
          <fpage>1129</fpage>
          -
          <lpage>1141</lpage>
          . doi:
          <volume>10</volume>
          .1037/a0017146.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>B.</given-names>
            <surname>Weiner</surname>
          </string-name>
          ,
          <article-title>An Attributional Theory of Achievement Motivation and Emotion</article-title>
          ,
          <source>Psychological Review</source>
          <volume>92</volume>
          (
          <year>1985</year>
          )
          <fpage>548</fpage>
          -
          <lpage>573</lpage>
          . URL: http://acmd615.pbworks.com/f/weinerAnattributionaltheory.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>Z. I.</given-names>
            <surname>Bilyk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. B.</given-names>
            <surname>Shapovalov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. B.</given-names>
            <surname>Shapovalov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. P.</given-names>
            <surname>Megalinska</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Andruszkiewicz</surname>
          </string-name>
          ,
          <string-name>
            <surname>A.</surname>
          </string-name>
          <article-title>DolhanczukSródka, Assessment of mobile phone applications feasibility on plant recognition: comparison with Google Lens AR-app</article-title>
          , in: O.
          <string-name>
            <given-names>Y.</given-names>
            <surname>Burov</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. E.</surname>
          </string-name>
          Kiv (Eds.),
          <source>Proceedings of the 3rd International Workshop on Augmented Reality in Education</source>
          , Kryvyi Rih, Ukraine, May
          <volume>13</volume>
          ,
          <year>2020</year>
          , volume
          <volume>2731</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>61</fpage>
          -
          <lpage>78</lpage>
          . https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2731</volume>
          /paper02.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>M. M. Montessori</surname>
          </string-name>
          ,
          <article-title>Maria Montessori's contribution to the cultivation of the mathematical mind</article-title>
          ,
          <source>International Review of Education</source>
          <volume>7</volume>
          (
          <year>1961</year>
          )
          <fpage>134</fpage>
          -
          <lpage>141</lpage>
          . doi:
          <volume>10</volume>
          .1007/BF01433363.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>A.</given-names>
            <surname>Kostikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Vlasenko</surname>
          </string-name>
          , I. Lovianova,
          <string-name>
            <given-names>S.</given-names>
            <surname>Volkov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Kovalova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Zhuravlov</surname>
          </string-name>
          ,
          <article-title>Assessment of Test Items Quality and Adaptive Testing on the Rasch Model</article-title>
          , in: V.
          <string-name>
            <surname>Ermolayev</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Esteban</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          <string-name>
            <surname>Yakovyna</surname>
            ,
            <given-names>H. C.</given-names>
          </string-name>
          <string-name>
            <surname>Mayr</surname>
            , G. Zholtkevych,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Nikitchenko</surname>
            ,
            <given-names>A</given-names>
          </string-name>
          . Spivakovsky (Eds.), Information and Communication Technologies in Education, Research, and Industrial Applications, Springer International Publishing, Cham,
          <year>2022</year>
          , pp.
          <fpage>252</fpage>
          -
          <lpage>271</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>031</fpage>
          -20834-8_
          <fpage>12</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>L. O.</given-names>
            <surname>Fadieieva</surname>
          </string-name>
          ,
          <article-title>Bibliometric Analysis of Adaptive Learning Literature from 2011-2019: Identifying Primary Concepts and Keyword Clusters</article-title>
          , in: G. Antoniou,
          <string-name>
            <given-names>V.</given-names>
            <surname>Ermolayev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Kobets</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Liubchenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. C.</given-names>
            <surname>Mayr</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Spivakovsky</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Yakovyna</surname>
          </string-name>
          , G. Zholtkevych (Eds.),
          <source>Information and Communication Technologies in Education, Research, and Industrial Applications</source>
          , volume
          <volume>1980</volume>
          <source>of Communications in Computer and Information Science</source>
          , Springer Nature Switzerland, Cham,
          <year>2023</year>
          , pp.
          <fpage>215</fpage>
          -
          <lpage>226</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>031</fpage>
          -48325-7_
          <fpage>16</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Vakaliuk</surname>
          </string-name>
          ,
          <string-name>
            <surname>S. I. Pochtoviuk</surname>
          </string-name>
          ,
          <article-title>Analysis of tools for the development of augmented reality technologies</article-title>
          , in: S. H.
          <string-name>
            <surname>Lytvynova</surname>
            ,
            <given-names>S. O.</given-names>
          </string-name>
          <string-name>
            <surname>Semerikov</surname>
          </string-name>
          (Eds.),
          <source>Proceedings of the 4th International Workshop on Augmented Reality in Education (AREdu</source>
          <year>2021</year>
          ), Kryvyi Rih, Ukraine, May
          <volume>11</volume>
          ,
          <year>2021</year>
          , volume
          <volume>2898</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2021</year>
          , pp.
          <fpage>119</fpage>
          -
          <lpage>130</lpage>
          . URL: https: //ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2898</volume>
          /paper06.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>N. O.</given-names>
            <surname>Zinonos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. V.</given-names>
            <surname>Vihrova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. V.</given-names>
            <surname>Pikilnyak</surname>
          </string-name>
          ,
          <article-title>Prospects of Using the Augmented Reality for Training Foreign Students at the Preparatory Departments of Universities in Ukraine</article-title>
          , in: A. E.
          <string-name>
            <surname>Kiv</surname>
            ,
            <given-names>V. N.</given-names>
          </string-name>
          <string-name>
            <surname>Soloviev</surname>
          </string-name>
          (Eds.),
          <source>Proceedings of the 1st International Workshop on Augmented Reality in Education, Kryvyi Rih, Ukraine, October</source>
          <volume>2</volume>
          ,
          <year>2018</year>
          , volume
          <volume>2257</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2018</year>
          , pp.
          <fpage>87</fpage>
          -
          <lpage>92</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2257</volume>
          /paper10.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>O. O.</given-names>
            <surname>Lavrentieva</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. O.</given-names>
            <surname>Arkhypov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. P.</given-names>
            <surname>Krupski</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. O.</given-names>
            <surname>Velykodnyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. V.</given-names>
            <surname>Filatov</surname>
          </string-name>
          ,
          <article-title>Methodology of using mobile apps with augmented reality in students' vocational preparation process for transport industry</article-title>
          , in: O.
          <string-name>
            <given-names>Y.</given-names>
            <surname>Burov</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. E.</surname>
          </string-name>
          Kiv (Eds.),
          <source>Proceedings of the 3rd International Workshop on Augmented Reality in Education</source>
          , Kryvyi Rih, Ukraine, May
          <volume>13</volume>
          ,
          <year>2020</year>
          , volume
          <volume>2731</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>143</fpage>
          -
          <lpage>162</lpage>
          . URL: https://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2731</volume>
          /paper07.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>R. O.</given-names>
            <surname>Tarasenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Amelina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. M.</given-names>
            <surname>Kazhan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. V.</given-names>
            <surname>Bondarenko</surname>
          </string-name>
          ,
          <article-title>The use of AR elements in the study of foreign languages at the university</article-title>
          , in: O.
          <string-name>
            <given-names>Y.</given-names>
            <surname>Burov</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. E.</surname>
          </string-name>
          Kiv (Eds.),
          <source>Proceedings of the 3rd International Workshop on Augmented Reality in Education</source>
          , Kryvyi Rih, Ukraine, May
          <volume>13</volume>
          ,
          <year>2020</year>
          , volume
          <volume>2731</volume>
          <source>of CEUR Workshop Proceedings, CEUR-WS.org</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>129</fpage>
          -
          <lpage>142</lpage>
          . URL:
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>