=Paper= {{Paper |id=Vol-2547/paper09 |storemode=property |title=The Google Lens analyzing quality: an analysis of the possibility to use in the educational process |pdfUrl=https://ceur-ws.org/Vol-2547/paper09.pdf |volume=Vol-2547 |authors=Viktor B. Shapovalov,Yevhenii B. Shapovalov,Zhanna I. Bilyk,Anna P. Megalinska,Ivan O. Muzyka |dblpUrl=https://dblp.org/rec/conf/aredu/ShapovalovSBMM19 }} ==The Google Lens analyzing quality: an analysis of the possibility to use in the educational process== https://ceur-ws.org/Vol-2547/paper09.pdf
                                                                                               117


     The Google Lens analyzing quality: an analysis of the
         possibility to use in the educational process

    Viktor B. Shapovalov1[0000-0001-6315-649X], Yevhenii B. Shapovalov1[0000-0003-3732-9486],
         Zhanna I. Bilyk1[0000-0002-2092-5241], Anna P. Megalinska2[0000-0001-8662-8584]
                         and Ivan O. Muzyka3[0000-0002-9202-2973]
                   1 National Center “Junior Academy of Sciences of Ukraine”,

                          38/44, Dehtiarivska Str., Kyiv, 04119, Ukraine
                                gws0731512025@gmail.com
      2 National Pedagogical Dragomanov University, 9, Pyrogova Str., Kyiv, 01601, Ukraine

                                  anna.megalin@ukr.net
    3 Kryvyi Rih National University, 11, Vitaliy Matusevych Str., Kryvyi Rih, 50027, Ukraine

                                   musicvano@gmail.com



         Abstract. Biology is a fairly complicated initial subject because it involves
         knowledge of biodiversity. Google Lens is a unique, mobile software that allows
         you to recognition species and genus of the plant student looking for. The article
         devoted to the analysis of the efficiency of the functioning of the Google Lens
         related to botanical objects. In order to perform the analysis, botanical objects
         were classified by type of the plant (grass, tree, bush) and by part of the plant
         (stem, flower, fruit) which is represented on the analyzed photo. It was shown
         that Google Lens correctly identified plant species in 92.6% cases. This is a quite
         high result, which allows recommending this program using during the teaching.
         The greatest accuracy of Google Lens was observed under analyzing trees and
         plants stems. The worst accuracy was characterized to Google Lens results of
         fruits and stems of the bushes recognizing. However, the accuracy was still high
         and Google Lens can help to provide the researches even in those cases. Google
         Lens wasn’t able to analyze the local endemic Ukrainian flora. It has been shown
         that the recognition efficiency depends more on the resolution of the photo than
         on the physical characteristics of the camera through which they are made. In the
         article shown the possibility of using the Google Lens in the educational process
         is a simple way to include principles of STEM-education and “New Ukrainian
         school” in classes.

         Keywords: Google Lens, plant recognition, New Ukrainian school, STEM-
         education, augmented reality, digital education.


1        Introduction

The school biology course is quite complicated because it includes a huge number of
abstract concepts and terms [4]. In addition, the school biology course also involves the
study of species diversity learning [7]. Ukraine has a rich biota with more than 25,000
___________________
Copyright © 2020 for this paper by its authors. Use permitted under Creative Commons License
Attribution 4.0 International (CC BY 4.0).
118


species of plants (5,100 vascular plants, more than 15,000 mushrooms and mollusks,
more than 1,000 lichens, almost 800 mosses and about 4,000 algae) and 45,000 species
of animals (more than 35 000 insects, almost 3 500 other arthropods, 1800 protozoa,
1600 roundworms, 1280 flatworms and 440 ringworms among more than 44 thousand
invertebrates, about 200 fish and roundworms, 17 amphibians, 21 reptiles, about 400
birds and 108 mammals from the vertebrates) and is characterized by a certain
endemism. A school teacher cannot perfectly know all kinds of species. He may face
the problem: “the students brought a photo of a plant or animal and want to determine
the species of this plant or animal”. One of the ways to solve it is the use of a Google
Lens. The absence of the answer will lead to decreasing of student’s motivation which
is even more important than the fact of absence of the answer.
   According to the concept of a new Ukrainian school, students need to develop
information and digital competencies, which involves the confident and meaningful use
of information technology to receive, transmit information [3]. Google Lens allows
students to set their own, in their convenient mode, during field or classroom classes,
with both informational competence as well as competence in science and technology.


2      Literature review and problem statement

2.1    General situation on the necessity of Google Lens in curricula
The world is becoming digital and technological, which directly affects the learning
process and it creates challenges to education. The classical educational environment
is stable, based on pedagogical traditions, involves the formation of hard skills. In
Ukraine, the classical educational environment is represented by curricula on various
subjects that are required for the performance of all teachers, a list of textbooks that are
recommended for use during the educational process and a number of legislative acts
of the Ministry of Education and Science. However, considering the New Ukrainian
school concept, educational society faces challenges on the implementation of virtual
instruments (learning environment) [2; 3].
   Unlike classical educational environment, virtual learning environment is constantly
changing in connection with the constant scientific and technical process, it is aimed at
the development of creativity [12]. Virtual learning environments include digital
programs and websites. The most program helps to analyze experimental data,
mathematically process them. Thanks to them, you can successfully apply a learning
model through a study in which a student analyzes the results obtained by himself or
others by establishing experimental data as if discovering the basic laws of nature.
Special and most modern of them are those which include elements of virtual and
augmented reality due to their ability to increase student’s motivation [6; 19].
Previously, we substantiated the need to implement Google Lens approach in the
educational process [16]. However, there wasn’t shown the efficiency of Google Lens.
Therefore, this work aims to analyze the possibility of Google Lens using in educational
institutions to provide STEM-research projects on botany. To achieve the aims next
tasks were indicated:
                                                                                              119


1. To evaluate the general quality of the Google Lens’s recognition technology related
   to plants.
2. To understand and show the main factors which effect on the recognition in real-life
   research to give advice in the process.
3. To modify the pedagogical method of the plant's analysis based on the obtained
   knowledge.
4. To summarize up and analyze the results and evaluate the possibility of Google Lens
   implementation in the school botany research.
Thus, the object of the research is the pedagogical method of plant kind determination.
The mechanism of plant determination by Google Lens was the subject of the study.


2.2    Description of the Google Lens and its role in education
Mobile phone nowadays is a powerful scientific instrument [13]. However, the
potential of it still not fully understood and presented. One of the companies who are
creating new digital software which can be used in education is Google who creates
instruments such as Google Lens. Google Lens is an image recognition technology
based on neural networks and developed by Google. Having determined the species of
animal or plant, one can further study its biological properties. The main positive
aspects of using Google Lens in our opinion are:
1. Provided by the possibility to use personal phones any time of the research.
2. Interaction with any objects include biological
3. The possibility of research any object any time including during expeditionary
   researches
4. Creation of interaction between real and virtual worlds.
Google lens is integrated into both Google Photos and Google camera which can be
used on any Android devices with Android 4.4 or higher or IOS. The access to Google
Lens instrument is presented in Figure 1.
   Google Lens can be used in different parts of education such as Biology, Mineralogy,
Architecture and history and Marketing to achieve additional information about the
object and increase the motivation of the students (table 1).

                    Table 1. Using Google Lens in different fields of education
 Field of science                                    Way of using
                     Nowadays Google Lens is characterized by the possibility of biology objects
Biology
                     recognition (animals, plants, etc.)
                     Google lens can use the color and the structure of the minerals to analyze it
Mineralogy
                     (not available now, but we think it will be provided in the future)
Architecture and
                 Analyzing the building and monuments
history
Marketing        Analyzing and searching for different real-life products such as clothes
120




                           Fig. 1. Google Lens instrument access


3      Materials and methods

3.1    Model experiment
To provide experiment and compare results with keys for each plant, 500 photos from
online-classifier “The list of plants of the Dneprovskiy district of Kiev” (Fig. 2) were
taken. The online-classifier contains the pictures of each kind of the plants and its
determination names. Photos were characterized by the method described in 3.2 due to
the different quality of the photos and collected by a method described in 3.3.




                Fig. 2. The list of plants of the Dneprovskiy district of Kiev
                                                                                          121


3.2     The general method of photo analysis
Photo’s quality is an important factor to Google Lens. Therefore, it is necessary to
classify each photo by main quality components – composition, resolution, digital
noise. Main photos quality criteria are presented in table 2.

                            Table 2. Main photos quality criteria
                 Analyzed object`s
      Quality                      Gray noise          Color noise       Analyzing object
                  resolution, Mpx
       Bad              <0.3         High                 High          Not clearly visible
      Middle           0.3–3        Middle               Middle           Clearly visible
      Good               >3          Low                  Low            Perfectly visible


3.3     Data collection and analysis
To collect data, we developed the database with front-end and back-end development.
Each photo was classified by the image quality using the method described in 3.2 and
its characteristics such as type (tree, bush, grass) and presented part of the plant (flower,
leaf, stem, fruit). The mark of the analyzing process was inputted too. The input
interface is presented on Fig. 3.




                                  Fig. 3. The input interface

The output interface looked like a table to provide the visualization and dynamic of the
research process. The output interface is presented in Fig. 4.
   Google Lens propose a few results of the analysis to the user. Therefore, the results
of Google Lens were classified on 0, 1, 2 or 3 points. Sometimes cropping of the photos
was used, in this case, one point was deducted. The keys of Google Lens results
evaluation are presented in table 3.

                            Table 3. Main photos quality criteria
Points                                       Description
  0 The object wasn’t detected at all
       A genus of the object was recognized and presented in top 6 results but species wasn’t
  1
       correctly recognized
            a) a genus of the object was recognized and presented in top 3 results but species
  2               wasn’t correctly recognized
            b) Genus and species of the object was recognized and presented in top 6 results
  3 Genus and species of the object was recognized and presented in top 3 results

   Results were collected on the database. To provide an analysis of the requests to a
database prepared and provided. The requests were prepared to take into account the
aims of the work. To process the results MS Excel 365 was used.
122




                                Fig. 4. The output interface


4      Results and discussion

4.1    The general accuracy of the Google Lens
The general inaccuracy of Google Lens analysis was 8.4 % on the modeling
experiment. This result proves the possibility of Google Lens using in the educational
process and it can help pupils to conduct their own researches; in 92.6 % of cases, it
can help to find the right answer. It is worth note that this accuracy is much higher than
the accuracy of the teacher’s answers.
   In 72.8 % of cases, Google Lens gives a totally correct answer (finding object was
in the top 3 of results) which is high. In 17 % of cases, it shows the correct results in
the top 6 of the results and just in 1.8 analysis results were not so much correct (in the
top 6 of the results without correct genus recognizing but with correct species
recognizing). General results are presented on Fig. 5.


4.2    Analyzing the importance of the criteria
Photos quality. As it was expected, as higher quality of the photo than better analysis
results. However, even the low quality of the photos has a huge chance to be rightly
analyzed. Just 14.3 % of photos with low quality weren’t recognized compared to 4.2 %
of incorrect results in the case of high-quality photos. Google lens was totally accurate
in 80.83, 72.7, 62.6 % of cases with high, medium and low quality, respectively. The
dependency of the accuracy of Google Lens analysis quality of photo’s quality is
                                                                                          123


presented in Figure 6.




                       80
                       70
                       60
    Probability, %




                       50
                       40
                       30
                       20
                       10
                           0
                               0             1                   2           3
                                           Quality of the analysis, points
                                                 0   1   2   3


                                   Fig. 5. General Google Lens accuracy



                      90
                      80
                      70
                      60
     Probability, %




                      50
                      40
                      30
                      20
                      10
                      0
                               0             1                   2               3
                                         Quality of the analysis, points
                                               Hi Mq Lq

  Fig. 6. The dependency of the accuracy of Google Lens analysis quality of photo’s quality
124


Therefore, using a better camera and making a better photo can increase analysis
quality, however, Google Lens algorithms work with low-quality photos enough fine
and it means that Google Lens instrument can be used on any device even with a bad
camera which can afford each student.
   Parts of a plant. Google Lens algorithms better analyze flowers of the plants than
other parts and it was characterized by an inaccuracy level of 7.1 %. The worst result
of the Google Lens analysis was observed under fruit analysis. It may be related to the
similarity of some fruits between each other. It was characterized by inaccuracy level
of 16.2 %. However, totally correct analysis results were similar for steams, leaves and
fruits of the plants and it was 70.9, 70.5, 70.3 %, respectively. Significantly higher was
the level of the totally correct analysis results in cases of flower analysis with an
indicator of 76.0 %. Therefore, to obtain better results if it possible provide analysis of
the flowers of the plants. The dependency of the accuracy of Google Lens analysis
quality of analyzing part of the plant is presented in Figure 7.



                       80
                       70
                       60
      Probability, %




                       50
                       40
                       30
                       20
                       10
                       0
                            0 points       1 point           2 points        3 points
                                       Google Lens results quality, points


                                       Flower    Stem     Leaf    Fruit


Fig. 7. The dependency of the accuracy of Google Lens analysis quality of analyzing part of the
                                            plant

   Plant type. Google Lens analysis shows similar results for both totally accurate and
inaccuracy for grass and trees and they were 74.4 and 7.8 % for grass, respectively, and
76.4 and 8.3 % for trees, respectively. Much worse Google Lens results were
characterized for bushes. The inaccuracy of it was 10.4 % and the quantity of totally
correct results was 64.6 %. Dependency of the accuracy of Google Lens analysis quality
of analyzing plant type is presented in Figure 8.
                                                                                          125



                      80
                      70
                      60
      Probabilty, %




                      50
                      40
                      30
                      20
                      10
                      0
                           0 points        1 point           2 points       3 points

                                      Google Lens results qaulity, points
                                           Tree      Bush    Grass


Fig. 8. The dependency of the accuracy of Google Lens analysis quality of analyzing plant type


4.3            Discussion
General specific of analysis.
   It’s worth note, that there were some examples of Ukrainian species of plants weren’t
recognized at all. This fact was obtained due to the integration of the Lens with different
internet services where wasn’t information about specifically kinds of plants. Thus, the
results will be even better in the regions where more information about the plants in
English.
   High analyzing results were obtained under analyzing of the flowers of grasses (for
example, Taraxacum officinale) where the quantity of inaccuracy analyzed samples
were 0 % and quantity of totally successfully analyzed samples was 93 %.
   Not surprisingly, the results of the brush’s analysis at all were bad. However, the
worsted was characterized for fruits and stems of the bushes and level of inaccuracy
was 22.2 % of them. The lowest level of total accuracy analyzed results were
characterized for stems of the bushes. For all other samples, results were close to
average. This means that using Google Lens for fruits and stems of the bushes do not
guarantee the perfect results. However, it still characterized by a respectively high level
of analyzing the accuracy and it can be used to obtaining information. General results
of Google Lens analysis are presented on the fig. 8.
   Google Lens isn’t analyzing the environment; therefore, it can make mistakes based
on this fact. For example, this fact was obtained under analyzing of the water mint
photos.
   Low indicator of analysis quality on the fruit analysis may be explained by an
algorithm of analyzing a shape of the fruit firstly and then looking on its specific.
Therefore, for example, guelder-rose was analyzed as grapefruit. Some photos where
126


colors were differed compared to real-life samples and in those cases, Lens makes
mistakes to. It was observed under analyzing of Gladiolus where colors were less
saturated than in real-life and Heliopsis helianthoides where samples were more
saturated. In those cases, Google Lens makes mistakes in the species not in the genus.




                                  Fig. 9. General results

Therefore, it seems like, a shape is an effect on the genus determination. Color and
specific of the plant parts are rather affected on the species determination of genus.
   Google Lens is looking for eye-catching object and there were cases where plant part
was less eye-catching than other objects and Lens makes mistakes. And this effect even
more affected than other photo quality aspects. It means that even not camera or its lens
plays the most important role in photo quality but photography skills. To decrease its
effect cropping photo may be used. However, this fact will stimulate students to
increase their photography skills.


Google Lens in STEM-education.
   Google Lens is a powerful STEM-instrument which can provide increasing of
knowledge quantity and quality and can increase motivation to education for students-
visuals [5; 6]. As was noted before, it has a huge potential of implementation in different
educational fields and can provide transdisciplinarity of the educational process through
the integration of it with Wikipedia (default) and other resources (by picture search).
   The teacher can achieve even better results by providing “find-mistakes” challenge
with excellent students. Under it, students will try to find mistakes in the analyzing of
the Google Lens.
   It is worth note that one of the priorities of the Ukrainian secondary school is STEM-
education [1; 15; 17; 18] and the New Ukrainian school principals implementation
which can be easily achieved by using Google Lens using in classes.
   Nowadays each teacher in Ukraine can easily use those methods based on Google
Lens through using online-guides located in stemua.science open-source web-portal
                                                                                            127


and can share own methods based on it [14]. In additions, STEM-principles nowadays
are being introduced in university courses due to their efficiency [8; 9; 10; 11; 15].


Conclusions

1. Google Lens shows the high results of analyzing which gives reason to recommend
   its implementation in the educational process.
2. It is better to plan classes on the gardens due to the fact that Google Lens shows
   better results on the grass and trees analysis.
3. Based on the results of the article we modernize methods located in the
   stemua.science.
4. Using of Google Lens in the educational process is a simple way to include principles
   of STEM-education and “New Ukrainian school” in classes.


References
 1. Bilyk, Z., Shapovalov Ye, Shapovalov V., Atamas A.: Vykorystannia ontolohichnykh
    resursiv yedynoho merezhetsentrychnoho osvitnoho informatsiinoho seredovyshcha dlia
    provedennia STEM/STEAM-zaniat (Use of Ontological Resources of the Universal
    Network Information Educational Media for STEM/STEAM-lessons). Osvita ta rozvytok
    obdarovanoi osobystosti 1(72), 30–36 (2019). doi:10.32405/2309-3935-2019-1(72)-30-36
 2. Budnyk, O.: Theoretical Principles of Using Steam-Technologies in the Preparation of the
    Teacher of the New Ukrainian School. Journal of Vasyl Stefanyk Precarpathian National
    University 5(1), 23–30 (2018). doi:10.15330/jpnu.5.1.23-30
 3. Elkin, O., Hrynevych, L., Kalashnikova, S., Khobzey, P., Kobernyk, I., Kovtunets, V.,
    Makarenko, O., Malakhova, O., Nanayeva, T., Shiyan, R., Usatenko, H.: The New
    Ukrainian School: Conceptual principles of secondry school reform. Ministry of Education
    and Science of Ukraine Kiev (2016)
 4. Gilbert, J.K.: Models and modelling: Routes to more authentic science education.
    International Journal of Science and Mathematics Education 2(2), 115–130 (2004).
    doi:10.1007/s10763-004-3186-4
 5. Keller, J.M.: ARCS Model of Motivation. In: Seel, N.M. (ed.) Encyclopedia of the Sciences
    of Learning, pp. 304–305. Springer, Boston (2012). doi:10.1007/978-1-4419-1428-6_217
 6. Khan, T., Johnston, K., Ophoff, J.: The Impact of an Augmented Reality Application on
    Learning Motivation of Students. Advances in Human-Computer Interaction 7208494, 1–
    14 (2019). doi:10.1155/2019/7208494
 7. Ministry of education and science of Ukraine: Biolohiia, 6–9 klasy: navchalna prohrama
    dlia zahalnoosvitnikh navchalnykh zakladiv (Biology, Grades 6-9: curriculum for
    secondary schools). https://mon.gov.ua/storage/app/media/zagalna%20serednya/programy-
    5-9-klas/onovlennya-12-2017/15.biologiya-6-9.docx (2017)
 8. Modlo, Ye.O., Semerikov, S.O., Bondarevskyi, S.L., Tolmachev, S.T., Markova, O.M.,
    Nechypurenko, P.P.: Methods of using mobile Internet devices in the formation of the
    general scientific component of bachelor in electromechanics competency in modeling of
    technical objects. In: Kiv, A.E., Shyshkina, M.P. (eds.) Proceedings of the 2nd International
    Workshop on Augmented Reality in Education (AREdu 2019), Kryvyi Rih, Ukraine, March
    22, 2019, CEUR-WS.org, online (2020, in press)
128


 9. Modlo, Ye.O., Semerikov, S.O., Nechypurenko, P.P., Bondarevskyi, S.L., Bondarevska,
    O.M., Tolmachev, S.T.: The use of mobile Internet devices in the formation of ICT
    component of bachelors in electromechanics competency in modeling of technical objects.
    In: Kiv, A.E., Soloviev, V.N. (eds.) Proceedings of the 6th Workshop on Cloud Technologies
    in Education (CTE 2018), Kryvyi Rih, Ukraine, December 21, 2018. CEUR Workshop
    Proceedings 2433, 413–428. http://ceur-ws.org/Vol-2433/paper28.pdf (2019). Accessed 10
    Sep 2019
10. Modlo, Ye.O., Semerikov, S.O., Shmeltzer, E.O.: Modernization of Professional Training
    of Electromechanics Bachelors: ICT-based Competence Approach. In: Kiv, A.E., Soloviev,
    V.N. (eds.) Proceedings of the 1st International Workshop on Augmented Reality in
    Education (AREdu 2018), Kryvyi Rih, Ukraine, October 2, 2018. CEUR Workshop
    Proceedings 2257, 148–172. http://ceur-ws.org/Vol-2257/paper15.pdf (2018). Accessed 21
    Mar 2019
11. Nechypurenko, P.P., Stoliarenko, V.G., Starova, T.V., Selivanova, T.V., Markova, O.M.,
    Modlo, Ye.O., Shmeltser, E.O.: Development and implementation of educational resources
    in chemistry with elements of augmented reality. In: Kiv, A.E., Shyshkina, M.P. (eds.)
    Proceedings of the 2nd International Workshop on Augmented Reality in Education
    (AREdu 2019), Kryvyi Rih, Ukraine, March 22, 2019, CEUR-WS.org, online (2020, in
    press)
12. Noskova, T., Pavlova, T., Yakovleva, O., Morze, N., Drlík, M.: Information environment of
    blended learning: aspects of teaching and quality. In: Smyrnova-Trybulska, E. (ed.) E-
    learning and Intercultural Competences Development in Different Countries, p. 73–94.
    Studio-Noa for University of Silesia, Katowice-Cieszyn (2014)
13. Quesada-González, D., Merkoçi, A.: Mobile phone-based biosensing: An emerging
    “diagnostic and communication” technology. Biosensors and Bioelectronics 92, 549–562
    (2017). doi:10.1016/j.bios.2016.10.062
14. Shapovalov, V.B., Atamas, A.I., Bilyk, Zh.I., Shapovalov, Ye.B., Uchitel, A.D.: Structuring
    Augmented Reality Information on the stemua.science. In: Kiv, A.E., Soloviev, V.N. (eds.)
    Proceedings of the 1st International Workshop on Augmented Reality in Education (AREdu
    2018), Kryvyi Rih, Ukraine, October 2, 2018. CEUR Workshop Proceedings 2257, 75–86.
    http://ceur-ws.org/Vol-2257/paper09.pdf (2018). Accessed 30 Nov 2018
15. Shapovalov, V.B., Shapovalov, Ye.B., Bilyk, Zh.I., Atamas, A.I., Tarasenko, R.A., Tron,
    V.V.: Centralized information web-oriented educational environment of Ukraine. In: Kiv,
    A.E., Soloviev, V.N. (eds.) Proceedings of the 6th Workshop on Cloud Technologies in
    Education (CTE 2018), Kryvyi Rih, Ukraine, December 21, 2018. CEUR Workshop
    Proceedings 2433, 246–255. http://ceur-ws.org/Vol-2433/paper15.pdf (2019). Accessed 10
    Sep 2019
16. Shapovalov, Ye.B., Bilyk, Zh.I., Atamas, A.I., Shapovalov, V.B., Uchitel, A.D.: The
    Potential of Using Google Expeditions and Google Lens Tools under STEM-education in
    Ukraine. In: Kiv, A.E., Soloviev, V.N. (eds.) Proceedings of the 1st International Workshop
    on Augmented Reality in Education (AREdu 2018), Kryvyi Rih, Ukraine, October 2, 2018.
    CEUR Workshop Proceedings 2257, 66–74. http://ceur-ws.org/Vol-2257/paper08.pdf
    (2018). Accessed 30 Nov 2018
17. Shapovalov, Ye.B., Shapovalov, V.B., Zaselskiy, V.I.: TODOS as digital science-support
    environment to provide STEM-education. In: Kiv, A.E., Soloviev, V.N. (eds.) Proceedings
    of the 6th Workshop on Cloud Technologies in Education (CTE 2018), Kryvyi Rih, Ukraine,
    December 21, 2018. CEUR Workshop Proceedings 2433, 232–245. http://ceur-ws.org/Vol-
    2433/paper14.pdf (2019). Accessed 10 Sep 2019
                                                                                      129


18. Shyshkina, M.P.: The Problems of Personnel Training for STEM Education in the Modern
    Innovative Learning and Research Environment. In: Kiv, A.E., Soloviev, V.N. (eds.)
    Proceedings of the 1st International Workshop on Augmented Reality in Education (AREdu
    2018), Kryvyi Rih, Ukraine, October 2, 2018. CEUR Workshop Proceedings 2257, 61–65.
    http://ceur-ws.org/Vol-2257/paper07.pdf (2018). Accessed 30 Nov 2018
19. Syrovatskyi, O.V., Semerikov, S.O., Modlo, Ye.O., Yechkalo, Yu.V., Zelinska, S.O.:
    Augmented reality software design for educational purposes. In: Kiv, A.E., Semerikov,
    S.O., Soloviev, V.N., Striuk, A.M. (eds.) Proceedings of the 1st Student Workshop on
    Computer Science & Software Engineering (CS&SE@SW 2018), Kryvyi Rih, Ukraine,
    November 30, 2018. CEUR Workshop Proceedings 2292, 193–225. http://ceur-ws.org/Vol-
    2292/paper20.pdf (2018). Accessed 21 Mar 2019