=Paper= {{Paper |id=Vol-2740/20200292 |storemode=property |title=Ukrainian Students' Digital Competencies: Various Aspects of Formation and Impact on Students' Learning Achievements |pdfUrl=https://ceur-ws.org/Vol-2740/20200292.pdf |volume=Vol-2740 |authors=Mariia Mazorchuk,Olena Kuzminska,Lucia Tramonte,Fernando Cartwright,Tetyana Vakulenko |dblpUrl=https://dblp.org/rec/conf/icteri/MazorchukKTCV20 }} ==Ukrainian Students' Digital Competencies: Various Aspects of Formation and Impact on Students' Learning Achievements== https://ceur-ws.org/Vol-2740/20200292.pdf
                  Ukrainian Students' Digital Competencies: Various
                Aspects of Formation and Impact on Students' Learning
                                    Achievements

                      Mariia Mazorchuk1 [0000-0002-4416-8361], Olena Kuzminska2 [0000-0002-8849-9648],
                      Lucia Tramonte3 [0000-0002-3914-4306], Fernando Cartwright4 [0000-0002-7959-941X],
                                        Tetyana Vakulenko1 [0000-0002-7403-1075]
                            1
                             Ukrainian center for educational quality assessment, Kyiv, Ukraine,
                        mazorchuk.mary@gmail.com, vakulenko_tetyana@ukr.net
                    2
                      National University of Life and Environmental Sciences of Ukraine, Kyiv, Ukraine,
                                             o.kuzminska@nubip.edu.ua
                           3
                             University of New Brunswick, Fredericton, New Brunswick, Canada,
                                                      lucia@unb.ca
                                  4
                                    Principal at Polymetrika, Inc., Ottawa, Ontario, Canada
                                     fernando.cartwright@polymetrika.com



                      Abstract. In different countries and economics, we can observe the difference in
                      learning students’ achievements depending on their digital competencies. There
                      are some international studies, such as PISA (Program International Student As-
                      sessment) and ICILS (International Computer and Information Literacy Study),
                      which allow to evaluate some associative relationship and to make a conclusion
                      about using digital technologies in the learning environment. In 2018 Ukraine
                      participated in PISA for the first time in the Paper-Based Assessment (PBA) for-
                      mat and didn’t participate in ICILS. Now we don't have any national or interna-
                      tional studies, which will be able to answer a question about readiness Ukrainian
                      students to Computer-Based Assessment (CBA), only we have some results from
                      PISA 2018. This research presents analyze the PISA 2018 Database and some
                      other results of studies. We studied how digital technologies influence on the
                      student's achievement of main literacy domains. So, in other words, we wanted
                      to evaluate how the digital environment impacts students' readiness for effective
                      implementation in a changing digital world. The results of Ukraine and other
                      countries are useful for the formation of key policies for Ukraine concerning the
                      development of the digital competencies of Ukrainian students.

                      Keywords: Program International Student Assessment (PISA), Student
                      Achievement, Information and Communication Technologies (ICT), Digital
                      Competence.


              1       Introduction

              Digital competence in the digital transformation of the current decade is becoming more
              relevant [1]. The EU recognizes digital competence to be one of eight key competencies




Copyright © 2020 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
required for any person to live his/her full life [2]. Thus, building the students’ digital
competence becomes a vital task for educational institutions to fulfill. Ubiquitous dig-
itization increases the requirements imposed on the specialists’ digital competence.
Whilst the computers and the internet play a central role in our personal and profes-
sional life, the students who do not gain basic digital skills of reading, writing, and
navigating through the digital environment cannot participate the economic, social and
cultural life [3, p. 15].
    Researchers in the field of Education have examined digital technologies and infor-
mational and communication technologies (ICT) in Ukraine, studying different aspects
of ICT influence on academic progress, motivation, and readiness for digital learning
at both higher education [4] and secondary school [5]. However, the previous research
does not evaluate the digital competency level of different groups of learners and their
readiness to advance and make progress within the digital environment.
    International research projects such as PISA (Program International Student Assess-
ment) and ICILS (International Computer and Information Literacy Study), which di-
rectly assess both digital and general competencies, allow us to evaluate some associa-
tive relationship and to make a conclusion about using digital technologies in the learn-
ing environment.
    The cross-national nature of the PISA and ICILS data are the basis of a growing
body of analyses that replicate and extend models that describe these relationships in
different contexts. By comparing the results across different countries and economies,
the research can develop generalizable models that can guide future educational policy.
Based on the PISA 2009 data [6] from 17 countries and economies, J. Nauman built a
model of online reading engagement, where he outlined the connection between the
digital competence and digital reading [7]. That model served to develop a practical
recommendation to improve the process of education.
    Unfortunately, research in Ukraine does not adequately evaluate the digital environ-
ment and the quality of gained digital competence in Ukraine. At the country level, we
still do not have the tools to monitor and evaluate digital skills and competence. At the
legislative level, we do not define the basic terms such as “digital skills” and “digital
competence [8, p. 10]. As a result, previous research conducted by Ukrainian contrib-
utors [9] does not describe the whole picture of students’ digital competence and does
not allow comparison with other countries.
    For the first time, Ukraine participated in PISA in 2018. However, the assessment in
Ukraine was held in a paper format. Thus, we could not evaluate the students’ readiness
for the digital world, as they did not have a chance to demonstrate their digital skills (in
particular, digital reading).
    Herewith, we use the analysis of the international data to generalize the influence of
ICT on the progress of students’ learning in Ukraine.
    This research aims to define the factors related to the students’ digital environment
and assess their influence on the literacy rate in reading, mathematics, and natural sci-
ences of Ukrainian students according to PISA 2018 results.
2      Theoretical Background

PISA (https://www.oecd.org/pisa/) is a Program for International Student Assessment
that includes over 90 world countries and economic regions. The program takes place
every three years. In each three-year cycle, PISA uses direct assessments to measure
students’ literacy rate in reading, mathematics, and natural sciences. In selected cycles,
the program also assesses the participants’ competence with the additional frameworks
such as global competence, financial literacy, etc. The main goal of PISA is not to rate
the students’ level of knowledge, but to assess the level of their readiness to live in the
digital world and solve the practical tasks they may face on their career path. In most
countries, the assessment occurs in digital format.
   In addition to the direct assessments of competencies, PISA also uses questionnaires
to collect information the students’ contextual characteristics. Combining information
from the tests and questionnaires enables analysis of how different contextual factors
correlate with student progress.
   In PISA 2012, 29 OECD countries and 13 partner countries and economies chose to
distribute the optional ICT familiarity component of the student questionnaire. PISA
2012 research [3] assessed the effectiveness of ICT applicability in school and at home
and its influence on assessment results on the main literacy frameworks.
   The results of the research do not indicate any direct correlation between level of
financial investment in digital technologies and quality of learning outcomes. Rather,
students’ achievements to depend more on how often and for what reason the digital
equipment is utilized during the education process, how motivated are students to use
computers for solving their tasks, and if the teachers are capable of effective methods
of teaching leveraging computes and digital technologies.
   The thematic studies [10] and local research [11], that rely on PISA analytical reports
[12] indicate that providing students and teachers with computers is not enough to guar-
antee them the digital competence and better progress. This finding is also seen in re-
sults of the international ICILS 2018 research held by IEA [13].
   This paper examines the Ukrainian students’ readiness to live in the digital environ-
ment by assessing the influence of several factors that characterize digital competence
on students’ performance according to PISA 2018 results. Although the survey did not
contain direct questions to students about leveraging ICT when studying, other data
provide a snapshot of the digital environment in Ukrainian schools.
   In terms of this work, we set the following tasks:
1. Define the formation conditions of the Ukrainian students’ digital competence: ac-
   cess to the Internet and availability of computers in school and at home, digital learn-
   ing at school, the activity of Internet using.
2. Define the influence of digital competence on the Ukrainian students’ performance
   on the main subjects and in particular, define the value of school education.
3      Methods and Study Materials

This work leverages the results of its own analysis of PISA materials, as well as uses
the materials from OECD analytical reports and PISA datasets in SPSS format
(https://www.oecd.org/pisa/data/2018database/). To analyze the Ukrainian students’
results we utilized data from the PISA 2018 research [14]. These data include computed
scale scores for the different skill assessments, responses of students and school admin-
istrators to individual questions, and computed index scores based on multiple ques-
tionnaire responses. Computed index scores are standardized to have a mean of zero
for OECD countries and a variance of 1 across OECD countries, and the scores are
usually distributed normally in the range from -3.5 to 3.5 [15, p. 22].
    The Index of computer availability in-home (ICTRES) is computed from items
ST011 and ST012 [16], [15, p. 305]. If the ICTRES index is greater than zero, the
students’ availability of computers, educational software and access to the Internet is
higher than in the OECD countries. If the ICTRES index is less than zero, the students
have fewer digital resources than on average students from OECD countries.
    The Index of Economic, Social and Cultural Status (ESCS) is calculated using a
variety of student background variables that describe the relative economic, social and
cultural advantage of different students’ families. This index is historically a strong
predictor of student outcomes. If the ESCS index is less than 0, the students have less
advantaged family backgrounds [15, p. 339].
    The Index of computer availability in schools (RATCMP1), is calculated from item
SC004 [16]. This index describes the ratio of students to available computers in a
school. If the index is less than 1, there are more students than computers, and if the
value is greater than 1, there are fewer students than computers.
    The main statistics and the coherence of statistical connections were calculated based
on the testing results on the main subjects, students’ surveys [15], and surveys of the
schools’ principals [17]. We utilized the methods of frequency analysis, calculation of
mean values, calculation of the fault level, and the methods of correlation and regres-
sion analysis recommended for processing PISA results [15].
    For all calculations, we used the software environment for statistical study R that
included the package for international research results processing intsvy [18] and the
technologies described here [15]. To analyze the influence of the school system on stu-
dents’ ability to gain the main digital competence we leveraged the method of hierar-
chical regression analysis in the R environment with the lme4 package [19]. We also
leveraged MS Excel for calculations and the presentation of the results.


4      Main Findings

4.1    The Digital Environment of the Ukrainian Students
The PISA 2012 results [3], the necessary (albeit insufficient) conditions for gaining
skills and knowledge in the digital environment are the availability of digital equipment
and Internet access. These factors are largely dependent on the primary level of the
students’ socioeconomic status.
    According to PISA 2018, 89.2% of Ukrainian students reported they have computers
at their homes to do homework, 58.6% of students said they own educational software,
and 97.7% of students have access to the Internet.
    The level of digital resources availability in Ukraine is lower [20] compared to the
OECD countries and other countries that perform highly on literacy outcomes. Figure
1 displays the comparison of Ukraine with neighboring countries and the OECD aver-
age.




        Fig. 1. The ICTRES index values for different countries (Source: Own work)

In Ukraine, almost 70% of principals working in schools where 15-years old students
study reported they lack digital equipment with access to the Internet, the Internet speed
is poor, and their software is outdated. Despite the lack of physical resources, the prin-
cipals reported adequate access to effective professional resources for the teachers to
utilize to learn how to use digital equipment, and they also have access to useful edu-
cational online platforms [20, p.167].
   Internationally, the countries with higher academic performance rates are better
equipped with computers and other digital tools. Comparing the relation of the number
of computers per student (RATCMP1), we can see that only 6 out of 10 students are
provided with a computer (Fig. 2). The number of computers per student in Estonia,
Slovakia, and the average number in OECD countries is higher.
 Fig. 2. Index of computer availability per student (RATCMP1) in different countries (Source:
                                           Own work)

The availability of computers and Internet access is a necessary precondition for devel-
oping students’ digital competence. The sufficient condition is self-teaching and actual
implementation of the gained skills and knowledge.
   According to the survey results on item ST176 [16], Ukrainian students mostly uti-
lize the Internet to communicate in messengers and to search for information. Less of-
ten, they search for some practical “know-how” information, read news and emails, or
participate in forums (Fig. 3).




Fig. 3. Percentage distribution of students’ answers on the usage of Internet resources (Source:
                                           Own work)
Also, 54.2% of students answered that they use digital equipment for reading, 65,3%
of students have some skills and knowledge with data security in the digital environ-
ment (calculated on the basis of answers to ST168 and ST166 items [16]).
A large proportion of students do not learn their digital skills at school (Fig. 4). On
average, 50% of students learned data security and 40% of students learned basic digital
competence connected with the usage of the Internet and searching data outside of
school (calculated based on answers to ST158 items [16]).




 Fig. 4. Percentage distribution of students’ answers on teaching digital competence in schools
                                       (Source: Own work)


4.2    Digital Skills and the Students’ Performance of the Main Disciplines
We assessed what digital resources, knowledge, and skills influence Ukrainian stu-
dents’ performance on the main disciplines. We built separate regression models to
estimate the relationships, where the target variables were students’ scores in reading,
mathematics, or natural sciences in PISA points, and the predictors were the ICTRES
or RATCAMP1 index.
   The students who has a higher ICTRES index has on average 20-25 points PISA
scores in reading, mathematics, and natural sciences. We observed the same coherence
with the school resources. In schools, where principals reported a higher index of com-
puter availability per student, students have at average 17 points higher PISA scores.
   However, because ICTRES is correlated with ESCS, it is difficult to estimate an
effect of ICT resources that is clearly independent of student background or the average
school socioeconomic context. If we build a regression model of dependence on the
students’ performance (in PISA points) from the ESCS and ICTRES indexes, the influ-
ence of the ICTRES index is not significant. Ideally, the model should include interac-
tion terms to estimate the degree to which ICTRES has a magnifying effect on ESCS
with respect to literacy. Unfortunately, with such small between-school variation in
both the effect of ESCS and the skewed ICTRES distribution, potential interaction ef-
fects cannot be estimated with the current data using this model. Based on the current
results, the most likely interpretation is that the students’ performance overall depends
more their socioeconomic status, with ICT resources playing a redundant role, and like-
wise, the average performance at school depends largely on the socioeconomic status
of the students studying in that school.
   Ukrainian students who use Internet for communication and searching for data are
considered active readers of information available online [20, p.109]. These active
online readers have higher scores in reading literacy than the students who do not use
online resources, despite their socioeconomic status (Fig. 5).




   Fig. 5. Average scores in digital reading of students with different socioeconomic status
                                      (Source: Own work)

Considering the influence of the book format (regarding the ST168 question about pa-
per vs digital books [16]), we can claim that the format does not influence the results.
Although students who do not read books tend to have lower scores and students who
read in both formats have higher scores, the difference in performance between students
who read paper and those who read digital is not significant (Fig. 6).
   The students were asked a question about challenges with the tasks from the reading
section. They had to agree or disagree on a statement: “I was confused when I had to
work with the pieces on different pages (ST163). These situations often happen when
a person reads from the digital devices and must switch between two files or open hy-
perlinks. The students who confirmed they had difficulties reading in that format re-
ceived fewer points for reading (Fig. 7).
 Fig. 6. The average scores in reading depending on the format of the books that students read
                                     (Source: Own work)




 Fig. 7. Average scores for reading and perceiving text when having to switch between pages
                                     (Source: Own work)

The analysis of the performance in reading, mathematics, and natural science can not
confirm that school education has a positive or negative influence on students gaining
knowledge or skills working with digital information. Self-educated students tend to
have higher scores than students who received digital skills at school, but this difference
can be explained by ESCS. Advantaged students are more likely to be self-educated in
ICT and also more likely to have higher performance in literacy. These relationships
are independent of any potential school effects. But we can see that the students who
were more successful received their digital skills out of school (Table 1). Values in the
table that are in bold are statistically significant.
 Table 1. Scores difference of students who received digital skills inside and outside school
                                                     Reading        Math         Science
                                                       Average scores of students, who
                                                      answered that they mastered these
                                                            skills inside of school
    Taught at school in Ukraine
                                                     449.9          437.63       452.6
                                                      Scores difference of students who
                                                      answered that they mastered these
                                                           skills outside of school
 How to use keywords when using a search             6.5            10.68        7.42
 engine such as , , etc.
 How to decide whether to trust information          -8.86           -5.45         -5.65
 from the Internet
 How to compare different web pages and de-          19.04           14.41         14.6
 cide what information is more relevant for
 your school work
 To understand the consequences of making            -0.86           1.79          1.88
 information publicly available online on , [...]
 How to use the short description below the          5.4             6.75          5.44
 links in the list of results of a search
 How to detect whether the information is sub-       10.06           4.62          9.14
 jective or biased
 How to detect phishing or spam emails               11.32           10.57         10.44
We also examined the data by constructing multilevel models [21, p. 201]. The ad-
vantage of multilevel models (or mixed regression models) is that they can together
estimate relationships between variables on the school level and on the student level.
Thus, we can determine the degree to which differences in student outcomes are the
result of differences between students or the result of differences between the schools
that they attend. The results of a mixed regression model include coefficients describing
the average performance and predictor effects within each school, as well as the degree
to which that vary between schools. The intraclass correlation coefficient (ICC) is the
ratio of the variability between schools to the total variation (within and between
schools) [22].
   Considering the dependence of the level of Ukrainian students’ success on the main
disciplines from their socioeconomic status, we can claim that the variability of the
results is explainable primarily by student-level differences.
   In Figure 8 illustrates the variation of PISA reading scores in accordance with the
students’ socioeconomic status. The lines on the graphs depict the regression lines for
certain schools. Mostly, these lines are situated in the center of the scattering graph and
have similar slopes. The similarity of the lines indicates that the relationship between
socio-economic background and student performance is relatively consistent, even
across schools with very different average levels of socio-economic backgrounds.
   The intraclass correlation coefficient (ICC) of this model is 0.28. The main interpre-
tation of this result is that approximately a third of the variability of the results can be
attributed to the influence of a school after the effects of socio-economic conditions are
considered. The diagram proves that in most of the schools, variability in students’ per-
formance depends on their socioeconomic status. However, there is also a place for the
influence of location and segregation. Some schools provide for a high or middle level
of performance regardless of the socioeconomic status.




Fig. 8. The variation of PISA reading scores (PV1READ) in accordance with the students’ so-
                            cioeconomic status (Source: Own work)

The dependence of a level of digital reading from students having available digital de-
vices at home does not influence the students’ performance (Fig. 9). According to the
regression model, its influence of the performance is consistently minor across schools,
though the variability of results among schools is big, and the ICC equals 0.46.




Fig. 9. Dependence of the PISA reading scores (PV1READ) from the index of computer availa-
                            bility (ICTRES) (Source: Own work)
The model of the online reading activity dependence on the availability of digital de-
vices at home (Fig. 10) indicates that, on average, there is little variation in either online
reading habits (ICC=0.041) or ICT resources between schools. However, there appears
to be substantial variability in the relationship across schools, suggesting that in some
contexts, differences in availability of ICT for students may explain to frequency of
online reading while in other contexts, it does not. This relationship is difficult to inter-
pret because both variables have skewed distributions (𝑠𝑘𝑒𝑤𝐼𝐶𝑇𝑅𝐸𝑆 = 0.44;
𝑠𝑘𝑒𝑤𝑜𝑛𝑙𝑖𝑛𝑒𝑟𝑒𝑎𝑑 = −0.43). The non-normality produces an artificial restriction of range
that has a minimizing effect on observed relationships; if the indices were measured
using items that are more sensitive to variations within Ukraine, the observed relation-
ships would likely be higher, but it is not possible to determine the degree of underes-
timation using the current data.




Fig. 10. Dependence of online reading activity (onlineread) from the index of computer availa-
                            bility (ICTRES) (Source: Own work)


5      Discussion: Computer vs Paper Assessment

The PISA migration to computer testing was phased (in 2018, 9 countries still held their
testing in paper format). Switching to computer testing was motivated by several fac-
tors: the tests are simpler and cheaper to administer, it facilitates more complicated
tasks, it simplifies data collection and analysis, it provides for ability to get more infor-
mation about the process of testing (for example, to evaluate how much time students
spent on certain tasks), etc. [23].
   In addition, proceeding to computer testing reflects global changes in our society.
The widespread availability of digital devices changes the methods of finding and ac-
cessing information. Some competences, such as the ability to process big volumes of
data or the ability to work with an overabundance of data, become essential. Utilizing
digital devices for reading and solving practical tasks such as adding funds, registering
to the electronic systems, participating in economic and social processes is a must.
Thus, PISA implementation of a computer format is a logical consequence of digitali-
zation.
   The differences between PISA tests in paper and digital formats in most cases are
caused by variety of tests and the ways of interaction. For instance, the digital reading
tests in digital format contain the elements (menu, scroll bars) for navigation between
the parts of the text [24], and the tasks in natural science contain the computer simula-
tors that help to assess the students’ ability for researching and interpreting the results.
   In all these tests, the data that students see on the screen are part of their interaction
with the task, such as clicking, scrolling, or else. Thus, all tests in digital format require
digital competence and skills of working with the computer.
   Since 2021, testing in Ukraine will be in digital format and the students will have to
complete tasks that could not exist in paper format. Thus, the PISA results of Ukrainian
students will depend on their digital competence level as well.
   Since we cannot rely on PISA 2018 results to predict the outcome of digital testing
in 2021, the experience of other countries can be useful for our research. However,
different countries have their own peculiarities, which may make generalizing their re-
sults problematic.
   Some countries demonstrated much better results after switching to digital testing,
while several researchers claimed the decrease of the literacy rate exactly after switch-
ing to digital PISA [25].


6      Conclusions and the Research Perspective

The analysis of the international PISA research, local Ukrainian researches, and addi-
tional data collected on this topic allows us to make some generalization.
    From the received results we can conclude that Ukraine, comparing to other coun-
tries is less provided with computers and digital devices, and scholastic education (to
the students’ opinion) only partly provides for digital competence.
    Though, according to the PISA 2018 results, the level of Ukrainian students’ digital
competence is impossible to assess. We can only observe the associative links between
the student’s performance and the disciplines and factors connected to leveraging digi-
tal devices and the Internet in the process of education.
    In most cases, students with lower PISA scores in subject literacy are also charac-
terized by lack of digital devices, access to the Internet and also, lack of activity utiliz-
ing online materials.
    What is more, although there is a difference between levels of digital competence of
students who have free access to digital devices, and those who do not, we consider this
difference to be connected with the students’ socioeconomic status. The availability of
digital devices correlates to socioeconomic status. Currently, availability of ICT and
instruction of ICT in schools may be less effective than non-scholastic resources in
supporting the development of ICT skills.
   To define the main strategies of how we can improve the students’ digital compe-
tence, we consider advantageous to widen the research on the experience of leveraging
digital devices and up-to-date methods of teaching among the countries that have sim-
ilar conditions of growth and development to those we have in Ukraine.
   In addition, comparing the results of the countries that switched from PISA paper to
digital format showed several distinctions related to different categories of students,
that we have to study and analyze.
   Another topic for future investigation and analysis is the effectiveness of utilizing
various teaching methodologies meant to improve the students’ digital competence.
   Additional research on the other countries’ experience will allow us to develop rec-
ommendations on how to improve digital education, and thus to improve the general
level of Ukrainian students’ performance.


References
 1. Europe 2020 strategy, https://www.eesc.europa.eu/sites/default/files/resources/docs/qe-01-
    14-110-en-c.pdf, last accessed 2020/15/01.
 2. Proposal for a Council recommendation on Key Competences for LifeLong Learning,
    https://eur-lex.europa.eu/legal-con-
    tent/EN/TXT/PDF/?uri=CELEX:52018SC0014&from=EN, last accessed 2020/15/01.
 3. Students, Computers and Learning: Making the Connection, PISA, OECD Publishing.
    http://dx.doi.org/10.1787/9789264239555-en (2015).
 4. Kuzminska, O., Mazorchuk, M., Morze, N., Pavlenko, V., Prokhorov, A.: Digital compe-
    tency of the students and teachers in Ukraine: measurement, analysis, development pro-
    spects. In: Information and Communication Technologies in Education, Research, and In-
    dustrial Applications. Communications in Computer and Information Science, vol. 2104, pp.
    366–379 (2018).
 5. Hatlevik, O.: Examining the Relationship between Teachers’ Self-Efficacy, their Digital
    Competence, Strategies to Evaluate Information, and use of ICT at School. Scandinavian
    Journal          of         Educational        Research,          61(5),         555–567,
    https://doi.org/10.1080/00313831.2016.1172501 (2017).
 6. PISA 2009 Results vol. VI: Student on line. Digital technologies and performance. Paris,
    France: OECD Publishing, DOI: 10.1787/9789264112995-en (2011).
 7. Naumann, J.: A model of online reading engagement: Linking engagement, navigation, and
    performance in digital reading. Computers in Human Behavior, 53, 263–277 (2015).
 8. Problems and Prospects for Harmonization of Ukraine's Digital Market with EU and EaP
    Markets: An Analytical Report, https://www.civicsynergy.org.ua/wp-content/up-
    loads/2018/04/Problemy-ta-perspektyvygarmonizatsiyi-tsyfrovogorynku-Ukrayinyz-ryn-
    kamy-YES-ta-krayin-ShP.pdf, last accessed 2020/15/01 (2018) (in Ukrainian).
 9. 3D MAPPING OF UKRAINIAN EDUCATION SYSTEM (Modernization of Pedagogical
    Higher Education by Innovative Teaching Instruments (MoPED)), http://mo-
    ped.kubg.edu.ua/wp-content/uploads/2014/03/MoPED_D1.2-3DMapping.pdf (2018).
10. Leroux, G., Monteil, J. & Huguet, P.: Apprentissages scolaires et technologies numériques:
    une revue critique des méta-analyses. L’Année psychologique, vol. 117(4), 433-465.
    doi:10.4074/S0003503317004018 (2017).
11. Tan, C., & Hew, K.: The Impact of Digital Divides on Student Mathematics Achievement
    in Confucian Heritage Cultures: A Critical Examination Using PISA 2012 Data. Interna-
    tional Journal of Science and Mathematics Education, 17(6), 1213–1232,
    https://doi.org/10.1007/s10763-018-9917-8 (2019).
12. Biagi, F., Loi, M.: ICT and Learning: Results from PISA 2009, https://publications.jrc.ec.eu-
    ropa.eu/repository/bitstream/JRC76061/lbna25581enn.pdf (2012).
13. ICILS 2018. International Computer and Information Literacy Study 2018,
    https://www.iea.nl/studies/iea/icils/2018, last accessed 2020/15/01.
14. PISA 2018 Database, https://www.oecd.org/pisa/data/2018database/, last accessed
    2020/15/01.
15. PISA 2015. Technical Report, https://www.oecd.org/pisa/data/2015-technical-re-
    port/PISA2015_TechRep_Final.pdf, last accessed 2020/15/01.
16. Student       questionnaire    for     PISA       2018.           Main      survey    version.
    https://www.oecd.org/pisa/data/2018data-
    base/CY7_201710_QST_MS_STQ_NoNotes_final.pdf, last accessed 2020/15/01.
17. School        questionnaire     for      Pisa      2018.        Main       survey     version,
    https://www.oecd.org/pisa/data/2018data-
    base/CY7_201710_QST_MS_SCQ_NoNotes_final.pdf, last accessed 2020/15/01.
18. Daniel Caro, D.: R intsvy: International Assessment Data Manager, https://cran.r-pro-
    ject.org/web/packages/intsvy/intsvy.pdf (2019).
19. Bates, D.: Package ‘lme4’, https://cran.r-project.org/web/packages/lme4/lme4.pdf (2019).
20. National Report on the Results of the International PISA-2018 Education Quality Survey /
    Team. ed.: M. Mazorchuk (main author), T. Vakulenko, V. Tereshchenko, G. Bychko, K.
    Shumova, S. Rakov, V. Gorokh and others. ; Ukrainian Center for Educational Quality As-
    sessment.           Kiev:        UCEO,            http://pisa.testportal.gov.ua/wp-content/up-
    loads/2019/12/PISA_2018_Report_UKR.pdf (2019) (in Ukrainian).
21. PISA Data Analysis Manual SPSS, Second edition, https://www.oecd-ili-
    brary.org/docserver/9789264056275-en.pdf?expires=1579771834&id=id&ac-
    cname=guest&checksum=5E865CACD91A58B3B970F2449608C568 (2009).
22. Multilevel Modeling Tutorial Using SAS, Stata, HLM, R, SPSS, and Mplus. The Depart-
    ment of Statistics and Data Sciences, The University of Texas at Austin,
    https://stat.utexas.edu/images/SSC/documents/SoftwareTutorials/MultilevelModeling.pdf
    (2015).
23. PISA Computer-Based Assessment of Student Skills in Science, PISA, OECD Publishing,
    https://www.oecd-ilibrary.org/docserver/9789264082038-en.pdf?ex-
    pires=1577974503&id=id&accname=guest&check-
    sum=A9DCF608DF1852C5A44F158C9425000C (2010), last accessed 2020/15/01.
24. PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing,
    Paris, https://doi.org/10.1787/5f07c754-en (2019) Anex. A5.
25. Komatsu, Н., & Rappleye, J.: Did the shift to computer-based testing in PISA 2015 affect
    reading scores? A View from East Asia, Compare: A Journal of Comparative and Interna-
    tional Education, 47:4, 616-623, DOI: 10.1080/03057925.2017.1309864 (2017).