=Paper=
{{Paper
|id=Vol-3059/paper1
|storemode=property
|title=Adopting Learning Analytics in a Brazilian Higher Education Institution: Ideal and Predicted Expectations
|pdfUrl=https://ceur-ws.org/Vol-3059/paper1.pdf
|volume=Vol-3059
|authors=Samantha Garcia,Elaine Cristina Moreira Marques,Rafael Ferreira Mello,Dragan Gasevic,Rodrigo Lins Rodrigues,Taciana Pontual Falcão
|dblpUrl=https://dblp.org/rec/conf/lala/GarciaMMGRF21
}}
==Adopting Learning Analytics in a Brazilian Higher Education Institution: Ideal and Predicted Expectations==
Adopting Learning Analytics in a Brazilian Higher Education Institution: Ideal and Predicted Expectations Samantha Garcia1 , Elaine Marques1 , Rafael Ferreira Mello1,2 , Dragan Gašević3 , Rodrigo Lins Rodrigues1 and Taciana Pontual Falcão1 1 Departamento de Computação, Universidade Federal Rural de Pernambuco, Brazil 2 Cesar School, Brazil 3 Centre for Learning Analytics, Faculty of Information Technology, Monash University, Australia Abstract Learning Analytics (LA) consists of using educational data to inform teaching strategies and manage- ment decisions, aiming to improve students’ learning. The successful implementation of LA in Higher Education Institutions (HEIs) involves technical aspects and infrastructure but also stakeholders’ accep- tance. The SHEILA framework proposes instruments for diagnosis of HEIs for LA adoption, including stakeholders’ views. In this paper, we present the results of the application of SHEILA’s surveys to identify the highest and lowest expectations about LA adoption, in the views of students and instruc- tors, and compare their ideal and realistic expectations. Results confirmed the high interest in using LA for improving the learning experience, but with ideal expectations higher than realistic expectations, and point out key challenges and opportunities for Latin American researchers to join efforts towards building solid evidence that can inform educational policy-makers and managers, and support the de- velopment of strategies for LA services in the region. Keywords Learning Analytics, higher education, student expectations, instructor expectations 1. Introduction As the amount of educational data increases, and tools for analysis become more available, Learn- ing Analytics (LA) becomes more popular [1]. LA is defined as the "measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs" by the Society for Learning Analytics Research. [2]. The implementation of these educational analyses in higher education institutions (HEIs) aims to optimize learning and its environments [3]. The amount of data available about students in HEIs is growing fast: exam grades, duration and frequency of in- teractions with virtual learning environments, and discussions in forums, are some examples LALA’21: IV LATIN AMERICAN CONFERENCE ON LEARNING ANALYTICS - 2021, October 19–21, 2021, Arequipa, Peru " samanthamcgarcia@gmail.com (S. Garcia); elaine.marques557@gmail.com (E. Marques); rafael.mello@ufrpe.br (R. F. Mello); dragan.gasevic@monash.edu (D. Gašević); rodrigo.linsrodrigues@ufrpe.br (R. L. Rodrigues); taciana.pontual@ufrpe.br (T. P. Falcão) 0000-0001-8558-4946 (S. Garcia); 0000-0001-8549-529X (E. Marques); 0000-0003-3548-9670 (R. F. Mello); 0000-0001-9265-1908 (D. Gašević); 0000-0002-3598-5204 (R. L. Rodrigues); 0000-0003-2775-4913 (T. P. Falcão) © 2021 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings http://ceur-ws.org ISSN 1613-0073 CEUR Workshop Proceedings (CEUR-WS.org) of very useful data sources used in educational analysis. LA can potentially help to overcome important educational challenges, such as student drop-out, failure, and personalized feedback at scale [4]. In Latin America (LATAM), LA adoption is still much lower than in North America and Europe [5, 6]. Still, the amount of data currently available indicates that LATAM countries have the possibility to implement LA strategies in order to improve educational systems [7], addressing known problems in the region like student dropout and program quality [8]. In Brazil, interest in LA is growing, along with the expansion and popularization of online and blended learning, and the increasing use of Learning Management Systems (LMS) [5]. A large amount of data is produced daily by HEIs students in Brazil, and the collection and analysis of this educational data can be crucial for the development of new strategies for improving teaching-learning processes. However, LA implementation is not straightforward, and is highly dependent on context [9, 10]. Although interest in LA has grown considerably around the world [11, 9], few studies specifically address the key role of contextual factors in implementing LA successfully at the institutional level [8]. The SHEILA framework (Supporting Higher Education to Integrate Learning Analytics) [12] is the main such initiative, providing instruments to build a diagnosis of HEIs in terms of several aspects that impact on the successful adoption of LA. As SHEILA is grounded in empirical research undertaken in the European context [12], the LALA project (Learning Analytics in Latin America) [13] encourages local adaptations of its methods and instruments, aiming at generating a corpus of knowledge and contextual evidence for the region. The SHEILA framework comprises dimensions that include political context, internal capacity, engagement strategy and learning frameworks [12]. Perhaps most importantly, it recommends the identification of key stakeholders and their needs and desires. As a matter of fact, stakeholder engagement and buy-in is considered a challenge for successfully implementing LA, besides pedagogical grounding, resources, and ethics and privacy [9]. As stakeholders diagnosis is very particular to regional specificities, including for example culture, bureaucracy, and social inequality, existing research based on SHEILA [12, 9] may not account for LATAM HEIs. There is yet few findings about the impact of stakeholders’ opinions and behaviors for LA adoption in Latin America. In this paper, we address this gap with empirical research in a Brazilian HEI, presenting stakeholders’ opinions and perceptions that can help increase buy-in in the process of imple- menting LA. Such collective effort in gathering empirical evidence has been pointed out by other LATAM researchers [8]. Previous research performed through focus groups indicate students and instructors’ interest in LA, in particular for improving the learning process, pro- viding and receiving personalized feedback, adapting teaching practices to students’ needs, and making evidence-based pedagogical decisions [14, 15]. The present research complements such qualitative findings with quantitative data from a survey using a questionnaire focused on stakeholders’ ideal and predicted expectations [16, 10]. We aimed to answer the following research questions: RQ1: What are the highest and lowest expectations regarding the adoption of LA, in the views of students and instructors? RQ2:What are the differences and similarities between students’ and instructors’ ideal and predicted expectations about the adoption of LA? 2. Method 2.1. Instrument The instrument used for data collection was based on SHEILA’s survey [16, 10], empirically tested and aiming for a diagnosis of HEIs at scale, by providing a comparison between ideal and predicted (or realistic) expectations from the main stakeholders groups (students and instructors). The instrument itself prompts participants to rate their expectations in two separate 7-point Likert scales: ideal and predicted (explained in the instrument). In this paper, ideal expectations are desired outcomes based on the hope stakeholders have, while predicted expectations are realistic beliefs about what is perceived as viable to be implemented. By analyzing these two kinds of expectations, a deeper understanding of stakeholders’ perspectives can be reached, identifying main areas to focus. Generally speaking, topics that receive the highest ratings in realistic expectations are considered priority in service planning [16]. We translated the questionnaire to Brazilian Portuguese, making small semantic adaptations to fit the context of Brazilian HEIs. We also removed a question about sharing the students data for a third party company as in Brazil it is not possible for public universities to share data with private companies. Questions from the adapted questionnaire for the instructors were maintained. The questionnaire included a brief introduction to LA and the purpose of the study, asking for informed consent for participation. We also collected demographic information, such as age and gender, and educational data (course, study field, degree, among other information). The themes addressed by the survey were: (i) Data Privacy (4 items for students): Whether the university is allowed to collect, use and analyze the data obtained from the students and for what purpose the institution may use these data. (ii) Academic Progress (6 items for instructors, 2 for students): What kind of information could benefit students and instructors helping to check on students’ progress in the courses. (iii) Feedback (4 items for instructors, 3 for students): How students would like to receive feedback / what are the ways of giving feedback that instructors find the most appropriate. (iv) Decision-making (2 items for instructors, 1 for students): How educational data can help students and instructors take action upon problematic situations identified. (v) Intervention (1 item for instructors, 1 for students): Whether the instructors or the institution should intervene when being notified by the system of a student at risk, and how this should be approached. (vi) Training (3 items for instructors): What kind of training for instructors will be provided for them to be capable of analyzing data effectively. 2.2. Context and Participants This study was undertaken in a HEI that offers face-to-face and online courses, with access to the same LMS (Moodle). While online courses occur fully through this platform, in the face-to-face courses the LMS is used as support to share materials, submit assignments and interact in online discussions. The questionnaires were created using Google Forms, and sent through the university official communication channels, including social networks and emails lists from departments and direct contact with course coordinators. The survey had 241 participants from the HEI (192 students and 49 instructors), from several areas of knowledge and courses (online and face-to- face) (Tables 1 and 2). The higher number of participants from Information Technology (IT) courses is due to the authors’ belonging to this area thus having better reach. Table 1 Table 2 Overview of instructors Overview of students Major Quantity Major Quantity IT Related 26 IT Related 132 Education 11 Education 45 Mathematics and Statistics 5 Mathematics and Statistics 4 Agrarian sciences 4 Agrarian sciences 2 Others 3 Others 9 2.3. Data Analysis The quantitative analysis adopted to answer the first research question focused on the description of the survey results into two boxplots, which include the median rating score of each item for ideal and predicted expectations, and the outliers for each scale. In order to address RQ2, we compared ideal and realistic expectations from students and instructors. For this analysis, only participants inclined to agreement were considered, i.e. those who answered 5 to 7 in the Likert scale. We performed statistical analysis over this sample and we assessed the percentage of agreement in instructors’ and students’ responses (separately) and the comparison between ideal and realistic expectation. More specifically, we applied the McNemar test [17] that performs a statistical comparison of two related samples. In this analysis, we aimed to reach 95% of reliability. 3. Results 3.1. Highest and lowest expectations regarding the adoption of LA Instructors’ responses are shown in the boxplot in Figure 1, where the vertical lines mark the lowest, median and highest values; the outer limit of the boxes show the first and third quartiles; and the dots correspond to outliers. So for instance, in Q4, about visualizing students’ progress, almost all instructors rated their ideal expectations as 7 (with 3 outliers only); but regarding realistic expectations the answers ranged mainly from 5 to 7 (being 2 the lowest rating). Instructors had high ideal expectations about their institution adopting LA, but were less optimistic about the viability (median rating scores between 5 and 6). The items with almost unanimous highest ideal expectations were: access to students’ progress (Q4-I and Q5-I), uni- versity support on data analysis (Q7-I), understanding of data (Q11-I), learning profile (Q12-I) and visualization of learning performance (Q16-I). Some of these also had the highest median ratings of perceived feasibility (Q4-I, Q5-I, Q11-I, Q12-I and Q16-I). The item about university support on data analysis (Q7-I) oscillated between agreement and neutrality, with the biggest interval (answers between 3 and 7). Students had high ideal expectations as well, but lower than instructors’ expectations (Figure 2). The items with highest realistic expectations had the median rating scores between 5 and 7, i.e., higher values than those expressed by the instructors. Students’ highest expectations Figure 1: Box plot of instructors’ responses Figure 2: Box plot of students’ responses regarded consent for use of their educational data (Q2-S) and use of data for other purposes (Q5-S); accessing their educational progress (Q3-S) and educational goals (Q7-S). The biggest gap between median ratings (3-7) was found in Q10-S, regarding intervention based on LA indicating that a student is at-risk of failing or dropping out. 3.2. Ideal versus realistic expectations Table 3 shows the results of the analysis of instructors’ answers, where "n" refers to the number of participants inclined to agree with the item (having answered 5-7 in the Likert scale) and "%" is the percentage of the total number of participant instructors. There were significant differences between instructors’ ideal and realistic expectations for the majority of items, but ideal expectations were higher. Q4-I and Q5-I were the only two items with similarity between expectation and reality, with high levels of agreement. These items were about instructors accessing students’ data on courses they are teaching or have taught previously, indicating that they think that this is viable in their present context. It was not necessary to perform Bonferroni adjustment, as all statistical tests had significance values less than or equal to 0.01. Table 3 Instructors’ ideal and realistic expectations Ideal expectations Realistic expectations Item n % n % p-value Q1-I 45 91.8 35 71.4 =0.006 Q2-I 44 89.8 29 59.2 < 0.001 Q3-I 44 89.8 35 71.4 =0.004 Q4-I 48 98.0 43 87.8 =0.063 Q5-I 47 95.9 42 85.7 =0.063 Q6-I 47 95.9 35 71.4 < 0.001 Q7-I 45 91.8 29 59.2 < 0.001 Q8-I 46 93.9 33 67.3 < 0.001 Q9-I 44 89.8 34 69.4 =0.002 Q10-I 43 87.8 31 63.3 < 0.001 Q11-I 47 95.9 37 75.5 =0.002 Q12-I 44 89.8 37 75.5 =0.016 Q13-I 46 93.9 33 67.3 < 0.001 Q14-I 42 85.7 27 55.1 < 0.001 Q15-I 43 87.8 31 63.3 < 0.001 Q16-I 46 93.9 37 75.5 < 0.001 Table 4 shows the results of the analysis of students’ answers. Students’ ideal expectations were statistically higher than realistic expectations in the majority of cases. The only item without significant differences was Q1-S (about the university asking for consent to use iden- tifiable data), indicating similarity between ideal and realistic expectation. Items Q2-S and Q10-S showed significant distance between ideal and realistic expectations, the ideal expecta- tion having a ceiling effect bigger than other items, specially for Q10-S. Item Q2-S (university ensuring that educational data will be kept safe) had the highest ideal expectation. Item Q10-S (instructors’ obligation to act on the results of LA methods if students underperform or are identified as at-risk of failing) had the lowest rating about realistic expectations. 4. Discussion In this section, we discuss the survey results considering research in other countries using the same instrument [8, 10, 16], as well as our previous qualitative results from the same HEI using the SHEILA instruments for focus groups to investigate similar themes [18, 15]. Table 4 Students’ ideal and realistic expectations Ideal expectations Realistic expectations Item n % n % p-value Q1-S 145 75.5 148 77.1 =0.678 Q2-S 162 84.4 149 77.6 < 0.001 Q3-S 158 82.3 135 70.3 < 0.001 Q4-S 138 71.9 127 66.1 =0.035 Q5-S 156 81.3 139 72.4 < 0.001 Q6-S 152 79.2 128 66.7 < 0.001 Q7-S 154 80.2 133 69.3 < 0.001 Q8-S 146 76.0 133 69.3 =0.011 Q9-S 145 75.5 126 65.6 =0.001 Q10-S 146 76.0 110 57.3 < 0.001 Q11-S 151 78.6 136 70.8 =0.001 Data analysis showed that instructors and students had positive views about the adoption of LA in their institution, which confirms results from other contexts [10, 16], and our previous findings [18, 15]. The survey results add that these stakeholders have ideal expectations higher than realistic expectations, i.e. they wish for LA to be implemented, but are unsure about its viability in a foreseeable future, considering the context of their institution. Previous research using the same survey instrument in other HEIs [10, 16] also showed ideal expectations scale with a ceiling effect, with ideal expectations higher than the realistic, reinforcing the tendency of stakeholders’ uncertainty about what can be achieved in their present context. According to the survey, instructors are particularly interested in visualizing students’ progress, learning profiles and performance, consonant with findings from the focus groups [18, 15] previously performed, which indicated instructors’ particular interest in: decreasing students’ dropout; improving students’ learning and their own teaching; and viewing students’ progress. Although in the focus groups, instructors were somewhat reluctant about the access to and use of students’ data (in line with other research findings [10]), fearing that this could become intrusive, the survey shows that they consider access to student data viable, even at present (items related to this topic – Q4-I and Q5-I – showed similarity between ideal and realistic expectations). Meanwhile, they were less optimistic about the support they can get from HEIs to help them analyze and understand this data, and act upon it (Q7-I) (also previously identified in the literature as an important challenge [10]). According to the survey, students were also especially interested in visualizing their progress and keeping track of their learning goals. This is in line with qualitative findings, which indicate that students particularly support the adoption of LA with the purpose of improving their learning experience. The use of such educational data was of little concern for students in the focus groups [18, 15], but the survey indicates very high ideal expectations that the HEIs will keep this data safe (Q2-S) (reinforced by previous similar results [16]). As for the use of personal data, students were more cautious, which was confirmed by the survey results, where asking for consent to use their data (Q1-S) appeared as an important aspect, and one that they considered rather feasible in their present context. In the focus groups, students were interested in better feedback through the identification of weaknesses in their learning and suggestions to improve it (confirming findings in [8]), which is aligned with previous evidence that students need meaningful information about their progress to motivate them to improve and remain engaged [16]. Students were in favor of the system alerting instructors early if they were at-risk of failing a course or could improve, but there were also reflections on their own responsibility for their learning. For their part, instructors in the focus groups mostly agreed with the obligation for teaching staff and/or HEIs to take action when difficulties in students’ learning are identified by LA methods, consonant with [8]. However, in the survey, this same topic (Q14-I) presented a large difference between instructors’ ideal and realistic expectations, and had the lowest ratings of agreement, indicating that instructors were in fact unsure about this obligation, as also identified in [10]. Students were also uncertain about the viability of instructors being obliged to take action when they are identified as underperformers or at-risk (Q10-S, lowest percentage of agreement and larger difference between ideal and realistic expectations). These somewhat contradictory findings reflect the hot topic still open to discussion, about the moral obligation instructors would have to act, versus students’ need to be autonomous and responsible for their learning [10, 19, 16]. 5. Conclusions, limitations and research directions This study presented the findings of a survey aimed at investigating stakeholders’ expectations on the adoption of LA in a Brazilian HEI, thus adding empirical evidence to the research efforts towards guiding the development of LA services in LATAM [8]. Following qualitative research undertaken previously through focus groups [18, 15], the present study aimed to complement evidence with a quantitative analysis that included a larger number of participants and a comparison between ideal and realistic expectations of key stakeholders. The main limitation of the research is the small size of the sample, given that in the HEI, the population of the instructors and students is around 1.200 and 17.000, respectively. Additionally, a large part of the responses were provided by students and instructors from IT related courses, who are most likely to accept the use of new technologies in their context. Our evidences, taken in perspective along with other research within LALA and SHEILA projects, in LATAM [8] and globally [16][10], reinforce the importance of stakeholder engage- ment for a successful implementation of LA. Together, the empirical evidence collected so far by researchers reveal convergent findings, such as: the need for HEIs to ensure all collected data is safely kept, within a transparent process with stakeholders’ consent; the benefits that LA can bring to the learning process by shedding light on students’ needs and making this visible for them and for the instructors; the wish students have for timely and quality feedback; and the need expressed by instructors for institutional support to help them understand data and take effective action upon them. Our research and other surveys on the same topic and using the same instrument [10, 16] show ideal expectations above realistic. The reasons for this disparity may vary substantially in different contexts, including instructors’ self-efficacy, familiarity with technology and analytics, institutional resources, bureaucracy, and data privacy legislation. Given the particularities of Latin America since colonization, which led to deep socioeconomic inequality, lack of resources and systemic institutional efficiency [8], stakeholders’ wishes may be more distant to their actual beliefs than in other regions of Europe and North America. The lack of belief in the country’s institutions, the lack of self-belief, and low levels of familiarity with technology can be barriers to stakeholder buy-in, thus important aspect to be considered and addressed by administrators. Another key topic, with divergent expectations in the literature, is about the responsibility to act, once data become available. Instructors’ opinions vary about how much they should be expected to take action, for example to contact and help students at-risk. Some researchers and educators argue that the students, on being informed of their progress with rich information, should take responsibility for their learning, with instructors’ support. In other words, who should be the protagonist once data is visualized by all? Instructors’ "obligation to act" is still in debate [19], along with discussions on the risk of discouraging students’ autonomy and creating a culture of passivity. This involves complex pedagogical and political decisions that need to be carefully considered, while maintaining instructors’ and students’ autonomy. For future work, we intend to broaden the survey and extend the study to managers and institutional leaders, based on the SHEILA framework. Additionally, we want to establish partnerships with other Brazilian and Latin American institutions, to run similar studies and further compare the results. In this way, we hope to help creating evidence that reflects the identity(ies) of Latin America [6, 8], and leads to effective strategies that promote the adoption of LA in LATAM HEIs. References [1] S. Joksimović, V. Kovanović, S. Dawson, The journey of learning analytics, HERDSA Review of Higher Education 6 (2019) 27–63. [2] P. D. Long, G. Siemens, G. Conole, D. Gašević (Eds.), Proceedings of the 1st International Conference on Learning Analytics and Knowledge (LAK’11), ACM, New York, NY, USA, 2011. [3] A. Whitelock-Wainwright, D. Gašević, R. Tejeiro, What do students want? towards an instrument for students’ evaluation of quality of learning analytics services, in: Proceedings of the Seventh International Conference on Learning Analytics & Knowledge, 2017, pp. 368–372. [4] A. Pardo, J. Jovanovic, S. Dawson, D. Gašević, N. Mirriahi, Using learning analytics to scale the provision of personalised feedback, British Journal of Educational Technology 50 (2019) 128–138. [5] C. Cechinel, X. Ochoa, H. Lemos dos Santos, J. B. Carvalho Nunes, V. Rodés, E. Mar- ques Queiroga, Mapping learning analytics initiatives in latin america, British Journal of Educational Technology 51 (2020) 892–914. [6] I. Hilliger, M. Ortiz-Rojas, P. Pesántez-Cabrera, E. Scheihing, Y.-S. Tsai, P. J. Muñoz-Merino, T. Broos, A. Whitelock-Wainwright, D. Gašević, M. Pérez-Sanagustín, Towards learning analytics adoption: A mixed methods study of data-related practices and policies in latin american universities, British Journal of Educational Technology 51 (2020) 915–937. [7] C. Cobo, C. Aguerrebere, Building capacity for learning analytics in latin america, Include us all! Directions for adoption of Learning Analytics in the global south (2017) 58. [8] I. Hilliger, M. Ortiz-Rojas, P. Pesántez-Cabrera, E. Scheihing, Y.-S. Tsai, P. J. Muñoz-Merino, T. Broos, A. Whitelock-Wainwright, M. Pérez-Sanagustín, Identifying needs for learning analytics adoption in latin american universities: A mixed-methods approach, The Internet and Higher Education 45 (2020) 100726. [9] Y.-S. Tsai, D. Rates, P. M. Moreno-Marcos, P. J. Munoz-Merino, I. Jivet, M. Scheffel, H. Drach- sler, C. D. Kloos, D. Gašević, Learning analytics in european higher education—trends and barriers, Computers & Education 155 (2020) 103933. [10] K. Kollom, K. Tammets, M. Scheffel, Y.-S. Tsai, I. Jivet, P. J. Muñoz-Merino, P. M. Moreno- Marcos, A. Whitelock-Wainwright, A. R. Calleja, D. Gasevic, et al., A four-country cross- case analysis of academic staff expectations about learning analytics in higher education, The Internet and Higher Education 49 (2021) 100788. [11] O. Viberg, M. Hatakka, O. Bälter, A. Mavroudi, The current landscape of learning analytics in higher education, Computers in Human Behavior 89 (2018) 98–110. [12] Y.-S. Tsai, P. M. Moreno-Marcos, I. Jivet, M. Scheffel, K. Tammets, K. Kollom, D. Gašević, The sheila framework: Informing institutional strategies and policy processes of learning analytics, Journal of Learning Analytics 5 (2018) 5–20. [13] J. Maldonado-Mahauad, I. Hilliger, T. De Laet, M. Millecamp, K. Verbert, X. Ochoa, M. Pérez- Sanagustín, The lala project: Building capacity to use learning analytics to improve higher education in latin america, in: Companion Proceedings of the 8th International Learning Analytics & Knowledge conference, 2018, pp. 630–637. [14] T. P. Falcao, R. Ferreira, R. L. Rodrigues, J. Diniz, D. Gasevic, Students’ perceptions about learning analytics in a brazilian higher education institution, in: 2019 IEEE 19th International Conference on Advanced Learning Technologies (ICALT), volume 2161, IEEE, 2019, pp. 204–206. [15] T. P. Falcão, R. F. Mello, R. L. Rodrigues, J. R. B. Diniz, Y.-S. Tsai, D. Gašević, Perceptions and expectations about learning analytics from a brazilian higher education institution, in: Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, 2020, pp. 240–249. [16] A. Whitelock-Wainwright, D. Gašević, R. Tejeiro, Y.-S. Tsai, K. Bennett, The student expectations of learning analytics questionnaire, Journal of Computer Assisted Learning 35 (2019) 633–666. [17] P. A. Lachenbruch, Mcnemar test, Wiley StatsRef: Statistics Reference Online (2014). [18] T. Pontual Falcão, R. Ferreira, R. Lins Rodrigues, J. Diniz, D. Gasevic, Students’ perceptions about learning analytics in a brazilian higher education institution, in: 2019 IEEE 19th International Conference on Advanced Learning Technologies (ICALT), volume 2161-377X, 2019, pp. 204–206. doi:10.1109/ICALT.2019.00049. [19] P. Prinsloo, S. Slade, An elephant in the learning analytics room: The obligation to act, in: Proceedings of the Seventh International Conference on Learning analytics & Knowledge, ACM, New York, 2017, pp. 46–55.