Monitoring students’ self-regulation as a basis for an early warning system Martín Liz-Domínguez1, Manuel Caeiro-Rodríguez1, Martín Llamas-Nistal1 and Fernando Mikic-Fonte1 1AtlanTTic Research Center, Universidade de Vigo, Campus Lagoas-Marcosende, 36310 Vigo, Spain [mliz, mcaeiro, martin, mikic]@gist.uvigo.es Abstract. Among the elements that determine a student’s academic success, their ability to regulate their own learning processes is an im- portant, yet typically underrated factor. It is possible for students to improve their self-regulated learning skills, even at university levels. How- ever, they are often unaware of their own behavior. Moreover, instruc- tors are usually not prepared to assess students’ self-regulation. This paper presents a learning analytics solution which focuses on rating self- regulation skills, separated in several different categories, using activity and performance data from a LMS, as well as self-reported student data via questionnaires. It is implemented as an early warning system, offering the possibility of detecting students whose poor SRL profile puts them at risk of academic underperformance. As of the date of this writing, this is still a work in progress, and is being tested in the context of a first year college engineering course. Keywords: learning analytics, self-regulated learning, early warning systems 1 Introduction Self-regulation skills are key features in order to achieve successful learning re- sults. Many studies have been published showing a good correlation between self-regulation skills and academic performance, also at the higher education level [16]. Good performance is related to a proper acquisition of self-regulation skills while poor performance and drop-out is associated with bad management. Therefore, for the purposes of early-warning systems, it is very interesting to know how students are regulating themselves. This can be a very useful indica- tor to identify students that are struggling because a poor management of these skills. Throughout the history of educational research, many authors have invested effort in understanding how students regulate their own learning behavior, and how this affects their performance and learning outcomes. Initial approaches were based on the use of questionnaires, usually very large ones, used to inquire Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Learning Analytics in times of COVID-19: Opportunity from crisis 39 students about their beliefs and strategies regarding the several categories in- volved in self-regulation. These instruments have two main issues [5]: first, as self-regulation involves a large variety of categories, the questionnaires include many questions and take a considerable time to be answered by students prop- erly. Trying to mitigate this problem, many published works include reduced questionnaires to limit this burden. Second, the answer provided by students may not be coherent with the actual behavior of a student during a course. It is possible that, even if the student knows what they should be doing in order to be successful in a course, their actual behavior differs significantly from their idea. A different approach in order to measure the level of self-regulation of a student could be to infer it from the actual behavior of the student, and this could be achieved from learning analytics approaches. The following are some examples of authors who used learning analytics with the goal of assessing different aspects of students’ self-regulation: – Several papers by Dragan Gasevic, Jelena Jovanovic and Abelardo Pardo focus on the analysis of LMS trace data in order to identify students’ learning strategies regarding the use of online resources. In [8], Gasevic et al. establish the basis of this line of research, defining several patterns in learning behavior — such as focus on formative or summative assessment, or preference for learning via videos — which allowed them to cluster students depending on their LMS activity. Around the same time, this group af authors published another work [9] that expands upon this methodology, identifying a clear correlation between learning strategies and performance. – A study by Asarta and Schmidt [1] focused on students’ time management and procrastination, which is an important area within self-regulation. The context of this study is a blended learning course, in which students needed to listen to recorded speech over slides instead of attending traditional lec- tures. Factors such as the moments at which students elected to access these online contents and the length of study sessions were useful in order to assess students’ use of time. Particularly, the authors highlight that regularity — as in, the ability of students to keep up-to-date with the lectures and evenly balance their workload throughout the course — is an aspect that is en- tirely opposite to procrastination and is generally favorable towards student performance. – Mega et al. [12] used several questionnaires to collect self-reported data from students, highlighting aspects related to self-regulation, emotions and moti- vation. The authors were able to prove a positive correlation between these aspects and academic achievement via analysis using a structural equation model. As we can see, many authors attempt to make use of data that is available thanks to online tools such as LMS. However, the collection of self-reported data using surveys and questionnaires is still a widely used technique, since even if it is more susceptible to bias, it can provide information that is very difficult or impossible to obtain otherwise. Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) 40 Learning Analytics in times of COVID-19: Opportunity from crisis One common trend of these studies is that they usually focus on a particu- lar aspect of self regulation. In the examples above, we have papers regarding learning strategies, time management and student motivation. The study presented in this paper aims to cover a wide spectrum of self- regulated learning components with a simplistic approach, the goal being ob- taining general SRL profiles of students that can be easily interpreted by a non- expert user. Moreover, the analysis procedure behind this objective can work as the basis for an early warning system: profiles are generated as the course is tak- ing place, allowing teachers to understand the particular SRL aspects in which students struggle and helping them improve. In the end, the generation and pre- sentation of these SRL profiles aims to be a learning tool for students, since as their self-regulation capabilities improve, so will their learning outcomes [13]. In order to generate these profiles, we use a combination of self-reported data (questionnaires) and observational data (LMS and similar online tools), which can be directly related to some self-regulation aspects. For example, use patterns of an LMS by students can give us an insight on how they manage their time and the use they make of the available learning resources. This paper is structured as follows: after the present introduction, Section 2 explains the foundations of the study and how the analysis procedure works. Section 3 details how this instrument is being used in a first year university course, and the results that have been observed so far. Finally, Section 4 offers a conclusion and possible lines of future work. 2 Study foundations The following subsections provide some brief reasoning regarding our approach to the division of different SRL aspects into categories, as well as detailing the types of both self-reported and observational data that we have at our disposal. 2.1 Self-regulated learning categories The classification of different SRL components into categories is not a partic- ularly novel concept. For example, researchers Zimmerman and Martı́nez-Pons proposed a detailed category list in their 1986 study [19], complementing the def- inition of one of the first widely known SRL questionnaires, the Self-Regulated Learning Interview Schedule (SRLIS). Particularly, these authors distinguished between 15 different categories, including items such as self-evaluation, informa- tion seeking, goal-setting, record keeping, or rehearsal and memorization. This category definition, however, is not a standard among educational re- searchers, as authors who work with SRL categories typically define and use a set that best fits their particular experiment. For example, Perels et al. [14] work with just six categories: goal setting, motivation, learning strategies, self-efficacy, self-reflection and problem-solving. On the other hand, Fabriz et al. [6] use 19 much more specific categories in their study, such as help seeking, procrastination or reflection. Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Learning Analytics in times of COVID-19: Opportunity from crisis 41 For our purposes, we wanted to define a simple, reduced set of SRL categories. This is because of the mid-term goal of reporting SRL information to students and teachers via an early warning system: it is important that the reported data is presented in such a way that is easy to understand and interpret. As for which categories to choose, we considered the ones that have been observed to be most correlated with academic performance, according to studies such as the ones just cited. Furthermore, we made sure that our available data could be directly associated to these categories. In the end, we have settled with the following five categories: 1. Learning strategies. These encompass the variety of ways in which students interact with course resources and undertake tasks. Depending on their learn- ing strategies, students may take superficial or deep approaches to learning (focusing on repetition and memorization, or making an effort to understand contents), which may be more or less effective depending on the specific sub- ject. This category also includes the student’s own awareness of the learning strategies that they use and how effective they are. 2. Time management. The effectiveness of time management by a student is defined by the amount of time they spend doing academic tasks, as well as the time frames they choose in order to do so. Ideally, students should be aware of the amount of time that they need to properly prepare their subjects and plan their study sessions around that. Additionally, they should avoid unnecessary delays in task performance, also known as procrastination. The ability of students to allocate time to personal activities or breaks also falls under this category. 3. Resource management. We define as resources not only the different type of learning materials at the student’s disposal, but also elements such as interactions with teachers and other students, or the use of libraries and other study spaces. This category measures the ability of the student to use all of these resources to their advantage in order to improve their learning performance. 4. Self-monitoring and self-assessment. Self-monitoring is the student’s capa- bility to realize that they are making progress towards their academic ob- jectives as they study or perform tasks. Meanwhile, self-assessment skills involve reflection on a previous task or study session, making sure that all goals established for said session were accomplished. In both cases, the stu- dent must be able to detect deficiencies in their work methods and apply solutions in order to improve them. 5. Motivation and self-confidence. These include several types of emotional fac- tors that directly affect students’ learning, performance and self-regulation capabilities. These factors can be reflected in aspects and actions such as setting and pursuing learning goals, which milestones the student considers as reachable and unreachable, their estimated value of tasks and subjects, or the mental strength to overcome difficulties that the course poses. Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) 42 Learning Analytics in times of COVID-19: Opportunity from crisis 2.2 Data collection Like any other learning analytics-related task, we rely on the availability of student data in order to carry out this study. We will make a distinction between two kinds of data: observational data, which includes online student activity data gathered from LMS and other similar platforms, and self-reported data, which refers to information that is directly provided by students via surveys or questionnaires. Observational data. This type of data is typically used in learning analytics studies. In our case, we have two different sources of observational data. On the one hand, course resources were made available to students via Moodle, and as such, access logs provide useful activity data. While finding an ideal way to process log data is a very complex problem in and of itself, we have found the methods detailed by Jovanovic et al. [9], already mentioned in Section 1, very interesting. These authors transform their LMS log data into learning sequences, which let them analyze each individual study session by the students, including the online resources that they use and the order in which they access them. This data transformation provides us a good idea on how students make use of online learning resources, and infer some information regarding the learning strategies that they follow. On the other hand, the course used the Blended e-Assessment platform (BeA) [10] to manage exams and any activity related to them. Data from BeA can provide not only grade information, but also an insight on what kind of mistakes students make during exams, as well as any aspect related to teacher- student communication in exam reviews. This information is not typically avail- able in a regular LMS, and serves as a nice complement to the data that is obtained from Moodle. Self-reported data. Self-regulated learning questionnaires have been widely used by educational researchers for decades, and are still very popular to this day, due to their ability to provide data that is not easily obtainable via observations. For example, information regarding motivational aspects is easy to gather using questionnaires, but very difficult to infer using LMS logs. This is why we consider self-reported data to be a necessary complement to observational data in order to get a complete picture of a student’s SRL profile. The main problem of self-reported data is the inherent bias of the students when they answer questionnaires or surveys. Information directly provided by students may not be accurate due to different factors, such as them having misconceptions about their own reality, or even students willfully lying when answering questionnaires. This is why self-reported data must be contrasted with observational data whenever this is possible. As a result of their popularity, many different questionnaires have been de- signed by a variety of authors throughout the years. For our own questionnaires, Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Learning Analytics in times of COVID-19: Opportunity from crisis 43 we have adapted questions from previously existing ones, modifying them to better suit our particular context. The questionnaires that served as inspiration were: – Study Process Questionnaire (SPQ) [3]. – Motivated Strategies for Learning Questionnaire [15]. – Metacognitive Awareness Inventory (MAI) [17]. – Learning Strategies Questionnaire (LSQ) [18]. – Revised Two-Factor Study Process Questionnaire (R-SPQ-2F) [4]. – Questionnaire for the Assessment of Learning Strategies of University Stu- dents (CEVEAPEU, originally in Spanish: Cuestionario de Evaluación de las Estrategias de Aprendizaje de los Estudiantes Universitarios) [7]. – Online Self-regulated Learning Questionnaire (OSLQ) [2]. 3 Execution and results The following subsections summarize the context in which this experiment was carried out, the ways data were collected throughout the course, and the provi- sional results obtained so far. 3.1 Context The focus of this study is a Computer Architecture course, part of the Degree in Telecommunications Engineering, taught at University of Vigo, in Spain [11]. As of the writing of this paper, the course has yet to finish, and thus, only partial results will be described. This course is one out of 5 that are simultaneously taught during the second semester of the degree’s first year. The course spans over a total of 16 weeks. The course has two separate parts that students need to pass: theory and practice, both implementing a continuous assessment system. In the latter, stu- dents are presented with weekly assembly programming assignments that they need to solve, and the assessment consists of three exams performed throughout the semester. The theory part, instead of traditional lectures, follows a flipped classroom system: students are provided videos covering the subject contents to watch at home, and classroom sessions are used for questions and problem solving. Additionally, students perform short exams every two weeks, which may allow them to pass the subject without the need to do a final exam. A final assessment system is also provided if the student so prefers, but fol- lowing the continuous assessment system is encouraged. During the academic year 2020/2021, out of 212 total enrolled students in the subject, 123 followed the continuous assessment system. It is worth noting that this rate of students following continuous assessment is lower than the degree average, and it is ex- plained by the fact that students who are retaking the subject often choose to follow the final assessment system. Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) 44 Learning Analytics in times of COVID-19: Opportunity from crisis In the practical part of the subject, assignments are made available for stu- dents via the institutional LMS, based on Moodle. Grades of this part are also reported using this medium. As for the theory part, videos, slides, self-assessment tests and any other kind of learning material are also available at the institutional LMS. However, exam handling is performed using BeA. This includes exam signups, grade reporting and reviews. 3.2 Experiment structure As this is the first year in which this experiment is being performed, the main goal is to gather data from students, both self-reported and observed, and try to define basic self-regulated learning profiles. Additionally, identifying correlations between the gathered data and student performance will set the foundations for the implementation of an early warning system. The use of Moodle in this course allows us to collect data related to student activity. Moodle logs provides information regarding when students log into the platform and which resources they visit. Additionally, the use of BeA provides the possibility of gathering assessment-related data that would be very difficult to obtain and process otherwise. As explained in Section 2, SRL questionnaires are used in order to collect self-reported data from students. The questions use a 1 to 5 Likert-style scale, through which the student expresses their level of agreement or disagreement with the statement posed in each item. Each of the questions can be directly linked to one of the five self-regulated categories that were defined in Section 2.1. Students are never required to answer these questionnaires, but are encouraged to do so. At the beginning of the course — during its second week —, a 20-item SRL questionnaire was performed during in-person theory sessions. This initial ques- tionnaire includes 4 questions related to each of the five SRL categories, and has the main purpose of providing basic information for SRL profiling. Appendix A lists the 20 items that were include in this initial questionnaire. Additionally, several shorter questionnaires of 7 items each are made avail- able to students through BeA at different points of the course. Three of these smaller questionnaires were scheduled throughout the semester, making them available every 4 weeks. The intended purpose of these are tracking the evolu- tion of student views regarding their self-regulation abilities. On top of this, they are designed as brief self-reflection exercises for students. Regarding the SRL questionnaires, students that followed the continuous assessment system were split into two groups of equal size: an experimental group, which have access and are encouraged to answer the questionnaires as they are made available, and a control group, which are asked to fulfill the first questionnaire at the beginning of the course, but none of the subsequent ones. Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Learning Analytics in times of COVID-19: Opportunity from crisis 45 3.3 Provisional results As of the writing of this paper, we have computed the self-reported data obtained from the initial questionnaire. Having been performed during an in-person ses- sion, a total of 113 students completed this questionnaire, a very significant fraction of those following the continuous assessment system. Figure 1 displays the answer distribution for each of the 20 items in this questionnaire, classified by their respective self-regulated learning category. The number inside each tile in the graph represents the number of students that provided a particular answer in the corresponding question. On the other hand, Figure 2 represents the averages and standard deviation observed in the answers for each question. Fig. 1. Answer distribution for the initial questionnaire. Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) 46 Learning Analytics in times of COVID-19: Opportunity from crisis Fig. 2. Mean and standard deviation for each item in the questionnaire. As it can be observed, while there is usually a clearly preferred answer in each question, the variation in answers is not insignificant. The standard deviation in the answers ranges from 0.583 in the most homogeneous one (question 10), to 1.318 in the most heterogeneous (question 11), and it ranges between 0.8 and 1.0 for most questions. This suggests that it may be possible to identify student clusters depending on their answers to the questionnaire, and possibly link them to strengths or weaknesses regarding specific self-regulated learning categories. It is worth noting that there is a negative correlation between averages and standard deviations for each question. Question 10, which as aforementioned is the one for which the lowest standard deviation value was observed, is also the question with the highest average answer value (4.46). Likewise, question 11 was the one with more deviation in its answers, and the one that had the lowest mean answer value (2.68). Generally, this means that there are items for which most students agree with the option that represents the “best practices” in terms of self-regulation, while some other questions are more controversial and varied in terms of their answers. If we group questions into their respective SRL categories, the average answer values for each one can also be calculated: – Learning strategies: 3.62 – Time management: 3.28 – Resource management: 3.66 – Self-monitoring and self-assessment: 3.66 – Motivation and self-confidence: 3.42 From the results of this questionnaire alone, it is not possible to discern which SRL categories the average student is weaker at. However, this was useful Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Learning Analytics in times of COVID-19: Opportunity from crisis 47 to determine what the focus should be in the following, smaller questionnaires. The categories that resulted in a slightly lower overall score were Time manage- ment and Motivation and self-confidence. Additionally, there is the fact that, as explained in Section 2.1, Motivation and self-confidence is arguably the category that is hardest to assess using observational data. Thus, we decided to include mostly questions from this category in future questionnaires. Figure 3 represents the distribution of answers by students if they are grouped into their respective categories, and confirms the conclusion that is inferred from the average values: the figures for Learning strategies, Resource management and Self-monitoring and self-assessment look almost identical to each other, while Time management and Motivation and self-confidence show slightly lower overall values. Fig. 3. Average answers given by students in the questionnaire, grouped by self- regulated learning category. 3.4 Upcoming analyses So far, we have only fully processed the results from the first questionnaire. We will be progressively incorporating data obtained from the course, both observa- tional and self-reported, in order to properly assess the SRL profiles of students. We then intend to look for correlations with course performance data, and de- termine what kind of SRL deficiencies put the student at most risk of failing or abandoning the course. This will allow us to build an early warning system based on self-regulation data. These are the ways in which we intend to use data at our disposal: – Extra SRL questionnaires performed at different points of the course pro- vide further information on the evolution of the students’ SRL abilities. As Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) 48 Learning Analytics in times of COVID-19: Opportunity from crisis explained in section 3.2, students that follow the continuous assessment are split into control and experimental groups, of roughly 60 students each, and only the ones in the latter are allowed to view and answer the extra question- naires. So far, we have observed that only about one third of the students in the experimental group actually answered the first extra questionnaire. Thus, additional measures to foster participation may be required in order to improve the usefulness of the extra SRL questionnaires. Once the results for all questionnaires during the course have been collected, we intend to validate the questions using Cronbach’s alpha coefficient. Partic- ularly, we will check whether questions that address the same SRL category have reliable and consistent answers. The validation outcome will be taken into account to improve the surveys that will be performed during the next academic year. – Moodle activity data can be used to track the use of learning resources by students. On top of being useful to assess the Time management and Resource management by students, these activity logs can also provide hints towards identifying Learning strategies: for example, observing if a student prioritizes some kinds of resources or activities over others, or if there are some topics that a student deliberately avoids. – Finally, BeA data can provide insights towards assessing students’ Self- monitoring and self-assessment. With the help of these data, we could iden- tify the kind of mistakes that students make the most during exams, and even if they repeat similar mistakes across multiple questions or different exams. 4 Conclusion and future work It is unquestionable that self-regulation plays a pivotal role in students’ perfor- mance and quality of learning at any level of education. However, this aspect is often forgotten due to its relative obscurity, not being taken into account by students and instructors alike. This project aims to raise awareness about self-regulation among the educational community, providing a way to assess the strengths and weaknesses of students in different self-regulated learning aspects. While this work is still at an early stage, we expect that the volumes of data that we handle, both self-reported and observational, can help us build decently reliable SRL profiles at early stages in a course. The lines for immediate future work were outlined in Section 3.4. We will continue to work with data from the target Computer Architecture course in future academic years. Additionally, we have been contacting other academic institutions of different educational levels in order to seek lines of cooperation. It would be ideal to test the ways in which the knowledge acquired from the experiments in Computer Architecture could be applied in other contexts. Acknowledgment. We want to thank Javier Montoto Urrabieta for his support in the development and maintenance of BeA. Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Learning Analytics in times of COVID-19: Opportunity from crisis 49 This work is partially financed by public funds granted by the Galician re- gional government, with the purpose of supporting research activities carried out by PhD students. (“Programa de axudas á etapa predoutoral da Xunta de Galicia — Consellerı́a de Educación, Universidade e Formación Profesional”) This work has received financial support from the Xunta de Galicia (Centro singular de investigación de Galicia accreditation 2019-2022) and the European Union (European Regional Development Fund - ERDF), and from the Galician Regional Government under project ED431B 2020/33. A Initial questionnaire items The following is a list of the questions that were part of the initial questionnaire, which students filled at the start of the course — originally in Spanish. Beside each question is the SRL category it is associated with: learning strategies (LS), time management (TM), resource management (RM), self-monitoring and self- assessment (MA) or motivation and self-confidence (MC). 1. I often write summaries of the subjects’ learning material. (LS) 2. I try to find classmates who I can trust and ask for help if I need it. (RM) 3. If I do not understand something of what I am studying, I go back and review it to make sure I comprehend everything. (MA) 4. I try to finish assigned tasks as soon as possible. (TM) 5. I often invest more time and effort in harder subjects. (TM) 6. When studying a subject, I try to identify the key concepts, as well as the contents that are not as important. (LS) 7. When I finish a study session, I ask myself questions to check that I have understood everything. (MA) 8. I get enough sleep and take the breaks I need. (TM) 9. I am capable of making an effort to focus on my task when I start getting distracted. (MC) 10. It is very important to me to understand the contents of the subjects. (MC) 11. I get very nervous when I am doing an exam. (MC) 12. I try to establish relationships between what I learn in one subject and the contents of others. (MA) 13. I have an adequate place to study, where I can fully focus on my task. (RM) 14. I generally only study what I need to pass the subject, since I think it is useless to do extra work. (LS) 15. When I finish an exam, I am aware of how well I did. (MA) 16. When studying my subjects, I set goals to reach in each study session. (LS) 17. I am confident that I can handle even the hardest parts of the subjects. (MC) 18. If I do not understand something, I ask the teacher. (RM) 19. I am aware of the assessment criteria that the teachers will use in the different subjects. (RM) 20. I have a weekly study schedule and try my best to follow it. (TM) Note: questions 11 and 14 are asked in such a way that “negative” answers are those with a higher level of agreement. Therefore, the values of the answers were inverted before analysis (1 becomes 5, 2 becomes 4, and vice versa.) Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) 50 Learning Analytics in times of COVID-19: Opportunity from crisis References 1. Asarta, C.J., Schmidt, J.R.: Access Patterns of Online Materials in a Blended Course. Decision Sciences Journal of Innovative Education 11(1), 107–123 (jan 2013), http://doi.wiley.com/10.1111/j.1540-4609.2012.00366.x 2. Barnard, L., Lan, W.Y., To, Y.M., Paton, V.O., Lai, S.L.: Measuring self- regulation in online and blended learning environments. The Internet and Higher Education 12(1), 1–6 (jan 2009), https://linkinghub.elsevier.com/retrieve/pii/ S1096751608000675 3. Biggs, J.: The Study Process Questionnaire (SPQ): Manual. Tech. rep., Australian Council for Educational Research, Hawthorn, Australia (1987), https://eric.ed. gov/?id=ED308200 4. Biggs, J., Kember, D., Leung, D.Y.: The revised two-factor Study Process Ques- tionnaire: R-SPQ-2F. British Journal of Educational Psychology 71(1), 133–149 (mar 2001), https://onlinelibrary.wiley.com/doi/abs/10.1348/000709901158433 5. ElSayed, A.A., Caeiro-Rodrı́guez, M., MikicFonte, F.A., Llamas-Nistal, M.: Re- search in Learning Analytics and Educational Data Mining to Measure Self... In: World Conference on Mobile and Contextual Learning. pp. 46–53 (2019) 6. Fabriz, S., Dignath-van Ewijk, C., Poarch, G., Büttner, G.: Fostering self- monitoring of university students by means of a standardized learning journal—a longitudinal study with process analyses. European Journal of Psychology of Education 29(2), 239–255 (jun 2014), https://link.springer.com/article/10.1007/ s10212-013-0196-z 7. Gargallo, B., Suárez-Rodrı́guez, J.M., Pérez-Pérez, C.: El cuestionario CE- VEAPEU. Un instrumento para la evaluación de las estrategias de aprendizaje de los estudiantes universitarios. RELIEVE - Revista Electrónica de Investigación y Evaluación Educativa 15(2), 1–31 (2009), https://ojs.uv.es/index.php/RELIEVE/ article/view/4156 8. Gašević, D., Jovanović, J., Pardo, A., Dawson, S.: Detecting Learning Strategies with Analytics: Links with Self-Reported Measures and Academic Performance. Journal of Learning Analytics 4(2), 113–128 (jul 2017), https://epress.lib.uts.edu. au/journals/index.php/JLA/article/view/5085 9. Jovanović, J., Gašević, D., Dawson, S., Pardo, A., Mirriahi, N.: Learning an- alytics to unveil learning strategies in a flipped classroom. The Internet and Higher Education 33, 74–85 (apr 2017), https://linkinghub.elsevier.com/retrieve/ pii/S1096751617300684 10. Llamas-Nistal, M., Fernández-Iglesias, M.J., González-Tato, J., Mikic-Fonte, F.A.: Blended e-assessment: Migrating classical exams to the digital world. Computers & Education 62, 72–87 (mar 2013), https://linkinghub.elsevier.com/retrieve/pii/ S0360131512002497 11. Llamas-Nistal, M., Mikic-Fonte, F.A., Caeiro-Rodriguez, M., Liz-Dominguez, M.: Supporting Intensive Continuous Assessment With BeA in a Flipped Classroom Experience. IEEE Access 7, 150022–150036 (2019), https://ieeexplore.ieee.org/ document/8865067/ 12. Mega, C., Ronconi, L., De Beni, R.: What makes a good student? How emo- tions, self-regulated learning, and motivation contribute to academic achievement. Journal of Educational Psychology 106(1), 121–131 (feb 2014), http://doi.apa.org/ getdoi.cfm?doi=10.1037/a0033546 13. Nilson, L.: Creating Self-Regulated Learners: Strategies to Strengthen Students’ Self-Awareness and Learning Skills. Stylus Publishing (2013), https://books. google.es/books?id=ZeBaAQAAQBAJ Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Learning Analytics in times of COVID-19: Opportunity from crisis 51 14. Perels, F., Gürtler, T., Schmitz, B.: Training of self-regulatory and problem- solving competence. Learning and Instruction 15(2), 123–139 (apr 2005), https: //linkinghub.elsevier.com/retrieve/pii/S095947520500023X 15. Pintrich, P.R., Smith, D., Garcia, T., McKeachie, W.: A manual for the use of the Motivated Strategies for Learning Questionnaire (MSLQ). Tech. rep., National Center for Research to Improve Postsecondary Teaching and Learning, Ann Arbor, MI, USA (1991), https://eric.ed.gov/?id=ED338122 16. Roth, A., Ogrin, S., Schmitz, B.: Assessing self-regulated learning in higher education: a systematic literature review of self-report instru- ments. Educational Assessment, Evaluation and Accountability 28(3), 225–250 (aug 2016), https://link.springer.com/article/10.1007/s11092-015-9229-2http:// link.springer.com/10.1007/s11092-015-9229-2 17. Schraw, G., Dennison, R.S.: Assessing Metacognitive Awareness. Contemporary Educational Psychology 19(4), 460–475 (oct 1994), https://linkinghub.elsevier. com/retrieve/pii/S0361476X84710332 18. Warr, P., Downing, J.: Learning strategies, learning anxiety and knowledge acqui- sition. British Journal of Psychology 91(3), 311–333 (aug 2000), http://doi.wiley. com/10.1348/000712600161853 19. Zimmerman, B.J., Pons, M.M.: Development of a Structured Interview for As- sessing Student Use of Self-Regulated Learning Strategies. American Educational Research Journal 23(4), 614 (1986), http://links.jstor.org/sici?sici=0002-8312% 28198624%2923%3A4%3C614%3ADOASIF%3E2.0.CO%3B2-P&origin=crossref Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0)