Supporting self-regulated learning in a blended learning environment using prompts and learning analytics Sabina Rakoa,b, Diana Šimića and Bart Rientiesc a University of Zagreb Faculty of organization and informatics, Pavlinska 2, Varazdin, 42000, Croatia b University of Zagreb University Computing Centre, Josipa Marohnica 5, Zagreb, 10000, Croatia c Open University, Milton Keynes MK7 6AA, United Kingdom Abstract Higher education institutions, teachers, and students face new difficulties and opportunities resulting from the introduction of modern technology into the learning process. The widespread of learning environments that integrate online learning and face-to-face learning may pose some opportunities as well as difficulties for some groups of students' self-regulation skills. Providing automated prompts may help to support those students with insufficient self-regulation skills. The use of learning analytics and multiple methods and data sources (data triangulation) may give better insight into the self-regulation process. The objective of the proposed research is to explore the students’ evaluation of the usefulness of prompts implemented in a blended learning environment. A secondary objective is to develop and evaluate a real-time dashboard designed to notify teachers of student responses to deployed prompts. The research methodology will be grounded in action research and empirical research. The scientific contribution will be achieved through the development of artefacts and the performance of empirical research to advance understanding of the student’s self-regulation in a blended learning environment. Keywords 1 learning analytics, self-regulated learning, prompts, blended learning, dashboards, higher education 1. Introduction This research also revealed that it is not yet possible to identify for which specific competencies (or disciplines) a blended In the past two decades, blended learning in learning format is most appropriate. higher education has been increasingly Several teachers and institutions strive to widespread [1]. The effectiveness of blended develop personalised learning approaches in an learning in relation to traditional learning is effort to meet the needs of each student to the continuously reviewed [2,3]. Recently, Müller greatest extent possible. To be able to customise and Mildenberger [4] conducted a meta- the approach, it is necessary to examine the analysis of scientific papers published from views and habits of students. For example, 2008 to 2019 and found that identical learning information systems deployed in the teaching outcomes were achieved in blended learning as and learning process are sources of valuable in a conventional classroom setting, with a educational data that may be used to monitor reduction of time spent in physical space by 30 and assess the teaching and learning process to 79% (division according to Allen et al. [5]). Proceedings of the Doctoral Consortium of Seventeenth European Conference on Technology Enhanced Learning, September 12–16, 2022, Toulouse, France EMAIL: sabina.rako@srce.hr (A. 1); diana.simic@foi.unizg.hr (A. 2); bart.rienties@open.ac.uk (A. 3) ORCID: 0000-0002-8457-3089 (A. 1); 0000-0002-6721-7250 (A. 2); 0000-0003-3749-9629 (A. 3) ©️ 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings (CEUR-WS.org) [6], and play a vital part in the development of elements that the teacher uses to encourage personalised solutions. understanding and are most often in a form of Learning analytics as a research area is questions, although they can also be formulated focused on the "measurement, collection, in the form of advice or instructions” [14]. analysis and reporting of data about learners Another definition of prompts is “short hints or and their contexts, for purposes of questions presented to students in order to understanding and optimising learning and the activate knowledge, strategies or skills that environments in which it occurs" [7]. The students have already available but do not use” implementation of learning analytics is a [15]. Additionally, students do not usually complex process that requires capability manifest self-regulated behaviour building and certain specific competencies of spontaneously without guidance [16]. Despite stakeholders in the education system. In the fact that the research revealed a number of practice, learning analytics examples can be potential advantages of prompts for self- found at several levels (e.g., students, courses, regulated learning, Schumacher and Ifenthaler programmes, institutions, and consortiums of [17] reported that learning analytics approaches institutions) [8]. When applying learning have not been thoroughly examined during analytics, technology should be used wisely prompt implementation, and that future studies taking into account existing educational should also focus on the student’s responses to concepts and research knowledge [9]. prompts. Tsai et al. [10] provided an overview of The proposed research will also consider trends and limits in the deployment of learning learning design as an important element in analytics in the European higher education educational interventions. system. According to their research, teachers Specifically, these research questions will and teaching staff are the primary users of drive the proposed research. learning analytics, and there is limited evidence RQ1: To what extent are students aware of of active engagement with students and the use self-regulation elements, such as metacognitive of learning analytics to improve self-regulated activities before/during/after learning, learning skills. environmental structuring, help seeking, and Self-regulated learning includes cognitive, time management in the blended learning metacognitive, behavioural, motivational, and environment? emotional aspects of learning. This area has RQ2: In a blended learning environment, been extensively researched in the field of which types of prompts (cognitive, educational psychology, and among the best metacognitive, motivational, or content- known and most applied models is the related) do groups of students find most useful? Zimmerman’s model of self-regulated learning, RQ3: Is there a difference in perceived that consists of three main phases: (a) usefulness of the same type of prompt based on forethought, (b) performance, and (c) self- the mode of learning (online and face-to-face)? reflection [11]. Wong et al. [12] in a systematic RQ4: How does the implementation of review of self-regulated learning in an online specific prompts affect environment and massive open online courses (a) student’s engagement (MOOCs) demonstrated the need for further (b) results achieved in formative research of self-regulated learning in an online assessment environment, particularly through an empirical (c) overall learning satisfaction? approach. Furthermore, Viberg et al. [13] What distinctions exist amongst student examined empirical research in which learning groups? analytics were used to improve self-regulated RQ5: Which components of the real-time learning and concluded that few studies related dashboard for displaying student feedback on to the self-reflection phase of the Zimmerman prompt implementation are important to model, and that the majority of research focused students and/or teachers? on measuring self-regulated learning and less on support. In previous research, feedback and prompts have been identified as the most important elements that encourage self-regulated learning [12]. Prompts are “visual, textual, or spoken Figure 1: Proposed activities and key artefacts based on steps in Somekh’s action research process (Source: Author) The intervention will be designed as an 2. Methodology iterative process, with a pilot trial followed by the main study. The interventions are intended to be implemented at two higher education This proposed research will utilise a mixed- institutions in Croatia, aiming to target around method practical action research design. 340 students and 3 teachers. Ethical approval According to Creswell [18], action research is from participating higher education institutions used to address specific, practical issues that will be obtained. seek solutions to a problem, and both Teachers will be closely involved in quantitative and qualitative methods may be preparations for implementation (analysis of employed. Somekh [19] proposes a four-step current learning design of a course, defining process for action research: planning, acting, specific goals of prompt implementation, observing, and reflecting. The proposed finding appropriate learning types, and defining activities in each action research step and key prompts based on selected models). artefacts are shown in Figure 1. Several During this phase, the appropriate research methods, including descriptive measurement instruments will be evaluated statistics, natural language processing methods (linguistic evaluation) or, if necessary, a new (open-ended questions), statistical analysis, and measurement instrument will be developed. nonparametric tests, will be utilised for data analysis. For statistical analysis, the statistical programming language R [20] will be used. 2.2. Acting 2.1. Planning This activity is a key component of the research proposal. During this phase, the developed artefacts will be used in the real The initial literature review showed the environment. research gap in the area of learning analytics The dominant research method used will be approaches in investigating prompts for pretest-posttest nonequivalent groups design, a supporting students’ self-regulation. During the type of quasi-experimental design. One group preparation phase, an additional literature of students will be exposed to an intervention, review will be conducted to synthesise the while the other group will not. The two groups findings of prior research, identify appropriate will then be compared. According to previous measurement instruments, and provide an research [21], in order to eliminate confounding overview of the outcomes of prior empirical variables, the duration of exposure should not interventions. be excessively long (preferably 2 - 4 weeks). Before the intervention, a priori statistical power analysis will be conducted to determine the required number of outcome observations. During this stage, the measurement instruments will be evaluated in a real environment. 2.3. Observing In this phase, monitoring activities and providing teachers with adequate technical Figure 2: Prompt prototype. Students could support will be the primary activities. Data will rate prompts and give textual feedback (Source: be collected via system logs, measurement Author) instruments and prompt feedback. To monitor student progress, teachers will Prototype of teacher’s dashboard has been have access to a real-time dashboard with also developed (Figure 3). visualisations of student responses. 2.4. Reflecting Teachers will receive the intervention results during the phase of reflection. In addition, they will assess the real-time dashboard that was accessible during the Figure 3: Prototype of teachers’ dashboard observing phase. providing real-time monitoring of student’s In addition, a think-aloud protocol [22] will responses (Source: Author) be implemented to collect specific information about students' and teachers’ experiences with In order to test the feasibility of the proposed prompt implementations. study, pre-pilot study has been conducted. 38 students gave consent to participate in the pre- 3. Current results pilot study. The students were second-year students of the informatology programme at the Faculty of Humanities and Social Sciences. 36 A literature review with the focus on out of 38 students were female, while two were available measurement instruments (self- male. regulated learning, engagement, satisfaction Lessons learned from the pre-pilot study: and other relevant constructs) is currently in progress. • the suggested plug-in is appropriate for Based upon the initial reading of the prompt implementation and gives literature and good practice identified, a considerable design flexibility with respect to learning design prototype of plug-in for prompt implementation has been developed in Moodle LMS Platform • students are more likely to rate prompts (Figure 2). The plug-in makes it possible to during face-to-face meetings than embed prompts wherever an HTML editor is during online sessions available. • the teacher acknowledged the advantages of monitoring student responses, and the input gained could be useful for designing course improvements • think-aloud sessions conducted with two students gave valuable insights into the perception of implemented prompts • adjustment of rating scale should be considered (10 or 7-level scale) • it would be useful to collect additional [2] R. M. Bernard, E. Borokhovski, R.F. demographic information in order to Schmid, R. M., Tamim, P.C. Abrami, A better understand behavioural meta-analysis of blended learning and differences among students. technology use in higher education: From the general to the applied, Journal of 4. Contribution to TEL domain Computing in Higher Education, 26(1) (2014), 87-122. doi:10.1007/s12528-013- 9077-3 The expected contributions of the proposed [3] B. Anthony Jr., A. Kamaludin, A. Romli, research to the Technology Enhanced Learning A.F.M. Raffei, D. Nincarean A/L Eh Phon, (TEL) domain are: A. Abdullah, G.L. Ming, N.A. Shukor, • synthesis of empirical interventions and M.S. Nordin, S. Baba, Exploring the role the results on supporting self-regulated of blended learning for teaching and learning with prompts using learning learning effectiveness in institutions of analytics in a blended learning higher learning: An empirical environment investigation, Education and Information • development and evaluation of artefacts Technologies, vol. 24, no. 6 (2019) 3433- related to prompt implementation in real 3466. doi: 10.1007/s10639-019-09941-z environment [4] C. Müller, T. Mildenberger, Facilitating • better understanding of students’ self- flexible learning by replacing classroom regulation in blended learning time with an online learning environment: environment using prompts A systematic review of blended learning in • results of empirical research on higher education, Educational Research supporting self-regulated learning in Review, Volume 34 (2021), ISSN 1747- blended learning environment using 938X. doi: 10.1016/j.edurev.2021.100394 prompts and learning analytics. After [5] I.E. Allen, J. Seaman, R. Garrett, Blending completing experimental part of the in: The extent and promise of blended proposed research, differences across education in the United States, student groups can be expected in terms Newsburyport, MA: Sloan Consortium of student engagement, formative (2007) assessment outcomes, and overall [6] G. Siemens, Learning Analytics: The learning satisfaction. The combination Emergence of a Discipline, American of accessible students' demographic Behavioral Scientist, 57(10) (2013) 1380– information with their responses and 1400. doi: 10.1177/0002764213498851 system data will provide insight into [7] Society for Learning Analytics Research, students' self-regulation practises and What is Learning Analytics?, Society for awareness. Learning Analytics, [Online]. Available: https://www.solaresearch.org/about/what- 5. Acknowledgments is-learning-analytics/ [Accessed, June 29, 2021] [8] Tyton, Learning analytics strategy toolkit, This work has been fully supported by the Croatian Science Foundation under the project [Online], Available: https://www.everylearnereverywhere.org/ IP-2020-02-5071. resources/learning-analytics-strategy- toolkit/ [Accessed, June 29, 2021] 6. References [9] D. Gašević, S. Dawson, G. Siemens, Let's not forget: Learning analytics are about [1] M. Lundin, A. Bergviken Rensfeldt, T. learning, Techtrends, vol. 59 (2015) 64-71. Hillman, A. Lantz-Andersson, L. Peterson, doi: 10.1007/s11528-014-0822-x Higher education dominance and siloed [10] Y. Tsai, D. Rates, P. M. Moreno-Marcos, knowledge: a systematic review of flipped P. J. Muñoz-Merino, I. Jivet, M. Scheffel, classroom research, International Journal H. Drachsler, C. D. Kloos, D. Gašević, of Educational Technology in Higher Learning analytics in European higher Education 20 (2018). doi: education-Trends and barriers, Computers 10.1186/s41239-018-0101-6 & Education, Volume 155 (2020), 103933. ISSN 0360-1315. doi: [20] R Core Team, R: A language and 10.1016/j.compedu.2020.103933 environment for statistical computing, [11] E. Panadero, A review of self-regulated 2022. URL: https://www.R-project.org/ learning: Six models and four directions [21] L. Zheng, The effectiveness of self- for research, Frontiers in Psychology 8, regulated learning scaffolds on academic Article 422 (2017) performance in computer-based learning [12] J. Wong, M. Baars, D. Davis, T. Van Der environments: a meta-analysis, Asia Zee, G. Houben, F. Paas, Supporting Self- Pacific Educ. Rev. 17 (2016) 187–202. Regulated Learning in Online Learning doi: 10.1007/s12564-016-9426 Environments and MOOCs: A Systematic [22] M. E. Fonteyn, B. Kuipers, S. J. Grobe, A Review, International Journal of Human- description of think aloud method and Computer Interaction 35:4-5 (2018) 356- protocol analysis, Qualitative Health 373 Research, vol. 3, issue 4 (1993) 430-441, [13] O. Viberg, M. Khalil, M. Baars, Self- doi: 10.1177/104973239300300403 Regulated Learning and Learning Analytics in Online Learning Environments: A Review of Empirical Research, in: Proceedings of the 10th International Learning Analytics and Knowledge Conference, LAK’20, Association for Computing Machinery, New York, NY, 2020, pp. 173–186. ISBN: 978-1-4503-7712-6 [14] British Council, Prompts, British Council,[Online]. Available: https://www.teachingenglish.org.uk/articl e/prompts [Accessed, Jun. 27, 2021] [15] J. Wirth, Promoting self-regulated learning through prompts, Zeitschrift für Pädagogische Psychologie, 23(2) (2009) 91-94 [16] C. Sonnenberg, M. Bannert, Evaluating the impact of instructional support using data mining and process mining: A micro- level analysis of the effectiveness of metacognitive prompts. Journal of Educational Data Mining, 8(2) (2016) 51- 83 [17] C. Schumacher, D. Ifenthaler, Investigating prompts for supporting students' self-regulation - A remaining challenge for learning analytics approaches?, The Internet and Higher Education, Volume 49 (2021). doi: 10.1016/j.iheduc.2020.100791 [18] J. W. Creswell, Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research, fourth edition, Pearson, 2012 [19] B. Somekh, Action Research: A Methodology for Change and Development, 1st edition, Open University Press, 2005