A proposal for Monitoring the Intervention Strategy on the learning of MOOC learners Ruth Cobos1[0000-0002-3411-3009] and Juan Soberón1 1 Computer Science Department. Universidad Autónoma de Madrid, Spain ruth.cobos@uam.es,juan.soberon@estudiante.uam.es Abstract. Insufficient feedback and lack of interaction among instructors and learners affect negatively learner retention and engagement in MOOCs. These factors, paired with the feeling of isolation, have a high influence on learners who do not complete the course in which they have enrolled. To overcome this situation, we propose a system to provide periodically MOOC learners with visual information on a Web-based Learner Dashboard, showing them their progress and engagement in the MOOC. This provided information is part of an Intervention Strategy on the learning of these learners. The system offers MOOC instructors to access to a Web-based Instructor Dashboard that shows the interest in this service by the learners and, therefore, facilitates evaluating the success or failure of the Intervention Strategy. Keywords: Learning Analytics, Massive Open Online Course, Learning Strategy, Web-based Technology, Feedback, Engagement, Intervention. 1 Introduction and motivation As times goes by, Massive Open Online Courses (MOOCs) have continuously grown in supply (number of courses offered), demand (number of learners enrolled) and relevance in e-Learning discipline [8]. Nowadays, they are accepted as a valid source of educational resources, even sometimes as an official formation recognized by higher education institutions. In the case of the University Autónoma of Madrid (UAM, Spain), it offers several MOOCs at edX since 2015 (https://www.edx.org/school/uamx). In this article, the approach presented was tested in the MOOC: “Introduction to Development of Web Applications” (WebApp MOOC), as in previous works. Several research studies agree on identifying that insufficient feedback and lack of interaction among instructors and learners affect negatively learner retention and engagement in MOOCs [5][11]. These factors, paired with the feeling of isolation, have a high influence on learners who do not complete the course in which they have enrolled. Therefore, it is necessary to address these issues to adequately direct research to learners at risk of dropout. 1 Copyright © 2020 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) 62 It is well known that the main research discipline that can provide ideas to solve the presented problem is Learning Analytics [7][10]. We find a wide variety of Learning Analytics tools in different higher education institutions for prediction and visualisation, among others, in different aspects of learning in MOOCs [6][9]. Furthermore, one of the authors of this article has developed different learning analysis tools previously [1][2][3][4] to solve related problems. With the aim of decreasing the mentioned feeling of isolation in MOOCs, at UAM we have developed a Learning Analytic system for periodically providing feedback messages to the learners. This system is called edX-LIS (acronym of Learning Intervention System for edX MOOCs). It provides to the course instructors the feedback messages, based on the learner performance in the course and have information on their progresses and recommendations about how they could improve their performance. Therefore, the instructors can send these mentioned messages to the MOOC learners. In order to test the intervention strategy on learners’ learning supported by edX-LIS a research study conducted with the learners of the first run of the mentioned MOOC. The obtained results from this study have demonstrated that the intervention strategy proposed had a positive impact in the motivation, persistence and engagement of the learners in the MOOC. Nevertheless, we obtained information about the usefulness of the intervention strategy thanks to the answers obtained from a satisfaction questionnaire, that, only few of the enrolled learners that received these feedback messages answered. Therefore, to provide monitoring of the intervention strategy on the learning of the MOOC learners, we propose in this article the extension of the previous system to a Web-based Learning Analytics System called edX-LIMS (acronym of System for Learning Intervention and its Monitoring for edX MOOC). edX-LIMS provides, among other services, MOOC learners with an easy way to visualise their engagement in the course in a “Web-based Learner Dashboard” and it provides MOOC instructors with an easy visualisation of the learners’ interests in the aforementioned visualisation in a “Web-based Instructor Dashboard”. The structure of this article is as follows: in the next section, we present a detailed description of the proposed Learning Analytics tool: edX-LIMS; in section 3, we present the current use of the tool in a case study with learners of the aforementioned MOOC. Finally, the article ends with conclusions and future works. 2 The approach proposed: a System for Learning Intervention and its Monitoring for edX MOOCs We propose a Web-based Learning Analytics System called edX-LIMS, which provides an intervention strategy on the learners’ learning and the monitoring of the mentioned strategy by the instructors. We have developed the system for any edX MOOC; however, we have tested it on a specific course (see Section 3 for more details). In addition, in MOOCs at edX Platform, only the learners enrolled in the verified itinerary (i.e. the learners who pay a fee) can carry out course assessment activities and, 63 therefore, they can be evaluated and get a final grade in the course. Our approach deals with only these verified learners. Therefore, in this article, when we mention MOOC learners, we refer exclusively to MOOC learners enrolled in the verified itinerary. The user roles for a specific MOOC in our system are: i) Learner (any learner who is enrolled in the verified itinerary), ii) Instructor (any member of the course instructors team) and iii) Admin (any instructor who maintain and manage the course data in the system). Five interconnected services provide the system functionality. These services are coded in Python language (https://www.python.org/) and presented in the next subsections. 2.1 Introduction to the system proposed The system services are (see Fig. 1): • Course Data Processing Service: the system, based on course data, calculates the learners’ indicators, which represent a summary of the learners’ interactions. • Learning Intervention Generation Service: the system, based in the learners’ indicators, provides the intervention strategy on the learners’ learning. That is, it generates interactive Web-based Dashboards (one per learner). Therefore, it sends the corresponding information for the access of these Dashboards to the learners via email. • Intervention Visualisation Service: The learners have the opportunity to access to their Learner Dashboards, which show them their engagement and performance in the course. • Course Data Monitoring Service: the system allows instructors to monitor summary course data. • Learning Intervention Monitoring Service: the system allows instructor to monitor the interests of the learners in their received Learners Dashboards. Fig. 1. edX-LIMS services. The system provides their users three types of Web-based Dashboards. Each one of them can be accessed by the corresponding role user and provides access to the appropriated system services: 64 • Learner Dashboard: Intervention Visualisation Service. • Instructor Dashboard: the service contained in the Learner Dashboard, Learning Intervention Generation Service, Course Data Monitoring Service and Learning Intervention Monitoring Service. • Admin Dashboard: the services contained in the Instructor Dashboards and Course Data Processing Service. 2.2 Couse Data Processing Service This service allows extracting, cleaning, selecting and preprocessing course data (MOOC log tracks) to obtain the relevant learners’ activity. Moreover, it supports the creation of input variables or indicators and data storage management. The following subservices compose this service. Extraction and Preprocessing Subservice It is designed to extract and clean course data, preprocess and change its representation. In more detail, in this subservice, we build the learners’ activity matrix considering their registered events. We filter events data by type and user. We only collect the events by users who are verified learners. Indicators Generation Subservice This service calculates the indicators for the course. These indicators measure for each learner the following: time spent at the course, time invested in graded assignments, videos and forum, number of sessions and events registered, number of days connected, the number of consecutive inactivity days, and the number of different graded assignments solved and videos accessed. In addition, it calculates the current final grade for each learner considering the partial grades of their solved graded assignments. Store Subservice The generated indicators are stored as data collections in a MongoDB Database. 2.3 Learning Intervention Generation Service This service generates the intervention strategy to the learners’ learning considering the learners’ indicators extracted in the previous service. The following subservices compose this service. Learner Dashboard Generation Subservice This subservice creates a visualisation per learner that reflects their performance in the course. With this aim, it generates and organises visual components (graphs) with the use of the framework Dash (https://plotly.com/dash/), which is based on Flask and React.js. 65 Mailing Subservice The aim of this subservice is to send the information to access to their Learner Dashboards to the learners via email. In addition, this subservice provides the use of templates to configure the content of the emails, and the recipients, among others. 2.4 Intervention Visualisation Service This service provides the graphical visualisation of each Learner Dashboard, which can be visualised in any web browser. Each Learner Dashboard shows the engagement and performance of the learner in the course in the following visualisations: • The current grade of the learner in the course (see Fig. 2), which is a value between one to zero (one is the maximum value and zero the minimum one). This value (number in black shown as a blue line in the graph) is compared with the average grade of all the learners in the course (yellow line in the graph). In addition, the learner can see the difference between their grade and the average grade of all the learners (number in green). Fig. 2. Graph with the current grade of the learner. • Data about the learner interactions in form of their calculated indicators daily (see Fig. 3). The learner can select the indicators to show in the graphs. These indicators can be compared with the average of these metrics of all the learners. The left graph shows the selected metrics and the right graph shows these values accumulated in time. The learner can click in the question mark icon to access help, i.e. a pop-up window with descriptions of the values in these graphs. 66 Fig. 3. Graphs with the daily indicators of the learner. • Data about the performance of the learner (see Fig 4.). The left graph shows the progress of the learner on the content course units and the right graph shows their progress on the graded assignments. In addition, the learner can compare their metrics with the average values of the same ones from all the learners. Fig. 4. Graphs with the progress of the learner. Learner Interest in their Engagement Subservice The “Intervention Visualisation Service” integrates a functionality that registers all learner interactions with the graphs and their parameters available in their Learner Dashboard in the MongoDB Database. Therefore, this information shall be used in the “Learning Intervention Monitoring Service”. 67 2.5 Course Data Monitoring Service This service depicts course data in several graphs and tables as follows: • General data of the course (see Fig. 5) in these metrics: total number of learners (the learners who are enrolled in the verified itinerary of the course) and certified learners (the verified learners who have passed the course because their grade is equal or greater than 0.5), average grade of learners and average grade of certified learners. Fig. 5. Table with general course data. • Demographic data of the learners in the course (see Fig.6) are shown. There are four graphs with the distribution of learners by country, gender, age and academic level. Fig. 6. Graphs with learners’ demographic data. • Data about the learners’ interactions in form of their calculated indicators daily (see Fig. 7). It is possible to apply several aggregation operations to these values in the graphs, such as, the average value or total value of the metrics. The left graph shows the selected metrics and the right graph shows these values accumulated in time. 68 Fig. 7. Graphs with the daily indicators of the learners. • Data related with the most recent activity of the learners in a table (see Fig. 8) and their grades per unit and in the total course are shown. In addition, it is possible to access to the Learner Dashboard of each learner in the table. Fig. 8. Table with most recent learners’ activity and Access to their Learner Dashboards. 2.6 Learning Intervention Monitoring Service This service allows the monitoring of the learners' interest in the intervention strategy on their learning. It shows the learners' interactions in the different graphs of their Learner Dashboards. It is possible to monitor this issue individually (selecting only one learner, see Fig. 10) or collectively (for all the learners, see Fig. 11). See next section for more details. 69 3 Case Study at WebApp MOOC University Autónoma of Madrid offers from 2019 the Massive Open Online Course (MOOC) entitled “Introduction to Development of Web Applications”, which is known as "WebApp" (https://www.edx.org/course/introduccion-al-desarrollo-de- aplicaciones-web-2). The first run of this course was from February 5th to March 26th 2019. Its second run is in self-paced mode, i.e. everyone can enrol it free from April 9th, 2019. Nowadays, the course is running, and the data presented in this article are obtained in March 2020. “WebApp” is a five-week course, which contents are structured in five units (one unit per week work). With this course, the learners can learn to develop any Web Application (Web App) based on HTML, CSS, Python, JSON, JavaScript and Ajax. In each course unit, there are multimedia resources, discussion forums and course evaluation activities in form of graded assignments. We explain the monitoring of the intervention strategy applied to the learning of the learners of the mentioned MOOC with the following case use diagram.234 Fig. 9. Case Use Diagram. 1 IVS: Intervention Visualisation Service 2 LIES: Learner Interest in their Engagement Subservice 3 LIMS: Learning Intervention Monitoring Service 70 Firstly, the instructor can monitor the interest of any learner in their engagement in the course, that is, their interactions with the Learner Dashboard (see Fig. 10). As we can see, the selected learner visited several times their Dashboard in the last week. This learner has a constant interest in their progress in the content course and in the graded assignments. Fig. 10. Graph with the daily interest of a learner on their engagement in the course (individually view). Secondly, the instructor can monitor the interest of all learner in their engagement in the course, that is, all the learner interactions with the Learner Dashboards are shown (see Fig. 11). As we can see, the maximum numbers of interactions with the dashboard are in two dates separated for a period of one week. This fact is reasonable since the MOOC learners receive weekly the information with the access to their Learner Dashboards. It confirms learners are interested in this intervention strategy because they use to access to their Dashboards at least weekly. Fig. 11. Graph with the daily interest of all learners on their engagement in the course (collectively view). 71 4 Conclusions and future work MOOCs learners are usually affected by feelings of insufficient feedback, lack of interaction with the instructors and isolation feeling while learning, which could lead to dropping out of the course. To mitigate these risk factors, we propose edX-LIMS as a multi-tool Learning Analytics System that facilitates: i) an intervention strategy on learners' learning by visualising their performance and activity in the course in a Learner Dashboard; ii) a graphical monitoring in an Instructor Dashboard that illustrates the interest in this service by the learners and, therefore, facilitates evaluating the success or failure of the aforementioned strategy. The main contribution of edX-LIMS in comparison with the previous developed system (edX-LIS) is that it gathers about how much is the learner engaged in its course, not only by their grades, but also for the monitoring of its interest on it. With the monitoring of its activity in the Learner Dashboard, the course instructor can easily identify which users are being responsive to the intervention strategy on their learning. Since the system is independent of edX Platform, there is not a direct synchronization with edX data; hence, it is required to obtain the course data (log tracks) weekly in order to import them to the platform. This is a transmission chain whose disruption could cause the monitoring process to be interrupted, as there would be no new data available to provide feedback to users. This situation is happening now: due to the circumstances caused by COVID-19, we have not been provided with new course data (log tracks) since March, so the system information is currently outdated until new information come to us. The future development of edX-LIMS aims to increase the quality of its information provided, because the current data information showed on the Learner Dashboard is the result of the answers obtained from a satisfaction questionnaire on the edX-LIS users. Nowadays, we are working in providing a new satisfaction questionnaire for the edX- LIMS users to know how we can improve their experience with the Learner Dashboard and, therefore, we could improve the Learning Intervention Monitoring Service. ACKNOWLEDGMENTS This work has been co-funded by the Madrid Regional Government, through the project e-Madrid-CM (S2018/TCS-4307). The e-Madrid-CM project is also co-financed by the Structural Funds (FSE and FEDER). Special thanks are due to the Universidad Autónoma de Madrid for facilitating access to the data of the MOOC used in this article. References 1. Alaman, X., Carro, R., Cobos, R., Gómez, J., Jurado, F., Molins, P Montoro, G., Moreno, J., Ortigosa, A., Rodriguez, P., Torrado, JC GHIA: Student Modeling, Learning Analytics, Attention to Diversity and e-Learning. IE Communications. Iberoamerican Journal of Educational Informatics, Number 30, July-December 2019, 1-12. 72 2. Cobos, R, Gil, S., Lareo, A, & Vargas, F. Open-DLAs: An open dashboard for learning analytics, in: L@S 2016 - Proceedings of the 3rd 2016 ACM Conference on Learning at Scale, ACM, 2016: pp. 265–268. https://doi.org/10.1145/2876034.2893430 3. Cobos, R, Jurado, F., Blázquez-Herranz, A. (2019). A Content Analysis System that supports Sentiment Analysis for Subjectivity and Polarity detection in Online Courses. IEEE Iberoamerican Journal of Learning Technologies, 14(4), 177–187. https://doi.org/10.1109/rita.2019.2952298 4. Cobos, R., Olmos, L. (2018). A Learning Analytics Tool for Predictive Modeling of Dropout and Certificate Acquisition on MOOCs for Professional Learning, in: 2018 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM), IEEE, 2018: pp. 1533–1537. https://doi.org/10.1109/IEEM.2018.8607541 5. Davis, D., Chen, G., Jivet, I., Hauff, C., Houben, G.-J. Encouraging Metacognition & Self- Regulation in MOOCs through Increased Learner Feedback, (2016) 17-22 BT-LAL 2016- Learning Analytics for Learn. 6. Ferguson, R., Brasher, A., Clow, D., Cooper, A.,Hillaire, G.,Mittelmeier, J., Rienties, B., Ullmann, T., Vuorikari, R. Research Evidence on the Use of Learning Analytics - Implications for Education Policy, in: R. Vuorikari, J. Castaño Muñoz (Eds.), European Commission’s Joint Research Centre Science for Policy Report (JRC), JRC, 2016. 7. Lang, C., Siemens, G., Wise, A., Gasevic D. Handbook of Learning Analytics, SOLAR, Society for Learning Analytics and Research, 2017. 8. Ma, L., Lee, C.S. Investigating the adoption of MOOCs: A technology–user–environment perspective, Journal of Computer Assisted Learning. 35 (2019) 89–98. 9. Martínez-Monés, A., Dimitriadis, Y., Acquila- Natale, E., Álvarez, A., Caerio-Rodríguez, M., Cobos, R., Conde-González, M. A., García-Peñalvo, F. J., Hernández-Leo, D., Menchaca, I., Muñoz-Merino, P. J., Ros, S., y Sancho-Vinuesa, T. (2020). Achievements and challenges in learning analytics in Spain: The view of SNOLA. RIED. Revista Iberoamericana de Educación a Distancia, 23(2), (preprint). doi: http://dx.doi.org/10.5944/ried.23.2.26541 10. Romero C, Ventura S. Educational data mining and learning analytics: An updated survey. WIREs Data Mining Knowl Discov. 2020; e1355. https://doi.org/10.1002/widm.1355 11. Topali, P., Ortega-Arranz, A., Er, E., Martínez-Monés, A., Villagrá-Sobrino, G.-J., Dimitriadis, Y. Exploring the Problems Experienced by Learners in a MOOC Implementing Active Learning Pedagogies BT - Digital Education: At the MOOC Crossroads Where the Interests of Academia and Business Converge, in: M. Calise, C. Delgado Kloos, J. Reich, J.A. Ruiperez-Valiente, M. Wirsing (Eds.), Springer International Publishing, Cham, 2019: pp. 81–90.