CADA: A learning analytics dashboard to support teachers with visualizations about students’ participation and discourse in online discussions Rogers Kaliisaa , Jan Arild Dolonena a Department of Education, University of Oslo, Oslo, Norway Abstract This paper introduces a Canvas discussion analytics tool (CADA), designed following a human-computer interaction approach to provide teachers with real-time insights about students’ participation and discourse in online discussions. CADA supports the automatic extraction and analysis of discussion forums posts and interactions from the Canvas LMS and provides visualizations that communicate at a glance and in detail about participation rate, concepts being used and their epistemic connections, contributions per participant, and sentiment scores. The outputs provided by CADA make students’ thinking visible to the teacher, which provides an informed basis to intervene and change the course activities. This work-in-progress paper outlines the functional features included in CADA, preliminary results, and the next steps for evaluating, redesign and research with CADA in authentic teaching environments. Keywords Learning analytics, teacher dashboards, design based research, 1. Introduction The emergence of information and communications technologies (ICT) into higher education has significantly changed how teachers teach and students learn. In particular, the increasing use of digital learning tools and platforms (e.g. learning management systems (LMSs), has opened up the possibility to transform face-to-face courses in which a significant amount of blended courses or all information (online courses) is delivered and accessible online [1]. This trend has gained more significance during the COVID-19 global pandemic, which has shifted teaching and learning programs to being fully online. LMSs such as Canvas, Moodle, and Blackboard among others support student learning by providing content online and by allowing for online collaborative activities (e.g. discussion forums and peer assessments) beyond physical classrooms. However, despite the vast array of benefits associated with digital platforms, research has shown that higher education teachers find it difficult to support student active learning through these platforms [2], partly because they struggle to find relevant insights about students’ level of participation in course activities and their misconceptions on a given task. The current versions of LMSs tend to only provide general and high-level statistics Nordic Learning Analytics Summer Institute (NLASI), August 23, 2021, Stockholm, Sweden Envelope-Open rogers.kaliisa@iped.uio (R. Kaliisa); janad@uv.uio.no (J. A. Dolonen) Orcid 0000-0001-6528-8517 (R. Kaliisa); 0000-0001-5596-1405 (J. A. Dolonen) © 2020 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings http://ceur-ws.org ISSN 1613-0073 CEUR Workshop Proceedings (CEUR-WS.org) (e.g. students’ login history, page views) which are less intuitive in terms of understanding students’ participation and engagement levels. Thus, providing limited informed opportunities for teachers to identify students that require help, and areas of the course that need improvement. In this short paper, we introduce a Canvas discussion analytics tool (CADA), which can be integrated within the Canvas LMS to automatically analyze discussion forums in a way that can offer actionable insights to teachers on how students are approaching the learning task in connection with the intended pedagogical intent. These kinds of formative feedback mechanisms are critical for teachers to better support students in real-time and adapt the design, content and structure of the lesson during the run of the course, rather than relying on evidence from summative assessments (e.g., course grades) that usually come at the end of the teaching period. 1.1. The design and development of CADA The design of CADA lends itself to principles from the learning sciences and human-computer interaction (HCI) [3] for an interpretable and actionable learning dashboard based on students’ discussion activities (Figure X). The essence of adopting these approaches was to satisfy that CADA had a theoretical foundation and that stakeholder (e.g. teachers) needs in real life were met [3]. Thus, the meanings, interaction opportunities, functions and attributes associated with the tool were defined together with the teachers for whom CADA is intended. The design process was conducted through several iterative phases based on design-based research (DBR) and the Learning Awareness Tools –User eXperience method (LATUX) [4]. The former is a method often used in the learning sciences, takes learning theoretical constructs as a starting point, and then iteratively develop the tool with stakeholders through testing in real settings, analysis and evaluation of results, before re-designing the tool. LATUX structures parts of the DBR process by emphasizing the development of interface and awareness through five iterative design stages: 1) problem statement and requirements identification; 2) creation of low-fidelity prototype; 3) creation of high-fidelity prototype; 4) evaluation of prototype in pilot classroom studies, and 5) evaluation in the real-world classroom. In the design of CADA, the first stage was a qualitative study with 16 teachers from two Norwegian universities, which highlighted teachers’ LA needs [5]. The second stage explored a range of candidate visualizations based on paper prototypes [6] and laid the foundations for the concepts of participation and discourse that were identified as leading learning theoretical constructs. The third stage, involved the creation of a high-fidelity prototype (CADA) (Figure 1) based on insights gained from the first two stages. 1.2. CADA: An overview CADA is a dashboard that visualizes participation, social networks, text/epistemic networks, and concepts used by students within the Canvas LMS-discussion forum on a need to know basis. Teachers can then use the insights gained as a diagnostic tool to improve teaching and inform learning design. CADA is focused on analyzing information on students’ discussions because the discussion forum is one of the most frequently used collaborative tools within an LMS and provide the students with opportunities to actively partake in subject-specific discourses with peers [7]. However, keeping track of students’ engagement in online discussions is challenging and there is a need to provide teachers with tools that enable them to support students’ subject- specific knowledge construction in collaboration with others, which is key to learning and modern knowledge work [1]. Thus, a fundamental premise of CADA is that teachers could benefit from formative insights about students’ participation in online discussions by using them to determine the extent to which the pedagogical intent of the task has been reflected in students’ interactions and the content they produce. The features of CADA include: 1) The dashboard: which provides teachers with a quick overview of discussion activity within the course and access to filtering functions such as the percentage of the active and inactive participants, total interactions, and an aggregated score of sentiments for a particular thread without the need for scrolling. 2)Discourse analytics: This feature displays the key topics discussed by the students within the selected discussion forum and the context they have been used. 3) Participation: This function contains information on students’ participation metrics within the discussion forum. 4) Network: This function provides details about students’ social interactions in a discussion forum, which might be useful for teachers interested in understanding how students relate to one another in a course and 5) Sentiment Analysis which analyzes the sentiment attached to each discussion post using a document-level sentiment classification level of granularity. Figure 1: The CADA Interface. 1.3. Preliminary findings and conclusion CADA was implemented with four courses and 4 teachers at a Norwegian University during Autumn 2020. The teachers involved took part in an interview at the end of the semester during which they were asked to reflect on the use of the dashboard. Preliminary findings from the pilot evaluation study show that teachers found CADA as a dashboard that provides them with insights about students’ engagement and participation at a glance. At the same time, teachers found the dashboard lacking actionable insights that could allow them to make timely changes and interventions. The teacher also identified several features that need to be improved to improve CADA’s functionality. For example, CADA analyses the content of discussions by providing a list of the key concepts used and the context. However, this might not be enough and there is a need to capture the epistemic connections of the content produced in discussion forums, by showing how the different topics are connected. In the future version of CADA, we plan to leverage the potential of epistemic network analysis-a quantitative ethnography network analysis technique for analyzing the structure of connections among coded data by quantifying and modelling the co-occurrence of codes [9]. In this case, CADA will provide additional value to teachers by visualizing the epistemic connections of students’ posts. The next step for CADA is a pilot with eight courses and seven teachers at a Norwegian University. The teachers involved will take part in an interview at the end of the semester during which they will be asked to reflect on their use of the tool. The findings of this data collection process will inform any changes to the tool before it is released as an LTI in Canvas for other institutions to use. The evaluation will also result in the development of practical and theoretical principles for the development of LA dashboards, which could be utilised by other researchers. 2. References 1. Lillejord S., Børte K., Nesje K. Ruud E. (2018). Learning and teaching with technol- ogy in higher education – a systematic review. Oslo: Knowledge Centre for Education, http://www.kunnskapssenter.no 2. Damşa, C., de Lange, T. (2019). Student-centred learning environments in higher education. Uniped, 42(01), 9-26. 3. Barab, S. (2006). Design-Based Research: A Methodological Toolkit for the Learning Scientist. In R. K. Sawyer (Ed.), The Cambridge handbook of: The learning sciences (p. 153–169). Cambridge University Press 4. Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., Clayphan, A. (2015). LATUX: An iterative workflow for designing, validating and deploying learning analytics visualisations. Journal of Learning Analytics, 2(3), 9-39. 5. Kaliisa, R., Mørch, A. I., Kluge, A. (2021). ‘My Point of Departure for Analytics is Extreme Skepticism’: Implications Derived from An Investigation of University Teachers’ Learning Analytics Perspectives and Design Practices. Technology, Knowledge and Learning, 1-22. 6. Kaliisa, R., Kluge, A., Mørch, A. I. (2020). Combining Checkpoint and Process Learning Analytics to Support Learning Design Decisions in Blended Learning Environments. Journal of Learning Analytics, 7(3), 33-47. 7. Rosé, C. P., Ferschke, O. (2016). Technology support for discussion based learning: From computer supported collaborative learning to the future of massive open online courses. International Journal of Artificial Intelligence in Education, 26(2), 660-678. 8. Kaliisa, R., Misiejuk, K., Irgens, G. A., Misfeldt, M. (2021, February). Scoping the Emerg- ing Field of Quantitative Ethnography: Opportunities, Challenges and Future Directions. In International Conference on Quantitative Ethnography (pp. 3-17). Springer, Cham.