<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>CADA: A learning analytics dashboard to support teachers with visualizations about students' participation and discourse in online discussions</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Rogers Kaliisa</string-name>
          <email>rogers.kaliisa@iped.uio</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jan Arild Dolonen</string-name>
          <email>janad@uv.uio.no</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Education, University of Oslo</institution>
          ,
          <addr-line>Oslo</addr-line>
          ,
          <country country="NO">Norway</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Nordic Learning Analytics Summer Institute</institution>
          ,
          <addr-line>NLASI</addr-line>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2020</year>
      </pub-date>
      <abstract>
        <p>This paper introduces a Canvas discussion analytics tool (CADA), designed following a human-computer interaction approach to provide teachers with real-time insights about students' participation and discourse in online discussions. CADA supports the automatic extraction and analysis of discussion forums posts and interactions from the Canvas LMS and provides visualizations that communicate at a glance and in detail about participation rate, concepts being used and their epistemic connections, contributions per participant, and sentiment scores. The outputs provided by CADA make students' thinking visible to the teacher, which provides an informed basis to intervene and change the course activities. This work-in-progress paper outlines the functional features included in CADA, preliminary results, and the next steps for evaluating, redesign and research with CADA in authentic teaching environments.</p>
      </abstract>
      <kwd-group>
        <kwd>Learning analytics</kwd>
        <kwd>teacher dashboards</kwd>
        <kwd>design based research</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The emergence of information and communications technologies (ICT) into higher education
has significantly changed how teachers teach and students learn. In particular, the increasing
use of digital learning tools and platforms (e.g. learning management systems (LMSs), has
opened up the possibility to transform face-to-face courses in which a significant amount of
blended courses or all information (online courses) is delivered and accessible online [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. This
trend has gained more significance during the COVID-19 global pandemic, which has shifted
teaching and learning programs to being fully online. LMSs such as Canvas, Moodle, and
Blackboard among others support student learning by providing content online and by allowing
for online collaborative activities (e.g. discussion forums and peer assessments) beyond physical
classrooms. However, despite the vast array of benefits associated with digital platforms,
research has shown that higher education teachers find it dificult to support student active
learning through these platforms [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], partly because they struggle to find relevant insights
about students’ level of participation in course activities and their misconceptions on a given
task. The current versions of LMSs tend to only provide general and high-level statistics
(e.g. students’ login history, page views) which are less intuitive in terms of understanding
students’ participation and engagement levels. Thus, providing limited informed opportunities
for teachers to identify students that require help, and areas of the course that need improvement.
      </p>
      <p>In this short paper, we introduce a Canvas discussion analytics tool (CADA), which can
be integrated within the Canvas LMS to automatically analyze discussion forums in a way
that can ofer actionable insights to teachers on how students are approaching the learning
task in connection with the intended pedagogical intent. These kinds of formative feedback
mechanisms are critical for teachers to better support students in real-time and adapt the design,
content and structure of the lesson during the run of the course, rather than relying on evidence
from summative assessments (e.g., course grades) that usually come at the end of the teaching
period.</p>
      <sec id="sec-1-1">
        <title>1.1. The design and development of CADA</title>
        <p>
          The design of CADA lends itself to principles from the learning sciences and human-computer
interaction (HCI) [
          <xref ref-type="bibr" rid="ref3">3</xref>
          ] for an interpretable and actionable learning dashboard based on students’
discussion activities (Figure X). The essence of adopting these approaches was to satisfy that
CADA had a theoretical foundation and that stakeholder (e.g. teachers) needs in real life were
met [
          <xref ref-type="bibr" rid="ref3">3</xref>
          ]. Thus, the meanings, interaction opportunities, functions and attributes associated
with the tool were defined together with the teachers for whom CADA is intended. The design
process was conducted through several iterative phases based on design-based research (DBR)
and the Learning Awareness Tools –User eXperience method (LATUX) [
          <xref ref-type="bibr" rid="ref4">4</xref>
          ]. The former is a
method often used in the learning sciences, takes learning theoretical constructs as a starting
point, and then iteratively develop the tool with stakeholders through testing in real settings,
analysis and evaluation of results, before re-designing the tool. LATUX structures parts of the
DBR process by emphasizing the development of interface and awareness through five iterative
design stages: 1) problem statement and requirements identification; 2) creation of low-fidelity
prototype; 3) creation of high-fidelity prototype; 4) evaluation of prototype in pilot classroom
studies, and 5) evaluation in the real-world classroom. In the design of CADA, the first stage
was a qualitative study with 16 teachers from two Norwegian universities, which highlighted
teachers’ LA needs [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. The second stage explored a range of candidate visualizations based on
paper prototypes [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ] and laid the foundations for the concepts of participation and discourse
that were identified as leading learning theoretical constructs. The third stage, involved the
creation of a high-fidelity prototype (CADA) (Figure 1) based on insights gained from the first
two stages.
        </p>
      </sec>
      <sec id="sec-1-2">
        <title>1.2. CADA: An overview</title>
        <p>
          CADA is a dashboard that visualizes participation, social networks, text/epistemic networks,
and concepts used by students within the Canvas LMS-discussion forum on a need to know basis.
Teachers can then use the insights gained as a diagnostic tool to improve teaching and inform
learning design. CADA is focused on analyzing information on students’ discussions because
the discussion forum is one of the most frequently used collaborative tools within an LMS and
provide the students with opportunities to actively partake in subject-specific discourses with
peers [7]. However, keeping track of students’ engagement in online discussions is challenging
and there is a need to provide teachers with tools that enable them to support students’
subjectspecific knowledge construction in collaboration with others, which is key to learning and
modern knowledge work [
          <xref ref-type="bibr" rid="ref1">1</xref>
          ]. Thus, a fundamental premise of CADA is that teachers could
benefit from formative insights about students’ participation in online discussions by using them
to determine the extent to which the pedagogical intent of the task has been reflected in students’
interactions and the content they produce. The features of CADA include: 1) The dashboard:
which provides teachers with a quick overview of discussion activity within the course and
access to filtering functions such as the percentage of the active and inactive participants, total
interactions, and an aggregated score of sentiments for a particular thread without the need for
scrolling. 2)Discourse analytics: This feature displays the key topics discussed by the students
within the selected discussion forum and the context they have been used. 3) Participation: This
function contains information on students’ participation metrics within the discussion forum.
4) Network: This function provides details about students’ social interactions in a discussion
forum, which might be useful for teachers interested in understanding how students relate to
one another in a course and 5) Sentiment Analysis which analyzes the sentiment attached to
each discussion post using a document-level sentiment classification level of granularity.
        </p>
      </sec>
      <sec id="sec-1-3">
        <title>1.3. Preliminary findings and conclusion</title>
        <p>CADA was implemented with four courses and 4 teachers at a Norwegian University during
Autumn 2020. The teachers involved took part in an interview at the end of the semester during
which they were asked to reflect on the use of the dashboard. Preliminary findings from the
pilot evaluation study show that teachers found CADA as a dashboard that provides them with
insights about students’ engagement and participation at a glance. At the same time, teachers
found the dashboard lacking actionable insights that could allow them to make timely changes
and interventions. The teacher also identified several features that need to be improved to
improve CADA’s functionality. For example, CADA analyses the content of discussions by
providing a list of the key concepts used and the context. However, this might not be enough
and there is a need to capture the epistemic connections of the content produced in discussion
forums, by showing how the diferent topics are connected. In the future version of CADA,
we plan to leverage the potential of epistemic network analysis-a quantitative ethnography
network analysis technique for analyzing the structure of connections among coded data by
quantifying and modelling the co-occurrence of codes [9]. In this case, CADA will provide
additional value to teachers by visualizing the epistemic connections of students’ posts. The
next step for CADA is a pilot with eight courses and seven teachers at a Norwegian University.
The teachers involved will take part in an interview at the end of the semester during which
they will be asked to reflect on their use of the tool. The findings of this data collection process
will inform any changes to the tool before it is released as an LTI in Canvas for other institutions
to use. The evaluation will also result in the development of practical and theoretical principles
for the development of LA dashboards, which could be utilised by other researchers.</p>
      </sec>
    </sec>
    <sec id="sec-2">
      <title>2. References</title>
      <p>7. Rosé, C. P., Ferschke, O. (2016). Technology support for discussion based learning:
From computer supported collaborative learning to the future of massive open online courses.
International Journal of Artificial Intelligence in Education, 26(2), 660-678.</p>
      <p>8. Kaliisa, R., Misiejuk, K., Irgens, G. A., Misfeldt, M. (2021, February). Scoping the
Emerging Field of Quantitative Ethnography: Opportunities, Challenges and Future Directions. In
International Conference on Quantitative Ethnography (pp. 3-17). Springer, Cham.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Lillejord</surname>
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Børte</surname>
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nesje</surname>
            <given-names>K. Ruud E.</given-names>
          </string-name>
          (
          <year>2018</year>
          ).
          <article-title>Learning and teaching with technology in higher education - a systematic review. Oslo: Knowledge Centre for Education</article-title>
          , http://www.kunnskapssenter.no
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Damşa</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>de Lange</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          (
          <year>2019</year>
          ).
          <article-title>Student-centred learning environments in higher education</article-title>
          .
          <source>Uniped</source>
          ,
          <volume>42</volume>
          (
          <issue>01</issue>
          ),
          <fpage>9</fpage>
          -
          <lpage>26</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Barab</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          (
          <year>2006</year>
          ).
          <article-title>Design-Based Research: A Methodological Toolkit for the Learning Scientist</article-title>
          . In R. K. Sawyer (Ed.), The Cambridge handbook of:
          <article-title>The learning sciences</article-title>
          (p.
          <fpage>153</fpage>
          -
          <lpage>169</lpage>
          ). Cambridge University Press
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Martinez-Maldonado</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pardo</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mirriahi</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yacef</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kay</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Clayphan</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2015</year>
          ).
          <article-title>LATUX: An iterative workflow for designing, validating and deploying learning analytics visualisations</article-title>
          .
          <source>Journal of Learning Analytics</source>
          ,
          <volume>2</volume>
          (
          <issue>3</issue>
          ),
          <fpage>9</fpage>
          -
          <lpage>39</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Kaliisa</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mørch</surname>
            ,
            <given-names>A. I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kluge</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>'My Point of Departure for Analytics is Extreme Skepticism': Implications Derived from An Investigation of University Teachers' Learning Analytics Perspectives</article-title>
          and
          <string-name>
            <given-names>Design</given-names>
            <surname>Practices</surname>
          </string-name>
          .
          <source>Technology, Knowledge and Learning</source>
          ,
          <fpage>1</fpage>
          -
          <lpage>22</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Kaliisa</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kluge</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mørch</surname>
            ,
            <given-names>A. I.</given-names>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>Combining Checkpoint and Process Learning Analytics to Support Learning Design Decisions in Blended Learning Environments</article-title>
          .
          <source>Journal of Learning Analytics</source>
          ,
          <volume>7</volume>
          (
          <issue>3</issue>
          ),
          <fpage>33</fpage>
          -
          <lpage>47</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>