=Paper= {{Paper |id=Vol-2671/paper11 |storemode=property |title=Self and peer monitoring during peer feedback: The instructor perspective |pdfUrl=https://ceur-ws.org/Vol-2671/paper11.pdf |volume=Vol-2671 |authors=Erkan Er |dblpUrl=https://dblp.org/rec/conf/lasi-spain/Er20 }} ==Self and peer monitoring during peer feedback: The instructor perspective== https://ceur-ws.org/Vol-2671/paper11.pdf
    Self and Peer Monitoring during Peer Feedback: The
                   Instructor Perspective

                                          Erkan Er
                          School of Telecommunication Engineering,
                              Universidad de Valladolid, Spain
                                  erkan@gsic.uva.es



       Abstract. Monitoring is a crucial skill that can trigger regulation of learning for
       achieving better learning outcomes. While monitoring the self can enhance self-
       regulation, monitoring peers in collaborative activities can support co-regulation.
       Student-facing dashboards have been often used to support monitoring. These
       dashboards intend to provide students with timely feedback on how well they are
       doing, which then is expected to be used by the students to evaluate their progress
       and update their learning strategies as needed. This study investigates instructors’
       perspective about self and peer monitoring enabled via a student-facing
       dashboard in the specific context of peer reviews, which is an underexplored area
       of research. The findings suggest that although instructors consider that the
       proposed approach to monitoring can hold potential to enhance students’ (both
       self- and co-) regulation of learning, they acknowledge that the activation of
       regulation in real-world practice might be a challenge. Implications for the
       activity design and learning analytics support are discussed.

       Keywords: student-facing dashboards, self-monitoring, peer-monitoring,
       regulation of learning, peer feedback.


1      Introduction

Self-regulation of learning is a critical process that is strongly associated with student
learning and achievement [1]. During self-regulation, students first set goals and then
plan and implement the necessary strategies (or tactics) to accomplish those goals [2].
An important component of self-regulation is the ability to monitor and assess the
ongoing progress and accordingly to refine the learning process with iterative
adaptations in the strategies and tactics [3]. That is, effective monitoring and accurate
interpretation of what is being monitored play a substantial role in self-regulation.
   Co-regulation of learning is another form of learning regulation that often occurs in
collaborative learning activities, in which (possibly more advanced) students help peers
regulate their learning [4]. Similar to self-regulation of learning, for the facilitation of
co-regulation, it is necessary that students monitor the activities of the peers and assess
the progress, which then can enable them to identify the guidance necessary to enhance
peer’s learning engagement. Successful co-regulation can augment the success and
learning benefits of collaborative learning [5].


Copyright © 2020 for this paper by its authors. Use permitted under Creative Commons License
Attribution 4.0 International (CC BY 4.0).
114


   The field of learning analytics provides a rich set of tools and techniques to support
regulation of learning [6], [7]. Among them, learning analytics dashboards (for
students), i.e., student-facing dashboards, have been the primary approach to providing
students with feedback about their progress and helping facilitate their regulation of
learning [7]. Several limitations of dashboards are noted in the literature. First, such
dashboards are mostly targeted at enhancing students’ overall learning process
throughout a course [6], and their use for improving student experience in a specific
activity has been underexplored. Dashboards can provide a more refined and targeted
support and feedback for students to help them improve in a specific learning task (e.g.,
collaborative learning). Moreover, such dashboards are implemented without taking
into account stakeholders’ perspectives in the design [8], and therefore their limited use
has had a very minor impact on student learning [6]. Also, the use of dashboards to
enable peer monitoring has been rarely investigated.
   Attending to these limitations, this paper reports on the preliminary results of a
research study investigating the instructor perspectives on self and peer monitoring
enabled through a student-facing dashboard in the specific context of peer feedback.
The peer feedback activity is implemented through Synergy [9], [10], an online
platform to facilitate collaborative peer feedback [11]. The student dashboard is
integrated into Synergy to support self and co-regulation of learning (through self and
peer monitoring). Instructor opinions from various domains helped us include
stakeholder perspective into the design and enhance the activity itself as well as the
learning analytics support (i.e., the dashboard).
   The rest of the paper is structured as follows. First, we discuss the related literature,
followed by the methods used. Then, we present the results along with the discussion
of the key findings. At the end, directions for the future research are shared.


2      Related Research

Student-facing dashboards has been the most popular applications of learning analytics
to support regulation of learning with timely feedback. Conflicting with their popularity
in research and practice, the literature notes limited capacity of the dashboards in terms
of triggering and informing learners’ future actions and supporting regulation of
learning [6]. One factor that plays a critical role in this low impact is the lack of
stakeholder involvement in the design process [8]. The latest approaches have begun to
embrace user-centered approaches to the design of learning analytics solutions [12].
The current work aims to contribute to this line of research by integrating instructor
perspective into the design of student-facing dashboards.
   Peer monitoring, although considered undesired in the early literature, can be an
effective peer learning strategy in collaborative learning activities [13]. However, most
dashboards in the literature focus on enabling students to monitor themselves. There
have been a few approaches and tools in the collaborative learning literature to support
peer monitoring with the goal of increasing group awareness [14]. Yet, no study
explored the peer monitoring phenomenon in the context of peer feedback. With this
work, we aim to address this gap in the literature.
                                                                                     115


3      Methods

3.1    Learning Analytics Support for Monitoring

Learning analytics support for monitoring is integrated into Synergy, an online platform
designed to facilitate collaborative peer feedback. In Synergy, students determine an
action plan by identifying specific learning actions that they plan to take based on the
peer feedback received. For each action, students are required to set an expected
completion date. This action plan is indeed the plan of revisions that students aim to
incorporate to improve their work being reviewed. As students implement the planned
changes in their work (using Google Docs), Synergy enables them to record their
ongoing progress on each action or to mark them as complete (i.e., 100% progress).
Students and the corresponding reviewing peers can access the monitoring page
anytime from the home page of the reviews. The first component of the dashboard, as
shown in Fig. 1, is the list of actions with the current progress on them displayed. Peer
monitoring can be set as an optional activity.




Fig. 1. Action plan page in Synergy

   Besides recording students’ manual progress updates, through Google Drive API
Synergy fetches and stores the number of revisions made by each student on their work.
Using both progress and revision data, a simple visualization is generated for each
action, as the second component of the dashboard, which can be accessed by clicking
on the Monitor button (for the corresponding action). Fig. 2 shows the visualization
generated for the learning action “enhance the existing example of personalization.”
The list of actions and the associated dashboard can be accessed at any time by both the
owner student and the reviewing peer(s).
   In this visualization, the history of the number of revisions made in the work
(represented by gray color) and the progress recorded on the chosen action (represented
by blue color) is temporally visualized as a line graph to help capture the change within
each variable over time and the interaction among them. The deadline for completing
116


the activity (set by the students themselves) is also indicated by a vertical red dashed
line. For example, according to Fig. 2, the student seems to face trouble completing the
action since the current progress is still 50% even though the intended deadline has
passed and there is an increasing level of effort in revising the work.




Fig. 2. Temporal visualization of student progress on actions versus number of revisions.


3.2    Participants

The targeted participant population was faculty or instructors with teaching experience
in Higher Education (for which the Synergy platform is intended). Participants were
recruited through email lists and an online invitation in social media. Eleven
participants agreed to participate in this study. They were from a variety of fields (such
as, computer science, instructional technology, science education, telematic
engineering, etc.) with a teaching experience ranging from 5 to 25 years, mostly in both
face-to-face and online settings. Except two of them, all implemented a peer review
activity at least once in one of their past classes.

3.3    Data Collection and Analysis

Qualitative data were collected through an online survey, composing of five open-
ended questions as given in Table 1. While the first question (Q1) inquiries into
participants’ opinion about self and peer monitoring as a component of the peer
feedback activity, the rest (Q2-Q5) focuses on the way the monitoring capacity is
                                                                                              117


implemented in Synergy. We applied grounded theory [15] to analyze the responses to
all questions. During the analysis, first, open coding was conducted to identify the
emergent categories, which was followed by axial coding to refine them and determine
if there are any subcategories. Then, selective coding was performed to determine the
final set of categories that best describe instructor perspective toward self and peer
monitoring enabled through a student-facing dashboard.

                            Table 1. The online survey questions.

 Q1    What is your opinion about students’ self-monitoring and peers’ monitoring of the
       student’s progress? Please provide your answer below.
 Q2    What do you think about the way students and peers are enabled to monitor the student
       progress on learning actions in Synergy (i.e., the page where all actions are listed along
       with the current progress)? Do you have any recommendations for a better
       implementation?
 Q3    What do you think about the temporal visualization of daily student progress and
       number of revisions? Is it useful and necessary? Do you think students and peers can
       make sense of it? Do you have any recommendations to enhance it?
 Q4    What do you think about the capacity of this visualization in terms of informing student
       actions (e.g., peers may contact the student to inquire about their progress, or students
       may change their study tactics themselves to progress better)?
 Q5    Would you prefer other types of support (e.g., automatic warnings, recommendations,
       etc.) to trigger peer or student action toward a better progress? Please provide your
       answer below.


4      Results

The analysis of the answers to all survey questions yielded four main categories: (1)
general perceptions, (2) benefits, (3) pitfalls, and (4) suggestions.
   First, the perceptions of all instructors towards self-monitoring was positive in
overall. The general opinion was that the dashboard was clear, and it might be useful
in this regard for self-monitoring. On the other hand, the perceptions toward peer
monitoring was inconsistent. Although several instructors considered it useful, most
suggested to include it as an optional activity or discard it. Accordingly, they did not
see much value of the dashboard for peer monitoring.
   The instructors highlighted several benefits of self and peer monitoring. Self-
monitoring was considered useful mainly because it can help students track their
progress, identify, and resolve any problems early, organize their time effectively to
progress and complete the task. One instructor also noted that it can motivate students
and increase their accountability during peer reviews, which is critical to achieve
particularly in online learning. The main benefit of peer monitoring noted was that
being aware of peers’ progress can enhance the collaborative learning process.
   There were several pitfalls noted by the instructors, which were mostly associated
with peer monitoring. Some instructors noted that students are unlikely to do peer-
monitoring for several reasons. For example, they may be inclined to think that it is the
118


responsibility of their peers to monitor themselves, or the instructor should be
responsible to monitor the whole class. Several participants mentioned that the peer
review activity is itself a time consuming activity and students may not have time for
monitoring peers. Another concern was that the feeling of being monitored by others
might be bothersome for some students and lead to undesired emotional states. One
particular pitfall regarding the dashboard was that the visualizations were built based
on subjective data (e.g., progress updates by students). The participant considered that
this may limit the impact of the dashboard in informing student actions. Another
participant indicated that students might have trouble interpreting the visualization.
   Last, there were several suggestions about the monitoring activity and the way it was
supported by the dashboard. First was to enable instructors to turn on/off monitoring
according to their pedagogical intentions and learning designs. Second, it was
suggested to make the dashboards in Synergy more visible to encourage students’ use
of them for monitoring (e.g., including them in the front page). Similarly, it was advised
to implement automatic reminders about the dashboards. Moreover, many instructors
recommended including automatic warnings as a complementary to the dashboards
(such as warning peers about low progress, thus motivating them to check the
dashboard for the details). The last suggestion was to allow students to report their
progress in terms of “not started”, “in progress” or “completed”, which would allow
reporting which days the student has been active, and how much time has been devoted
for each action. Synergy could also provide a separate visualization showing the
percentage of actions in each state.


5      Discussion

The results provided some evidence that although monitoring can hold potential to
enhance students’ regulation of learning, activation of regulation in real-world practice
(with learning analytics) might be a challenge. Although the participating instructors
recognized the value of dashboard-enabled self and peer monitoring, they provided
several critical arguments highlighting some emergent issues.
   There was an agreement among the participants that self-monitoring can enhance
student learning by allowing them to be aware of their progress and to refine their study
strategies in a timely manner to achieve the learning goals (i.e., to self-regulate). To
motivate learners to use them, a common suggestion was to make the self-monitoring
(via the dashboard) more visible within the system. Associated with this, another
recommendation was to include automated system messages to remind students of the
dashboards. That is, the way the dashboards are integrated into Synergy may impact
students’ access to it and their use for self-monitoring; if they are more reachable within
Synergy by design, students are more likely to use it. Currently, the dashboards are
implemented as embedded learning analytics [16], that students can access only if they
click to view the learning actions. Based on this finding, we plan to implement a feature
to send automatic reminders to the students that they can monitor their own progress
through the dashboards in a separate interface.
                                                                                       119


   Several barriers to the use of the dashboard for monitoring was mentioned by the
instructors (e.g., lack of time, having trouble interpreting the visualizations, uneasy
feelings for being monitored, resistance to taking role of monitoring peers, etc.). We
argue that these barriers can be prevented through a careful design (of the activity)
accompanied with effective training of the students. The feedback activity should itself
place a greater importance on monitoring, as it is not very usual in peer review
activities. Students may need further instructions why monitoring self and peers is a
critical component of the activity. This may also help increase students’ sense of
responsibility when subjectively entering their progresses. Thus, the monitoring
through dashboard should be integrated into the workflow of the peer feedback activity
[17]. Their efforts can be graded as a subtask to perform during the feedback activity.
   Moreover, students can be trained by allowing them to practice how to interpret the
visualizations through several working examples. We plan to create several materials
(e.g., videos) for instructors to provide an effective training for students. Additional to
training, as suggested by the instructors, automatic warnings could be sent out to
students about self or peers, which then would provoke the need for exploring what is
happening through the dashboard. We plan to implement such prescriptive analytics as
a complementing to the functionality of the exiting dashboard.


6      Future Research

This work has some limitations that open opportunities for future research. First, the
number of participants is relatively low and only includes instructors (thus, missing the
students who are the targeted users of the dashboards), which limits the generalizability
of the findings. A future study with a larger set of participants is planned. Additionally,
we plan to evaluate the effectiveness of monitoring in real-world context where students
can use the learning analytics dashboards for self and peer monitoring during a peer
feedback activity. In terms of the methodology, focus group interviews are planned in
the evaluation studies to obtain a richer data for a deeper understanding of participants’
experiences and perspectives. Moreover, instructor-facing dashboards can be
implemented to support instructors in their efforts to intervene and facilitate for better
regulation of learning.


References

[1]     R. F. Kizilcec, M. Pérez-Sanagustín, and J. J. Maldonado, “Self-regulated
        learning strategies predict learner behavior and goal attainment in Massive
        Open Online Courses,” Comput. Educ., vol. 104, pp. 18–33, 2016.
[2]     P. H. Winne and A. F. Hadwin, “Studying as self-regulated learning,” in
        Metacognition in educational theory and practice, D. J. Hacker, J. Dunlosky,
        and A. C. Graesse, Eds. Hillsdale, NJ: Erlbaum, 1998, pp. 277–304.
[3]     D. L. Butler and P. H. . Winne, “Feedback and self-regulated learning: A
        theoretical synthesis,” Rev. Educ. Res., vol. 65, no. 3, pp. 245–281, 1995.
[4]     A. F. Hadwin, S. Järvelä, and M. Miller, “Self-regulation, co-regulation and
120


       shared regulation in collaborative learning environments,” in Handbook of self-
       regulation of learning and performance, 2nd ed., D. H. Schunk and J. Greene,
       Eds. New York, NY: Routledge., 2017.
[5]    J. Malmberg, S. Järvelä, and H. Järvenoja, “Capturing temporal and sequential
       patterns of self-, co-, and socially shared regulation in the context of
       collaborative learning,” Contemp. Educ. Psychol., vol. 49, pp. 160–174, 2017.
[6]    W. Matcha, A. Uzir, D. Gašević, and A. Pardo, “A systematic review of
       empirical studies on learning analytics dashboards: A self-regulated learning
       perspective,” IEEE Trans. Learn. Technol., vol. In Press., 2019.
[7]    I. Jivet, M. Scheffel, H. Drachsler, and M. Specht, “Awareness is not enough:
       Pitfalls of learning analytics dashboards in the educational practice,” in
       Proceedings of the 12th European Conference on Technology-Enhanced
       Learning, 2017, pp. 82–96.
[8]    S. B. Shum, R. Ferguson, and R. Martinez-maldonado, “Human-centred
       learning analytics,” J. Learn. Anal., vol. 6, no. 2, pp. 1–9, 2019.
[9]    E. Er, Y. Dimitriadis, and D. Gašević, “Synergy: An Online Platform for
       Dialogic Peer Feedback at Scale,” in 13th International Conference on
       Computer Supported Collaborative Learning, Conference Proceedings Volume
       2, 2019, pp. 1005–1008.
[10]   E. Er, Y. Dimitriadis, and D. Gašević, “Synergy: A Web-Based Tool to
       Facilitate Dialogic Peer Feedback,” in 14th European Conference on
       Technology Enhanced Learning, EC-TEL 2019, 2019, pp. 709–713.
[11]   E. Er, Y. Dimitriadis, and D. Gasevic, “Collaborative peer feedback and
       learning analytics: Theory-oriented design for supporting class-wide
       interventions,” Assess. Eval. High. Educ., vol. In Press, 2020.
[12]   C. P. Alvarez, R. Martinez-Maldonado, and S. B. Shum, “LA-DECK: A card-
       based learning analytics co-design tool,” in Proceedings of the Tenth
       International Conference on Learning Analytics & Knowledge, 2020, pp. 63–
       72.
[13]   K. J. Topping, “Trends in peer learning,” Educ. Psychol., vol. 25, no. 6, pp.
       631–645, 2005.
[14]   C. Phielix, F. J. Prins, P. A. Kirschner, G. Erkens, and J. Jaspers, “Group
       awareness of social and cognitive performance in a CSCL environment: Effects
       of a peer feedback and reflection tool,” Comput. Human Behav., vol. 27, no. 3,
       pp. 1087–1102, 2011.
[15]   B. G. Glaser and A. L. Strauss, The Discovery of Grounded Theory. Strategies
       for Qualitative Research. Chicago: Aldine., 1967.
[16]   A. Wise, Y. Zhao, and S. Hausknecht, “Learning Analytics for Online
       Discussions: Embedded and Extracted Approaches,” J. Learn. Anal., vol. 1,
       no. 2, pp. 48–71, 2014.
[17]   S. Charleer, J. Klerkx, E. Duval, and T. De Laet, “Creating effective learning
       analytics dashboards: Lessons learnt,” in 13th European Conference on
       Technology Enhanced Learning (EC-TEL 2016), 2016, pp. 42–56.