=Paper= {{Paper |id=None |storemode=property |title=Tales of a Companion Teacher Analytics |pdfUrl=https://ceur-ws.org/Vol-985/paper4.pdf |volume=Vol-985 |dblpUrl=https://dblp.org/rec/conf/lak/LibbrechtKRM13 }} ==Tales of a Companion Teacher Analytics== https://ceur-ws.org/Vol-985/paper4.pdf
           Tales of a Companion Teacher Analytics

    Paul Libbrecht, Ulrich Kortenkamp,1 Sandra Rebholz, Wolfgang Müller 2
    1
        Center for Educational Research in Mathematics And Technology (CERMAT),
         Martin Luther University Halle-Wittenberg, Germany http://cermat.org/
                   2
                     MEVis, University of Education, Weingarten, Germany



          Abstract. In this paper we sketch a multi-step scenario of a teacher
          and her classroom, who uses a few computer-based learning tools and
          their attached teaching analytics views. The tools that are described are
          extensions of the toolset of the authors (ComIn-M and SMALA) and
          others; the scenarios should thus serve as user-story in the design of
          future analytical softwares that support the teacher’s management of
          her courses. Our sketch attempts to apply concepts of the instrumental
          approach where the computer-based artifact becomes an instrument as
          its usage obtains its sense in the user’s life as well as their orchestration.

          Keywords: teaching analytics, math learning, teachers, adapations, lear-
          ning process, web-based learning


1        Introduction

The adoption of computer-based learning tools in schools has been slow. Diffi-
culties of multiple sorts have been met by most teachers for which technology-
enhanced learning is often a nice excursion with multiple pitfalls, the perma-
nent requirement of a plan B, and the frequent experience of an overwhelming
technique. Some categories of usages, especially in higher education, have been
acquired (too often only for communication purposes), but the lack of trust of
the teachers in the usage of learning tools remains widely spread.
    In order to release this mistrust, approaches to model the usage of the
computer-based tools by teachers have emerged, notably the instrumental ap-
proach [Trouche 2005] which describes how the users and the learning and teach-
ing tools evolve based on their mutual enrichment (the user’s perception of the
feasible actions is influenced by the tool, the instrumentalization, the representa-
tions on the tool’s states achieved by the user, the instrumentation). While this
model helps to understand the broad variability of achievements in using lear-
ning tools or any computer-based tools, it does not provide methods to empower
the learners in using the learning tools or to control the achieved progresses.
    Derived from this approach, the concepts of classroom orchestration has
been proposed to describe how the activities of the teachers, students, and their
use of technology-based tools are coordinated. Among the latest publications
on this subject, [Tabach 2013] attempts to list types of orchestration, several of
which we shall use in the scenarios below. The classroom orchestration explains
the ways for the teacher to manage the activities so that learning happens.
However, it does not really explain how to assess the learning performance and
how to interpret this information to improve teaching and learning appropriately.
    Teaching analytics [Vatrapu et al. 2011] provides ways for the learning to
be tracked at the scale of a classroom, that is, it provides ways to monitor
the effects of the orchestration. The analytical activity that the teacher can
perform leveraging log-views [Rebholz et al. 2012] can lead her to better adapt
the teaching taking in account the instrumentation and instrumentalization that
happened thus far and the one that she wishes to happen.
    The scenario we sketch below is based, mostly, on the SMALA toolset, de-
scribed in [Rebholz et al. 2012] with all mentioned and depicted functionali-
ties being implemented, except for Figure 5 and the facility to spot and show.
SMALA provides an infrastructure for automatically tracking learning activi-
ties and recording the solutions and solutions steps taken by learners when they
are working with computer-based tools. In addition to the documentation of
individual learning processes, SMALA also provides summaries of the perfor-
mance of the whole learning group. Visualizations support the teacher in getting
a quick overview of the class’ activities and point to common problems and
possible misconceptions by representing the results detected by the automatic
assessment component of the various learning tools. As opposed to the log views
provided by other Learning Analytics toolsets [Dyckhoff et al. 2012, Govaerts
et al. 2011], SMALA does not focus on data typically collected by Learning
Management Systems (e.g., the number of used resources, the number of earned
credit points) or advanced activity metrics, but allows the representation and
analysis of learning data down to the level of the individual solution process.
    Based on this perspective, the following usage scenario outlines the teaching
practice in mathematics classrooms when using computer-based learning tools in-
tegrated in a learning analytics infrastructure. Real-life examples of technology-
enhanced learning environments and details from current SMALA log view visu-
alizations illustrate the scenario and demonstrate the practical relevance of the
described setting.


2   Scenario: A Teacher’s Instrumenting Activity

Our hero is a university teacher educating future mathematics teachers. He
wishes to introduce the proper use of proofs by induction, a topic that is well
known to create lots of confusion in young students but remains quite impor-
tant for many proofs of the mathematical knowledge. Thus, he decides that the
usage of a learning tool to train such proofs is desirable. ComIn-M [Rebholz
& Zimmermann 2011] is such a learning tool. It can be run on contemporary
laptops’and desktops’web-browsers, which also access the learning management
system of the university for all students.
   The teacher contacts the editors of the learning tool, which provide her with
an online learning activity. In there, she can read the instructions to deploy
the learning tool within the learning management system: simply uploading a
content package will create an online resource from which students can start the
learning tool. She shall make it visible a bit later.
    Following the didactical de-
sign pattern technology on de-
mand [Bescherer et al. 2010],
our teacher first presents a few
situations of proofs by induc-
tions and its typical errors in
class and then introduces the
usage of the tool. To do so, she
presents the tool and performs
one complete exercise with it.
Somewhat similarly to the pre-
sentation on the right,1 our Fig. 1. Demonstrating the usage of a tool in class.
teacher is able to connect in
words and graphics the learning tools’representations and the course’s concepts.
Her approach follows mostly the orchestration explain-the-screen [Tabach 2013].
    At the end of the session, she invites the students to use the tool, demon-
strating how it can be started in the learning management system; one of the
exercises of this week’s assignment is based on the learning tool.
    Because exercises are optional, she cannot be sure that the exercises will be
performed. As she feared, very few students actually attempted the requested
exercise from what she can see in the log-views in the figure below: only 4 of her
150 students have trieds, and, as she can see in the assessment results table on
the right, none have succeeded. In the graphic below, red cells represent wrong
solutions, light red cells incomplete solutions, while (the missing) blue cells would
represent correct solutions.




Fig. 2. Simple summary log-views for the class: a few numbers, a graphical display by
user (u01, u02, ..) and exercise (Ex1, Ex2, ...).

   For the following exercise session, thus, a plan change is communicated so
that students come with their laptops to the university. The objective of the
teacher is to let the students go through as many of the ComIn-M exercises as
possible in small groups in front of the laptops but keep eyes wide open to ensure
that they are progressing.
1
    This presentation, about another learning tool, is available at http://www.youtube.
    com/watch?v=jHBYtbdEic4.
                                           During the help session students are first
                                       given a briefing on the mission they are to
                                       aim at assorted with a set of practical and
                                       strategic instructions. Most of the rest of the
                                       class is spent in the classroom orchestration
                                       monitor-and-guide where the teacher, and
                                       possibly teaching assistants, come at each
                                       screen providing individualized help on de-
                                       mand. Typical help requests are answered in
                                       just a few minutes in an attitude similar to
    Fig. 3. monitor and guide.         that pictured on the left.
    The teacher’s work
there generally involves
understanding the stu-
dents’states, what they
have done to reach it
(which can be shown
or told by the students,
e.g., such as a particu-
lar type of problem which
keeps being reported by
the learning tool), and
what they understand to               Fig. 4. An example student screen.
have made these manipu-
lations. Understanding the user’s actions involves the expertise of the teacher,
guessing the steps the learners have done or asking about them, and advising on
the best possible continuation. In the situations where we have attempted this
orchestration (described in [Libbrecht et al. 2012]) the teachers have not used
the my log feature of SMALA as it would take them away of the learning tools.
    The decision to stop and help can either be following a students’initiative or
a teacher’s observation. This observation can be over the shoulder or based on
some analytics representations.
    Hypothetically, the prototype log-view in Figure 5 could be displayed on a
screen that the teacher has in her pocket, on her desk in a course, or at the
class’s projector when named logging is agreed upon. It indicates the amount of
manipulations (buttons, clicks, in formulæ input and elsewhere,...), the amount




Fig. 5. A prototype log-view, displaying facets of the recent activity of each group that
could direct a teacher in a classroom to help the group that needs it most.
of correct inputs, the amount of reported errors and the amount of displayed
hints, each row for each of the groups. Such a view, for example, indicates that
the Riemann group is employing many hints and not reaching many successful
inputs or that the Gauss group is encounters multiple errors but is nonethe-
less progressing. This information supports students to situate each other and
supports teacher to decide on where to best offer guidance.
    The classroom of our teacher is
pretty large however, and overall mea-
sures are necessary to assess the overall
advances of the class. One of them can
be the display of the successes as on
the right, from the SMALA tool. One
should note that users are not named
there and that, although the teacher
could drill down and analyze the ses-
sions in more detail to identify the stu-
dent, this precise display remains suit-
able to display in the overhead projec- Fig. 6. An overall classroom progress view.
tor of the classroom to show progress.
This design for anonymity has been a central element of the SMALA toolset,
targeted to minimize students’ fear of being constantly observed.
    Similarly, this classroom benefits from the display of individual problems
that the learning tool has automatically reported. ComIn-M is an intelligent
tool, with a computer-algebra-system on the back, and a broad set of errors
that are automatically reported on the basis of individual problem-solving steps
to the students. The SMALA log view below, Figure 7 displays counts of error
reports. Hovering the mouse on the individual errors indicates its text.




Fig. 7. Statistics of the problems reported by the learning tool with the text of tooltip
indicating the complete text reported to the students. This text can be translated to:
Complete Induction: Basis: provide a start value for the variable.
    This list informs the teacher of typical errors and advises him to pause the
monitor-and-guide orchestration to perform an explain-the-screen orchestration
to explain the typical errors and how they can be avoided. The potential feature
to continue a started session on the projector would even allow this error to be
explained in the context of a realistic student work, applying the spot-and-show
orchestration.
    Moreover, such pauses are likely to be very necessary for the teacher to
stop the students’attempts on individual exercises, demonstrating a solution to
them on the projector or blackboard so that they can apply it to the computer
successfully, and attempt other exercises.



3     Conclusion
The example scenario described in this paper covers just a handful of the ana-
lytics features that one would wish a teacher to use to enhance her course. It is
likely that other features would support other adaptations’mechanisms.
    Among others, multiple didactical designs exist which involve learning tools
that perform less automatic evaluation than the ComIn-M tool. Many of them
are described in the learning scenarios on http://remath.cti.gr, for exam-
ple. The assessment of the achievements are then left to the teachers. Analytics
tools can still apply in this situation, but in quite different ways. For example,
it is likely that summarization of the input documents is quite important. As
another example, the simple display of miniaturized geometry constructions de-
livered as assignment results, may help the teacher to rapidly spot an interesting
contribution that he can analyze with the classroom.
    The scenario we have described included references to the classroom orches-
tration types in action. We claim that this provides an effective way to describe
the organization of the learning situation using the computer-based tools in the
learning place. It is likely that the summary of orchestration types presented in
[Tabach 2013] is insufficient to cover the more scattered use of learning tools,
blended between exercise rooms, home, libraries, or multiple other locations en-
abled by mobile devices, especially at the higher education level. However, it still
provides a good language to describe the practices of the teachers and learners.

Future work
The scenario that we have described essentially extends the SMALA tool set
[Rebholz et al. 2012],. This toolset has been evaluated by 150 students and 7
teachers, as described in [Libbrecht et al. 2012] but with a much more preliminary
set of log-views. Following these evaluations, and interview with teachers, the
features described here all implemented except the below future work.2 , we see
the following desirable extra features:
2
     SMALA is an open source software documented at http://sail-m.de/sail-m/
    SMALA_en
 – A possibility to enroll the students in a fully tracked environment where the
   teacher is able to follow a dashboard summarizing the individual user’s work
   by name and decide to advise so as to make the student’s advance in a more
   precise direction. However, in such an environment concerns about privacy
   and acceptance from side of the learners have to be carefully checked and
   taken into account.
 – A possibility for the learning tool to transport the solution path taken by
   a student, transmit to another place where other students can see it (such
   as the class projector). SMALA offers this function already (see scenario 3
   of [Rebholz et al. 2012]). But the ability to resume the exercise from there
   would complement better the spot-and-show orchestration, as well as other
   forms of learning collaborations (for example the request for help between
   peers).
 – The scenario we have described shows strong differences between the analyt-
   ics views, each linked to a particular classroom orchestration, and potentially
   to a particular position (teacher private, student shared, student private, ....).
   It is likely that orchestration types serve as a good naming for the teacher
   to denote the various analytics views.


References

[Bescherer et al 2010] Bescherer, Christine, Spannagel, Christian, Zimmer-
mann, Marc, The TECHNOLOGY ON DEMAND Pattern, http://sail-m.
de/sail-m/TechOD_en
    [Dyckhoff et al. 2012] Dyckhoff, A.L., Zielke, D., Bültmann, M., Chatti,
M.A. & Schroeder, U. (2012): Design and Implementation of a Learning Analyt-
ics Toolkit for Teachers. In: Journal of Educational Technology & Society, Vol.
15/3, 58-76.
    [Libbrecht al. 2012] Libbrecht, Paul, Rebholz, Sandra, , Herding, Daniel,
Müller, Wolfgang, and Tscheulin, Felix Understanding the Learners’ Actions
when using Mathematics Learning Toolsin Intelligent Computer Mathematics,
Proceedings of CICM 2012, Bremen, Germany, July 8-13, 2012, Lecture Notes
in Computer Science, Vol. 7362, Jeuring, J.; Campbell, J.; Carette, J.; Dos Reis,
G.; Sojka, P.; Wenzel, M.; Sorge, V. (Eds.) , 2012-07.
    [Rebholz et al. 2012] Rebholz, Sandra, Libbrecht, Paul, and Müller, Wolf-
gang, Learning analytics as an investigation tool for teaching practicioners To-
wards Theory and Practice of Teaching Analytics 2012, Proceedings of TaPTA
2012, CEUR-WS Volume 894, Ravi Vatrapu, Wolfgang Halb, Susan Bull (eds).
, 2012-09-18. Available from http://ceur-ws.org/Vol-894/ .
    [Rebholz & Zimmermann 2011] : Rebholz, Sandra & Zimmermann,
Marc: Applying Computer-Aided Intelligent Assessment in the Context of Math-
ematical Induction . In: eLearning Baltics 2011: Proceedings of the 4th Interna-
tional eLBa Conference, pages 43-51, Stuttgart: Fraunhofer Verlag.
    [Govaerts et al. 2011] Govaerts, S., Verbert, K. & Duval, E. (2011): Eval-
uating the student activity meter: two case studies. In: Proceedings of the 9th
International Conference on Advances in Web-based Learning - ICWL 2011,
volume 7048, pages 188-197, Springer.
    [Trouche 2005] Trouche, Luc, (2005), An instrumental approach to mathe-
matics learning, in K. Ruth- ven, D. Guin and L. Trouche (eds.) The Didactical
Challenge of Symbolic Calcula- tors, 137-162. Springer
    [Tabach 2013] Tabach, Michal, Developing a General Framework For In-
strumental Orchestration , in Conference on European Research on Mathematics
Education, CERME8. Jana Trgalova and Hans-Georg Weigand (editors). Avail-
able from http://cerme8.metu.edu.tr/wgpapers/wg15_papers.html
    [Vatrapu et al. 2011] Vatrapu, R., Teplovs, C., Fujita, N., & Bull, S.
(2011): Towards Visual Analytics for Teachers’ Dynamic Diagnostic Pedagogical
Decision-Making . In: LAK ’11 Proceedings of the 1st International Conference
on Learning Analytics and Knowledge, 93-98. ACM, New York, NY, USA.