=Paper= {{Paper |id=Vol-1601/CrossLAK16Paper8 |storemode=property |title=Opening the Black Box of Practice-Based Learning: Human- Centred Design of Learning Analytics |pdfUrl=https://ceur-ws.org/Vol-1601/CrossLAK16Paper8.pdf |volume=Vol-1601 |authors=Nina Valkanova,Mutlu Cukurova,Annelie Berner,Katerina Avramides,Manolis Mavrikis |dblpUrl=https://dblp.org/rec/conf/lak/ValkanovaCBAM16 }} ==Opening the Black Box of Practice-Based Learning: Human- Centred Design of Learning Analytics== https://ceur-ws.org/Vol-1601/CrossLAK16Paper8.pdf
    Opening the Black Box of Practice-Based Learning: Human-
              Centred Design of Learning Analytics

         Nina Valkanova 1, Mutlu Cukurova 2, Annelie Berner1, Katerina Avramides2, Manolis Mavrikis2,
              1 Copenhagen Institute of Interaction Design n.valkanova, a.berner@ciid.dk;
             2 UCL Knowledge Lab, University College London, m.cukurova, k.avramides,
                                         m.mavrikis@ucl.ac.uk

        Abstract: Practice-based learning activities are an important aspect of education,
        particularly for science, technology, engineering and mathematics (STEM) subjects. Their
        immense importance to STEM curricula is unequivocal and so are the teachers’ and
        students’ need for support during those activities. However, considering the open-ended
        and hands-on nature of practice-based learning activities, designing, deploying and
        validating learning analytics visualisations remains as a significant challenge for the state-
        of-the-art learning analytics. In this paper, we present our human- centered contextual
        enquiry approach for generating requirements and its preliminary results in the form of
        visualisations that have the potential to support facilitators and learners. Although there
        have been certain attempts to provide learning analytics for increasing awareness,
        supporting reflection and facilitating decision-making and intervention, to our knowledge,
        our research presented is the first attempt to provide such information regarding students’
        progress during practice-based learning activities.

        Keywords: practice-based learning, contextual inquiry, learning analytics, feedback, visualizations

Introduction and Background
In STEM teaching, practice-based learning is considered to be an essential part of teaching and learning
(Millar, 2004). Guidance is essential in those activities (Clark, 2009), since allowing students to work
independently does not always lead to meaningful learning outcomes (Cukurova & Bennett, 2014).
Facilitators of practice- based learning activities, as well as learners, are in need of tools that can provide
them with indicators of learning processes in order to support teacher monitoring and learner self-reflection
and self-regulation (Dillenbourg et al., 2011). Yet, little is known about what should be and can be
presented to teachers and learners in practice-based learning environments. In this paper, we present our
visualisation tool based on the outcomes of a contextual inquiry in practice-based STEM teaching and
human-centred iterative design methodology.

Learning Visualizations
A number of visualisation tools have been developed for online, face-to-face, and blended learning
settings, where this data is more readily available. Most of these attempts aim to support teachers (Verbert
et al., 2014), but some applications have also been developed to support students’ awareness and self-
reflection (e.g. (Govaerts, Verbert, Klerkx, & Duval, 2010). Researchers have developed visualisations of
students’ access to resources, their communication patterns in forums, as well as frequenc y and timings of
their activities (e.g. (Coffrin, Corrin, de Barba, & Kennedy, 2014). Such visualisations enable teachers to
provide better support, for example by identifying patterns of participation and intervening in problematic
groups (Van Leeuwen, Janssen, Erkens, & Brekelmans, 2014). Similarly, student learning in intelligent
tutoring systems is more easily tracked, and several visualisation tools have been developed to provide
students with information on their progress e.g. (Lafford, 2004).
         These solutions, however, do not necessarily transfer in open- ended, practice-based learning
where the technical challenges are very different and the usability and pedagogical requirements are not
yet well understood. First, practice-based learning activities usually take place simultaneously in multiple
groups of students, sometimes in a range of physical spaces and across a large time- span. In addition, the
diverse set of digital and non-digital activities cannot always be tracked keeping practice-based learning
largely out of the scope of current learning analytics trends, despite its immense importance to STEM
curricula. We are interested in investigating whether learning analytics can support the challenging role of
the teacher or facilitator in such settings and/or help students reflect on their own practice. The challenges
teachers face during practice-based learning, particularly in formal education, are well documented.

                                                         40
                    Copyright © 2016 for this paper by its authors. Copying permitted for private and academic purposes.
Teachers are rarely aware of the processes followed by students during these activities (Race, 2001), and it
is challenging for them to provide appropriate support to individual students, who have different needs,
strengths and weaknesses (Zhang, Zhao, Zhou, & Nunamaker, 2004).

Teachers can only be aware of what a small number of students are doing at any one time in the classroom.
It is, therefore, hard for teachers to know which students are making progress, and which are in difficulty
and in need of additional support. It is a challenge for teachers to understand the process by which students
have arrived at the current state of their practice- based activities and thus to provide appropriate guidance.

Assistance Tools for Collaborative Digital Learning Environments
An area with similar challenges, where we have sought inspiration from, is that of teacher assistance or
awareness and reflection tools on collaborative or open-ended digital learning environments. Similar to
practice- based learning, this requires much more than providing simple descriptive statistics of students’
activities. For example, with the aim of supporting students’ meta-cognitive processes in science and
mathematics education the METAFORA project developed a bespoke digital platform where students
undertake collaborative challenges, describe and enact their plans while working with open-ended
environments or games (Dragon et al., 2013). Tracking student activity allows data aggregation and
visualization for the teacher in terms of timelines or other charts. Earlier work looked into providing
synchronous information to support timely teacher intervention utilising the, familiar by now, traffic light
metaphor for showing which students are active, inactive or in need of help in an exploratory digital
environment for mathematics (Gutierrez-Santos, Geraniou, Pearce- Lazard, & Poulovassilis, 2012). Roman
et al. (2012) explored patterns of collaborative conversation at a non- interactive table aiming to provide
information regarding students’ learning process, Martinez-Maldonado et al. (2013) investigated students’
collaborative interactions during their work on an interactive tabletop, Gutierrez- Santos et al. (2012)
looked at students’ learning progress and need for help in the context of learning programming and
Mercier et al. (2015) studied the collaborative problem solving process within the context of multi-touch
technology. Although the aforementioned work points to the potential of tools for increasing awareness,
supporting reflection and facilitating decision-making and intervention, to our knowledge, the research
presented in this paper is the first attempt to provide information regarding students’ progress during
practice-based learning activities.

Contextual Inquiry into Practice-based STEM Learning
It is by now well understood that design and evaluation of learning analytics tools targeted at teachers (or
facilitators in general) and learners, requires techniques and methods from different disciplines, such as
software engineering, human- computer interaction and education (Martinez-Maldonado et al., 2015). As
discussed in detail by Martinez-Maldonado et al. (2015), while software engineering or human-computer
interaction have a lot of methods to offer in relation to establishing technical or usability requirements and
for evaluating systems, they may underestimate the learning context. In our previous experience from
participatory design, for example, particularly with teachers, the lack of previous experience on tools that
can support decision- making makes it really difficult to elicit requirements (Mavrikis, Gutierrez-Santos,
Geraniou, Noss, & Poulovassilis, 2013). Instead, in such occasions it is necessary to adopt methodologies
that appreciate the need of providing participants the opportunity to directly experience a situation and
provide meaningful feedback (Mavrikis et al., 2013).
          Several methodologies have been used the last few years for designing and evaluating learning
analytics tools. One approach that is particularly well suited to our aims is the so-called Learning
Awareness Tools User eXperience (LATUX) workflow (Martinez-Maldonado et al., 2015). It was recently
put forward as an approach to designing and deploying awareness tools in the classroom by an iterative
process of problem definition, low- and higher-fidelity prototypes, pilot studies and validation in-the-wild
sessions that can help designers to pay attention to the pedagogical requirements underlying the use of the
awareness tools under design. However, even the initial ‘problem identification’ stage requires recognising
that in-depth understanding of user behaviour can only be achieved by following a human-centered design
process that observes and analyses situations in their actual contexts. This is the main advantage of
contextual design approaches or contextual inquiry (Bayer & Holtzblatt, 1998). Hence, in order to
understand practice-based learning practices, situate our work in the context of real users and uncover
potentials for technology support, we commenced by conducting a contextual inquiry into several STEM
learning environments.


                                                      41
Method
Our contextual inquiry was based on the ethnographic method (Hammersley & Atkinson, 1995),
combining participative and observational approaches. We visited ten formal educational institutions in
four European countries and interviewed 25 STEM teachers and facilitators. We asked questions about the
learning environment, the people, spaces and materials involved in the learning process. Each interview
lasted for 1.5 to 2 hours and was digitally audio recorded with participants’ permission. The interviews
were conducted face-to- face and were later transcribed verbatim for analysis. Additionally, we conducted
a total of nine hours of in-situ observations during STEM classes in the same educational environments.
          We focused on gaining insights into class dynamics and interaction with learning materials, as
well as between peers and teachers within different learning settings. We complemented our data with
opportunistic, conversational interviews with a total of 15 students at the end of the observational sessions.
Our contextual inquiry was guided by two main research objectives 1) To understand the practices of
teachers and learners and their attitude to learning, in the face of material, spatial and logistic constraints
and how technological tracing and data analytical augmentation could support them, and subsequently 2)
To explore the design of visualizations of practice-based learning activities that can capture aspects of the
hands-on, open-ended, collaborative nature of practice-based STEM learning.
          Thematic analysis was performed, applying an iterative coding scheme with a mix of both
deductive and inductive codes (Braun & Clarke, 2006). The resulting coding scheme included learning
activities, motivations and attitudes towards tutoring, assessment and the learning process, challenges, as
well as socio-material and socio-spatial relationships between users, materials and spaces in the learning
process. While the detailed discussion of thick descriptions of the resulting findings is out of the scope of
this paper, in the following we present a summarised set of opportunity areas for research and design of
technological data-driven augmentations for practice-based learning.

Findings from the Contextual Enquiry Study
     1) Support Replay and Self-tracking: Hands-on demonstrations are an often-employed teaching
        strategy, as teachers believe it is necessary as well as stimulating for students to see the correct
        step-by-step execution of a hands- on activity (e.g. building a circuit) and comprehend and
        reflect on the steps behind it. This practice also applies to teachers’ in-class tutoring patterns,
        which often include conducting hands-on mini- demonstrations with individual groups, live-
        coding in front of the class to highlight specific problems or error patterns, or ‘reverse
        engineering’ of students’ current outcome in order to find coding or circuitry problems.
        However, with several individual groups with different levels of knowledge, it is often difficult
        (or impossible) to trace their mistakes and ‘replay’ the errors.
     2) Capture and Visualize Programming / Hands- on Issues: Teachers argue that they often become
        aware of students’ difficulties during programming and hands-on activities too late, when
        students are already stuck on larger, more complex issues. They believe they are unable to
        supervise several student groups simultaneously, and students’ also often lack the motivation
        and self-regulation skills to identify and report on issues.
     3) Promote and Leverage Documentation: According to teachers documentation is increasingly
        integrated in curricula and assessment criteria. Its implementation during the learning process
        was found as a challenge, yet it is valued. On the other hand, students find it as tedious and
        make incomplete, unreflective posts. Nevertheless, they enjoy documentation with digital tools
        (e.g. Facebook).
     4) Support Immediate, Opportunistic Means for Feedback & Documentation: Documenting can be
        disruptive to learners - especially in hands-on learning environments. Playful, opportunistic
        mobile documentation could facilitate the process, complemented by a system that tracks and
        captures important learning events.
     5) Support Non-linear Tutoring and Orchestration: Teachers claim that in-class tutoring of hands-
        on activities is a highly intense and dynamic activity that requires teachers’ attention and
        engagement at multiple levels. Teachers need to walk around, observe and visit students and
        attend to their questions and problems, while being able to keep track and give feedback to
        other students (who sometimes might not even need it). Yet teachers are able to be only at one
        place at a time, which makes it challenging to attend to specific students’ needs and orchestrate
        well their feedback. Combining a tracking system that is aware of students’ issues or feedback
        requests, with on-demand visual feedback through situated, and distributed devices, could
        provide means to overcome the inevitably ‘sequential’ nature of teachers’ feedback dynamics
                                                      42
        and allow them to prioritise and orchestrate his tutoring scheme.
     6) Multi-purpose spaces & dynamics: Students often use school spaces for multiple purposes –
        such as the workshop for brainstorming rather than just product work. Tracing their presence in
        these various spaces might yield information about their project development paths.
     7) Capture and Visualize Collaboration: Teachers believe that collaboration is an important
        process for learning and an effective way to expand and reinforce one’s knowledge. They try to
        develop a positive attitude in students towards cooperation with others by organising teamwork
        activities. Collaboration is assessed after continuous observation of teamwork and teachers
        usually keep track of it through personal observation notes that add to the overall ‘assessment’
        profile of the learner at the end of the course. However, as teachers point out this assessment
        strategy is highly subjective and difficult to track and thus, it remains challenging to capture
        collaborative skills effectively.

Then these findings were mapped to the design features of our visualisations as presented in figure 1
below.




                      Figure 1. Mapping the contextual enquiry findings with design features

Prototyping Visualisations
After two prototyping iterations, we developed the visualisation presented in Figure 2. It corresponds to our
findings from our contextual enquiry study in practice-based learning environments.




      Figure 2. Snapshot of visualization designed using taking into account findings of the contextual
                                               enquiry study


                                                     43
As can be seen, it includes labels for each component of the visualisation:
A. Visualization of Physical Computing Kit Activity: Using an Arduino-based Smart Learning Kit, we
    were able to visualize the hardware and software components in use and time spend using them. For
    example, the “BTN” represents the use of the button component by the students, making a physical
    computing project. They clearly use it throughout the whole working session. Yet they use the “ACR”
    (accelerometer) much less frequently - showing project development patterns.




             B.
                  Figure 3. Physical computing tools’ presentation in the visualisation
 We chose to represent the physical connection of a component as a strong thin line, the software use as a
 rectangle, each extending for the period of time for which they were either physically or digitally
 connected (Figure 3). The color of the component’s visual representation depends on whether it was an
 input (button, sensor, etc.) or output - thus aligning with the physical elements’ own placards as well.
 Any connection made is represented as a triangle on the element connected and each end of the
 connection on that element is represented as a square at the moment of the disconnection, again placed
 in line with that element’s general linear representation track.
C. Sentiment Feedback Visualisation: We visualized the button presses from the Sentiment Feedback
     Buttons (designed as part of the prototype) with a lightbulb icon (positive sentiment, e.g. “eureka
     idea”) and a storm-cloud icon (negative sentiment, e.g. “frustrated”, “stuck”). The icons were
     displayed over the visualization timeline at the moment of a corresponding button press.
D. Screenshot From the Workstation and the Computer Screen: We implemented a snapshot ability
     into each of the Sentiment Buttons such that when pressed, an overview camera from the workstation
     is triggered to take a picture of the students’ working environment. At the same time, a button press
     triggers the system to take a screenshot from the computer screen. Snapshots expand upon mouse
     hover and swap upon click to show the other image associated with this same time.
E. Overall Timeline with a Manipulatable Interface: A student or teacher can choose a slice of time
     that is as small as one minute or expand the slice to the full length of the session. They can look at the
     minute of a ‘frustration’ button press and see what modules were in use. They can also zoom out and
     look at the data patterns over the full period of project development. This view reveals patterns of
     usage behaviour such as the progression in complexity.
          We are at the stage of evaluating our visualisation in real world teaching environments. We are
interested to find out how educators and students engage with learning visualizations of data originating
from their practice-based work, in particular supporting students’ reflections, discussions, and self-
regulation, as well as educators’ awareness and assessment of the learning process Our initial feedback
from teachers and students demonstrate that the visualization could support valuable processes within
practice- based STEM learning and teaching. Some of the most salient are students’ collective post -
reflection and debriefing of specific difficulties within a project, and the facilitation of communication on
those issues in the group and with their teacher. However, we would like to evaluate the visualisation in
formal and informal teaching environments using robust research criteria with bigger samples in order to
be able to draw better conclusions regarding its potential use in classrooms.



                                                      44
Conclusions and implications
In this paper, we presented our human-centered process for generating visualisations of face-to-face,
practice- based learning activities, based on a contextual inquiry study of real world settings. We believe as
our colleagues (Yu & Nakamura, 2010) that technology can capture only certain aspects of student
interactions during such rich learning situations as practice-based learning activities. Hence, it is
challenging to present the practice-based learning process as a whole. However, our visualisation reflects
some aspects of the learning process that are considered as important by teachers and, as our initial
feedback sessions demonstrate, this approach can be valuable for providing support to both teachers and
students. Yet, there are other elements, which we identified in our studies as relevant and encourage future
investigations, such as capturing and visualization of more heterogenous types of activities (e.g.
sketching), or surfacing collaboration patterns. We hope that our visualisation will generate a productive
discussion at the workshop and we can get some feedback on our visualisation of practice-based learning
process as well as our approach to design it.

Acknowledgments
This work was funded by the PELARS project (GA No. 619738) under the Seventh Framework
Programme of the European Commission.

References
Bayer, H., & Holtzblatt, K. (1998). Contextual Design: Defining Customer-Centred Systems. San Francisco:
         Morgan Kaufmann.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology,
         3(2), 77-101.
Clark, R. E. (2009). How much and what type of guidance is optimal for learning from instruction? In S.
         Tobias & T. M. Duffy (Eds.), Constructivist theory applied to instruction: Success of failure? (pp.
         158-183). New York: Routledge, Taylor and Francis.
Coffrin, C., Corrin, L., de Barba, P., & Kennedy, G. (2014). Visualizing patterns of student engagement and
         performance in MOOCs. Paper presented at the The fourth international conference on learning
         analytics and knowledge.
Cukurova, M., & Bennett, J. (2014). An investigation of the effects of a novel teaching approach on students’
         learning of chemical ideas. Paper presented at the ESERA 2013 Conference: Science Education
         Research For Evidence-based Teaching and Coherence in Learning, Nicosia, Cyprus.
Dillenbourg, P., Zufferey, G., Alavi, H., Jermann, P., Do-Lenh, S., & Bonnard, Q. (2011). Classroom
         orchestration: The third circle of usability. Paper presented at the International Conference on
         Computer Supported Collaborative Learning 2011 (CSCL 2011), Hong Kong.
Dragon, T., Mavrikis, M., McLaren, B. M., Harrer, A., Kynigos, C., Wegerif, R., & Yang, Y. (2013).
         Metafora: A web-based platform for learning to learn together in science and mathematics.
         Learning Technologies, IEEE Transactions on, 6(3), 197-207.
Govaerts, S., Verbert, K., Klerkx, J., & Duval, E. (2010). Visualizing activities for self-reflection and
         awareness. ICWL, 91-100.
Gutierrez-Santos, S., Geraniou, E., Pearce-Lazard, D., & Poulovassilis, A. (2012). Desing of teacher
         assistance tools in an exploratory learning environment for algebraic generalization. Learning
         Technologies, IEEE Transactions on, 5(4), 366-376.
Hammersley, M., & Atkinson, P. (1995). Ethnography: Principles in practice. London: Routledge.
Lafford, B. A. (2004). Review of Tell me More Spanish. Journal on Language Learning and Technology, 8(3),
         21-34.
Martinez-Maldonado, R., Dimitriadis, Y., Martinez-Monés, A., Kay, J., & Yacef, K. (2013). Capturing
         and analyzing verbal and pyhsical collaborative learning interactions at an enriched interactive
         tabletop. Computer-Supported Collaborative Learning, 8, 455-485.
Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2015). The LATUX
         workflow: Designing and deploying awareness tools in technology-enabled learning settings. Paper
         presented at the Fifth International Conference on Learning Analytics and Knowledge.
Mavrikis, M., Gutierrez-Santos, S., Geraniou, E., Noss, R., & Poulovassilis, A. (2013). Iterative context
         engineering to inform the design of intelligent exploratory learning environments for the
         classroom. In
R. Luckin, S. Puntambekar, P. Goodyear, B. L. Grabowski, J. Underwood, & N. Winters (Eds.),
Handbook of Design in Educational Technology (pp. 80-92). London: Routledge.

                                                     45
Mercier, E., Vourloumi, G., & Higgins, S. (2015). Student interactions and the developme nt of ideas in
         multi- touch and paper-based collaborative mathematical problem solving. British Journal of
         Educational Technology. doi:10.1111/bjet.12351
Millar, R. (2004). The role of practical work in the teaching and learning of science Retrieved from
         Washington, DC.:
Race, P. (2001). A briefing on self, peer and group assessement. York: Higher Education Academy.
Roman, F., Mastrogiacomo, S., Mlotkowski, D., Kaplan, F., & Dillenbourg, P. (2012). Can a table regulate
         participation in top level managers' meetings? Paper presented at the 17th International Conference
         on Supporting Group Work, New York.
Van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2014). Supporting teachers in guiding
         collaborating students: Effects of learning analytics in CSCL. Computers & Education, 79, 28-
         39.
Verbert, K., Govaerts, S., Duval, E., Santos, J. L., van Assche, F., Parra, G., & Klerkx, J. (2014).
         Learning dasboards: an overview and future research opportunities. Personal and Ubiquitous
         Computing, 18(6), 1499-1514.
Yu, Z., & Nakamura, Y. (2010). Smart Meeting Systems: A Survey of State-of-the-Art and Open Issues.
         ACM Computing Surveys, 42(2), 1-20.
Zhang, D., Zhao, J. L., Zhou, L., & Nunamaker, J. F. (2004). Can e-learning replace classroom
         learning? Communications of the ACM, 47(5), 75-81




                                                    46