=Paper= {{Paper |id=Vol-1601/CrossLAK16Paper13 |storemode=property |title=Seeing Learning Analytics Tools as Orchestration Technologies: Towards Supporting Learning Activities across Physical and Digital Spaces |pdfUrl=https://ceur-ws.org/Vol-1601/CrossLAK16Paper13.pdf |volume=Vol-1601 |authors=Roberto Martinez-Maldonado |dblpUrl=https://dblp.org/rec/conf/lak/Maldonado16 }} ==Seeing Learning Analytics Tools as Orchestration Technologies: Towards Supporting Learning Activities across Physical and Digital Spaces== https://ceur-ws.org/Vol-1601/CrossLAK16Paper13.pdf
 Seeing Learning Analytics Tools as Orchestration Technologies:
  Towards Supporting Learning Activities across Physical and
                         Digital Spaces

                                        Roberto Martinez-Maldonado
          Connected Intelligence Centre, University of Technology Sydney, Chippendale, NSW, 2007
                                  Roberto.Martinez-Maldonado@uts.edu.au

         Abstract: This panel paper proposes to consider the process that learners or educators
         commonly follow while interacting with learning analytics tools as part of an orchestration
         loop. This may be particularly valuable to facilitate understanding of the key role that learning
         analytics may have to provide sustained support to learners and educators. The complexity of
         learning situations where learning occurs across varied physical spaces and multiple
         educational tools are involved requires a holistic and practical approach. The proposal is to
         build on principles of orchestration that can help link technical and theoretical aspects of
         learning analytics with the practitioner. The panel paper provides: 1) a brief description of the
         relevance of the notions of orchestration and orchestrable technologies for learning analytics;
         and 2) the illustration of the orchestration loop as a process followed by learners or educators
         when they use learning analytics tools.

         Keywords: learning analytics, classroom orchestration, Cross-LAK, physical and digital spaces


Introduction
It has been emphasised that learning analytics research and practice has a holistic and human-centred
perspective, primarily aimed at leverage human judgement and understanding (Siemens & Baker, 2012).
Learning analytics is also focused on empowering educators and learners and thus requires to keep the human in
the loop. Moreover, the learning analytics community is differentiated from other educational data science
perspectives because of its particular view of learning as a whole, complex activity. This includes understanding
that students’ activity and their actual learning not only occurs while they interact with single learning tools (e.g.
with an intelligent tutoring system or the learning management system only), but with a variety of tools,
connected or unconnected with each other, and commonly distributed across different physical spaces (Pérez-
Sanagustín et al., 2012).
          A promising yet underexplored perspective to understand how students and educators may interact with
learning analytics tools to gain a more holistic view of activity across physical and digital spaces is the metaphor
of orchestration. Orchestration takes account of the variability and complexity of classrooms (and blended
learning scenarios) by considering this as a question of “usability in which the classroom is the user”
(Dillenbourg et al., 2011). It also recognises the key role of educators in adapting the available pedagogic and
technological resources to help students achieve their intended learning goals (Dillenbourg, et al., 2011). This
perspective emphasises that technology should be practical, minimalist, controllable and flexible to facilitate
rather than hinder the learning activities (Dillenbourg, 2013). An evolved notion of this approach has been
embraced by the communities of Technology Enhanced Learning and Computer-Supported Collaborative
Learning (Prieto et al., 2015). One of the reasons for this is that it has shown some potential to help to link
research-based results with everyday educational practice.
          There has only been a small number of research outputs mentioning orchestration and learning
analytics together (e.g. Martinez-Maldonado et al., 2016; Rodríguez Triana et al., 2014; Verbert et al., 2013).
However, there is an implicit overlap in both perspectives, particularly because learning analytics tools
commonly support educators and learners by making visible aspects of their learning in order for them to take
some action as a consequence.
          This panel paper is aimed at generating discussion about the relevance of the notion of orchestration
technology for learning analytics; and the notion of orchestration loop as a process followed by learners or
educators when they use learning analytics tools.




                                                          70
                   Copyright © 2016 for this paper by its authors. Copying permitted for private and academic purposes.
Orchestration Technology in Learning Analytics
Prieto et al.’s (2015) orchestration framework identifies 4 main orchestration tasks that educators commonly
have to perform. These are: 1) Design and planning; 2) Regulation and management; 3) Adaptation, flexibility
and intervention; and 4) Awareness and assessment. Orchestration technology may support the management of
the orchestration or some part of it, in one or more of these orchestration tasks. This includes, for example,
interfaces that help teachers manage the class workflow, enhance their awareness or track students’ progress, or
re-design the tasks after looking at the data generated in previous activities. We can easily realise that learning
analytics tools are currently mostly used to support awareness and different sorts of assessment. Thus, learning
analytics tools can be considered as a special type of orchestration tool just by definition. However, Martinez-
Maldonado et al. (2016) demonstrated that learning analytics tools can also be used to provide support in the
other orchestration tasks (e.g. during the learning design, to regulate class scripts or to perform semi-automated
interventions). Mike Sharples (2013) also introduced the notion of shared orchestration, which suggested that
these tasks are not just limited to the things that educators have to do, but that can be distributed among other
stakeholders to different extents. For example, in self-regulated learning scenarios, the role of the teacher may
exist but students have to orchestrate their own learning. This is particularly important for learning analytics for
learners (Bull et al., 2016).
          By contrast, an orchestrable technology allows teachers to configure or adapt the use of the technology
for different purposes, before the class and/or while the class is being conducted (Tchounikine, 2013). This can
help teachers’ target the technology to a range of pedagogical objectives rather than restricting the learning
analytics tool to specific educators (or students) usage. Examples of this kind of tools include efforts to create
configurable open dashboards that can be customised by educators to accomplish their particular needs – see
Open Learning Analytics (Siemens et al., 2011). There is also a nascent interest in collecting data from multiple
data sources and trying to make sense of the learners’ heterogeneous data at a higher level. An example of this is
the CLA Toolkit (Kitto et al., 2015) which provides an infrastructure to collect gather information from
learner’s activity through multiple social media tools (e.g. facebook, twitter, youtube). The challenge for an
educator would be how to coordinate the pedagogical approach to teaching using multiple tools but also how to
make sense of the partially collected data as part of the learner’s activity may be tracked across multiple
platforms.
          In short, taking an approach of orchestration for learning analytics is a dynamic perspective that has the
potential to attend authentic issues considering that learning activities can occur in the classroom or in other
spaces. Moreover, if multiple tools are used, there is an increase in the orchestration load too (Prieto et al.,
2012). For learning analytics, this may generate additional technical and pedagogic challenges to create tools
that can support educators or students in making sense of learning data coming from multiple heterogeneous
learning systems.

Iterative Orchestration of Learning Analytics Tools
Verbert et al. (2013) proposed that the design of visual learning analytics tools (such as dashboards) can be built
and developed following an orchestration idea of “modest computing” approach (Dillenbourg, et al., 2011).
This approach tries to empower people with key tools and/or information to take their own decisions, rather than
automating decisions on their behalf. With this perspective in mind, the user has a crucial role in the loop where
the educational technology and the learning analytics tool sits. We can understand the notion of iteration in
orchestration of learning activities through learning analytics tools from a personal informatics perspective as a
starting point. This has been described by Verbert et al. (2013) as the process users follow to: have access to
data (i. awareness); ask questions and assess the relevance of the data (ii. reflection); answer questions, getting
new insights (iii. sensemaking); to finally induce new meaning or insights (iv. impact).
          This four-stage iterative process occurs while users interact with a learning analytics tool in a given
phase. This process, from an orchestration perspective, mimics the orchestration loop that includes: the teacher
or the student monitors the classroom or learning situation (possibly aided by a learning analytics tool),
compares its state to some intended state (assessment), and adapts the scenario accordingly (intervention). This
loop highlights two key tasks in the orchestration function that can be aided by learning analytics tools: state
awareness (which can be improved by learning analytics tools that make visible aspects of the learning activity
that may otherwise be hard to see) and workflow manipulation (which can be improved by enhancing the
decision making process of the teacher or the students to self-regulate their learning).




                                                        71
Implications
Learning analytics can have a key role in supporting both face-to-face and blended learning activities. The
learning activity is physically and socially situated and thus is strongly shaped by the tools, space and social
dynamics where it sits. However, non-online learning activities have been considerably neglected by the
learning analytics efforts.
         The orchestration metaphor may be relevant for generating learning analytics solutions in authentic
learning settings. However, how can we start the conversation between the two very different academic
communities? Moreover, how can the different actors (e.g. teachers, students, developers and designers of
learning analytics tools, educators and researchers) communicate and gain common understanding of the
particular needs and the mutual objectives that each has?
         Orchestration may be particularly important for scenarios where learning occurs in different physical
and digital spaces because of its holistic perspective towards the different tasks that educators and/or learners
need to do that can shape learning (e.g. design, regulation, management, intervention, evaluation, keep
awareness etc.). Then, how can the metaphor of orchestration facilitate the understanding of the complexity of
learning activities that occur across multiple digital environments and physical locations? A better
understanding of the commonalities and particularities of each field is most needed in order to connect technical
and theoretical aspects of the learning analytics research with the real-world practitioners.


References
Bull, Susan, Blandine Ginon, Judy Kay, and Michael Kickmeier-Rust. (2016). Learning Analytics for Learners.
         In Proceedings of the International Conference on Learning Analytics and Knowledge, (pp. to appear).
Dillenbourg, Pierre. (2013). Design for classroom orchestration. Computers & Education, 69(0), 485-492.
Dillenbourg, Pierre, Guillaume Zufferey, Hamed Alavi, Patrick Jermann, Son Do-Lenh, Quentin Bonnard,
         Sébastien Cuendet, and Frédéric Kaplan. (2011). Classroom orchestration: The third circle of usability.
         In Proceedings of the International Conference on Computer Supported Collaborative Learning, (pp.
         510-517). Hong Kong, 4-8 July 2011. New York: Springer
Kitto, Kirsty, Sebastian Cross, Zak Waters, and Mandy Lupton. (2015). Learning analytics beyond the LMS: the
         connected learning analytics toolkit. In Proceedings of the Proceedings of the Fifth International
         Conference on Learning Analytics And Knowledge, (pp. 11-15). Poughkeepsie, New York: ACM.
Martinez-Maldonado, Roberto, Bertrand Schneider, Sven Charleer, Simon Buckingham Shum, Joris Klerkx, and
         Erik Duval. (2016). Interactive Surfaces and Learning Analytics: Data, Orchestration Aspects,
         Pedagogical Uses and Challenges. In Proceedings of the International Conference on Learning
         Analytics and Knowledge, (pp. to appear).
Pérez-Sanagustín, Mar, Patricia Santos, Davinia Hernández-Leo, and Josep Blat. (2012). 4SPPIces: A case study
         of factors in a scripted collaborative-learning blended course across spatial locations. International
         Journal of Computer-Supported Collaborative Learning, 7(3), 443-465.
Prieto, Luis P, Yannis Dimitriadis, Juan I Asensio-Pérez, and Chee-Kit Looi. (2015). Orchestration in learning
         technology research: evaluation of a conceptual framework. Research in Learning Technology, 23.
Prieto, Luis Pablo, Juan Alberto Muñoz-Cristóbal, Juan Ignacio Asensio-Pérez, and Yannis Dimitriadis. (2012).
         Making Learning Designs Happen in Distributed Learning Environments with GLUE!-PS. In A.
         Ravenscroft, S. Lindstaedt, C. Kloos & D. Hernández-Leo (Eds.), 21st Century Learning for 21st
         Century Skills, (pp. 489-494). Saarbrücken, Germany: Springer Berlin Heidelberg.
Rodríguez Triana, María Jesús, Alejandra Martínez Monés, Juan I Asensio Pérez, and Yannis Dimitriadis.
         (2014). Scripting and monitoring meet each other: Aligning learning analytics and learning design to
         support teachers in orchestrating CSCL situations. British Journal of Educational Technology, 46(2),
         330–343.
Sharples, Mike. (2013). Shared orchestration within and beyond the classroom. Computers & Education, 69(1),
         504-506.
Siemens, George, and Ryan S. J. d. Baker. (2012). Learning analytics and educational data mining: towards
         communication and collaboration. In Proceedings of the International Conference on Learning
         Analytics and Knowledge, (pp. 252-254). Vancouver, Canada, April 29 - May 2. New York: ACM.
Siemens, George, Dragan Gasevic, Caroline Haythornthwaite, Shane Dawson, S Buckingham Shum, Rebecca
         Ferguson, Erik Duval, Katrien Verbert, and RSJD Baker. (2011). Open Learning Analytics: an
         integrated & modularized platform. Proposal to design, implement and evaluate an open platform to
         integrate heterogeneous learning analytics techniques.


                                                       72
Tchounikine, Pierre. (2013). Clarifying design for orchestration: orchestration and orchestrable technology,
         scripting and conducting. Computers & Education, 69(1), 500-503.
Verbert, Katrien, Erik Duval, Joris Klerkx, Sten Govaerts, and José Luis Santos. (2013). Learning Analytics
         Dashboard Applications. American Behavioral Scientist, 57(10), 1500-1509.




                                                    73