Action-oriented, Accountable, and inter(Active) Learning Analytics for Learners Simon Knight, Theresa D. Anderson Connected Intelligence Centre University of Technology Sydney Broadway, Ultimo, NSW 2007, AUS Simon.Knight@uts.edu.au, Theresa.Anderson@uts.edu.au ways across different, social, learning contexts [23], with ABSTRACT openness entailing [23]: This short paper describes our developing theorizing around the nature of learning analytics, and specifically ‘learning analytics 1. Openness of process (algorithms and tech) for learners’. We describe a value sensitive, participatory, design 2. Modularized integration process for the development of a learning platform and learning 3. Openness of data and platforms across stakeholders such analytics. Preliminary design sessions with students illustrate the that the needs and values of respective stakeholders are met approach we have taken to developing analytics in one masters – a key focus of our own work level course at the University of Technology Sydney. We In our work we have drawn association between this desire for highlight ‘three As’ in our approach. We argue that: (A1) learning open learning analytics (OLA) and the value sensitive design analytics for learners should be action oriented, with a focus on (VSD) approach [4], in particular regarding the third point above. process-based analytics that lead to actionable insight; (A2) In VSD, there is a focus on the role of values and how they are accountable, supporting sensemaking around learning data across undermined or promoted in the design of computer systems. For stakeholders; and (A3) (inter)active, involving students in example, Friedman notes that the design decision not to include understanding their own learning through analysis of processes an ‘off’ switch on systems that monitor behavior (for whatever (per A1), made visible and accessible to them (per A2), and in legitimate work or leisure reasons), removes a freedom from users which they have a say. We thus argue that engaging students in to maintain their own privacy. Of high relevance to learning participatory design of learning analytics and their platforms is a analytics, Friedman [4] also notes that user autonomy can be key potential of LAL. maintained in cases where some design decisions encode particular ways of working (for example, technical Categories and Subject Descriptors implementations of search functions) into a system, while K.3.1 [Computers and Education]: Computer Uses in Education maintaining user freedom over other elements (for example, the formatting of their texts); what is key, is that “autonomy is General Terms protected when users are given control over the right things at the Measurement, Design, Human Factors, Theory right time.” [4 p.18]. Keywords A number of key foci emerge from the VSD approach [5] that are Learning Analytics, Participatory Design, Algorithmic of relevance to learning analytics for learners; thus value sensitive Accountability, Social Learning Analytics, Value Sensitive design: Design, Human Data Interaction, Collaborative Sensemaking, Learning Analytics for Learners 1. Is ‘proactive’ – it should run through a whole design process 2. Has a broad focus, including the role of technology in 1 INTRODUCTION educational contexts Learning analytics is the "measurement, collection, analysis and 3. Encompasses a broad set of values, (e.g. beyond reporting of data about learners and their contexts, for purposes of ‘cooperation’ in computer supported cooperative work understanding and optimising learning and the environments in (CSCW) research) which it occurs," [1]. However, there has been concern that 4. Makes use of an integrated methodology involving analysis learning analytic technologies focus on passive interventions for of conceptual, empirical, and technical concerns. ‘predictions’ around ‘underperforming’ and ‘at risk’ students, 5. Takes an interactional approach, in which it is understood rather than empowering students to create and use their own that values emerge in the interaction between technologies learning analytic tools [14]. Many learning technologies are and social systems, but are not determined by either in pedagogically neutral, with little user-centered or participatory isolation. design involved in their conception or implementation [15]. In 6. Holds that some moral values are independent of the parallel, there is a desire to move ‘beyond the LMS’ in particular group or individual (e.g. values relating to human understanding student learning data [10], with calls for welfare and justice). development of an open and modularised approach – making use 7. Holds that some values are universally held, but vary in of a variety of open source tools which might be linked in ad hoc instantiation across cultures and contexts (e.g. how privacy is understood and implemented). Copyright © 2016 for the individual papers by the papers' authors. Copying permitted only for private and academic purposes. This volume is published and copyrighted by its editors. LAL 2016 workshop at LAK '16, April 26, 2016, Edinburgh, Scotland. 1 For learning analytics, the implications are – at least – that we Framed as participatory action research, the project pursues the should: iterative design of an online environment for learning. Interested students will volunteer to attend challenge days to develop the 1. Consider values in the design and implementation of online environment, with all students invited to use and feedback learning analytics, throughout the process, considering how on the developed technologies on an ongoing basis informally and technologies can reify values, and their interpretive using established course feedback mechanisms. We aim to flexibility. In earlier work we have considered this concern establish a co-design method for development of an online in light of underlying theoretical positionings [12], with learning environment (the process), with a learning environment more recent explication [13] pointing to the potential of as an end product, for use by students in their own learning. ‘Claims Analysis’ – analysis of the ways values are Design artefacts will be collated through the iterative process, implemented in systems – to clarify and critique implicit with participants and academic co-designers encouraged to reflect models of user-tool interactions [see, 17] on the process and the needs a developing online environment 2. Consider the ways that analytic devices might capture, might meet. operationalize, and represent, constructs of significance in the learning sciences across contexts and cultures, and the In both the initial specification, and ongoing implementation role of learners and educators in that. process, a community steward will support the iterative design. The steward will act in line with Wenger et al.’s [8] description of With regard to the iterative process taken in VSD, three kinds of technology stewards: analysis are conducted: conceptual, empirical, and technical. In conceptual investigations the nature of values from different Stewarding technology involves knowing a lot but it stakeholders, and the ways that technologies support or diminish also involves a lot of intuition, guesswork, and the them are analysed. Conceptual investigations are thus analyses of patience to tolerate uncertainty and not knowing. Tech the key constructs of interest in the design process, and their stewards face fundamental questions that can't be weight and balance. Empirical investigations, then, investigate answered in advance or from a distance. This specific social contexts for the designed technology, and – uncertainty requires insight and inventiveness on the iteratively – the impact of the technology on those contexts. Third, part of tech stewards and the community, whether technical investigations provide analysis of the suitability of through making do with what's available, inventing particular technological designs for the values and context technical workarounds, or forging ahead with new targeted. design efforts…Determining what communities will tolerate or demand, including their needs, interests This approach to design meets some of the ethical concerns raised and motivations, makes stewarding interesting work. around learning analytics [20, 25], with calls for students as This kind of work cannot be reduced to one formula [8 collaborators at varying intervals through a design and analytic p.146] process in a student-centric approach to learning analytics. As The steward’s role is one of advocacy and responsiveness, such, VSD may be targeted at maintaining student autonomy, and supporting student activity within the community (and its ensuring students are included in analytic devices (including technologies) to foster a participatory value sensitive design through provision of educational resources regarding those process. They will thus use their knowledge and intuition in the analytic devices). Through involvement in the design of analytics, fluid design of the online space, supporting effective community stakeholder needs (and acceptable constraints) can be use of the space, developing workarounds, and co-developing new conceptualized and operationalized into technologies in ways that designs. Critically, this role requires understandings about the support, rather than diminish, their values. human and the technological contexts of the learning space we are developing for the MDSI program. The steward will work to 2 THE MDSI CONTEXT understand the community’s needs and values through interaction At the University of Technology Sydney, the authors’ centre (the online, supporting platform and learning analytic design and Connected Intelligence Centre) runs a new transdisciplinary community concerns for UTS-staff, professional-partners, and Masters in Data Science and Innovation (MDSI). In support of students. that course, the authors and other UTS colleagues have begun a participatory design process with a subset of students from the At the time of writing, the first ‘design day’ has been conducted, MDSI (for which UTS ethics approval has been granted). with 8 physically co-located students and 7 contributions from online survey responses. An ideation process has been used which Using a participatory design and action research [3] process UTS asked participants to consider the following questions individually staff and MDSI students are co-developing a space for creatively and in groups: exploring transdisciplinary and professional connections across their course supported by a ‘community steward’. The intention is 1. Why do you participate in online environments? that this open environment will enable students to actively 2. Thinking about specific activities involving tools or online participate with professionals and shape an online community to spaces, do you have any examples of great practice? supplement their more traditional online offerings (e.g. 3. In your MDSI experience thus far, what obstacles have you Blackboard). Learning analytics will provide students with data encountered in online learning? What has prevented you about their learning to interrogate and respond to for formative from participating as you would have liked? purposes and academics with data about the value of this model of 4. If you could design anything to support your learning, what engagement for postgraduates. The research thus aims to engage a would it be? participatory co-design methodology through which researchers and students develop deeper understandings of learning 'beyond These questions were designed to elicit responses that: considered the LMS'. the range of tools and platforms available; would focus on 2 examples of ‘getting it right’, of design spaces that have analytics, but this multi-layered feature should be embraced successfully met challenges; and that this consideration would be and remain visible rather than shied away from. targeted at specific challenges (3) to be met through actionable 3. (Inter)active, Participatory, and Engaging – LAL should design changes (4). A final question was asked: “Now to sum this involve learners in understanding their own learning, up – how do we reconcile all this? All the support is now in place through analysis of processes (A1), made visible and – what has to happen next?” This last question was designed to accessible to them (A2), and in which they have a say (A3). encourage the students to ‘get concrete’, and particularly to Engaging students in participatory design of learning consider actions we could make to support them in their design analytics and their platforms is a key potential of LAL. process. The potential of such a shift is to bring students into active discussion about their own learning, and the diversity of Through the process of the design day, a number of themes experiences of that learning (as is explicit in VSD). For example, (expressed as questions) have been identified; these are now being our approach might explore the means through which diversity of discussed further in an ongoing online design process: experience can be valued in the application of models of social learning analytics, which have a focus on learners as producers  How do we ensure our site is responsive, and well designed? (for example, through blogs where learners are encouraged to  How do we tackle the need for a sustainable, searchable, share and discuss learning as it is unfolding and not just showcase tagged knowledge base & Q&A space? outcomes) [2]. One aim, then, is for systems of learning process  How do we integrate external tools and platforms analytics to understand “what is going on in a learning scenario” effectively? [21 p.1632] rather than predictive models of future outcomes (or,  How do we guide learners through resources? (E.g. sticky to shift to process rather than ‘checkpoint’ analytics [16]). posts and collaborative filtering)  How do we manage identities internally and externally, While these processes level analyses afford new and important integrating existing profiles, and crediting engagement & potential to support student learning – and their own participation (reputation management)? understanding of that learning – they also introduce complexities. In earlier work (by the first author, [11]) it was noted that  How do we create a constructive feedback and discussion conveying learning information to multiple audiences – from area (possibly with multimedia tools)? students, teachers and parents, to vice-chancellors, heads of  How to we engage with industry through the site, and professional associations, and government ministers – is complex. understand what they’re looking for? This complexity is compounded by the various skill levels and  How do we build a space for constructive-community-based needs of the audiences, with users which to gain different insights feedback and formative iteration, with possible ‘employer- from any data (from personal learning improvement, to systems- ready’ output? level change), and having differing skills to interpret and make  How do we gamify and show participation to support use of that data towards their needs. There we noted that “LA may learning and effective community? in part be about personalization of learning through analytics, but it is also about engaging learners and educators in a sensemaking In addressing these design questions we have taken an process around the data” [11 p.3]. Understanding of learners’ and (inter)active participatory approach focusing on action orientation, educators’ interpretations of learning, and of the value of the data, and understanding the various lenses and levels through which the may be explored through analysis of this sensemaking process. design will be seen. For example, rather than imposing a perspective of gamification which foregrounds data only to An emerging field of 'Human Data Interaction' (HDI) builds on lecturers, and focuses on content learning over interaction, we are work in human computer interaction (HCI) to explore the specific engaging in a value sensitive approach to understanding what interactions of agents with data to "support end-users in the day- ‘gamification’ and ‘participation’ might mean in this learning to-day management of their personal digital data..." with an context and community and whether or not learners see it as understanding of data as of an "inherently social and relational adding value to their experience. character" [3 p.1]. Thus, "HDI is a distinctively socio-technical problematic, driven as much by a range of social concerns with the emerging personal data ‘ecosystem’ as it is by technological 3 AAA APPROACH TO LAL concerns, to develop digital technologies that support future Through our work with students, and the VSD approach we have practices of personal data interaction within it" [3 p.3]. taken, we have begun to think of learning analytics for learners (LAL) in terms of three ‘A’s, in brief: HDI, then, highlights the tensions between ‘our’ data and ‘my’ data, and the corresponding issues of data ownership and control. 1. Action oriented, integrating (social) processes – LAL should These issues are of course key in learning analytics, where data is focus on what we do, not just what we know, and how we ‘produced’ by individuals through their learning processes, and change, not just where we are. We see learning as analysed (and contextualized) through comparison with other fundamentally interactional, and tool-mediated in nature; groups and individuals within the specific learning activity, often learning analytics brings new potential for process oriented through the use of institutionally owned technologies. feedback and support. 2. Accountable, Accessible, and Multi-layered – LAL should HDI, then, is concerned not only with how people use and create be accountable, and accessible, at various levels of the data, but with how they both visualise and understand the data, learning analytic system, from the micro (individual and how that data is made use of within social relational systems teachers and students) to the macro (institutions and (by data creators and processors); the problems of connecting collections of institutions). New challenges around learning analytics across levels from the macro, meso, and micro, collaborative sensemaking are foregrounded by learning can thus be seen in terms of HDI. 3 In learning analytics contexts one of the things we're interested in actors’ understanding of them, is important to building learning is how stakeholders - managers, educators, students, parents, etc. - analytics. Other communities have tackled such issues, for interact with 'their' data at the various levels of granularity. Of example the end-user customization community has explored the course part of that is about how that data is represented and ways in which end-users modify software applications through visualised, and the kinds of collaborative sensemaking processes their embedded eco-systems, and the ways in which interfaces that stakeholders engage in. enable such adaptation (MacLean, Young, Bellotti, & Moran, 1991). The challenges – flagged in [3 p.3] – of relevance to learning In other work reviewing ‘collaborative visualisation’ [8] the analytics, then are: relationships between visualization and computer supported cooperative work (CSCW) are highlighted, with CSCW holding • Personal data discovery, including meta-data publication, key potential in understanding: consumer analytics, discoverability policies, identity mechanisms, and app store models supporting discovery of  The relationships between users and their roles (for data processers example, student, administrator) and how their tasks and • Personal data ownership and control, including group needs (and, per VSD, their values) are defined management of data sources, negotiation, delegation and  The kinds of learning gain, insight, consensus, etc. gained in transparency/awareness mechanisms, and rights the process of collaborative visualization (as compared, say, management. to a focus on creation of fixed ‘products’). • Personal data legibility, including visualisation of what  The processes of data interaction (or, as discussed above, processors would take from data sources and visualisations HDI), and visualization development that help users make sense of data usage, and recipient  The insights groups gain through collaborative design to support data editing and data presentation. visualisations, and how this is understood in the context of • Personal data tracking, including real time articulation of group success, and the qualities of the visualisations data sharing processes (e.g., current status reports and themselves. aggregated outputs), and data tracking (e.g., subsequent In learning analytics, understanding these concerns offers an consumer processing or data transfer). additional site for analytics in itself. Understanding the ways in [3 p.18] (emphasis added). which stakeholders at various levels make sense of, and draw In the learning analytics context, the particularly interesting value from, data affords opportunity to investigate that challenge is to make these concerns legible in such a way as to sensemaking as a learning process. The potential is to understand make it clear to learners not only what behaviour or change is both how stakeholders extract meaning from data, and action this, expected/observed in them, but how their data has been collated and in understanding how best to support these processes across and used, how their data-feedback is both an end-product and and within stakeholder levels. fundamental component of the analytic process, and how changes to the data (for whatever reason) might relate to them and the In our perspective, one means through which to engage in the fuller analytic set. Of course part of HDI must be how we process of developing effective means for collaborative facilitate data subjects to understand their data-relations; some of sensemaking is through engaging in participatory design this will be difficult, understanding the balance of clarity and processes. By co-designing, learners are engaged in understanding accessibility alongside conceptual (and methodological) the kinds of values technologies can instantiate, and their complexity is an important challenge. Some ideas are hard, and connection to the social and technological context of their working with their coarseness is exactly what makes them learning. The potential outcome is for learners to be involved in productive. open sensemaking around their own learning processes, as they are made visible and accessible to them in ways that they have Learning analytics for learners, then, must include accountability been involved in designing. and accessibility considerations. Yet, while algorithms are key to learning analytics, their design and implementation are restricted While earlier research has analysed participatory processes in to a small group of individuals, often excluding students and even understanding the learning context [7], it has not, to our academic educators. Thus, a concern has been raised regarding the knowledge, involved development of the platforms and analytic pedagogic and ethical imperative for “algorithmic accountability” approaches for that learning. In that earlier work [7], processes of (Diakopoulos, 2014). This concern implies the need to ensure peer interaction and public development of learning artefacts appropriately accountable and accessible (or, legible) HDI across alongside ‘badges’ (credits given for particular kinds of in-course the range of stakeholders. In considering a broader discourse behaviours) were central. As the Open University’s Innovating around the nature of programming and code as ‘actors’ in Pedagogy 2013 report highlighted, there is untapped potential in education Williamson [24 citing , 6] notes the construct of mobilising badges and learning analytics for the support of calculated publics: learning [22]. Their potential is in the recognition of learning across sites and diverse sets of knowledge and skills, in support of as algorithms are increasingly being designed to anticipate novel assessments [9]. Moreover, there is potential for peer- users and make predictions about their future behaviours, badging in participatory collaborative contexts [19], bringing users are now reshaping their practices to suit the algorithms together social learning, participatory learning design, and they depend on. This constructs ‘calculated publics,’ the learning analytics. algorithmic presentation of a public that shapes its sense of itself. [24 p.30] In forthcoming work of this kind, McPherson et al., [18] use focus That is, learning analytics have the potential to impact on how group analysis, asking participants “what data related to their learners and educators (and administrators) act, and interact (as learning they would like to have and why they would like to have HDI foregrounds). Consideration of these changes, and of the it” [18 p.2], suggesting that through analysis of disciplinary 4 differences, student data needs (in their specific contexts) can be Analytics And Knowledge (New York, NY, USA, 2014), assessed and met. Designing in partnership with learners what 93–97. ‘meaningful’ participation is (and how it should be credited) helps [8] Isenberg, P. et al. 2011. Collaborative visualization: with elusive measurement in blended learning where ‘activity’ is Definition, challenges, and research agenda. Information often limited to what actions are visible to the tool and the Visualization. 10, 4 (Oct. 2011), 310–326. teacher. We thus see great potential in the kind of participatory, [9] Jovanovic, J. and Devedzic, V. 2014. Open Badges: Novel value sensitive, design process we describe here, which builds on Means to Motivate, Scaffold and Recognize Learning. open learning analytics, to take an ethical approach to human data Technology, Knowledge and Learning. 20, 1 (Aug. 2014), interaction and collaborative sensemaking. 115–122. [10] Kitto, K. et al. 2015. Learning analytics beyond the LMS: 4 DISCUSSION the connected learning analytics toolkit. (2015), 11–15. Participatory design approaches support human values by [11] Knight, S. et al. 2013. Collaborative Sensemaking in embedding a practice of transparency and openness into the Learning Analytics. CSCW and Education Workshop (San design process. By foregrounding values and helping teachers as Antonio, Texas, USA, 2013). well as learners navigate the value-laden terrain of systems [12] Knight, S. et al. 2014. Epistemology, assessment, designed for learning, VSD adds another critical dimension to the pedagogy: where learning meets analytics in the middle design of learning analytics that are meaningful for learners. What space. Journal of Learning Analytics. 1, 2 (2014). is particularly significant about VSD is the focus on supporting [13] Knight, S. and Buckingham Shum, S. in submission. enduring human values. Unlike many other design techniques that Theory and Learning Analytics. will focus on the workplace or the classroom context, VSD [14] Kruse, A. and Pongsajapan, R. 2012. Student-centered enlarges the arena in which one considers ethical issues and the learning analytics. CNDLS Thought Papers. (2012), 1–9. values that centre on human well-being, dignity, justice, [15] Laanpere, M. et al. 2012. Pedagogy-Driven Design of welcomes, and rights. It is not just about designing technology, it Digital Learning Ecosystems: The Case Study of Dippler. is about recognising the (often invisible) impact and implication Advances in Web-Based Learning - ICWL 2012. E. of protocols and policies that surround and inform the use of any Popescu et al., eds. Springer Berlin Heidelberg. 307–317. technology. Applying a VSD mindset helps us – as researchers [16] Lockyer, L. et al. 2013. Informing Pedagogical Action: and teachers – and our student co-designers articulate the human Aligning Learning Analytics With Learning Design. values we seek to account for in the 'design' of the MDSI learning American Behavioral Scientist. (Mar. 2013), experience and in the process the LAL that will make that 0002764213479367. experience visible to all stakeholders. Thinking about our design [17] McCrickard, D.S. 2012. Making Claims: The Claim as a intentions can inform not only the design of the blended learning Knowledge Design, Capture, and Sharing Tool in HCI. environment we are aspiring to co-create, but also the institutional Morgan & Claypool Publishers. practices and protocols that will shape its use. It invites us to have [18] McPherson, J. et al. Forthcoming. Learning analytics and conversations and discuss the relative overlaps and potential disciplinarity: Building a typology of disciplinary contradictions of our value systems in the design of learning differences from student voices. Proceedings of the 6th analytics for learners. International ACM Conference on Learning Analytics and Knowledge (Edinburgh, UK, Forthcoming). [19] Pedro, L. et al. 2015. Peer-supported badge attribution in a 5 REFERENCES collaborative learning platform: The SAPO Campus case. [1] 1st International Conference on Learning Analytics and Computers in Human Behavior. 51, Part B, (Oct. 2015), Knowledge 2011 | Connecting the technical, pedagogical, 562–567. and social dimensions of learning analytics: 2011. [20] Prinsloo, P. and Slade, S. 2015. Student privacy self- https://tekri.athabascau.ca/analytics/about. Accessed: management: implications for learning analytics. 2013-05-21. Proceedings of the Fifth International Conference on /[2] Buckingham Shum, S. and Ferguson, R. 2012. Social Learning Analytics And Knowledge (Poughkeepsie, New Learning Analytics. Educational Technology & Society. York: A. C. M., 2015), 83–92. 15, 3 (2012), 3–26. [21] Schneider, D. et al. 2012. Requirements for learning [3] Crabtree, A. and Mortier, R. 2015. Human data interaction: scenario and learning process analytics. (2012), 1632– Historical lessons from social studies and CSCW. ECSCW 1641. 2015: Proceedings of the 14th European Conference on [22] Sharples, M. et al. 2013. Innovating Pedagogy 2013. Computer Supported Cooperative Work, 19-23 September Technical Report #2. The Open University. 2015, Oslo, Norway (2015), 3–21. [23] Siemens, G. et al. 2011. Open Learning Analytics: an [4] Friedman, B. 1996. Value-sensitive design. interactions. 3, integrated & modularized platform. Society for Learning 6 (1996), 16–23. Analytics Research (SoLAR). [5] Friedman, B. et al. 2002. Value sensitive design: Theory [24] Williamson, B. 2015. Digital education governance: data and methods. University of Washington technical report. visualization, predictive analytics, and “real-time” policy (2002), 02–12. instruments. Journal of Education Policy. 0, 0 (Apr. 2015), [6] Gillespie, T. 2014. The relevance of algorithms. Media 1–19. Technologies: Essays on communication, materiality and [25] Willis, J.. et al. forthcoming. Ethical oversight of student society. T. Gillespie et al., eds. MIT Press. 167–194. data in learning analytics: a typology derived from a cross- [7] Hickey, D.T. et al. 2014. Small to Big Before Massive: continental, cross-institutional perspective. Educational Scaling Up Participatory Learning Analytics. Proceedings Technology Research and Development. (forthcoming). of the Fourth International Conference on Learning 5