Learning Analytics for Learners: Preface to Proceedings of First LAL Workshop at LAK’16 Susan BULL Blandine GINON Judy KAY University College London, UK University of Birmingham, UK University of Sydney, Australia Michael KICKMEIER-RUST Matthew D. JOHNSON Technische Universität Graz, Austria University of Birmingham, UK  Simon Buckingham Shum, University of Technology, Syd- 1. MOTIVATION ney, Australia With the arrival of ‘big data’ in education, the potential was  Susan Bull, University College London, UK recognised for learning analytics to track students’ learning, to  Eva Durall, Aalto University, Finland reveal patterns in their learning, or to identify at-risk students, in  Albrecht Fortenbacher, HTW Berlin, Germany addition to guiding reform and supporting educators in improv-  Alyssa Friend Wise, Simon Fraser University, Canada ing teaching and learning processes [1]. Learning Analytics  Dragan Gasevic, University of Edinburgh, UK dashboards have been used at all levels, including institutional,  Blandine Ginon, University of Birmingham, UK regional and national level [2]. In classroom use, while learning  Dai Griffiths, University of Bolton, UK visualisations are often based on counts of activity data or inter- action patterns, there is increasing recognition that learning  Sharon Hsiao, Arizona State University, USA. analytics relate to learning, and should therefore provide peda-  Stéphanie Jean-Daubias, University Claude Bernard of gogically useful information [3]. While increasing numbers of Lyon, France technology-enhanced learning applications are embracing the  Matthew Johnson, University of Birmingham, UK potential of learning analytics at the classroom level, often these  Judy Kay, University of Sydney, Australia are aimed at teachers. However, learners can also benefit from  Michael Kickmeier-Rust, Technische Universität Graz, learning analytics data (e.g. [4][5]). Austria  Symeon Retalis, University of Piraeus, Greece Learner models hold data about an individual’s understanding or  Ravi Vatrapu, Copenhagen Business School, Denmark skills, inferred during an interaction, and are at the core of edu- cational systems that personalise the learning interaction to suit The workshop sold out quickly at full capacity (40 participants), the needs of the learner [6]. Open learner models externalise the highlighting the timeliness of this topic in Learning Analytics. learner model to the user, and have long been showing learners information about their own learning, often with the aim of en- 3. WORKSHOP PAPERS couraging metacognitive behaviours such as reflection, plan- The main themes that were addressed in the workshop papers ning, self-assessment and self-directed learning [7]. Benefits of were visualisation/dashboards, metacognition/awareness, and showing learning data to learners for such purposes are now also social learning. Several papers considered more than one of being investigated in learning analytics (e.g. [8][9]). Neverthe- these themes. Hatala et al.’s paper compares students’ approach- less, despite a few exceptions (e.g. [9][10][11][12]), there is es to learning to learning analytics visualisations, and the quality limited reference to both open learner models and learning ana- of messages posted. Al-Shanfari et al.’s paper proposes ways to lytics in the same publications. One of the aims of the Learning visualise uncertainty in data in an open learner model context. Analytics for Learners workshop, therefore, was to raise aware- Marzouk et al.’s paper investigates facilitating self-monitoring ness of the overlap, as well as differences, in approaches to, and and the type of analytics that may meaningfully prompt changes purposes of visualising and/or using learning data in these two to learning, including social learning. Venant et al.’s paper also fields. considers metacognition, awareness and deep learning, and so- cial awareness; and Davis et al.’s demonstration paper explores 2. SUBMISSION AND REVIEWING self-regulation, and comparison to previous successful learners. Submissions were sought on any aspect of learning analytics Knight and Anderson take a theoretical perspective, arguing for aimed at learners. Submissions were reviewed by three members participatory design for learning analytics for learners. Wasson of the Program Committee, and papers and reviews were also et al.’s position paper argues for the need to address data litera- scrutinised by members of the organising team. The papers were cy, and training learners in the new approaches and learning then discussed by the organisers, with particular attention given analytics and/or open learner model tools available to them. to cases where there was any disagreement amongst the review- Finally, Martinez-Maldonado et al.’s paper also explores both ers. Of the ten submissions received, eight were accepted for learning analytics and open learner models, in their case to sup- presentation at the workshop. port behavioural change in a health context. We thank the members of the Learning Analytics for Learners We thank all the authors for their contributions, as well as the Program Committee for their substantial efforts in making the other workshop participants who contributed substantially to the workshop a success. Program Committee members were: discussions throughout the day. REFERENCES [1] Siemens, G. & Long P. 2011. Penetrating the Fog: Analyt- ics in Learning and Education. EDUCAUSE Review 46 (5), 30-38. [2] West, D. 2012. Big Data for Education: Data Mining, Data Analytics, and Web Dashboards. Governance Studies at Brookings. 1-10. [3] Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s Not Forget: Learning Analytics are about Learning. Techtrends, 59(1), 64-71. [4] Ferguson, R. and Buckingham Shum, S. (2012). Social Learning Analytics: Five Approaches. LAK 2012. 23–33. [5] Vozniuk, A., Govaerts, S., & Gillet, D. 2013. Towards Portable Learning Analytics Dashboards. ICALT 2013, IEEE, 412-416. [6] Woolf, B.P. (2010). Student Modeling, in R. Nkambou, J. Bourdeau & R. Mizoguchi (eds). Advances in Intelligent Tutor- ing Systems, Springer-Verlag, Berlin Heidelberg, 267-279. [7] Bull, S. & Kay, J. 2013. Open Learner Models as Drivers for Metacognitive Processes. In R. Azevedo & V. Aleven (eds), International Handbook of Metacognition and Learning Technologies, Springer, New York, 349-365. [8] Dawson, S., Macfadyen, L., Risko E., Foulsham, T. & Kingstone, A. 2012. Using Technology to Encourage Self- Directed Learning: The Collaborative Lecture Annotation System (CLAS). ASCILITE 2012. [9] Durall, E. & Gros, B. 2014. Learning Analytics as a Meta- cognitive Tool. Proceedings of CSEDU 2014, 380-384. [10] Bull, S. & Kay, J. (2016). SMILI: A Framework for In- terfaces to Learning Data in Open Learner Models (OLMs), Learning Analytics and Related Fields, International Jour- nal of Artificial Intelligence in Education 26(1), 293-331. [11] Ferguson, R. (2012). Learning Analytics: Drivers, Devel- opments and Challenges. International Journal of Tech- nology Enhanced Learning, 4(5/6), 304–317. [12] Kalz, M. (2014). Lifelong Learning and its Support with new Technologies, in N.J. Smelser & P.B. Baltes (eds.), In- ternational Encyclopedia of the Social and Behavioral Sci- ences, Pergamon, Oxford.