=Paper=
{{Paper
|id=Vol-1738/IWTA_2016_paper1
|storemode=property
|title=Towards Understanding the Potential of Teaching Analytics within Educational Communities
|pdfUrl=https://ceur-ws.org/Vol-1738/IWTA_2016_paper1.pdf
|volume=Vol-1738
|authors=Konstantinos Michos,Davinia Hernandez-Leo
|dblpUrl=https://dblp.org/rec/conf/ectel/MichosH16a
}}
==Towards Understanding the Potential of Teaching Analytics within Educational Communities==
Towards understanding the potential of teaching analytics within educational communities Konstantinos Michos Davinia Hernández-Leo ICT Department Serra Hunter, ICT Department Universitat Pompeu Fabra Universitat Pompeu Fabra kostas.michos@upf.edu davinia.hernandez@upf.edu designers prepare ICT-rich learning arrangements and how they ABSTRACT use students´ data for the accountability and the (re)design of The use of learning analytics in ICT-rich learning environments their learning scenarios. Teaching analytics have been proposed assists teachers to (re)design their learning scenarios. Teacher as the design, development and evaluation of visual analytics inquiry is a process of intentional and systematic research of methods and tools for teachers to understand learning and teachers into their students´ learning. When teachers work in teaching processes [28]. The current research has focused in small groups or communities and present results of their practice different directions. This includes real-time learning analytics more interpretations are generated around the use and meaning collected during the learning process and presented to teachers of this data. In this workshop paper we present preliminary in order to intervene “on the fly” and orchestrate better their research about four dimensions of learning analytics teaching [29], data gathering based on the affordances of (engagement, assessment, progression, satisfaction), and their specific learning analytics tools and presentation to the teacher visualization as teaching analytics, that are hypothesized to be after the learning sessions [13]. relevant to help teachers in the (re)design of their learning scenarios. Moreover, we evaluate teachers’ acceptance of Although those approaches provide valuable information to the exchanging these types of analytics within their teaching teachers, in this paper, we argue that a communicative approach community. A workshop for blended MOOCs design (N=20 of teacher inquiry within groups or professional communities participants) showed that although all the analytics dimensions can generate additional insights on the way teachers can improve were valuable, assessment data was the most useful dimension learning scenarios and benefit from teaching analytics methods. for (re)designing while data about the engagement of students We present our preliminary work on four dimensions of learning was the less useful. Educational practitioners also showed analytics data with the aim to generate discussions between interest in knowing a combination of specific data (e.g. teachers on how they plan their inquiry and reflect about their achievements related with the satisfaction of students). Last, teaching plans with other practitioners. To extract requirements most participants expressed their willingness to share visual for the support of teachers within groups or professional learning learning analytics related to their designs with their colleagues. communities, we evaluate perceived usefulness of learning The role of contextual information to interpret the learning analytics data for the improvement of learning designs. analytics was recognized as important. Moreover, we evaluate the acceptance of exchanging visualizations between educators. A case study took part within a workshop for blended learning scenarios that incorporate General Terms resources from Massive Open Online Courses (MOOCs) [1]. Teaching analytics, Learning analytics, Communities of The remainder of the paper is organized as follow. In section 2 educators we describe teacher inquiry within professional learning communities, specifying the challenges addressed in the paper. Keywords In section 3 we explain our methodology and the four Teacher inquiry, professional learning communities, learning dimensions of students´ data which can be aligned with a design learning design. Section 4 describes the evaluation study we conducted for extracting requirements from educational practitioners and results of the study. Last section 5 is devoted to 1. INTRODUCTION a conclusion and implications for future work. There is a growing interest on the way teachers and learning multiple meanings and interpretations of the same data based on 2. TEACHER INQUIRY WITHIN different contexts. Moreover, often educators may face the PROFESSIONAL LEARNING problem of information overload from the data deluge and the solution may be not to gather more data rather to better highlight COMMUNITIES the reasons to collect the data, understand the context from which There is evidence that data use is helpful in improving it comes and locate better frames of reference [31]. It is also educator’s attitudes towards teaching practice and their students useful to differentiate between individual and collective [3, 21]. This is empowered when educational teams learn about sensemaking of data. The reason is that this process is the inquiry process and are engaged in collaborative informed considerably influenced from the context of the situation in which decisions. Changes in teacher culture which has been often it takes place as well as the wider organization in which the described as isolationist include the development of professional individual is participating. Prior knowledge of the sense maker learning communities which encourages sharing, reflection and and routines of actions between individuals may also influence deprivatization of practice [12]. Research in professional the way they interpret information. Thus, having more labels learning communities acknowledges that active teacher’s explanations and related experiences provides the ability to see participation and collaborative activities has an impact in and connect different data together and develop different teaching practice [7] and students´ learning [9]. narratives on what the data mean. However, developing a richer schema requires learning from the others and externalization of Teacher's groups or wider communities can be formulated knowledge between educational practitioners. within the same or different educational institutions with the aim to improve educational practices [8]. Currently a vast amount of 3. LEARNING ANALYTICS DIMENSIONS networked technologies and investigation tools [23] provide many opportunities of knowledge sharing and reflection over FOR A COMMUNITY OF INQUIRY teaching practice. The term teacher inquiry has been defined as Previously teaching analytics were proposed as the support of “a systematic, intentional research by teachers” [10] which aims diagnostic-decision making by teachers with the use of learning at improving instructions in four levels [14]: analytics [28] and as the understanding of online teachers interactions when they search and create educational resources 1. By defining important instructional problems specific [32]. In both cases, educational practitioners are considered in to the local context of the participating teachers small working groups with divergent backgrounds or larger 2. By planning and implementing instructional solutions- communities which aim to reach common ground or learn from Connecting theory to action each other. However, little research addresses how communities 3. By using evidence to drive reflection and analysis of teachers could be supported for better collective performance 4. By working towards detectable improvements and and which analytics from students' activity are most useful to specific cause-effect findings about teaching and consider when reflecting about improvements to their practice. learning Schnellert et al [26] examined how teachers are engaged in collaborative cycles of inquiry within authentic communities of As such, teacher inquiry consist of a cyclical approach which is practice. Teachers were co-constructing and analyzing situated connected with teacher´s planning and investigation and assessment based on formative assessment data. Avramides et al promotes changes in the way teacher´s design and rethink for [4] describe and evaluate a collaborative approach of teacher their students´ learning. Moreover, currently the practice for the inquiry into student learning and they emphasize in the need of collection of data about teaching and learning has emerged. As defining what data to collect and what they tell us about the Roshelle & Krumm [24] describe, evidence which can inform learning process. instructional improvement was previously infrequent and separated in time because it required an extensive time period 3.1 Methodology: first LATUX steps and additional teams of people which could carry out for In this paper, our aim is to understand how to support teachers´ instance classroom observation and paperwork. However, with reflection on their teaching plan with the use of teaching analytics the integration of ICT in teaching and learning, data can be displayed within communities. Our research context leads as to collected both from teachers and students more frequently and follow a Design-Based-Research [6] approach as it provides integrated into the everyday activities. The research field of flexibility and proposes analysis of requirements through the learning analytics, defined as “ the measurement, collection, collaboration with educators and researchers in real-life settings analysis, and reporting of data about learners and their contexts, in order to improve educational practices. More specifically, for purposes of understanding and optimising learning and the because we focus on visual analytics, after analyzing different environments in which it occurs” [15], facilitates the practical frameworks for the design of visualizations, we decided to follow application of extracting useful information from a learning the iterative workflow LATUX (Learning Awareness Tools User environment. eXperience ) [20] for designing, validating and deploying learning analytics visualizations. LATUX propose a workflow for projects However, despite the positive factors of investigating teaching aiming to develop awareness tools for instructors regarding the and learning to improve future students´ experiences we identify learning activities of the students. The authors explain four steps specific challenges addressed within a wider framework of which include problem identification, low-fidelity prototyping, professional communities for educators. There are currently few higher-fidelity prototyping and pilot-studies. In the first steps of works on how to support collaborative teacher inquiry [27] problem identification and low-fidelity prototyping the designers within communities, which students´ information is relevant to extract requirements, investigate stakeholder’s needs, identify extract in order to improve teaching and inform other colleagues data sources aligned with intended pedagogies and develop and which extra factors influence a community of educators. For possible visualizations. Our aim is to cover the first steps of instance, the concept of equivocality [17] deals with possible problem identification and low-fidelity prototyping. For this reason, we define our problem of supporting teacher inquiry within communities with visual learning analytics. We propose learning analytics data and visualizations which can drive reflections and we investigate stakeholders´ needs. 3.2 Description of the problem and low- fidelity prototyping Examples of learning tools which can be integrated in face-to- face and online teaching sessions include Learning Management Systems (e.g., Moodle, Blackboard, Sakai), discussion forums for social learning or use of wikis and google docs for deploying activities of students´ writing. Those kind of tools store information about student to student interaction and student- content interaction. However, information provided by those Figure 1. Examples of checkpoints and process tools with learning analytics visualizations often do not align analytics visualizations with the pedagogical intentions expressed by teachers in a For instance, regarding checkpoints the left graph shows the learning design and are not consistent with their aims of percentage of students who submitted a learning assignment in investigating their students [13, 22, 25]. Moreover, possible different levels of completeness. Regarding the process, the right reasons of teacher's inquiry into students learning [22] and the graph shows the level of participation in the assignment from sense-making of information about students may vary according different groups of students. A teacher may estimate if students to the specific educational context. In this paper, we focus on fulfilled requirements to proceed in an upcoming activity. four learning analytics types which are relevant with the monitoring of students´ engagement, the assessment of student's work, their progression through the timeline of a learning design 3.2.2 Achievements and assessment and the understanding of their overall satisfaction from the Achievement of students may be assessed through the evaluation learning activities. Our aim is to connect common objectives of of student’s products and artifacts. Thus, access to e-portfolios learning designs which promote active learning such as can generate valuable insights on how to (re)design future cognitive, behavioral, social and affective goals with the aims of learning activities [19]. However, since this requires time, learning analytics tools which has been stressed as assistance for qualitative information for the students´ works through the use of educators that identify cognitive, social and behavioral aspects rubrics may be able to inform educators about how to improve of students´ activities [2]. Moreover, we aim to address teacher their design. Moreover, automatic analysis from tests can also information needs which can be extracted from three sources: show where the students struggle and cognitive impacts of the the learning process, the learning outcomes and the teaching learning design [16]. practice [13]. These learning analytics dimensions may be classified in different levels of granularity from higher order values to concrete metrics according to specific tools´ affordances and indicators of students learning. We propose four higher level categories which may be able to help teachers to plan the inquiry process and evaluate a learning design within communities. In each category we present examples of low- fidelity prototypes visualizations and explain the connection with the learning design as teaching representations. 3.2.1 Engagement Engagement of students with the learning content and their peers constitutes prerequisite for their learning. Lockyer et al [18] explain two types of engagement data which can inform the (re) design of learning scenarios. First, checkpoint analytics which Figure 2. Sample visualization of assessment are relevant with the engagement of students with the course rubrics per group based on different criteria. resources and can show how students prepare to learn. Examples can be metrics for submission of learning assignments, online access to resources and downloads of course content. Second, Figure 2 shows a visualization of assessment rubrics based on process analytics like participation in activities per group and different criteria of evaluation which can be contrasted with the interaction analysis can show how students are engaged in goals of a writing assignment. Values correspond to the grades specific tasks (see figure 1). given by the teacher and show comparisons between different groups of students. inquiry process. This presumes to plan in advance how to collect 3.2.3 Progression through the time this data, which learning objectives to evaluate and in which instance of the design to focus. Although different types of data Learning progression can help guide teachers in designing their may be needed to be collected during the learning activities, objectives and choices in the classroom [11]. Bakharia et al [5] teachers often are overloaded with multiple tasks and thus need to describe a framework for the alignment of learning analytics focus in a specific dimension in each case. These multiple types with learning design and one dimension deals with temporal of learning analytics collected during the learning process may be analytics relevant with course, content and tool access during the able to evaluate a learning design and serve as support to timeline of the course. Tracking the progression of students intentionally collect data when designing for students’ learning. through the time may help teachers to better orient their decisions based on temporal planning (see figure 3). 3.3 Research focus In this paper we provide low-fidelity prototypical examples of analytics for teachers but our aim is not to evaluate the design of the visualizations rather to understand which of those learning analytics dimensions are relevant for educational practitioners. More specifically, we explore which information is useful in a community of educators´ to drive the improvement and customization of their learning designs. To address these issues, Figure 3. Example of progression educators´ usage beliefs (usefulness) of learning analytics through the time of a learning design dimensions for the (re)design of learning scenarios may provide insights on the adoption of this approach in teacher´s practice. Moreover, to evaluate those dimensions together, rather than separately, we discover relations between the usage beliefs of Figure 3. Example of progression different learning analytics data and between their contexts. Last, through the time of a learning design since our framework is within teachers´ groups or communities we evaluate the acceptance to exchange with other colleagues Figure 3 shows the progression of a whole class regarding access teaching analytics and additional useful information for them. to resources and participation in a forum during the timetable of a learning design. Low participation in specific weeks may The research question explored in this paper is: orient the design of future activities. RQ: Which learning analytics are useful to (re)design or to re-use a learning design? 3.2.4 Satisfactions rates Student interest and satisfaction is referred to as another factor to This research question is investigated though the following more evaluate the effectiveness of learning environments [33]. The specific questions: term student satisfaction can refer to whether students liked to RQ1: Are the above learning analytics dimensions participate in the learning environment, if it was enjoyable to (engagement, assessment, progress, satisfaction) or other work in groups and their overall experience in each learning information perceived as valuable by educational activity (see figure 4). practitioners? RQ2: Is there any relation between the four dimensions and between the dimensions and the contexts of the students? RQ3: In a collective level, do educators will to share learning analytics visualizations or to look at the results of their colleagues? 4. EVALUATION A case study was used to evaluate how educational practitioners perceive the use of learning analytics for the improvement and reuse of learning designs. The setting was a teacher-training workshop about designing blended MOOCs held in conjunction with a MOOC platform conference. 24 participants, including 8 professors, 12 university assistants devoted to the design of Figure 4. Satisfaction of students in courses and 4 educational researchers took part in the workshop. different elements of the learning design The use of technology in blended learning approaches allows the collection of data about students representing a feasible case where teachers can have access to learning analytics data. The Figure 4 shows percentage of students´ satisfaction regarding aim of the workshop was to introduce to a group of educational different elements of a blended learning scenario. Each element practitioners a framework for the design of blended MOOCs and can be estimated in the design of an upcoming learning scenario. to evaluate which different levels of analytics or additional information from colleagues can drive decisions for learning The alignment of those learning analytics dimensions with a design improvement. learning design may require from a teacher to be involved in the Regarding the profile and interest of the participants, 60% of questions, one for each of the learning analytics dimensions and them were conceptualizing an idea of a blended MOOC course one question regarding the usefulness of knowing about the to be implemented in the future while 35% were preparing or context and student´s profile. Additional open questions aimed to running a blended MOOC course at the time of the workshop extract which additional information could be useful from the and only 5% were not intended to implement a MOOC course. perspective of the participants to support reflection for the Their interest to participate in the workshop was primarily to improvement of a learning design. Finally, to evaluate acceptance learn how to blend MOOC resources in face-to-face classrooms of collective practices when teachers present results of their and apply it into their practice. inquiry in the form of visualizations, two additional questions were referring to the acceptance of sharing learning analytics For the facilitation of the workshop, participants were provided visualizations with other colleagues and the acceptance of having with different example cases of blended MOOCs design (e.g. access into results of other educator’s results. flipped classroom case) which were analogous to their own ideas about course design. Each case was enriched with low-fidelity Descriptive statistics and correlation analysis between the prototypes of learning analytics data in each of the above constructs were used to explore the results of the questionnaire. A categories (engagement, assessment, progress, satisfaction). The total N=20 participants responded in the questionnaire with an examples included the figures shown in the previous section and acceptable reliability a = .76. The results regarding the perceived among others, histograms, bar charts and line-graphs of temporal usefulness of learning analytics dimensions (RQ1) showed that analysis for student´s access to resources of the course, these categories receive high value from the participants with satisfaction rates of face-to-face and online activities, students´ means ranging between 3.6 and 3.95 within a Likert scale 1-5 pass rates and group participation in wiki assignments. Both the (See table 1). An interesting result was the fact that the example cases and the visual analytics were provided as paper assessment category had the higher mean (M = 3.95) whereas the material. engagement of students had the lowest mean (M = 3.6), while progression and satisfaction were in similar levels. One To generate discussions within the workshop´s groups, after an interpretation could be that participants perceived high value in initial introduction to the topic, the participants were asked to past students´ achievements when designing a blended MOOC look at the example cases and the learning analytics dimensions whereas engagement with course material and online interactions and to think which information help them to re-design or reuse is a secondary priority. these cases. Moreover, they were asked to discuss which information after the implementation of their course they were The question concerning perceived usefulness of knowing the willing to share within their educational community. context (RQ2) of the course (e.g. the profile of the students, level of education, and the domain of knowledge) for the understanding and analysis of learning analytics visualizations received high value with a mean M = 4.4 (SD = .68) within a Likert Scale 1-5. This may shows high relevance of providing information about the students and the overall context of a learning design in order to interpret visualizations given by others. Correlation analysis between those dimensions (see Table 1) showed that perceived usefulness of engagement analytics was correlated with progression and assessment with satisfaction. Moreover, interest in knowing the educational context was correlated with interest about engagement and assessment. The relation between the value of engagement and progression awareness may show how participants anticipate and combine the efforts of the students with their progress. The relation between assessment and satisfaction can be interpreted from the perspective that achievements of students are perceived consistent with their overall satisfaction. Finally since we found a correlation only between usefulness of context information and assessment and engagement analytics, we can interpret that those types of data are especially relevant within the context in which Figure 5. Working groups discussing about the they are collected. use of learning analytics in different cases. Figure 5 shows low fidelity prototypes of four learning analytics Table 1. Descriptive statistics and correlation matrix. dimensions as paper material. Groups of participants were Usefulness of each learning analytics dimension and the provided with example cases of blended MOOCs designs and context the four learning analytics dimensions. Mean(SD) 1 2 3 4 1.Engagement 3.6(1.04) For the evaluation of this approach, we used two data sources, a 2.Assessment 3.95(.82) .402 questionnaire and observations carried out by one individual 3. Progression 3.8(.89) .585** .271 researcher. We constructed a questionnaire based on the 4. Satisfaction 3.85(.81) .297 .616** .391 Learning Analytics Acceptance Model described in [2] for the 5. Context 4.44(.68) .532* .506* -.035 .304 perceived usefulness of learning analytics dimensions to n = 20, *p<0.05, **p <0.01 improve learning designs. The questionnaire included four Table 1 provides descriptive statistics regarding the usefulness The willingness to see the results of the implementation of other of each learning analytics dimension (1-4) and the usefulness of learning designs also received high acceptance (75%). However, knowing the context and student´s profile in the example cases this time 4 participants indicated that they would not like to have (5). Moreover, columns 1-4 show the correlations between the access to these kinds of visualizations. This opens up questions in five items of the questionnaire. the way data can be presented to educators and which additional information would help them to re-design their course. The The qualitative responses of the participants regarding limited responses concerning useful information from other additional information which could help them to redesign their colleagues do not allow us to make conclusions. However, many course or re-use an implemented design showed the importance participants inquired information about concrete related learning of having descriptive qualitative information about face to face design examples and students´ satisfaction levels for each specific sessions such as teacher reports and observations about the part of the course. levels of students´ interactions. Some other interesting responses were the idea that online connection time does not necessarily Last, the observations carried out by the individual researcher indicates useful work, but that actual time used in each activity showed that participants were particularly interested to have is useful to redesign a course (see figure 6). In general, learning analytics results for each specific case. The discussions of the designers may often need a combination of data regarding face groups were varying according to the participant´s beliefs about to face and online interactions and qualitative feedback from the different analytics dimensions and often participants were their colleagues. having different understanding of the same results and possible learning design improvements. 4. CONCLUSION Data-driven reflections on the teaching practice can impact the way in which educators design for learning and deliver their teaching. Educational teams or communities can be formed around situated activities such as teacher planning, analysis of student’s data and improvement of learning designs. In this paper we analyzed which learning analytics data or additional information is useful to help educational practitioners to redesign their learning scenarios. We considered our analysis within teacher’s inquiry teams or wider communities and thus we proposed four learning analytics data which can be aligned with teacher’s pedagogical intentions expressed in a learning design and can drive discussions. Our case study within a workshop for the design of blended MOOC courses showed that the dimensions of engagement, Figure 6. Word cloud of participants´ responses achievement, progression and satisfaction were perceived as of high value by the participants. This proposes that in this context Figure 6 shows key words of participant’s responses regarding these learning analytics dimensions are considered as relevant to information that help them to re-use or re-design their course or drive reflections. The assessment of students was the most useful anothers’ implemented design. Interaction of students, time information to develop decisions on how to improve future duration of activities and face-to-face observations were among courses. However, the limitation of our case in blended MOOCs the key information. and the fact that the participants were provided with the learning design of high granularity (representing the whole course rather Regarding the willingness to share learning analytics (RQ3) the design of partial phases of the course) may influence the value results in the form of visualizations with other colleagues, the of having this data. For instance, teaching representations for a results showed high acceptance as 75% of the participants gave collaborative learning activity may require more data about the positive responses. 2 of the participants indicated that they learning process and the engagement of students to show would be willing to share specific data and on demand interesting information to the teacher. information if they were asked from other colleagues and 2 were not willing to exchange aggregated analytics from their Second, the experience of the participants with the scenarios. The participants were also asked which type of implementation of blended courses with MOOCs, positive or information would be useful to help other colleagues to design a negative, may influence the interpretations of our results. The similar experience. Although this question received low largest amount of them were preparing the content of a blended responses, useful information was related to the details of the course but had limited experience in implementing it. Further teaching strategy (similar to the representation of a learning studies should consider interviewing educational practitioners design or a teaching notation), explanations of faced difficulties during or after the implementation of their own learning scenarios and positive experiences from other educators and aspects of as accessibility and effort to interpret data will provide better their four dimensions we proposed. This highlights the need to insights for the usefulness of this approach. inform other educators about the way they design their courses and their experiences after their implementation as statistics and In a collective level, educational practitioners were interested to visualization may be not enough for the interpretation of view learning analytics visualizations from other colleagues or to learning analytics results. share their own results to inform educational teams. However, the context of the learning design was valuable information to interpret this data. This proposes that educators are interested to Gašević, D., Mulder, R., Williams, D., Dawson, S. & collaborate with others on issues such as the use of student´s Lockyer, L., 2016. A conceptual framework linking learning data to improve their practice, data collection, data visualization design with learning analytics. In Proceedings of the Sixth and learning design. However, we need to consider that there is International Conference on Learning Analytics & an amount of practitioners that are not willing to open their Knowledge (pp. 329-338). ACM practice about data-driven reflections in open educational teams and thus prefer to share practice on demand if they are asked [6] Barab, S., Squire, K. 2004. Design-Based Research: Putting from others. a stake in the ground. The Journal of the Learning Sciences 13(1), 1–14 Regarding the four dimensions we proposed, we can conclude that educators may need to search relations between their data [7] Berry, B., Johnson, D., & Montgomery, D. (2005). The according to their actual meaning. For instance, in our case the power of teacher leadership. Educational Leadership, 62(5), value of assessment data was correlated with information about 56-60. student´s satisfaction and engagement with their progression. Moreover, in our workshops participants asked for teacher´s [8] Binkhorst, F., Handelzalts, A., Poortman, C. L., & van reports regarding the student´s discussion in the classroom, and Joolingen, W. R. 2015. Understanding teacher design exchange of positive or negative experiences from other teams–A mixed methods approach to developing a colleagues. This proposes that additional work is needed on how descriptive framework. Teaching and teacher education, 51, teachers connect different sources of visual learning analytics 213-224. and qualitative data to decide how to improve their scenarios. Studies that evaluate practitioners during their design, the use of [9] Bolman, R., McMahon, A., Stoll, L., Thomas, S., & learning analytics data and their collaboration with other Wallace, M. 2005. Creating and sustaining professional educators can identify patterns of data-driven reflections. learning communities (Research Report 637). London UK: General Teaching Council for England, Department for Last, design implications of our evaluation propose that Education and Skills educators´ teams can be supported with learning analytics visualizations when they have access to the specific learning [10] Cochran-Smith, M., & Lytle, S. L. (1993). Inside/outside: design of a course and additional teacher´s reports or exchange teacher research and knowledge. New York: Teachers of teaching experiences. Educational communities need to College Press. concentrate in specific learning analytics data that show impacts of learning designs in order to formulate collaboratively [11] Corcoran, T., Mosher, F., & Rogat, A. 2009. Learning important meanings for the teaching practice. progressions in science: An evidence-based approach to reform. Report of the Center on Continuous Instructional 5. ACKNOWLEDGEMENTS Improvement, Teachers College, Columbia University, New This research is partly funded by RecerCaixa and the Spanish York. Ministry of Economy and Competitiveness under RESET (TIN2014-53199-C3-3-R) and the Maria de Maeztu Units of [12] Dana, N. & Yendol-Hoppey, D. 2014. The Reflective Excellence Programme (MDM-2015-0502). Authors want to Educator’s Guide to Classroom Research: Learning to thank Laia Albo for the global organization of the UCATX Teach and Teaching to Learn Through Practitioner Inquiry. workshop and all the professors and researchers who Corwin: London. participated. [13] Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. 2013. Supporting action research with 6. REFERENCES learning analytics. In Proceedings of the Third International [1] Albó, L., Hernández-Leo, D., Oliver, M. 2016. Blended Conference on Learning Analytics and Knowledge (pp. 220- MOOCs: university teachers’ perspective. EC-TEL-WS 229). ACM. 2015: Trends in Digital Education. Toledo, Spain, September 18, 2015,(pp 11-15) Published on CEUR-WS: [14] Ermeling, B. A. 2010. Tracing the effects of teacher inquiry 30-May-2016 http://ceur-ws.org/Vol-1599/ on classroom practice. Teaching and Teacher Education, 26(3), 377-388. [2] Ali, L., Asadi, M., Gašević, D., Jovanović, J., & Hatala, M. 2013. Factors influencing beliefs for adoption of a learning [15] Ferguson, R. 2012. Learning analytics: drivers, analytics tool: An empirical study. Computers & developments and challenges. International Journal of Education, 62, 130-148. Technology Enhanced Learning (IJTEL), 4, 5/6, 304–317. [3] Armstrong, J., & Anthes, K. 2001. How data can help. [16] Hernández-Leo, D., Pardo, A. 2016. Towards Integrated American School Board Journal 188(11), 38–41 Learning Design with Across-spaces Learning Analytics: A Flipped Classroom Example. Proceedings of the First [4] Avramides, K., Hunter, J., Oliver, M., & Luckin, R. 2015. International Workshop on Learning Analytics Across A method for teacher inquiry in cross‐curricular projects: Physical and Digital Spaces co-located with 6th Lessons from a case study. British Journal of Educational International Conference on Learning Analytics & Technology, 46(2), 249-264. Knowledge (LAK 2016). Edinburgh, Scotland, UK, April 25-29, 2016, (pp. 44-78) Published on CEUR-WS: 01-Jun- [5] Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., 2016 http://ceur-ws.org/Vol-1601/ Towards visual analytics for teachers' dynamic diagnostic [17] Lim, K. H., & Benbasat, I. 2000. The effect of multimedia pedagogical decision-making. In Proceedings of the 1st on perceived equivocality and perceived usefulness of International Conference on Learning Analytics and information systems. MIS quarterly, 449-471. Knowledge (pp. 93-98). ACM [18] Lockyer, L., Heathcote, E., & Dawson, S. 2013. Informing [29] Vatrapu, R., Kocherla, K., Pantazos, K. 2013. iKlassroom: Pedagogical Action: Aligning Learning Analytics With Real-Time, RealPlace Teaching Analytics. In Learning Learning Design. American Behavioral Scientist, 57(10), Analytics and Knowledge.2nd International Workshop on 1439–1459. Teaching Analytics (IWTA).Proceedings of the 3rd International Conference on Learning Analytics and [19] Lozano-Alvarez, A., Asensio-Pérez, J. I., Vega-Gorgojo, Knowledge. 9 April 2013. University of Leuven. ISBN: 978- G., & Martínez-Monés, A. 2015. Helping teachers align 1-4503-1785-6 learning objectives and evidence: integration of ePortfolios in Distributed Learning Environments. Journal of [30] Vescio, V., Ross, D., & Adams, A. 2008) A review of Universal Computer Science, 21(8), 1022-1041. research on the impact of professional learning communities on teaching practice and student learning. Teaching and [20] Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, teacher education, 24(1), 80-91 K., Kay, J., & Clayphan, A. 2015. The LATUX workflow: Designing and deploying awareness tools in technology- [31] Weick, K. E., & Sutcliffe, K. M. 2008. Information overload enabled learning settings. In Proceedings of the Fifth revisited. The Oxford Handbook of Organizational Decision International Conference on Learning Analytics and Making, Oxford University Press, Oxford, 56-75. Knowledge (pp. 1-10). ACM. [32] Xu, B., & Recker, M. 2012. Teaching Analytics: A [21] Massell, D. 2001. The theory and practice of using data to Clustering and Triangulation Study of Digital Library User build capacity: State and local strategies and their effects. Data. Educational Technology & Society, 15(3), 103-115. In S. H. Fuhrman (Ed.), From the capitol to the classroom: Standards-based reform in the states (pp. 148–169). [33] Zhu, C. 2012. Student Satisfaction, Performance, and Chicago: University of Chicago Press. Knowledge Construction in Online Collaborative Learning. Educational Technology & Society, 15(1), 127-136. [22] Mor, Y., Ferguson, R., & Wasson, B. 2015. Editorial: Learning design, teacher inquiry into student learning and learning analytics: A call for action. British Journal of Educational Technology, 46(2), 221–229. http://doi.org/10.1111/bjet.12273 [23] Rebholz, S., Libbrecht, P., and Müller, W. 2012. Learning analytics as an investigation toolfor teaching practicioners. In Proceedings of the Workshop Towards Theory and Practice of Teaching Analytics 2012 (TaPTA-2012), Saarbrücken, Germany, 2012. CEUR-WS. [24] Roschelle, J., & Krumm, A. 2015. Infrastructures for Improving Learning in Information-Rich Classrooms. Measuring and Visualizing Learning in the Information- Rich Classroom.3-11 [25] Rodríguez-Triana, M. J., Martínez-Monés, A., Asensio- Pérez, J. I., & Dimitriadis, Y. 2015. Scripting and monitoring meet each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations. British Journal of Educational Technology, 46(2), 330–343. [26] Schnellert, L. M., Butler, D. L., & Higginson, S. K. 2008. Co-constructors of data, co-constructors of meaning: Teacher professional development in an age of accountability. Teaching and Teacher Education, 24(3), 725-750. [27] Slavit, D., Nelson, T. H., & Deuel, A. 2013. Teacher groups’ conceptions and uses of student-learning data. Journal of Teacher Education, 64(1), 8-21. [28] Vatrapu, R., Teplovs, C., Fujita, N., & Bull, S. 2011.