=Paper=
{{Paper
|id=None
|storemode=property
|title=Learning Analytics as an Investigation Tool for Teaching Practitioners
|pdfUrl=https://ceur-ws.org/Vol-894/paper7.pdf
|volume=Vol-894
}}
==Learning Analytics as an Investigation Tool for Teaching Practitioners==
Learning Analytics as an Investigation Tool for Teaching Practicioners Sandra Rebholz1, Paul Libbrecht2, Wolfgang Müller1 1 University of Education Weingarten, Media Education and Visualization Group (MEVis) 2 MLU Halle, Center for Educational Research in Mathematics and Technology Abstract: Formative assessment plays an important role in teaching, to both provide learners with valuable feedback and to improve the teaching process. Recently, novel tools have been developed to support teachers in this context. These tools allow one to analyze the students' activity in detail and guide and improve the teaching process. We present one such tool, and the scenarios in which this tool would be used, following an evaluation. From these scenarios, we deduce the structure of a general analytical process for teachers that yields general requirements for the corresponding analytical tools. 1 Introduction Feedback on the learner’s performance represents an indispensable element in learning scenarios. Adequate feedback presented in a timely fashion may improve and accelerate student learning. However, the task to provide such feedback is challenging in classical learning scenarios due to the required effort to analyze students’ perform- ance. Novel techniques and tools from the field of computer-based assessment (CBA) may overcome this problem. These allow for solving exercises at a computer, record- ing not only the final outcome, but also the whole of a student's learning process on the level of individual solution steps. This is especially important in cases where feedback is required not only based on the reproduction of factual knowledge, but on the application of strategies and algorithmic thinking, such as in various fields in Mathematics. Applying corresponding forms of formative CBA may easily result in large amounts of log data from learners. In principle, this data may allow teachers to analyze a learner’s performance in detail, to provide helpful feedback to individuals, and to adapt teaching in general. Thus approaches to analyze the vast amounts of data are required. For instance, summary views are required to detect symptoms such as misunderstandings or excessive successes, while sampling methods have to be sup- ported to infer a problem seen in summaries. The contribution of this paper is an analysis of the analytical process of the teach- ers we aim to support, based on a range of scenarios. Moreover, we depict demands on teaching and learning analytics tools and visualizations. The remainder of the paper is structured as follows: First we give an overview on the theoretical background and related work. Then, we introduce tools developed in the SAiL-M project showcasing novel opportunities to include CBA in the classroom, and how this enables teachers to analyze students’ performance in a new way. We present three different scenarios, which explain in more detail the different motiva- tions of teachers for analyzing data, and analyze the processes observed. Open ques- tions form the conclusion. 2 Background and Related Work Formative assessment is intended to improve teaching and learning. This kind of assessment aims at providing an overview of the students’ understanding and at iden- tifying areas of misconception. When conducting formative assessments, the teacher faces the challenge of interpreting the data obtained as assessment results and adapt- ing instruction based on that information. When applying the concept of formative assessments in a university setting, the challenge already arises when trying to organ- ize assignments for a large number of participants: assignments have to be corrected, errors have to be analyzed, data has to be collected and feedback to the students has to be given. Doing this on a regular basis, very little time remains for interpreting the obtained data and for thinking about strategies of instructional change. A teacher’s role as a researcher or investigator is being increasingly highlighted. For example, the research review [10] highlights the importance of the teachers’ abil- ity to synthesize students’ and tools’ usage. Corresponding approaches are connected to the field of action research [6], targeting to improve teaching through reflection and development of adequate action in their classroom setting. Adequate formative as- sessment typically also represents a crucial element in such approaches. Computers have been proposed frequently as a means to perform assessments fre- quently and in an effective manner. In fact, early approaches in the area of pro- grammed instruction include this notion [9]. Intelligent tutoring systems (ITS) also integrate different types of assessment components as part of their inner loop [12]. In both cases, the results of the assessment are being utilized to adapt the learning proc- ess. However, the corresponding information is rarely being provided to a teacher or tutor for further analysis. CBA refers to a set of different approaches for educational assessment, both in the classroom and in large-scale testing situations, based on ques- tion types with which computers can effectively interact, including scoring and score reporting, while still gathering meaningful measurement evidence [7]. In practice, most of the applied techniques boil down to multiple-choice-like questionnaires, which may provide limited detail on students' conceptions and misconceptions, espe- cially in the context of conveying knowledge beyond factual and higher-level compe- tencies. Intelligent CBAs [2] represents a recent approach to overcome these limita- tions, utilizing methods from the field of ITS to assess more complex solutions proc- esses on the level of individual solution steps. As a result, such assessments may pro- vide much more detailed information of students’ performance, raising demands in approaches to analyze the large amounts of recorded data efficiently. Different to ITS approaches, intelligent assessment does not rely on automatic assessment alone, but rather introduces a semi-automatic approach, thus causing further demand for ade- quate methods to support a teacher in the assessment of solution processes. There have been a number of proposals for defining Learning Analytics, which to some extent take different objectives and only partially overlap [8]. We do connect to the field of Visual Analytics [5, 11], and for this reason we understand Learning Ana- lytics as a specific focus and application area of Visual Analytics. That is, Learning Analytics relates to approaches and technologies targeted to allow for analytical rea- soning facilitated by visual interfaces employed for teaching or learning. Objectives are the detection of interesting aspects and patterns in learner and learning data, build- ing hypotheses based on these detected structures, confirming such hypotheses, drawing conclusions, and possibly communicating the results of this analytical proc- ess. In the context of this paper we will discuss how Learning Analytics relates to formative assessment, and what specific requirements can be stated on corresponding solutions from the perspective of a teacher-as-investigator. 3 Intelligent Assessment Tools In the context of the research project SAiL-M (Semi-automatic Analysis of Indi- vidual Learning Processes in Mathematics), we have developed various interactive learning tools for the field of mathematics. Our learning tools implement the approach of intelligent assessment and use the general-purpose logging architecture SMALA (SAiL-M Architecture for Learning Analytics) for recording all semantically relevant interactions between the learners and the tools. The SAiL-M learning tools are web-based software applications that can be ac- cessed as learning activities from within a learning management system (LMS). Authenticated users of the LMS can use the learning tools and solve exercises interac- tively. The actions of the learner are analyzed and automated feedback is provided (generally detecting standard errors or standard solution paths). The SMALA logging infrastructure provides the learning tools with the extra func- tionality of recording all interactions that occur between the learner and the learning tool. In order to document individual learning processes, the learning tools send all semantically relevant interactions as events to the SMALA logging service with in- formation such as the pseudonym, the input, and the displayed feedback. The events are stored in the SMALA database and from there, the data logs get analyzed and rep- resented by suitable log views. Authorized teachers can can access these log views from the SMALA web server thanks to the automatic SMALA log architecture in Figure 1. Available log views assessement views log views log include both summary views on activities and views logs performances of the whole group of learners, acti- how to deploy and session views on step-by-step recordings vities of individual learning processes. We describe Learning- Tool them below. start tool session In the winter term 2011/2012 the learning LMS tools were used and evaluated by about 200 students and 6 teachers in the Universities of launch tool Course Education in Heidelberg, Karlsruhe and Lud- resource creates wigsburg. The goal of the evaluation was to resource investigate both the acceptance and usefulness Fig. 1: the architecture of SMALA. of the learning tools from the students’ point of view and from the teachers’ point of view. The evaluations showed the toolset and observation mechanism to be acceptable for students and to allow them to seek help effectively (see [4]). Interviews with four participating teachers confirmed that individual learning processes were reproducible by the SMALA session views. However, teachers requested other statistical indicators and richer summary views for getting a general overview of the students’ activity and performance. In particular, they were interested in statistics on assessment results (e.g., type and number of detected problems), number of feedback requests and the level of activity per student, which were not available during the evaluation. Based on these outcomes, we have developed analytical process scenarios as a means to illus- trate the integration of SMALA log views into the teaching practice and to get a better understanding of the teachers’ requirements for such log views. 4 Analytical Process Scenario When integrating formative CBA tools in a classical teaching environment, the teachers typically introduce the learning tools during the lecture. The tools can then be used at home or in the lab. Based on the assessment results, the teachers can then decide on appropriate adaptions of their teaching strategy. In this section, we describe illustrative scenarios involving the usage of learning analytical tools. Scenario I: Checking the students’ activity. The teacher introduces the topic of functions and relations. In order to demonstrate the concepts of injectivity and surjec- tivity, she interactively constructs relations by using the tool Squiggle-M, a tool to exploratively discover the properties of finite mappings. At the end of the session the teacher asks the students to do exercises 1 and 2 from the learning tool as homework and to perform their own explorations with the tool as they wish, with a one week delay. One day before the next session, the teacher checks whether the students have worked with the learning tool. She opens the SMALA log views for the Squiggle-M learning activity and gets a visual representation of the learner’s activities. According to the diagram most of the students have solved both exercises. Their level of activity shows that they were interacting intensely with the tool. Only a few students have done only one exercise with a low level of activity. Based on his experience, he de- cides that 74% is reasonable as a proportion of high involvement. Scenario II: Detecting possible areas of misconception. The teacher introduces the concept of proofs by complete induction. After explaining the underlying idea, she demonstrates how to apply the principles proving the formula of the Gauss summa- tion. At the end of the session, the teacher asks the students to do exercises 1 to 3 from the online training tool ComIn-M as homework. The homework should be made before next week’s ses- sion. One day before the next session, the teacher checks whether the stu- dents were successful in solving the exercises us- ing a tabular overview (such as the one in Figure 2) which shows counts of successes for each exer- cise. This is the SMALA log views for the ComIn- M learning activity. Ac- Fig. 2: A summary view in SMALA. cording to the view, several students have worked the first exercises. In order to get an impression of the students’ performance, the teacher opens a second view that shows the successful sessions. She realizes that there are only very few students that came to a correct solution. When drilling down to a representation of the type of prob- lems that were detected by the automatic analysis component of the learning tool, which is linked to a detailed session view, the teacher realizes that most of the prob- lems occurred in the step of finding the correct induction statement. According to this report, 80 % of the students gave up at this stage of the solution process. Because the teacher considers these findings critical, he decides to explain this part of complete induction again in more detail in the coming session. Scenario III: Providing individual feedback. A student is doing the homework for the mathematics class and opens the learning tool ComIn-M in order to solve the exercises online. She selects the first exercise that the teacher asked the students to do. After some minor difficulties in entering the mathematical formulae, she success- fully enters the base case for the proof by mathematical induction. In order to find out whether she did fine so far, she requests an automatic analysis of the current solution by pressing the “Verify” button. A green check mark appears on the screen, confirm- ing that her intermediate solution is correct. She continues by selecting the correct assumption and then tries to figure out the induction statement. As she is not sure what to enter, she guesses a statement and requests an automatic analysis from the tool. This time, the tool marks the current solution as wrong and displays a short de- scription of the problem that was detected. Now the student tries another solution, requests an analysis again, but again the tool reports a problem. The student is afraid that she cannot find the correct solution on her own, so she uses the “Ask Tutor” fea- ture of the learning tool. By simply clicking the corresponding link, a dialog window opens and lets her enter a message to the tutor. When she submits the dialog, her mes- sage is sent along with a link to her SMALA logging session to the responsible tutor. Later that day, the tutor checks her email and finds a notification that a ComIn-M user needs personal help from her. She reads the message and follows the link to the SMALA logging session. This log view shows the recorded interaction sequence be- tween the student and the learning tool until the point of help request as illustrated in Figure 3: it shows an easily read- able overview of each the terms the student has input and all the prob- lems that were reported by the tool. Investigating the last state of the solution process, the teacher quickly finds out that the student did not replace the index variable correctly. So she sends her advice back to the requesting student, ad- dressing the concrete problem that she detected in the solution of the Fig. 3. SMALA session view student. 4.1 The Analytical Processes In the scenarios described above the teachers typically perform four parallel rea- sonings, which could be carried out alternating or in parallel: based on their knowl- edge of the domain and learning tools, they have expectations of the learners’ activi- ties; these expectations are compared to the analytical views in an explorative brows- ing way; this browsing leads to interpretations of the learning processes, which results in strategies being assembled to further teaching actions. Process 1: Determine the expectations. Based on their course plan and assign- ment, the teacher has expectations about the students’ usage of the learning tools. Typical expectations of interest would be: this assignment should have been fully (or barely) completed since it is easy (or challenging); can we find typical problems?; this technical problem is likely to happen; or expect to see sufficient evidence in the ana- lytics views to decide on deepening a subject or not. These expectations are con- stantly adjusted based on the processes below. Process 2: Log views analysis. Typically, teachers perform a multi-step analysis on the assessment results [3]. First, teachers look at overall scores and learning out- comes to get an overview of the general class performance. Such summaries of the assessment results should highlight weaknesses both by content area and by student. Thereby, it is possible to detect common problems and difficulties. It is also possible to identify low-performing students that need special support and further guidance. In a second step, teachers perform an in-depth analysis of selected individual solutions and errors. The detailed analysis shall reveal insights into the reasons for errors. Ide- ally, not only the product of learning should be considered in this analysis step, but the whole process leading to the final product of learning. Process 3: Interpret the learning processes. The interpretation of the log views leads to an understanding of the learning process. The interpretations depend on the professional experience of the teacher and is often based on so-called “thresholds”. A threshold in this context is defined as a “criteria for determining whether student[s] performance[s] require[s] an instructional response” [3]. Teachers can use thresholds as an indicator as to whether a student has mastered the content covered by the as- sessment. This allows teachers to decide on the need for an adaptation to the teaching plan, for the classroom or for the individual student. Process 4: Preparing the instructional response. The outcome of the analytic process is the most important and the most challenging objective: how can the inter- preted analysis be turned into action? What are the necessary measures to address the detected problems? Again, it is the experience of the practicing teacher that can help in answering these questions. In this step, research meets practice by developing con- crete teaching strategies from research findings. Although it is the individual teacher who is ultimately responsible for selecting an appropriate action strategy, there are numerous sources of ideas and suggestions for finding such strategies [1]. 5 Open Questions and Conclusions Teachers do require special solutions for learning analytics tools that support them in the analysis of data from formative CBAs. Analytical processes such as the 4-step procedure described in this paper have to be supported. In addition, exploratory ap- proaches with no initial hypotheses in mind must be supported. Here, hyperlinking between the representations certainly supports an exploratory investigation. Currently, our SMALA interface is restricted to simple depictions of data corre- sponding to typical goals and analysis steps, as well as displaying the usage of the learning tools. We are currently extending this by developing standard graphs for de- tailed information and special visualizations supporting overview/detail representa- tions and the interactive filtering and thresholding of data (e.g., table lens techniques [13]) as well as analyzing temporal aspects in the data [14]. A more detailed analysis is required to evaluate the potential of integrating such visualizations into the inter- face for regular teachers, which are not experts in exploiting visual representations. References 1. Altrichter, H. & Posch, P. (2007). Lehrerinnen und Lehrer erforschen ihren Unterricht (4. Auflage). Bad Heilbrunn: Julius Klinkhardt. 2. Bescherer, C., Herding, D., Kortenkamp, U., Müller, W., Zimmermann, M. (2011). E- Learning Tools with Intelligent Assessment and Feedback. In S. Graf et al. (eds.): Adaptiv- ity and Intelligent Support in Learning Environments. IGI Global. 3. Goertz, M.E., Oláh, L.N., & Riggan, M. (2009). Can interim assessments be used for in- structional change? (CPRE Policy Brief R-51). Philadelphia, PA: University of Pennsylva- nia, Consortium for Policy Research in Education. 4. Libbrecht, P., Rebholz, S., Herding, D., Müller, W., & Tscheulin, F. (2012). Understanding the Learners' Actions when using Mathematics Learning Tools. In: Jeuring, J. et al. (Eds.): Proceedings CICM 2012. LNCS 7362. Springer. 5. Keim, D., Mansmann, F., Schneidewind, J., Thomas, J., & Ziegler, H. (2008). Visual ana- lytics: Scope and challenges. In S. Simoff, M. Böhlen, and A. Mazeika, editors, Visual D a t a M i n i n g , v o l u m e 4 4 0 4 o f L N C S , p a g e s 7 6 – 9 0 . S p r i n g e r. S e e http://www.springerlink.com/content/5275180h84863347/. 6. Kemmis, S., & McTaggart, R. (1982). The action research planner. Victoria, Australia: Deakin University Press. 7. Scalise, K. & Gifford, B. (2006). Computer-Based Assessment in E-Learning: A Frame- work for Constructing “Intermediate Constraint” Questions and Tasks for Technology Platforms. The Journal of Technology, Learning, and Assessment, 4(6). http://www.dokeos.com/doc/thirdparty/Computer%20Based%20Assessment.pdf 8. Siemens, G. (2011). What are learning analytics? See http://www.elearnspace.org/blog/2010/08/25/what-are-learning-analytics/. 9. Skinner, B. F. (1958). Teaching machines. Science 128 (967-77), 137-158. 10. Slavin, R. E., Lake, C., & Groff, C. (2009). Effective programs in middle and high school mathematics: A best-evidence synthesis. Review of Educational Research, 79(2), 839–911. 11. Thomas, J.J. & Cook, K. A. (2006). A visual analytics agenda. IEEE Computer Graphics and Applications, 26(1):10–13. 12. VanLehn, K. (2006). The Behavior of Tutoring Systems. Int. J. Artif. Intell. Ed., 16(3), 227--265. 13. Rao, R. & Card, S. K. (1994). The table lens: merging graphical and symbolic representa- tions in an interactive focus + context visualization for tabular information. In CHI'94 (pp. 318-322). 14. Aigner, W., Miksch, S., Müller, W., Schumann, H., und Tominski, Ch.: Visual Methods for Analyzing Time-Oriented Data. IEEE Transactions on Visualization and Computer Graph- ics, Vol. 14, No. 1, 2008.