Using Teaching Analytics to Inform Assessment Practices in Technology Mediated Problem Solving Tasks Geneviève Gauthier Department of Educational Psychology, University of Alberta, Canada Genevieve.gauthier@ualberta.ca Abstract: Teaching analytics can provide a useful framework to conceptual- ise the visual methods used in previous work on teachers’ assessment practices. I describe two examples of visual methods and discuss their assumptions and potential usefulness beyond the original context in which they were designed. Rethinking visual data analysis around the teaching process provide different lenses and ways to use data to inform and improve both the teaching and learn- ing processes. 1 Introduction Teaching analytics focuses on the development or adaptation of visual methods and technological tools to inform and support teaching practices within technology en- hanced learning contexts [1]. This perspective on data analysis and interpretation to support teachers’ instructional decision-making process acknowledges the importance of designing and thinking about technology mediated learning to provide an active role to teachers. In a discourse that has emphasized the power of educational technol- ogy around an individualistic and personalized approach to learning [2], teaching analytics brings a new perspective into the technology mediated discourse. It reframes the use of educational technology as enabling teachers and their teaching to occur in new ways instead of trying to replace the teacher variable in the learning equation. As argued by Goggins [3], isolating teaching analytics from the broader learning ana- lytics perspective reframes the learning equation by making this layer explicit, ac- knowledging that learning happens in a social context in which teachers’ expertise is instrumental and beneficial to this learning. In terms of research, this perspective changes the focus from analysing how teachers can make sense of technology to find- ing ways that technology can enhance or inform teachers’ practices. Teachers’ as- sessment practice relates to learners’ data as it is build and developed through years of interaction with learners and content in similar contexts. Investigating teachers’ practice instead of solely focusing on learners’ concrete instance of data enable us to explore teachers’ pedagogical content knowledge [4]. This type of knowledge that experienced teachers have can be defined as their ability to select, represent and communicate component of the domain knowledge in a way that stimulates learning for novice learners. Some aspects of the pedagogical content knowledge, like the understanding of what is difficult for learners, the ability to pro- vide good examples and explanations or the ability to predict difficulties or challenges that will likely be encountered by novice learners, can be linked to the concept of the student model in intelligent tutoring system [5]. Excellent teachers have an implicit understanding of how students tend to learn and struggle with the content knowledge in their domain. An in-depth study of these predictive aspects of experienced teachers pedagogical content knowledge’s represent an indirect way to investigate typical learning patterns for specific tasks. In this paper I discuss how teaching analytics provide a useful framework to conceptualise the tools I have developed in previous work on teachers’ assessment practices. Thinking of data analysis in ways that can improve teaching enables me to rethink of tools and methods beyond the research context in which they were devel- oped and propose ways to apply them in other contexts and domains. I begin by situ- ating the context and purpose of the research we do before describing two examples of visual methods that can be described as teaching analytics. I then propose to use theses examples to build and expand on concepts of teaching analytics as tool to im- prove teaching expressed by previous authors on teaching analytics [3, 6]. 2 Background and purpose of research The research is situated in a higher education where teachers design and use interac- tive teaching cases, which are conceived as problem solving activities or tools to help students anchor knowledge [7]. In this context, a teaching case typically consists of a story about one or more problems affecting a patient and a scenario for addressing the diagnosis and management of these problems. The development of interactive cases is a technology-mediated task that changes the sequencing in which teachers need to plan and implement their assessment. Teachers need to specify the expected outcomes or answer for each case ahead of time, not in reaction to students’ production. This answer along with rules and arguments sustaining the answers are used to give feed- back to students. The development and testing of interactive teaching cases is done through BioWorld and its Case Builder companion (Lajoie, Lavigne, Guerrera, & Munsie, 2001). Bioworld is a computer based learning environment where partici- pants are presented with patient cases to solve. The structure of the environment is non-linear; participants can interact with the problem through selecting potential hy- potheses, ordering test, checking vital signs and scrutinizing the patient problem in any sequence or order they want. While solving the case, participants collect evidenc- es supporting their reasoning and they are asked to sort and prioritize the ones sup- porting their final diagnosis. The development of teaching cases and their corresponding answer and grad- ing rules for the computer based learning environment has led us to unravel interest- ing assessment issues that highlight the tacit nature of teachers’ assessment knowledge [8, 9]. When experts and case creators were asked to perform relatively easy cases using the computer-based learning environment, we encountered validity and reliability issues regarding the proposed “good” answers for the cases. Experts and case creators, who were both experienced practitioners and teachers, could neither reproduce the “proposed” good answers for the cases, nor upon replication of the cases, repeat their first answer to the cases. Subsequent analysis of the computer log representing the different diagnostic tests or actions comparing different performances did not result in any meaningful patterns. To gain insight on how teaching cases could be assessed while enabling vari- ability in the reasoning process that we observed in competent practitioners, we inves- tigated how experienced teachers, who manage this variability in their assessment practice on a daily basis, conceptualize the notion of a good answer to teaching cases. Through technology-mediated interactions structure to capture teachers’ knowledge in action, we studied how teachers plan, design and interpret students’ reasoning per- formance in the context of open-ended interactive scenarios. 3 Teaching Analytics as a tool for practitioners to reflect about their assessment practices This first example of how methods developed for our research on assessment can gain in being framed as teaching analytics build on the idea expressed by Rebholz, Lib- brecht and Müller [6] of using visual representations of data as a tool for practitioners to improve their decision making process. We propose to use the concept of tool to go beyond the investigation of learner’ individual performance and include analysis re- lated to the practitioner’s judgment over time or over a series of different tasks. As expressed by Goggins [3], teaching analytics have the potential to enhance teaching and subsequently have a positive impact learning. Going beyond the raw data pro- duced by learners and focusing on teachers’ practices and experiences of interaction with these learners opens up possibilities in the ways we use different sources of data. In this example we describe and explain how teaching analytics can be thought as tools to help practitioners reflect and gain insight about their practice. We use the concept of student model, as internalized tacit knowledge teachers gain through expe- rience as an assumption to interrogate their assessment practice for specific teaching cases. Teachers’ instructional practice happens in a constant flux of decision- making in action. Teachers make a lot of decisions while interacting with students, content and context but they are not always able to remember or reason about these decisions [10]. When investigating teachers’ knowledge about assessment we realised that most of it was implicit; they tend to know what a good performance looks like when they see it but they cannot easily articulate the criteria they use to make this judgment [11]. Throughout a number of our experimentations teachers expressed surprise about the variability in the problem solving process of relatively easy cases [9, 12]. They were puzzled by the variability between the answers they would pro- posed and the one they could produce when doing the case in the computer learning environment. They were incredulous about the computer log and the recording of their performances when we asked them to solve the same cases twice. Their beliefs about assessment of teaching cases were not aligned with the reality captured by the computer. As a result we had to design a method to capture their knowledge in action instead of relying on what they thought the answer should be. The method aimed at capturing teachers’ implicit knowledge about their assessment practice anchored in concrete events instead of relying on memory or predictions about the problem solv- ing process and answer [13]. We build on teachers’ case based knowledge by relying on their verbal abili- ties which tend to be more developed than their written abilities. Teachers’ ability to verbalize their thinking for an external audience is well developed and using think aloud protocols with them provides richer verbal descriptions and explanations than with non teaching experts. To gain insight about the problem solving process within the technology mediated problem-solving task, we framed a think-aloud protocol analysis into a teach-aloud task, which is anchored in a familiar case presentation task for these teachers [14]. The use of verbal data combined with video and computer logs provides a better retrospective understanding of the meaning of the data in rela- tionship to both the global and contextual nature of the problem solving performance. This labour intensive strategy addresses the incompleteness of the data collected through computer log interactions as expressed by Goggins [3], and it helps teachers reflect back on the data as they provide rich narrative context and cues. The method developed throughout a number of experimentation in my doc- toral studies [15] can be conceptualized as an example of teaching analytics since it uses a visual method to display computer log data in the context of a think-aloud pro- tocol. Even if we have mainly used data of teachers’ performance given our research question related to assessment’s judgment, we use the information to inform teachers understanding of the assessment process which has had impact on their actual assess- ment practice. The use of their own performance data is related to assessment judg- ments of teachers having been shown to be anchored in personal knowledge about the task [16] and that inference or interpretation of their own behaviour is less prone to different types of biases than when they assess students’ performance [17]. 3.1 Brief description of design and use of visual representation of data Data collection and analysis to develop, validate and interact with the visual represen- tation occurs in two phases. In phase 1 teachers solve the cases while performing a think-aloud protocol [18] during their interaction with the computer-learning envi- ronment. In the first phase of analysis the computer log data and the think-aloud pro- tocol are combined and transformed into a sequential representation of the problem solving performance. The framing of the think-aloud, as a presentation for a specific audience of learners, enables the use of conversational analysis [19] for these mono- logues where the focus is on the intentions and meaning(s) of the utterances and ac- tions performed by the participants. The goal in building these visual representations is to use empirical qualitative models as tools to study the complexity of problem solving performance with participants. We use these visual representations with participants to have them validate the analysis before asking them to use the visual representation as a tool to reflect on the assessment of this specific teaching case. The validation phase, where teachers can inspect their entire verbal protocol, is similar to a retrospective think-aloud proto- col where participants have the opportunity to add or comment on their previous per- formance [18]. After the validation task, participants are asked to reflect on their problem solving process by categorizing the key features of their resolution process for the cases that are indicative of a good performance. Figure 1 below shows a sec- tion of the categorized individual visual representation. Fig. 1. Extract of teacher’s visual representation after validation and categorization This visual teach-aloud method was designed to study how experienced teachers use their contextual case knowledge. We aimed at extracting their concept of competent reasoning for specific cases to inform a reflection on their individual and shared as- sessment practice. We do not intend to question teachers’ expertise and knowledge, but we refer to the use of retrospective contemplation about their actions after the fact. This strategy is what Schön [20] refers to as “reflection-on-action”, it builds on their implicit knowledge and promotes a better understanding of the strengths and limita- tions of their assessment judgment. 4 Beyond the individual learner: analysing different units of data In this second example we build on Goggins’ [3] idea of using teaching analytics as a tool to bring a social perspective into the assessment of learning analytics. Yet, our concept of group is more elusive and not focused on groups that are interacting and learning together in time and space. Teaching analytics can enable teachers to capital- ize on the social component of learning beyond the analysis of the individual learner’s performance. This perspective on the analysis and use of data has the potential to open the door of the classroom by rethinking the unit, purpose and use of the data within the learning process. Individual performance’ data may be relevant in some situations but different grouping of data may be more useful or informative if we frame different questions about the learning process. What defines a ‘meaningful unit’ depends on the context and aspect of the learning process that we decide to explore. We suggest that a group of teachers grading the same task is a useful unit of analysis or that an entire class doing the same task or problem is also a useful unit of analysis. Analysing data from these two groups can provide insight and better contextualize judgment and interpretation of performance in a computer based learning environment. We briefly define different groups or unit of analysis that we have used in our experimentation and discuss how these use of data can inform teachers about their assessment practic- es. 4.1 Triangulating data and comparing teachers’ assessment criteria In our work we used a group of teachers who teach the same course to different stu- dents as a useful unit of analysis. These teachers sometimes discuss or build teaching cases together but they do not teach or learn together per say. We compared their concepts of a good performance for each case and analyse the convergence in their judgment. We created combined visual representations for each case resolution by merging each participant’s individual representation into one complex multi-layered representation. These representations were the result of the analysis of their conver- gence in decision making about the key element required for students to demonstrate a successful reasoning. The analysis process was very insightful for teachers, as they had never had the opportunity to see how competent colleagues would solve these types of prob- lems. We did not capture their reflection about the process but teachers’ comments at a number of occasions gave us insight on the impact of the process on their perspec- tive on assessment. For example at the beginning of the analysis, one participant asked who was the student that had produced one of the visual representations I had on the wall. As this visual representation happened to be produced by one of their colleague, it completely changed the way they were looking at it. This comment was revealing of their typical assessment experience where they mostly had to compare and evaluate students’ performances solely based on their own understanding. The exercise of comparing with colleagues opened their minds to different “good’ ways to solve these cases which impact how teachers assess them as well. 4.2 Visual representations as a tool to contextualize performance and its assessment The use of visual representations has the potential to inform teachers about the nature of competent performance. It provides a concrete trace of the context and process of the problem solving performance, which include tangents and mistakes, that differs from their ideal image of what the answer should be. This reminder about the com- plexity and potential variation in the performance of competent self or colleagues improves the transparency of the assessment procedure. It enables a better evaluation of the validity claims and the corresponding inferences of proficiency related to its scoring in small-scale educational settings [21]. While collecting students’ perfor- mance on the case we have experimented with using the visual representation tools described above as assessment tool to promote teachers’ critical perspective about their own judgment. Teachers asked more questions in their feedback but they also ended up using the visual representations as tools to communicate their assessment criteria to students when they gave them their grade and feedback. Students said that the representations of the problem solving process helped them see how they could have solve the case in different ways and it made them aware of how much more depth they could have gone into. 5 Conclusion and discussion The two examples described above show how the use of visual display of data analy- sis can provide insight into the teaching process in the context of research on assess- ment. If we think about these tools beyond the context in which they were designed and used they can become more generic tools that have the potential to support the teaching and learning process in other contexts. Teaching analytics provide a new way to frame questions related to the teaching process by acknowledging the role of the teacher as the orchestrator of learning in classroom settings. A better understand- ing of the socio-technical context in which learning occur will lead to the design and implementation of technologies that better align technical requirements and af- fordances with teachers’ analytic strengths and weaknesses. Future research using teaching analytics as tools to explore assessment practic- es will go beyond the realm of computer mediated interaction and see if these tools cannot have impact on classroom assessment judgment in context where there is no technology involved. We will also explore use of visual representations to anchor discussions and training about assessment with groups of teachers in disciplinary and interdisciplinary contexts in health. References 1. Vatrapu, R.K.: Towards semiology of teaching analystics. TaPTA Workshop at EC- TEL, Saarbrücken, Germany (2012) 2. Cuban, L.: Oversold & underused: Computers in the classroom. Harvard University Press, Cambridge, MA (2001) 3. Goggins, S.P.: Group informatics: A multi-domain perspective on the development of teaching analytics. TaPTA Workshop at EC-TEL, Saarbrücken, Germany (2012) 4. Shulman, L.: Knowledge and teaching: Foundations of the new reform. Harvard Educational Review 57, 1-23 (1987) 5. Murray, T.: Authoring Intelligent Tutoring Systems: An Analysis of the State of the Art. International Journal of Artificial Intelligence in Education 10, 98-129 (1999) 6. Rebholz, S., Libbrecht, P., Müller, W.: Learning analytics as an investigation tool for teaching practitioners. TaPTA Workshop at EC-TEL, Saarbrücken, Germany (2012) 7. Kolodner, J.L., Camp, P.J., Crismond, D., Fasse, B., Gray, J., Holbrook, J., Puntambekar, S., Ryan, M.: Problem-based learning meets case-based reasoning in the middle-school science classroom: Putting Learning by Design (TM) into practice. Journal of the Learning Sciences 12, 495-547 (2003) 8. Gauthier, G., Lajoie, P.S., Richard, S., Wiseman, J.: Mapping and validating diagnostic reasoning through interactive case creation. In: E-Learn 2007, pp. 2553-2562. (Year) 9. Gauthier, G., Conway, J.M., Taylor, R.: Assessment planning for interactive case- based learning scenario. European Association for Research on Learning and Instruction (Earli) Special Interest Group on Assessment and Evaluation, Brussels, Belgium (2012) 10. Fenstermacher, K.D.: The tyranny of tacit knowledge: What artificial intelligence tells us about knowledge representation. In: 38th Hawaii International Conference on System Sciences, pp. 1530-1605. IEEE, (Year) 11. Sadler, D.R.: Specifying and promulgating achievement standards. Oxford Review of Education 13, 191-209 (1987) 12. Gauthier, G., Conway, J.M., Taylor, R.: Variability in the expert solution for case based learning scenarios: Reliability issues. Canadian Society for the Study of Education (CSSE), Waterloo, Ontario (2012) 13. Gauthier, G.: Teach aloud: A modified version of the think-aloud protocol to study the teaching of clinical reasoning. Qualitative Health Research (QHR), Montreal, Quebec (2012) 14. Anspach, R.R.: Notes on the Sociology of Medical Discourse: The Language of Case Presentation. Journal of Health and Social Behavior 29, 357-375 (1988) 15. Gauthier, G.: Capturing and Representing the Reasoning Processes of Expert Clinical Teachers for Case-Based Teaching. McGill University, Montreal (2009) 16. Wyatt-Smith, C., Castleton, G., Freebody, P., Cooksey, R.W.: The nature of teacher’s qualitative judgements: A matter of context and salience. Part One: ‘‘In-context’’ judgements. . Australian Journal of Language and Literacy 26, 11-32 (2003) 17. Bowers, A.J.: What's in a grade? The multidimensional nature of what teacher- assigned grades assess in high school. Educational Research and Evaluation 17, 141-159 (2011) 18. Ericsson, K.A., Simon, H.A.: Protocol Analysis: Verbal Reports as Data. MIT Press, Cambridge, Mass. (1993) 19. Hutchby, I., Wooffitt, R.: Conversation Analysis: Principles, Practices and Applications. Blackwell Publishers Inc, Oxford (1998) 20. Schön, D.A.: The Reflective Practitioner: How Professionals Think in Action. Basic Books, New York (1983) 21. Kane, M.T.: An argument-based approach to validation. Psychological Bulletin 527- 535 (1992)