=Paper=
{{Paper
|id=Vol-1738/IWTA_2016_paper3
|storemode=property
|title=Luna: A Dashboard for Teachers Using Intelligent Tutoring Systems
|pdfUrl=https://ceur-ws.org/Vol-1738/IWTA_2016_paper3.pdf
|volume=Vol-1738
|authors=Kenneth Holstein,Franceska Xhakaj,Vincent Aleven,Bruce McLaren
|dblpUrl=https://dblp.org/rec/conf/ectel/HolsteinXAM16
}}
==Luna: A Dashboard for Teachers Using Intelligent Tutoring Systems==
Luna: A Dashboard for Teachers Using Intelligent Tutoring Systems Kenneth Holstein1 Franceska Xhakaj1 Vincent Aleven1 Bruce McLaren1 1 Human-Computer Interaction Institute 1 Carnegie Mellon University kjholste@cs.cmu.edu francesx@cs.cmu.edu aleven@cs.cmu.edu bmclaren@cs.cmu.edu ABSTRACT In creating the Luna dashboard, we follow a user-centered design Intelligent Tutoring Systems (ITS) generate a wealth of fine- approach, grounded in data collected about teacher and student grained student interaction data. Although it seems likely that needs. So far, we have used Contextual Inquiry together with teachers could benefit from access to advanced analytics other design methods such as Interpretation Sessions and Affinity generated from these data, ITSs do not typically come with Diagramming to collect user data from middle-school teachers dashboards designed for teachers’ needs. In this project, we [10]. We created a medium-fidelity prototype, which we then used follow a user-centered design approach to create a dashboard for in a classroom study with real student data. teachers using ITSs. We are currently redesigning this early dashboard prototype (see Figure 1) based on extensive feedback and usage data from CCS Concepts teachers. We then plan to move to an implementation, within the • Human-centered computing~Human computer interaction CTAT/Tutorshop environment, of a fully functioning initial (HCI) • Applied computing~Education version. During the workshop we will show some of the design iterations the dashboard underwent, and we will demo a Keywords functioning early prototype. In addition, we will introduce Intelligent tutoring systems, learning analytics, user-centered extensions we are making to the CTAT/TutorShop environment, design, dashboards, blended learning, student modeling which facilitate the iterative prototyping and deployment of 1. THE LUNA DASHBOARD learning analytics tools for ITSs. Intelligent Tutoring Systems (ITS) [4, 8, 9] typically generate a Although the notion of a teacher dashboard for advanced learning wealth of fine-grained data about student progress and learning. technologies is not in itself new, our project may have a somewhat The analytics that can be derived from these data include, for unique combination of characteristics. As mentioned, few ITSs example, estimates of student knowledge, decomposed by skills have teacher dashboards. Further, we are carefully considering the and misconceptions within a domain, as well as time and progress unique needs of different usage-scenarios and designing to on various activities within the ITS. Although it seems highly address them. Specifically, in our design process we are likely that human teachers could benefit from access to these considering teachers’ needs in two usage scenarios within a single analytics, ITSs do not typically come with teacher dashboards that project: exploratory/reflective use (a dashboard that offers are designed with a thorough understanding of teachers’ needs for formative reports, accessible anytime by the teacher), as well as actionable information in various contexts. More often, analytics real-time decision support (a dashboard that helps teachers from ITSs are used for research purposes or in Open Learner monitor and help their students during live, in-class use of ITSs). Models shown to the student. While some ITSs show reports that Each of these scenarios leads to different teacher needs and may be useful for teachers (e.g. [1, 2]), these are not typically designs. Finally, we ultimately aim to study how teachers use designed specifically to address teachers’ needs. these dashboards, and how student learning is affected by teachers’ use of the dashboards. In our project [3], our goal is to support teacher decision-making and self-reflection in blended learning environments that use ITSs 2. ACKNOWLEDGMENTS in conjunction with classroom instruction, particularly in contexts We thank Gail Kusbit, Octav Popescu, Jonathan Sewall, Cindy where ITS-use and classroom instruction occur at separate times. Tipper, and all participating teachers for their help with this We are in the process of creating a dashboard for an ITS that project. The research reported here was supported by NSF Award supports step-based problem solving (as many ITSs do [4]). As #1530726 and by the Institute of Education Sciences, U.S. our initial test bed, we use Lynnette – a simple but highly Department of Education, through Grant R305B150008 to effective ITS for basic equation solving, built in our lab [5, 6]. Carnegie Mellon University. The opinions expressed are those of Lynnette has been used in a number of middle schools in our the authors and do not represent the views of the Institute or the region, in research studies with students in grades 6 through 8 (11- U.S. Department of Education. 14 year olds). However, our goal is to create a dashboard that can be used with any ITS that, like Lynnette, is built within the 3. REFERENCES CTAT/Tutorshop environment for authoring and deployment of [1] Kelly, K., Heffernan, N., Heffernan, C., Goldman, S., ITSs [7]. This environment provides both efficient authoring tools Pellegrino, J., & Goldstein, D. S. (2013). Estimating the as well as a system for web-based deployment with learning effect of web-based homework. In H. C. Lane, K. Yacef, J. management facilities for teachers. The CTAT/Tutorshop Mostow, & P. Pavlik (Eds.), Proceedings of the 16th environment has previously been used to build many ITSs that international conference on artificial intelligence in education have been shown to be effective in classrooms. AIED 2013 (pp. 824-827). Berlin, Heidelberg: Springer. [2] Arroyo, I., Woolf, B. P., Burleson, W., Muldner, K., Rai, D., intelligent tutoring systems, ITS 2016 (pp. 90-100). Springer & Tai, M. (2014). A multimedia adaptive tutoring system for International Publishing. doi:10.1007/978-3-319-39583-8_9 mathematics that addresses cognition, metacognition and [7] Aleven, V., McLaren, B. M., Sewall, J., van Velsen, M., affect. International Journal of Artificial Intelligence in Popescu, O., Demi, S., & Koedinger, K. R. (2016). Example- Education, 24(4), 387-426. doi:10.1007/s40593-014-0023-y tracing tutors: intelligent tutor development for non- [3] Xhakaj, F., Aleven, V., & McLaren, B. M. (2016). How programmers. International Journal of Artificial Intelligence teachers use data to help students learn: Contextual inquiry in Education, 26(1), 224-269. for the design of a dashboard. To appear in the Proceedings [8] Koedinger, K.R. and Corbett, A.T. 2006. Cognitive tutors: of The 11th European Conference on Technology-Enhanced Technology bringing learning sciences to the classroom. In Learning, EC-TEL 2016. The Cambridge Handbook of the Learning Sciences, [4] VanLehn, K. (2006). The behavior of tutoring systems. Cambridge University Press, New York, 61-78. International Journal of Artificial Intelligence in Education, [9] Woolf BP. Building intelligent interactive tutors: Student- 16(3), 227-265. centered strategies for revolutionizing e-learning. Burlington, [5] Waalkens, M., Aleven, V., & Taatgen, N. (2013). Does MA: Morgan Kaufmann; 2009. supporting multiple student strategies lead to greater learning [10] Beyer, H., & Holtzblatt, K. (1997). Contextual design: and motivation? Investigating a source of complexity in the defining customer-centered systems. Elsevier. architecture of intelligent tutoring systems. Computers & Education, 60(1), 159 - 171. doi:10.1016/j.compedu.2012.07.016 [6] Long, Y., & Aleven, V. (2016). Mastery-Oriented shared student/system control over problem selection in a linear equation tutor. In A. Micarelli, J. Stamper, & K. Panourgia (Eds.), Proceedings of the 13th international conference on Figure 1. One of two screens of an interactive, medium-fidelity prototype of the dashboard that was used in a classroom study with data from a teacher’s own students. This screen displays information about the performance of the class as a whole, in the form of counts of students who are estimated to have ‘mastered’ particular skills (top-left), skill levels plotted against amount of practice (right), and prevalence of particular misconceptions (bottom-left). Early design feedback from teachers suggests that they perceive information about student misconceptions to be particularly actionable.