=Paper=
{{Paper
|id=Vol-1738/IWTA_2016_paper4
|storemode=property
|title=Developing a Teacher Dashboard for Use with Intelligent Tutoring Systems
|pdfUrl=https://ceur-ws.org/Vol-1738/IWTA_2016_paper4.pdf
|volume=Vol-1738
|authors=Vincent Aleven,Franceska Xhakaj,Kenneth Holstein,Bruce M. McLaren
|dblpUrl=https://dblp.org/rec/conf/ectel/AlevenXHM16
}}
==Developing a Teacher Dashboard for Use with Intelligent Tutoring Systems==
Developing a Teacher Dashboard For Use with Intelligent Tutoring Systems Vincent Aleven Franceska Xhakaj Kenneth Holstein Bruce M. McLaren Human-Computer Interaction Institute Carnegie Mellon University Pittsburgh, PA USA {aleven, francesx, kjholste, bmclaren}@cs.cmu.edu ABSTRACT redesign involved adding a new dashboard but the course was Many dashboards display analytics generated by educational changed in other ways as well, so the better results cannot be technologies, but few of them work with intelligent tutoring attributed solely to the dashboard. systems (ITSs). We are creating a teacher dashboard for use with We are creating a dashboard for teachers who use intelligent ITSs built and used within our CTAT/Tutorshop infrastructure: an tutoring software in their classrooms. Intelligent tutoring systems environment for authoring and deploying ITSs. The dashboard (ITSs) have led to improved learning outcomes in many domains will take advantage of the fine-grained interaction data and [28,33,40,45-47] but often are not designed to involve teachers. derived analytics that CTAT-built ITSs produce. We are taking a ITSs might be even more effective if they were designed to not user-centered design approach in which we target two usage only help students directly, but to provide data to teachers to help scenarios for the dashboard. In one scenario, a teacher uses the them help their students. In fact, they already produce a wealth of dashboard while helping a class of students working with the sophisticated analytics, based on student modeling methods, that tutoring software in the school’s computer lab. In the other, the might be useful for this purpose. In our current project, we take a teacher uses the dashboard to prepare for an upcoming class user-centered design approach to create a teacher’s dashboard for session. So far, we have completed a Contextual Inquiry, ideation, intelligent tutoring software, focusing on realistic classroom Speed Dating sessions in which teachers evaluated story boards, scenarios. usability testing, and a classroom study with a mocked up version of the dashboard with real data from the teacher’s current classes The work differs from past work on teacher dashboards in that it and students. We are currently analyzing the data produced in focuses on intelligent tutoring technology rather than typical these activities, iterating on the design of the dashboard, and online course materials. This difference is significant because implementing a full version of the dashboard. Unique ITSs record student interaction data at a very fine-grained level, characteristics of this dashboard may be that it leverages fine- enabling advanced student modeling. These models often capture grained interaction data produced by an ITS and that it will be aspects of student knowledge, affect, metacognition, and other fully integrated with an ITS development and deployment variables. However, there are many interesting open questions as environment, and therefore available for use with many ITSs. to how such a dashboard can be designed to fit with classroom practice and whether teachers can take advantage of it to help CCS Concepts their students learn more effectively. • Applied computing~Interactive learning environments Our project focuses on the following research questions: Keywords 1. What up-to-the-minute data about student learning that Intelligent tutoring systems, learning analytics, user-centered ITSs can provide is helpful to teachers and how can it design, dashboards, blended learning, student modeling. best be presented in an actionable manner? 2. How do teachers use actionable analytics presented in a 1. INTRODUCTION dashboard to help their students? In the field of learning analytics, dashboards are often viewed as an important way in which data about students’ learning processes 3. Do students learn better when their teacher monitors a can be used to make instruction more effective [18,48]. dashboard and uses it to adjust the instruction? Dashboards are often used in college-level online courses or In the current paper, we report on the steps taken so far in our blended courses (e.g., [32]). They have also been used to support user-centered design process and on an experimental study for computer-supported collaborative learning scenarios [24,38], which we have completed data collection. At the time of this learning with mobile devices [16,25], and tabletop instructional writing, we have preliminary answers for the first two questions, technology [34,44]. and are still working on the third. Many papers describe dashboard designs and present evidence that users found these designs useful [1,17,22,39]. However, there 2. BACKGROUND: THE has been almost no empirical work that shows how teacher CTAT/TUTORSHOP ENVIRONMENT FOR dashboards influence student learning. Some studies came close. ITS RESEARCH AND DEVELOPMENT For example, Lovett, Myers, and Thille [32] showed that a The dashboard we create will be integrated in our general redesigned college-level online statistics course led to greater and infrastructure for ITS authoring and deployment, the more efficient learning, compared to the original course. The CTAT/Tutorshop infrastructure [7,8]. The CTAT tool suite makes it possible to develop intelligent tutors without programming and steps in each problem. Also, upon request, it gives strategic hints to deploy and use them on the web. It is proven and mature, suggesting what transformation to try next, even if the student having been used by over 600 authors for projects of various follows an unusual strategy. Lynnette is flexible enough to follow levels of ambition and in a variety of domains. Tutors built with along with students regardless of what sequence of reasonable CTAT have been used in at least 50 research studies, most of transformations they try as they solve equations. Lynnette has which took place in real educational settings. The Tutorshop is a been shown in five classroom studies to help students learn learning management system specifically designed to support effectively [29-31,49]. classroom use of CTAT-built ITSs. It provides teachers with tools for creating class lists, assigning work (i.e., tutor problem sets) to The idea to build a dashboard was inspired by an informal students, and viewing reports on student progress and learning. It observation by Yanjin Long, a former PhD student at our hosts a variety of tutors, including Mathtutor [6,9], Lynnette institution, during one of her classroom studies with Lynnette. [30,31,49] (see Figure 1), and tutors for genetics problem solving During a session in which middle-school students used Lynnette [20], stoichiometry [36,37], decimals [23,35], and fractions [41- in their school’s computer lab, the teacher of this class, who was walking around in the lab to keep a close eye on how her students 43]. Tutorshop is implemented in Ruby on Rails with a database in MySQL. Tutors built in this infrastructure are compatible with were progressing with the tutoring system, repeatedly saw her DataShop, a large online service that provides data sets and tools students make the same error. Although the tutoring software for researchers in educational data mining (EDM) [26]. flagged this error and helped students recover, the teacher wisely decided that more was needed. Perhaps key conceptual knowledge was missing. Right then and there, she inserted a brief mini-lesson in front of the lab’s white board, explaining not just the correct procedure (as Lynnette would do) but highlighting conceptual background knowledge regarding why this procedure is the way it is and why the error is wrong. This illustrates one of the scenarios for which we are designing the dashboard. The dashboard may make this kind of scenario more frequent and more effective. 3. USER-CENTERED DESIGN We are implementing a user-centered design process in which we identify needs of teachers in different usage scenarios and design to address these needs. We also explore the utility of analytics currently used for research but not, typically, in practice, such as learning curves [26], graphs that track the gradual increase in correct execution of targeted knowledge components over successive practice opportunities We focus on dashboard use within blended courses in which students use intelligent tutoring Figure 1. Lynnette is an intelligent tutoring system for software several times a week, and in which the remaining basic equation solving, implemented within the classroom time is spent on lectures, group work, and seat work. CTAT/Tutorshop architecture. This approach is typical of Cognitive Tutor courses, a type of ITS that is widely used in American middle schools and high schools Building on the CTAT/Tutorshop infrastructure facilitates the [27]. Within this broader context, we focus on two specific development of the dashboard, for two reasons. First, any tutor scenarios in which a teacher uses the dashboard, namely, built within this infrastructure generates a wealth of data from exploratory/reflective use of analytics to inform decisions about which informative analytics can be calculated. Second, the what to do during subsequent class periods (we refer to this as the infrastructure is geared towards feeding back information to “next-day” scenario) as well as real-time decision support, in teachers, though in elaborate reports rather than the use-specific, which the dashboard displays up-to-the-second analytics as a class actionable form we foresee for the dashboard. Importantly, the of students is working (in the school’s computer lab) with the dashboard and the newly developed learning analytics will tutoring software (we refer to this as the “on-the-spot” scenario). become part of the CTAT/Tutorshop infrastructure. Thus, they So far, we have carried out the following activities: will be available in many CTAT-built tutors. • Contextual Inquiry with teachers In our research, we will use a tutoring system called Lynnette, • Interpretation Sessions and building work models, designed to help 7th and 8th grade students learn basic skill in followed by creating an Affinity Diagram equation solving [30,31,49] (see Figure 1). As ITSs typically do, • Speed Dating to explore design ideas captured in Lynnette supports learning by doing. It presents problems that are storyboards matched to each individual student’s evolving skill level. It also • Developing prototype designs. provides detailed, step-by-step guidance as students solve these problems. That is, it gives feedback as students attempt to take Figure 2. A storyboard depicting the focus question (at the top), the storyboard images (the first row) and the description of each image and the story (second row) o “Wheel spinning,” that is, not learning a skill despite • Prototyping sessions with teachers repeated practice [13] • Classroom experiment in which a mocked up dashboard o Generality of knowledge learned – statistical fit with was fueled with real data from the teacher’s current different knowledge component models may indicate classes and students. whether students make or miss key generalizations such as treating constant and variables term the same where A key design challenge is figuring out which of the many appropriate [3,15] analytics that ITSs produce will be most useful for teachers, as • Learning behaviors well as how they can be presented to teachers in an actionable o Effective help use [4,5] way. We explore this question throughout the user-centered o Frequent use of bottom-out hints (gaming the system) design process. Below we list possible analytics, to illustrate the [2,12] range of possibilities. This list was drawn up based on our o Being on/off task [11] knowledge of teacher reports in Mathtutor and Cognitive Tutor, o Being frustrated or bored frequently (affect) [21] our knowledge of the literature on learning analytics and o Effort (e.g., evidence of steady work without educational data mining, and suggestions from two teachers. maladaptive strategies) [10] Some of these analytics can be distilled or aggregated in a o Being stuck on a problem for a long time (brought up by straightforward manner from the interaction stream with an ITS. one of the interviewed teachers) Others require more sophisticated detectors or metacognitive tutor • Where are the challenges for students? agents. However, all items listed below are realistic in that they o Which problem types, problems, or steps are hardest? have been demonstrated in prior ITS or EDM work. (suggested by one of the interviewed teachers) • Progress through problem units in the tutoring software o Which problems are harder than the most similar o Overall progress (e.g., list of units completed) problems? o Progress rate (e.g., problem-solving steps completed per o Which error types are most frequent across problems? unit of time) o Progress during the current session or past sessions 3.1 Contextual Inquiry o Progress since a particular benchmark date, (suggested We started with Contextual Inquiry sessions to investigate how by a teacher whom we interviewed) teachers currently use data in order to inform their pedagogical • Skill mastery and rate of learning decisions. Contextual Inquiry is a form of semi-structured o Learning curves [26] interview within the context of a specific task [14]. The o Skills mastered [19] participants were 6 middle school teachers in 3 schools. We o Skills students are about to start working on collected a total of 11.5 hours of video data. Some of our main o Most/least difficult skills, determined through learning findings were that teachers use data extensively, often curve analysis [26] Figure 3. A medium-fidelity prototype created using Contextual Inquiry and Speed Dating data. It displays information (from top to bottom, left to right) on the number of students who have mastered each skill or have misconceptions, skill mastery and misconceptions per student, average skill mastery plotted against average amount of practice and student time in tutor plotted against student progress. analytics they generate themselves. These analytics influence their Dating we found that teachers think it would be useful to see data decisions both at the class level and the individual level. We also and analytics provided by ITSs that teachers do not currently found that teachers paid a great amount of attention to student have, such as wheel-spinning information. In addition, we found errors, perhaps because (in a domain such as algebra) errors tend that teachers like to have power over the dashboard and its to be very actionable (e.g., the teacher might discuss the given decisions, and would not prefer having the dashboard have full error in class). The methods, data, and findings are described in control or power over the students. more detail in [50]. 3.3 Prototyping 3.2 Ideation And Speed Dating Through Based on our findings from Contextual Inquiry and Speed Dating, Storyboarding we created an initial medium-fidelity prototype of the dashboard Following Contextual Inquiry we generated broad design concepts for use in the next-day scenario (shown in Figure 3). Recall that in and created storyboards that captured them in the form of this scenario, the teacher uses the dashboard “offline” (i.e., illustrated stories addressing a central question (see Figure 2). outside of class) to prepare for an upcoming class session. These storyboards were then reviewed with teachers during Speed We conducted prototyping sessions with this medium-fidelity Dating sessions, high-paced sessions in which each teacher gave prototype with three middle-school faculty (two teachers, one their quick impressions of each of the storyboards. educational technology specialist), in which we showed them a We conducted Speed Dating with 2 middle-school teachers from a paper print out of this prototype and asked them to pretend they suburban, medium-achieving school (2 male) and 1 female were preparing for a next-day lecture, while also ‘thinking aloud’ middle-school teacher from a suburban, medium-achieving as they walked through the interface. We also encouraged the school. We created 22 storyboards with focus questions that participating teachers to ask the interviewer questions about any aimed to explore different types of data that the teacher might components of the dashboard interface that they did not need in the dashboard but they currently do not have, such as understand, as well as to provide criticism and generate design wheel-spinning information (e.g., “Does information on students’ alternatives (e.g., by drawing on the mockup). The interviewer wheel spinning in the tutor help guide your instruction?”). The also asked for elaborations throughout each prototyping session, questions also focused on whether the data should be shown at the based on the participants’ questions and feedback. For example, class or the individual level (as shown in Figure 2), and how this two teachers requested that the dashboard generate high-level data could help the teacher drive and differentiate instruction (e.g., summaries (e.g., lists displaying the students, skills, and “What notes and reminders from the dashboard help you make misconceptions that most require the teacher’s attention) to help decisions as you prepare for the next class?”). Lastly, we wanted teachers reach actionable insights more quickly. In each case, to test some futuristic ideas, in particular regarding the power however, further discussion suggested that these teachers would separation between the teacher and the dashboard. From Speed find it difficult to trust such summaries without being able to view Figure 4. One of the two screens of high-fidelity prototype of the dashboard that was used in a classroom study with real student data from the teacher’s current classes. This screen displays information about the performance of the class as a whole, in the form of number of students who have mastered each skill (top-left), average skill mastery plotted against average amount of practice (right), and prevalence of particular misconceptions (bottom-left). the “raw data” upon which these summaries were based, or to up the dashboard to the Tutorshop backend. We populated the better understand how these summaries were generated. We are dashboard with real data from the teacher’s current classes and currently in the process of analyzing data from these prototyping students, but did so through a combination of Python scripts, sessions, to inform future design iterations. We are also Excel use, and Tableau code. conducting additional Speed Dating sessions to inform the design of a dashboard used in the on-the-spot scenario. In our current Our goal for the study was to (1) understand how teachers use Speed Dating sessions, we are exploring the potential usefulness actionable analytics presented in a dashboard to drive their of a broader range of analytics, while also exploring some of the instruction and (2) explore whether students learn better when the interesting tensions and trade-offs that teachers highlighted during teacher uses a dashboard to monitor their performance and adjust our previous speed dating and prototyping sessions. instruction. At the time of this writing, we have completed the data collection and are starting to analyze the data. 3.4 Classroom Evaluation Study With Dashboard Mockup And Real Data We conducted the classroom evaluation study with 5 teachers Finally, we conducted a classroom evaluation study to test out our from two different suburban, medium-achieving schools in our initial design for a dashboard for the next day scenario. As area. The 2 teachers from the first school participated with 3 of mentioned, in this scenario, a teacher uses the dashboard to plan their classes each, while the 3 teachers from the other school what to do the next day in class, or the next day that the class will participated with 2, 4 and 5 of their classes respectively. Students be in the computer lab working with the tutoring software. were required to take a 20-minute pre-test followed by 1.5 periods work with Lynnette (1 period is 40 min) and a 20-minute mid-test. We iterated on the medium-fidelity design of the dashboard based Each teacher was given 20 minutes to prepare for a full class on feedback from a design professor at our institution, and created period and their classes were assigned in counterbalanced fashion a high-fidelity design of the dashboard (as shown in Figure 4). to the experimental or control condition. After the teacher This high-fidelity design has separate screens for class and conducted the lecture, students took a 20-minute post-test individual level information; both screens display information followed by a delayed post-test one week after the lecture. about students’ skills and categories of errors. These design decisions were grounded in the data gathered during the The sole difference between the two conditions was whether or Contextual Inquiry and Speed Dating sessions. In this study, we not the teacher had the dashboard available during their 20-minute used the high-fidelity design of the dashboard mocked up with class preparation session. In the experimental condition, teachers Tableau, a data visualization tool (http://www.tableau.com/). were shown two next-day dashboards, one with overall class-level Using Tableau, we created a realistic-looking dashboard with very information (as shown in Figure 4) and another one with limited interactive capabilities (e.g., tooltips) but without hooking individual-level information. We asked them to prepare for class using the two dashboards as they saw appropriate. In the control condition, teachers were not given any information on their informed by the dashboard. Thus far, very little research has students’ performance and were asked to prepare as they normally attempted to evaluate learning gains attributable to teacher would for the topic of Linear Equations in middle-school dashboards. mathematics. 5. ACKNOWLEDGMENTS 4. DISCUSSION AND CONCLUSION We thank Gail Kusbit, Octav Popescu, Jonathan Sewall, Cindy Teacher dashboards are emerging as a key way in which learning Tipper, and all participating teachers for their help with this analytics might have a positive influence on educational practice. project. The research reported here was supported by NSF Award Although by now many dashboards have been created, we know #1530726 and by the Institute of Education Sciences, U.S. of few projects that have focused on creating a dashboard for Department of Education, through Grant R305B150008 to ITSs. These systems produce rich interaction data. Many analytics Carnegie Mellon University. The opinions expressed are those of derived from these data have been used in research (e.g., in the the authors and do not represent the views of the Institute or the EDM community), but use in a teacher dashboard is less common. U.S. Department of Education. There are many interesting open questions regarding whether and how analytics used in ITS research might be useful for teachers 6. REFERENCES and in what form they need to be presented to be easily [1] Abel, T.D. and Evans, M.A. 2013. Cross-disciplinary understood and actionable. We explore this question through a participatory & contextual design research: Creating a teacher user-centered design approach, combined with experimental dashboard application. Interaction Design and Architecture classroom studies. We consider multiple usage scenarios, focused Journal 19, 63-78. on supporting teacher decision-making and self-reflection in [2] Aleven, V. and Koedinger, K.R. 2000. Limitations of student blended learning environments that use intelligent tutoring control: Do students know when they need help? In Proceedings software. Another aspect of our project that is somewhat unusual of the 5th International Conference on Intelligent Tutoring in comparison to other dashboard projects is that we are creating a Systems, ITS 2000, Springer, Berlin, 292-303. dashboard for use in schools, rather than for the college level. [3] Aleven, V. and Koedinger, K.R. 2013. Knowledge component A technical challenge of the current project is that we are approaches to learner modeling. In Design recommendations for implementing a dashboard for a general infrastructure for ITSs adaptive intelligent tutoring systems , US Army Research research and development: the CTAT/Tutorshop infrastructure. Laboratory, Orlando, FL, 165-182. This means that, by and large, the dashboard we create will be [4] Aleven, V., McLaren, B.M., Roll, I., and Koedinger, K.R. general to all intelligent tutors created within this infrastructure. It 2006. Toward meta-cognitive tutoring: A model of help seeking may thus become a testbed for further research into teacher with a cognitive tutor. International Journal of Artificial dashboards for blended courses that use intelligent tutoring Intelligence in Education 16, 101-128. software. [5] Aleven, V., McLaren, B.M., Roll, I., and Koedinger, K.R. 2016. Help helps, but only so much: Research on help seeking Our ongoing work focuses on the design for the “on-the-spot” with intelligent tutoring systems. International Journal of usage scenario, in which the teacher uses the dashboard while the Artificial Intelligence in Education 26, 1, 205-223. students (as a class) are working with the tutoring software. We are following the same approach as described above, soliciting [6] Aleven, V., McLaren, B.M., and Sewall, J. 2009. Scaling up teacher feedback on storyboards and increasingly sophisticated programming by demonstration for intelligent tutoring systems prototypes. We expect this design to be substantially different development: An open-access web site for middle school from that of the dashboard designed for the “next-day scenario.” mathematics learning. IEEE Transactions on Learning Identifying these differences may be a research contribution in Technologies 2, 2, 64-78. itself. We are currently analyzing the feedback and results of the [7] Aleven, V., McLaren, B.M., Sewall, J., and Koedinger, K.R. experimental study presented above. These results will inform a 2009. A new paradigm for intelligent tutoring systems: Example- planned redesign of the dashboard for the next-day scenario. Tracing tutors. International Journal of Artificial Intelligence in Education 19, 2, 105-154. In parallel, we are working to create the dashboard front-end and integrate it with the CTAT/Tutorshop infrastructure. We are using [8] Aleven, V., McLaren, B.M., Sewall, J., van Velsen, M., et al Ember.js as our framework for the front end. On the back end we 2016. Example-Tracing tutors: Intelligent tutor development for are building on the existing Ruby on Rails CTAT/Tutorshop non-programmers. International Journal of Artificial Intelligence infrastructure and the MySQL database. Our aim in extending in Education 26, 1, 224-269. Tutorshop is to (a) support additional analytics we intend to [9] Aleven, V. and Sewall, J. 2016. The frequency of tutor display on the dashboard, (b) provide updates to the dashboard in behaviors: A case study. In Proceedings of the 13th International real time, and (c) allow for relatively easy plug in of additional Conference on Intelligent Tutoring Systems, ITS 2016, Springer “detectors” (e.g. detectors of students’ help-use behavior and International Publishing, 396-401. affective states). The latter is one way in which a dashboard [10] Arroyo, I., Woolf, B.P., Burleson, W., Muldner, K., et al project can push an ITS architecture towards wider functionality 2014, Dec. A multimedia adaptive tutoring system for and generality. mathematics that addresses cognition, metacognition and affect. International Journal of Artificial Intelligence in Education 24, 4, Finally, we are planning to evaluate both dashboards (for the next- 387-426. day and on-the-spot scenarios) through experimental studies in real classroom environments. In these studies, we will test [11] Baker, R.S.J.d. 2007. Modeling and understanding students’ whether a teacher dashboard can lead to increased learning gains off-task behavior in intelligent tutoring systems. In Proceedings of on students’ work in an ITS, through teacher intervention ACM CHI 2007: Computer-Human Interaction, 1059-1068. [12] Baker, R., Walonoski, J., Heffernan, N., Roll, I., et al 2008. [26] Koedinger, K.R., Baker, R.S.J.d., Cunningham, K., Why students engage in “gaming the system” behavior in Skogsholm, A., et al 2010. A data repository for the EDM interactive learning environments. Journal of Interactive Learning community: The PSLC datashop. In Handbook of Educational Research 19, 2, 185-224. Data Mining, CRC Press, Boca Raton, FL, 43-55. [13] Beck, J.E. and Gong, Y. 2013. Wheel-Spinning: Students [27] Koedinger, K.R. and Corbett, A.T. 2006. Cognitive tutors: who fail to master a skill. In Proceedings of the 16th International Technology bringing learning sciences to the classroom. In The Conference on Artificial Intelligence in Education, AIED 2013,, Cambridge Handbook of the Learning Sciences, Cambridge Springer, Berlin Heidelberg, 431-440. University Press, New York, 61-78. 14. Beyer H, Holtzblatt K. Contextual design defining customer- [28] Kulik, J.A. and Fletcher, J.D. 2015. Effectiveness of centered systems 1997. intelligent tutoring systems. Review of Educational Research [15] Cen, H., Koedinger, K., and Junker, B. 2006. Learning 0034654315581420. factors analysis--a general method for cognitive model evaluation [29] Long, Y. and Aleven, V. 2013. Supporting students’ self- and improvement. In Proceedings of the 8th International regulated learning with an open learner model in a linear equation Conference on Intelligent Tutoring Systems (ITS 2006), Springer, tutor. In Proceedings of the 16th International Conference on Berlin, 164-175. Artificial Intelligence in Education (AIED 2013), Springer, Berlin, [16] Chen, G.D., Chang, C.K., and Wang, C.Y. 2008, Jan. 249-258. Ubiquitous learning website: Scaffold learners by mobile devices [30] Long, Y. and Aleven, V. 2016. Mastery-Oriented shared with information-aware techniques. Computers & Education 50, student/system control over problem selection in a linear equation 1, 77-90. tutor. In Proceedings of the 13th International Conference on [17] Clarke, J. and Dede, C. 2009, Aug. Design for scalability: A Intelligent Tutoring Systems, ITS 2016, Springer International case study of the river city curriculum. Journal of Science Publishing, 90-100. Education and Technology 18, 4, 353-365. [31] Long, Y. and Aleven, V. 2014. Gamification of joint 18. Clow D.: The learning analytics cycle: Closing the loop student/system control over problem selection in a linear equation effectively. In: Buckingham Shum S., Gasevic D., Ferguson R. tutor. In Proceedings of the 12th International Conference on (eds.) Proceedings of the 2Nd International Conference on Intelligent Tutoring Systems, ITS 2014, Springer, New York, 378- Learning Analytics and Knowledge, pp. 134-138. ACM, New 387. York (2012) [32] Lovett, M., Meyer, O., and Thille, C. 2008. JIME-The open [19] Corbett, A.T. and Anderson, J.R. 1995. Knowledge tracing: learning initiative: Measuring the effectiveness of the OLI Modeling the acquisition of procedural knowledge. User statistics course in accelerating student learning. Journal of Modeling and User-Adapted Interaction 4, 4, 253-278. Interactive Media in Education 2008, 1. [20] Corbett, A., Kauffman, L., MacLaren, B., Wagner, A., and [33] Ma, W., Adesope, O.O., Nesbit, J.C., and Liu, Q. 2014, Nov. Jones, E. 2010. A Cognitive Tutor for genetics problem solving: Intelligent tutoring systems and learning outcomes: A meta- Learning gains and student modeling. Journal of Educational analysis. Journal of Educational Psychology 106, 4, 901. Computing Research 42, 2, 219-239. [34] Martinez Maldonado, R., Kay, J., Yacef, K., and [21] d Baker, R.S., Gowda, S.M., Wixon, M., Kalka, J., et al Schwendimann, B. 2012. An interactive teacher's dashboard for 2012ar. Towards sensor-free affect detection in cognitive tutor monitoring groups in a multi-tabletop learning environment. In algebra.. In Proceedings of the 5th International Conference on Proceedings of the 11th International Conference on Intelligent Educational Data Mining (EDM 2012), International Educational Tutoring Systems, ITS 2012, Springer Berlin Heidelberg, 482-492. Data Mining Society, Worcester, MA, 126-133. [35] McLaren, B.M., Adams, D.M., and Mayer, R.E. 2015. 22. Duval E.: Attention please!: Learning analytics for Delayed learning effects with erroneous examples: A study of visualization and recommendation. In: Proceedings of the 1st learning decimals with a web-based tutor. International Journal of International Conference on Learning Analytics and Knowledge, Artificial Intelligence in Education 25, 4, 520-542. pp. 9-17. ACM, (2011l) [36] McLaren, B.M., DeLeeuw, K.E., and Mayer, R.E. 2011. A [23] Forlizzi, J., McLaren, B.M., Ganoe, C., McLaren, P.B., et al politeness effect in learning with web-based intelligent tutors. 2014. Decimal point: Designing and developing an educational International Journal of Human Computer Studies 69, 1-2, 70-79. game to teach decimals to middle school students. In 8th [37] McLaren, B.M., DeLeeuw, K.E., and Mayer, R.E. 2011. European Conference on Games Based Learning: ECGBL2014, Polite web-based intelligent tutors: Can they improve learning in Academic Conferences and Publishing International, Reading, classrooms? Computers & Education 56, 3, 574-584. UK, 128-135. [38] Mclaren, B.M., Scheuer, O., and Mikšátko, J. 2010. 24. De Groot R., Drachman R., Hever R., Schwarz B.B., et al: Supporting collaborative learning and e-discussions using Computer supported moderation of e-discussions: The artificial intelligence techniques. International Journal of ARGUNAUT approach. In: Proceedings of the 8th Iternational Artificial Intelligence in Education 20, 1, 1-46. Conference on Computer Supported Collaborative Learning, pp. 39. Mottus A., Kinshuk, Graf S., Chen N.-S.: Visualization and 168-170. International Society of the Learning Sciences, (2007d) interactivity in the teacher decision support system. In: 2013 IEEE [25] Kamin, S.N., Capitanu, B., Twidale, M., and Peiper, C. 13th International Conference on Advanced Learning 2009g. A teachers dashboard for a high school algebra class. In Technologies, pp. 502-503. IEEE, (2013) The Impact of Tablet PCs and Pen-based Technology on Education: Evidence and Outcomes, Purdue University Press, 63- 71. [40] Pane, J.F., Griffin, B.A., McCaffrey, D.F., and Karam, R. [46] Steenbergen-Hu, S. and Cooper, H. 2014. A meta-analysis of 2014. Effectiveness of cognitive tutor algebra I at scale. the effectiveness of intelligent tutoring systems on college Educational Evaluation and Policy Analysis 36, 2, 127-144. students’ academic learning. Journal of Educational Psychology [41] Rau, M., Aleven, V., and Rummel, N. 2013. Interleaved 106, 2, 331-347. practice in multi-dimensional learning tasks: Which dimension [47] VanLehn, K. 2011. The relative effectiveness of human should we interleave? Learning and Instruction 23, 98-114. tutoring, intelligent tutoring systems, and other tutoring systems. [42] Rau, M.A., Aleven, V., and Rummel, N. 2015. Successful Educational Psychologist 46, 4, 197-221. learning with multiple graphical representations and self- [48] Verbert, K., Govaerts, S., Duval, E., Santos, J.L., et al 2013. explanation prompts. Journal of Educational Psychology 107, 1, Learning dashboards: An overview and future research 30-46. opportunities. Personal and Ubiquitous Computing 18, 6, 1499- [43] Rau, M.A., Aleven, V., Rummel, N., and Pardos, Z. 2014. 1514. How should intelligent tutoring systems sequence multiple [49] Waalkens, M., Aleven, V., and Taatgen, N. 2013. Does graphical representations of fractions? A multi-methods study. supporting multiple student strategies lead to greater learning and International Journal of Artificial Intelligence in Education 24, 2, motivation? Investigating a source of complexity in the 125-161. architecture of intelligent tutoring systems. Computers & 44. Son LH. Supporting reflection and classroom orchestration Education 60, 1, 159 - 171. with tangible tabletops. Unpublished Ph.D. Thesis. École 50. Xhakaj F., Aleven V., McLaren B.M.: How teachers use data Polytechnique Fédérale de Lausanne EPFL Lausanne: 2012. to help students learn: Contextual inquiry for the design of a [45] Steenbergen-Hu, S. and Cooper, H. 2013. A meta-analysis of dashboard. In: Proceedings of The 11th European Conference on the effectiveness of intelligent tutoring systems on K–12 students’ Technology-Enhanced Learning, EC-TEL 2016. (2016) mathematical learning. Journal of Educational Psychology 105, 4, 970-987.