LiveHint: Intelligent Digital Support for Analog Learning Experiences Joshua D. Fisher, Stephen E. Fancsali, Amy Jones Lewis, Victoria Anne Fisher, Rob- ert G.M. Hausmann, Martina Pavelko, Sandy Bartle Finocchi, Steven Ritter Carnegie Learning, Inc., Pittsburgh PA 15219, USA {jfisher, sfancsali, alewis, vfisher, bhausmann, mpavelko, sandy, sritter}@carnegielearning.com Abstract. Among the critical expectations that learning stakeholders have for K- 12 curriculum providers during the current global pandemic are that they provide: (a) support to rectify students' learning loss, (b) resources to help parents support student learning, and (c) greater access to open educational resources. We intro- duce a mobile-friendly, digital support for analog learning experiences called Li- veHint, which currently supports students as they work on assignments in Carne- gie Learning’s physical worktexts via a chatbot with access to thousands of con- text-sensitive hints. In addition to expanding the number of courses supported by LiveHint, we discuss possibilities for expanding the scope of activities supported by LiveHint within Carnegie Learning’s existing content. We also lay out possi- bilities for expanding the approach beyond Carnegie Learning’s content to teacher-created artifacts (e.g., custom worksheets), hand-offs between instruc- tional modalities, and potential research use-cases for data collected from such a platform. Keywords: Hints, Homework, Textbooks, Dialogue-Based Tutoring, Intelligent Tutoring Systems, Parent Support, Caregiver Support, K-12 Pandemic Re- sponse, K-12 COVID-19 Response, Mathematics Education. 1 Introduction 1.1 Equity & Support for Learning in a Global Pandemic During the flu pandemic of 1918, schools in the United States closed for as many as fifteen weeks [1, 2]. During that time, some teachers likely sent home assignments, and students could practice with learning materials of the time (e.g., practicing math by writing with chalk or charcoal on a slate), but, by and large, time devoted to chores or to paid work (prior to changes in child-labor laws made in subsequent decades) was perceived as more valuable for both students and their families [3]. Despite substantial increases in the technology used for learning and the extent to which K-12 education is prioritized in the 21st century, many solutions deployed by schools, both under normal operations and to respond to the COVID-19 pandemic, are necessarily analog (e.g., paper worksheets, textbook work, and work in consumable worktexts like Carnegie Copyright © 2020 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). 2 Learning's MATHbook), owing especially to disparities in access to certain technologies (e.g., laptop computers and broadband internet access). Whether digital or analog, solutions provided by curriculum providers and educa- tional technology developers face serious challenges as learning transitions from largely synchronous K-12 classrooms environments to more asynchronous, remote learning contexts. Nascent research efforts are only beginning to understand this tran- sition (e.g., [4, 5]). Among the most important expectations for curriculum providers, according to a recent survey of 900 teachers and administrators conducted by EdWeek Market Brief [5], are "support to make up for students' learning loss," "resources to help parents support student learning," "creating more opportunities for equitable learning," and "greater access to open educational resources." Any technology that claims to meet these expectations must be platformed in a way that is sensitive to persistent disparities in the availability of online access and digital technology between lower- and higher- income households. Only around 54% of lower-income Americans had access to a desktop or laptop computer in the home in 2019, while 71% had access to smartphones. The percentage of these Americans who rely on their smartphone to access the Internet has more than doubled over the past six years [6]. Given the more widespread adoption of mobile technologies (e.g., smartphones with some form of access to the internet), Carnegie Learning seeks to develop more equitable technological solutions to support learning during such analog experiences, not only to meet the immediate need created by the novel coronavirus but to also enhance learning (and our understanding of remote learning) more generally. Capitalizing on relatively broad access to smartphones with access to the Internet, in what follows, we describe a dialogue-driven, mobile application called LiveHint. Car- negie Learning deployed LiveHint as a rapid response solution to support student work (as well as caregivers supporting those students) with its MATHbook worktexts, which are an integral part of its widely-deployed blended curriculum solutions that include the MATHia intelligent tutoring system (formerly Cognitive Tutor) [7]. In addition to de- scribing the current deployment of LiveHint, we describe important avenues for future research and development, including ways that LiveHint points toward a new concep- tion of textbooks that combines the familiarity and convenience of traditional analog text with the flexibility and support provided by intelligent tutoring systems. 1.2 Carnegie Learning’s MATHbook & Worktexts Carnegie Learning's print worktexts and e-textbooks (collectively referred to as MATHbook) are designed to promote active learning and are structured to allow stu- dents to collaboratively engage with others, think critically, and gain a deeper under- standing of math at the middle school and high school levels. Teacher support materials for these write-in consumable worktexts or e-books include topic introductions, pacing support tools, suggestions for grouping students, and recommendations for how to con- nect group and individual learning. Digital slide decks are available for each lesson for teachers to use in class or in remote-learning settings. In addition, a Skills Practice companion to the basal worktext provides targeted practice of skills and mathematical concepts for each topic. 3 In Carnegie Learning’s recommended, blended implementation, collaborative stu- dent work guided by instructors and MATHbook is coupled with individual work in MATHia adaptive software in a 60%-40% time split (60% collaborative work and 40% individual work in MATHia). The current global pandemic has upended the extent to which this blended implementation can be realized in face-to-face settings, but various technologies (e.g., remote conferencing and video chat software, among others) pro- vides means by which key facets of these blended implementations can be maintained, and, hopefully, flourish. 1.3 Intelligent Digital Support for Traditional Learning Resources To help address remote-learning needs in response to the global pandemic, Carnegie Learning introduced direct-instruction videos for each worktext lesson, openly availa- ble to support students and parents at home, along with other resources collected in an @Home Learning Library. In addition, a “live coach on-call” human tutor has been made openly available to students and parents via email or text. Teacher support deliv- ered by Carnegie Learning’s professional learning and development experts now in- cludes virtual 1:1 coaching and troubleshooting. In addition to these resources, Carne- gie Learning seeks to provide more support for students (and their caregivers or parents supporting them), especially those students and their families experiencing poverty, as they use physical worktexts, but possibly in the absence of a laptop computer or broad- band Internet access. To further enhance the experience of learners using print worktexts, we designed a digital companion to textbook homework assignments, Li- veHint. 2 LiveHint Lessons in Carnegie Learning’s MATHbook generally start by tapping into students' prior knowledge and using worked examples and step-by-step instruction to support students as they build concepts. Next, tasks transition to challenging students to reason about the implications of what they just learned and to make connections to go further. When students work on homework assignments associated with lessons, they need to consolidate what they've learned and embed it within practice, which is provided as a section of MATHbook homework assignments. While working in this practice section, students need support as they are strengthening and maintaining already existing, but brittle and weakly constructed, problem-solving schemata [8]. LiveHint supports students as they work on assignments by, first, reiterating the di- rections and then providing three to five hints on each practice problem in the home- work assignment via a chatbot. Students can access these hints on a smartphone, desk- top or laptop computer, tablet, or, in the future, potentially through smart speaker de- vices. A desktop/laptop demo is also available [9]. The initial release of LiveHint covers five of Carnegie Learning’s standard middle/high school courses, from Grade 6 to Grade 8, Algebra I, and Integrated Math I (hints for the latter deployed several weeks after the Grades 6-8 and Algebra I) and includes nearly 3,900 hints for close to 1,000 4 homework practice questions. In the 2021-2022 school year, students will be able to access LiveHint via QR codes printed in their worktexts; such QR code functionality also opens up opportunities to digitally support crowd-sourced (especially teacher-cre- ated) content. Hint authoring for LiveHint is informed by research [10, 11] suggesting that effective hints: • are concrete, • model the process of determining the sought-after solution, • suggest courses of action rather than just state general principles, and • are specifically related to the problem at hand. When they are free to do so, students often do not ask for assistance (e.g., after making an error [12]), and students often do not realize what kind of help they should seek (e.g., hints vs. answers [13]). In their conversation with the LiveHint chatbot, students do not enter a natural-language query, but simply reference the question they are working on, and hints developed for that question are provided. Students are then given the oppor- tunity to rate each hint, according to how useful they found it. Math experts, practition- ers, instructional designers, and researchers can respond to usage and rating data, along with written feedback, to improve the quality of the hints provided. We return to discuss the potential for this continuous improvement process later. Providing hints in assignments may help induce both teachers and students, who might not otherwise have done so, to provide or engage in post-lesson retrieval oppor- tunities, regardless of ability. Vaughn and Kornell [14], for example, found that stu- dents provided with hints were more likely to engage in retrieval versus restudy. In contrast, students who were not provided hints chose to restudy. Thus, providing hints may induce students to use a more effective (but more difficult) study technique [15]. 2.1 A LiveHint Example An example of a LiveHint sequence of directions and a hint (Fig. 1) is taken from the current Carnegie Learning MATHbook for Grade 8 (Course 3), in a homework assign- ment for a lesson on classifying numbers as rational, irrational, integers, natural num- bers, terminating or repeating decimals, and so on. The problem scenario is as follows: “Ling groups the following numbers together with the rationale that they are all repeating decimals: 12/18, –16/3, 3 1/3. Do you agree with Ling's grouping? Explain your reasoning.” MATHbook provides the following directions to the student, which are re-iterated by LiveHint, as illustrated in Fig 1: “Read the scenario. Ling groups the numbers 12/18, –16/3, and 3 1/3 together with the rationale that they are all repeating decimals. Do you agree with Ling’s grouping? Explain your reasoning.” 5 The first hint provided by LiveHint, after a student selects, “See a hint.” (also illus- trated in Fig. 1): “To determine whether a fraction is a repeating decimal, divide the numerator by the denominator. Think about the fraction –16/3. The negative sign doesn't matter to whether or not it can be written as a repeating decimal.” Fig. 1. LiveHint directions and the first hint for a question from the practice section of a homework assignment in Carnegie Learning’s MATHbook for Grade 8. The second hint provided by LiveHint is: “Think about the mixed number 3 1/3. The whole number doesn't matter to whether or not it can be written as a repeating decimal. If 1/3 is a repeating decimal, then 3 1/3 is a repeating decimal.” The third, and final, hint provided by LiveHint is: “The fraction 12/18 in lowest terms is 2/3. Is that a repeating deci- mal?” 2.2 Student Feedback After each hint is provided, LiveHint’s chatbot asks the student, “How helpful was that hint?” (Fig. 2) The student’s response is captured on a 5-point Likert scale with emoji 6 facial expressions ranging from an angry face (a rating of 1, “not at all helpful”) to a wide-mouthed grin with hearts for eyes (a rating of 5, “exceedingly helpful”). While not a direct measure of learning, we take learner satisfaction as a preliminary proxy for hint effectiveness. In the future, we plan to explore questions of whether (and the extent to which) learner satisfaction ratings for hints correspond to the effectiveness of these hints in improving problem solving and other learning outcomes. Fig. 2. LiveHint’s prompt for the question, “How helpful was that hint?” The prompt provides five emoji facial expressions for possible student responses to the question. 2.3 Preliminary Data We deployed LiveHint and publicized its availability to a small set of pilot school dis- tricts in May and June of 2020. Despite the substantial disruption of the COVID-19 pandemic and how late in the school year this piloting took place, we consider data from 173 LiveHint sessions in which learners1 received at least one hint, having identi- fied a particular problem in MATHbook, viewed the directions, and selecting to receive a hint. The median session time across sessions with at least one hint was 1 minute, 17 seconds, suggesting that the design intent of keeping learner interactions with LiveHint relatively brief is being met overall. During these 173 sessions, learners provided a rating for 124 hints they were provided (approximately 72% of provided hints), sug- gesting that students do not provide ratings for hints they receive with some frequency. Following research that considers learner response times in the context of help-seek- ing and hint use (e.g., [16, 17]) and as an illustration of analyses we expect to conduct with LiveHint data (and the types of research we hope to catalyze by sharing LiveHint data), Fig. 3 provides a plot of mean hint rating provided by students over these 124 rated hints (across Grades 6-8, Algebra I, and Integrated Math I2) versus the amount of time learners spent (presumably) working with hints, before requesting another hint, seeking help on another problem, or logging off. While certainly not definitive, a pre- liminary pattern indicates that learners tend to be more satisfied with the hints with 1 Logging functionality to identify unique users was implemented part-way through piloting and suggests that most users launched LiveHint once or twice. This logging functionality suggests that greater than 104 unique users are represented among these 173 LiveHint sessions with at least one hint that we consider. 2 Integrated Math I content deployed late in the pilot and consequently has only two rated hints in these preliminary data. 7 which they spend less time (with hints rated ³ 3 tending to have average times less than a minute), indicating that hints on which they spend more time could be less helpful and targets for revision. We look forward to collecting more data during the summer and during the fall as we support learners, instructors, parents, and caregivers adjusting to new learning contexts as a result of the COVID-19 pandemic. Fig. 3. Learners’ mean rating of hints plotted against mean total time on these hints across Courses 1-3 (Grades 6-8), Algebra I, and Integrated Math I (n = 124 rated hints across all courses). 3 Future Work 3.1 Extending Content Reach & More Sophisticated Tutoring In addition to expanding the number of published courses supported by LiveHint, we are exploring expanding the reach of LiveHint within courses, moving beyond support for practice activities in the worktexts to include other independent activities as well as those that are often completed independently or by students together with parents or caregivers, either remotely or at school. Future plans may also include widening the scope of student and other stakeholder needs that LiveHint supports, including providing hints in Spanish and other languages and increasing the use of video hints and hints accompanied by either static or dynamic images. In addition to providing hints and querying students as to the perceived value of such hints, intermediary problem step or partial solutions to MATHbook problems could also be accepted as LiveHint input to better understand learner problem solving (e.g., understanding strategies adopted by students in solving particular problems) and 8 hint effectiveness by providing more direct measures of learning (e.g., correctness of solutions). Although current versions of LiveHint do not accept student-generated in- put, we are actively exploring the application of dialog-based tutoring in this context [18]. LiveHint affords the opportunity to develop new supplemental products that can be connected to and "tutored" through LiveHint, including test preparation materials, in- tervention resources (e.g., for Response to Intervention and/or Tiers 2 and 3 in multi- tiered systems of support or MTSS [19, 20]), games (e.g., scavenger hunts), and other extension materials. Authoring tools could be built that would allow teachers to create LiveHint support for their own custom-made worksheets or other analog learning re- sources. Open repositories of such custom materials could be tagged with data about standards, competencies, and/or fine-grained knowledge components or skills [21] to which particular elements of such materials correspond. The availability of this kind of content, coupled with such meta-data, in an open repository would make assignments of appropriate content easier, and far less time-consuming, for instructors. Further, mappings of such content to knowledge components or other forms of competencies could enable linking LiveHint to other instructional technologies, including intelligent tutoring systems like MATHia, for example, by updating a student’s estimates of skill mastery (i.e., MATHia’s skillometer [7]) based on their work in the analog MATHbook materials and interactions with LiveHint. Such potential linkages of analog and digital learning experiences represent important avenues for the future of the textbook. 3.2 Improved Artificial Intelligence & Instructional Hand-Offs Previous work has considered using data to predict and potentially inform “instructional hand-offs” [22] or transitions between instructional applications or between the use of an instructional application and engaging with a human resource (e.g., an online, human tutor, a video chat with a teacher, or, in less pandemic-addled times, a face-to-face in- teraction with a teacher or peer). In the context of LiveHint, instructional hand-offs could happen in at least three ways: (1) connecting a struggling student with a human teacher or tutor (e.g., allowing a student to launch a video chat with a teacher or tutor from within LiveHint), (2) connecting a struggling student with a peer to engage in collaborative learning, or (3) connecting a struggling student with an appropriate in- structional technology or application (e.g., suggesting that a student work in a particular topical workspace in Carnegie Learning’s MATHia based on patterns of responses to hints and/or other questions or problem-steps in LiveHint). Any of these hand-offs re- quire technical integration work (e.g., integrating live video chat functionality into Li- veHint) as well as the development of statistical models to predict: • when such hand-offs are most likely to benefit the student most, • if a particular peer, online tutor, or instructor (assuming access to instructors that are not the student’s classroom teacher) is a good match, and • ensuring that hand-offs make efficient use of scarce, valuable time available to instructors, tutors, and/or a student’s peers. 9 3.3 Prospects for Research, Learning Engineering, & Data Sharing LiveHint represents a new facet of support for students’ analog interactions with ana- log/physical learning artifacts (i.e., Carnegie Learning’s consumable MATHbook) and also serves as an evidence-gathering tool for continuous improvement and learning en- gineering efforts, pointing to particular content in MATHbook that may require review and revision. For example, particular practice questions that are the subject of LiveHint sessions with greater frequency than others may be highlighted as requiring revisions to better facilitate student practice. In addition, student feedback is likely to provide insight into better hint construction and authoring, and experimental A/B tests can be conducted to determine if particular alternative strategies for providing and/or author- ing hints are likely to yield better learning. Further, data from the platform will be made available via mechanisms like the Learner Data Institute (http://www.learnerdatainsti- tute.org) and LearnSphere (http://www.learnsphere.org) to provide resources to the ed- ucational data science research community for exploring the so-called “assistance di- lemma” (i.e., the tension between providing and withholding assistance like hints in ways that are conducive to learning) [23], modeling hint response time [16, 17], the effectiveness of hint authoring schemata and of particular hints, dialogue-based tutor- ing, and other pressing questions for how to deliver effective, adaptive instruction across both analog and digital learning modalities. We are excited to explore opportu- nities to better understand and improve real-world, classroom-based, and remote learn- ing with datasets collected from authentic, multi-modal learning contexts. 4 Acknowledgments This work is generously supported by Schmidt Futures and the National Science Foun- dation via The Learner Data Institute (Award #1934745). Opinions expressed herein are those of the authors and do not necessarily reflect those of Schmidt Futures or the National Science Foundation. References 1. Markel, H., Lipman, H.B., Navarro, J.A., Sloan, A., Michalsen, J.R., Stern, A.M., Cetron, M.S.: Nonpharmaceutical interventions implemented by US cities during the 1918-1919 in- fluenza pandemic. Jama 298(6), 644–654 (2007). 2. Stern, A.M., Cetron, M.S., Markel, H.: Closing the schools: Lessons from the 1918-19 U.S. influenza pandemic. Health Affairs 28(Supplement 1), w1066–w1078 (2009). 3. Rich, G.: During the 1918 flu pandemic, at-home learning meant little schoolwork. Wash- ington Post (May 13, 2020), https://wapo.st/2Wqao8z, last accessed 2020/06/11. 4. Reich, J., Buttimer, C., Larke, L. R., Coleman, D., Colwell, R. D., & Faruqi, F.: Emergency Remote Instruction Study (2020), https://osf.io/2fjtc, last accessed 2020/06/11. 5. Molnar, M.: Wanted: Curriculum providers who can help with students’ COVID-19 learning loss. EdWeek Market Brief (May 15, 2020), https://marketbrief.edweek.org/market- trends/wanted-curriculum-providers-can-help-students-covid-19-learning-loss/, last ac- cessed 2020/06/11. 10 6. Anderson, M., Kumar, M.: Digital divide persists even as lower-income Americans make gains in tech adoption. (May 7, 2019), https://pewrsr.ch/36sWplO, last accessed 2020/06/11. 7. Ritter, S. Anderson, J.R., Koedinger, K.R., Corbett, A.: Cognitive Tutor: Applied research in mathematics education. Psychonomic Bulletin & Review 14, 249–255 (2007). 8. Anderson, J.R.: Cognitive psychology and its implications. 5th edn. Worth Publishers, New York (2000). 9. LiveHint Demo, https://sineof1.github.io/livehintdemo/livehint_demo.html, last accessed 2020/06/11. 10. Trismen, D.A.: Hints: An Aid to Diagnosis in Mathematical Problem Solving. Journal for Research in Mathematics Education 19(4), 358–361 (1988). 11. Perrenet, J., Groen, W.: A hint is not always a help. Educational Studies in Mathematics 25, 307–329 (1993). 12. Wood, H., Wood, D.: Help seeking, learning and contingent tutoring. Computers & Educa- tion 33(2–3), 153–169 (1999). 13. Nelson-Le Gall, S.: Help-seeking behavior in learning. Review of Research in Education 12, 55–90 (1985). 14. Vaughn, K.E., Kornell, N.: How to activate students’ natural desire to test themselves. Cog- nitive Research: Principles and Implication 4(35), (2019). 15. Karpicke, J.D.: Retrieval-based learning: Active retrieval promotes meaningful learning. Current Directions in Psychological Science 21(3), 157–163 (2012). 16. Aleven, V., McLaren, B.M., Roll, I., Koedinger, K.R.: Toward tutoring help seeking - ap- plying cognitive modeling to meta-cognitive skills. In: Lester, J.C., Vicari, R.M., Paraguaca, F. (eds.) Proceedings of the 7th International Conference on Intelligent Tutoring Systems, pp. 227–39. Springer-Verlag, Berlin/Heidelberg (2004). 17. Shih, B., Koedinger, K.R., Scheines, R.: A response time model for bottom-out hints as worked examples. In: Baker, R.S.J.d., Barnes, T., Beck, J.E. (eds.) Proceedings of the 1st International Conference on Educational Data Mining, pp. 117–126. International Educa- tional Data Mining Society (2008). 18. Nye, B.D., Graesser, A.C., Hu, X.: AutoTutor and family: A review of 17 years of natural language tutoring. International Journal of Artificial Intelligence in Education 24, 427–469 (2014). 19. Fuchs, D., Mock, D., Morgan, P.L., Young, C.L.: Responsiveness-to-intervention: Defini- tions, evidence, and implications for the learning disabilities construct. Learning Disabilities Research and Practice 18, 157–172 (2003). 20. Sugai, G., Horner, R.H.: Responsiveness-to-intervention and school-wide positive behavior supports: Integration of multi-tiered system approaches. Exceptionality 17, 223–237 (2009). 21. Koedinger, K.R., Corbett, A.C., Perfetti, C.: The Knowledge-Learning-Instruction (KLI) framework: Bridging the science-practice chasm to enhance robust student learning. Cogni- tive Science 36(5), 757–798 (2012). 22. Fancsali, S.E., Yudelson, M.V., Berman, S.R., Ritter, S.: Intelligent instructional hand offs. In: Boyer, K.E., Yudelson, M.V. (eds.) Proceedings of the 11th International Conference on Educational Data Mining, pp. 198–207. International Educational Data Mining Society (2018). 23. Koedinger, K.R., Aleven, V.: Exploring the assistance dilemma in experiments with cogni- tive tutors. Educational Psychology Review 19, 239–264 (2007).