Indirectly Visible Bayesian Student Models Diego Zapata-Rivera Research and Development Division Educational Testing Service Princeton, NJ 08536 Abstract used to support formative dialogue in the classroom that can promote student learning. Black and Wiliam Inspectable Bayesian student models have (1998a, 1998b), for example, established a clear link been used to support student reflection, between formative assessments (assessment for learn- knowledge awareness and communication ing) and classroom learning. among teacher, students and parents. This Open student models (OSMs) consider teachers, stu- paper presents a new approach to interact- dents, and sometimes parents to be more than just ing with inspectable Bayesian student mod- consumers of assessment information. In OSM, these els called indirectly visible Bayesian student participants play an active role by observing, updat- models. In this approach, the student model ing, and acting based upon student model assessment is seen through the eyes of a pedagogical information. OSMs have been used to support student agent (e.g., a virtual student). This ap- reflection, knowledge awareness, group formation, stu- proach has been implemented in the context dent model accuracy and learning (Brna et al., 1999; of an Assessment-Based Learning Environ- Bull & Pain, 1995; Hartley & Mitrovic, 2002, Kay, ment for English grammar (English ABLE), 1998; Dimitrova, 2004; Zapata-Rivera & Greer, 2004). where the student is asked to help a peda- gogical agent find grammatical errors on var- Inspectable, interactive Bayesian student models have ious sentences. Since the pedagogical agent’s been used to integrate various sources of evidence (e.g., knowledge levels, which are also the student’s the system’s and the student’s view of the student knowledge levels, are always visible, the stu- model). Several visualization techniques including an- dent can see how much the pedagogical agent imation have been used to show how evidence of stu- ”knows” based on his/her actions. Initial dent performance is added to and propagated through- reactions to this approach have been posi- out the Bayesian student model (Zapata-Rivera & tive. We are planning on integrating it into Greer, 2001, 2004). assessment-based learning and gaming envi- Although various representational and interaction ronments as indicators of progress that con- techniques have been used to implement OSMs, stu- tinuously change in light of new evidence. dents always see the student model as the system’s view of his/her knowledge, skills and abilities. This direct approach to OSMs confronts the learner with a 1 INTRODUCTION view of the student model that could (or could not) match that of his/her own requiring the student to re- Assessment information can be obtained from a variety act to it. Students could react in a variety of ways de- of sources including standardized assessments, class- pending on many factors including student self-esteem, room quizzes, group activities, and self- or negotiated personality traits, and personal beliefs regarding com- assessment activities. Intelligent Tutoring Systems puters in general. For example, while some students (ITSs) continuously monitor student performance and could respond in a negatively way categorically re- adapt their behavior to a changing view of the student jecting the system’s claims leaving no room for ne- maintained by the system (i.e., a student model). gotiation, some could, instead, try to understand the Student models generally maintain rich student as- system’s claims in detail and perhaps even challenge sessment information. Assessment information, when them, some would just accept them, and some would shared with students, teachers and parents, can be completely ignore them without even looking at them. What if the system refers to a third person instead, for However, this part of the sentence is correct.,” and example, someone the student wants to help? Could additional adaptive instructional feedback (i.e., rules, such an approach avoid or at least attenuate some of procedures, examples and definitions). Students can these possible negative reactions? How would students ask Dr. Grammar for hints ”Ask for a hint” before react to this approach? We have implemented an in- committing to a particular choice. In that case, Dr. direct approach to interacting with Bayesian student Grammar provides a general rule related to the cur- models that capitalize on the idea of learning by teach- rent grammatical structure. Students can also type a ing. In this approach students ”teach” a pedagogi- possible correction ”Suggested word.” Both asking for cal agent by providing help finding grammatical er- help and providing corrections are treated differently rors. Students can see whether the pedagogical agent when adding evidence of student performance to the is making progress (or not) by looking at how the in- Bayesian model. Jorge’s knowledge levels, which are directly visible student model changes and how the also the student’s knowledge levels (indirectly visible pedagogical agent reacts. Bayesian student model), show a lack of knowledge for agreement. Jorge seems confused and expresses it ”I The indirectly visible Bayesian student modeling ap- don’t understand how to make the verb agree with the proach has been implemented in the context of an rest of the sentence.” Assessment Based Learning Environment for English grammar called English ABLE. English ABLE makes Knowledge levels representing the pedagogical stu- use of a Bayesian student model that is used by peda- dent’s knowledge of English grammar are taken di- gogical agents to provide adaptive feedback and adap- rectly from the Bayesian network that supports the tive sequencing of tasks. A view of the Bayesian stu- system (i.e., Bayesian student model). Although only dent model is presented to the student through the three knowledge bars are shown in Figure 1 (i.e., Agree- eyes of a pedagogical agent. ment, Wrong Form and Omission/Inclusion), a de- tailed view of the Bayesian student model containing This paper describes the Bayesian student model used information about low-level concepts is available upon in English ABLE, explains how the indirectly visible student request (Details button). student model was implemented, describes its poten- tial to be integrated into existing games, reports on initial student reactions, and concludes by discussing 2.1 Bayesian Student Model some open research issues and plans for future work. Several authors in different areas have explored the use of Bayesian belief networks to represent student 2 ENGLISH ABLE models (Collins et al. 1996, Conati et al. 2002, Horvitz et al. 1998; Mislevy & Gitomer, 1996; VanLehn & English ABLE is an Assessment-Based Learning En- Martin, 1997; Reye, 2004). vironment for English grammar. Assessment-based English grammar can be divided into three main cat- learning environments make use of assessment infor- egories: use, form, and meaning (Celce-Murcia & mation to guide instruction. Larsen-Freeman, 1999). We worked with experts to English ABLE demonstrates the reuse of existing high- elicit an initial Bayesian structure for a student model stakes tasks in lower stakes learning contexts. English (see Figure 2). The current structure of the Bayesian ABLE currently draws upon a database of TOEFL° R student model deals with English grammar form, al- Computer-Based Testing (CBT) tasks to create new though it could be extended to cover use and meaning. packages of enhanced tasks targeted towards particular Three sentence-level grammatical categories (i.e., component ELL skills. Agreement, FormofWord or Wrong Form, and Omis- In English ABLE, students try to help a virtual stu- sionInclusion) have been chosen based upon a diffi- dent (Carmen or Jorge) learn English by correct- culty analysis that was performed using student data ing this student’s writing from a notebook of facts from native Spanish speakers. These three sentence- (sentences —enhanced TOEFL° R tasks). Supplemen- level grammatical categories are further divided into tal educational materials about specific grammatical low-level sub-categories (leaf nodes) according to parts structures are offered by a virtual tutor (Dr. Gram- of speech (e.g., agreement has been divided into 3 mar). leaf nodes: noun agreement, verb agreement, and pro- noun agreement). Leaf-nodes are linked to 2 main Figure 1 shows a screenshot of English ABLE. The stu- knowledge areas (i.e., individual parts of speech: noun, dent is helping Jorge find grammatical errors within verb and pronoun, and sentence-level grammatical cat- several sentences. The student selects an option and egories). clicks on ”Check Answer.” Dr. Grammar offers ver- ification feedback ”I see you have selected ’created’. Figure 1: English ABLE. tween any two variables in the model (Daniel, Zapata- Rivera & McCalla, 2003). Each task was attached to a single category using ex- isting classification metadata and corresponding Item Response Theory (IRT) discrimination and difficulty parameters (Lord & Novick, 1968; Embretson & Reise, 2000). Tasks were recalibrated (i.e., new IRT param- eters were computed) based on data from all native Spanish speakers who took the test. The IRT-2PL model is described by the following formula: 1 Pr(taski = correct | prof j ) = , −1.701∗ a ( prof j − b ) 1 + (e ) where b is the difficulty parameter (−3 ≤ b ≤ +3, Figure 2: Bayesian student model. typical values for b), a is the discrimination parameter (−2.80 ≤ a ≤ +2.80, typical values for a), and P rofj Preliminary difficulty analysis plus data from experts represents an ability level (continuous proficiency vari- were used to generate prior and conditional probabili- ables were discretized using the following ability val- ties for the latent structure. Experts used a qualitative ues: Advanced = 0.96, IntermediateAdvanced = 0, and inspired method to produce probability values based Intermediate = -0.96. These values come from quan- on estimates of the strength of the relationship be- tiles of a normal distribution (Almond, et al. 2001)). Figure 3 shows how tasks are connected to leaf-nodes proaches to open student model use the student model using IRT parameters. Table 1 shows the resulting as a communication tool engaging students in a forma- conditional probability table of Task 2 (a=1.5, and b= tive dialogue aimed at supporting metacongition. 0.4). We do not have to show the whole Bayesian network As the student makes progress (i.e., answers additional to provide students with a sense of progress (e.g., tasks), more tasks are dynamically added to the model. weak and strong areas). We can just show an overall Observed values per task (i.e., correct or incorrect) view covering main concepts/nodes or relevant ones provide evidence (as defined by its conditional prob- depending on the tasks that the student is currently ability table) to update the student model. Asking working on. Although, in this approach just a piece for help (”Ask for a hint”) and providing corrections of the Bayesian student model would be open to stu- (”Suggested word”) are handled by slightly adjusting dents at a particular time, the whole internal Bayesian the difficulty level of the task. network is available to other components in the sys- tem (e.g., pedagogical agents). Different views of the Bayesian structure can be created to support the goals of the learning environment. These views can range from static student or teacher reports to interactive adaptive applications. 2.2 Pedagogical Agents Pedagogical agents (e.g., Chan & Baskin, 1990; Graesser, Person, Harter, & TRG, 2001; Johnson, Figure 3: Three tasks connected to a leaf-node. Rickel, & Lester, 2000) have been used to facilitate learning by supporting human-like interaction with computer-based systems. Pedagogical agents can act Table 1: Conditional probability table for Task 2 as virtual peers or virtual tutors. Pedagogical agents (a=1.5, b= 0.4) can model human emotions and use this information Pr(Task2|VerbAgreem) to facilitate learning (e.g., Picard, 1997; Nkambou et VerbAgreement Correct Incorrect al., 2003). Advanced 0.807 0.193 An interesting variant of pedagogical agents are teach- IntermediateAdvanced 0.265 0.735 able agents (Biswas et al., 2001), which have been used Intermediate 0.030 0.970 to facilitate student learning. The student’s role in these environments is to teach an artificial student how to act in a simulated environment. Students in This underlying Bayesian network supports the knowl- English ABLE are asked to help a pedagogical agent edge levels and the pedagogical agents’ behavior. That (i.e., Carmen and Jorge) find grammar errors. Car- is, indirect knowledge levels are computed based on the men and Jorge ”learn” based on the student’s perfor- corresponding probability distribution of a particular mance. Students can see how much the pedagogical node. Pedagogical agents query the Bayesian student agent knows about a particular concept by looking at model to implement adaptive algorithms (i.e., adaptive the indirectly visible Bayesian student model and by feedback, adaptive sequencing of items, and adaptive observing Carmen’s and Jorge’s changes in emotional behavior). states and associated utterances (Zapata-Rivera et al., This Bayesian student model can be made available 2007). Figure 4 depicts Jorge, Carmen and Dr. Gram- to students using a variety of approaches. For ex- mar. ample, we could have used ViSMod (Zapata-Rivera & Greer, 2003) to show students a complete view of 2.3 Indirectly Visible Bayesian Student the graphical structure using visualization techniques Model such as node color, link size and animation to repre- sent marginal and conditional probabilities. Although Bull et al. (2005) reported that children, university presenting the whole the Bayesian network can help students and instructors understood and used a variety students understand how the Bayesian student model of student model external representations. However, works (e.g., understanding integration and propaga- they also warn of possible negative effects when low- tion of evidence), it requires students to spend some performance students explore student models of more time understanding and interacting with the student capable students (i.e., some of these students reported model. Interactive, collaborative and negotiated ap- a negative effect on their motivation level and esteem). a proficiency node). This approach uses a great deal of screen space and requires users to have some fa- miliarity with probability distributions to make sense of multiple changes occurring as more evidence be- comes available and added to the Bayesian student model. Alternatively, we could choose one state (e.g., Pr(P rof iciencyj = Advanced | evidence) and show just one bar. However, this approach, would not necessar- ily be sensitive to variations on marginal probability values occurring on the neglected states of the node. Figure 4: Jorge, Carmen and Dr. Grammar. In English ABLE, the length of each bar is calculated based on an Expected A Posteriori (EAP) score that takes into account the whole marginal probability dis- tribution of a particular node, producing a value that ranges from zero to 1. This EAP-length score is com- puted using the following formula: n ∑ C Pr( proficiency = state ) j i j Lengthi = j , n where Cj is a constant numerical value assigned to each state of a node based on its proficiency level (i.e., Intermediate = 0, IntermediateAdvanced = 1, and Ad- vanced = 2) and n is the index of the highest profi- ciency state (e.g., n = 2, in this case). Figure 5 shows a detailed view of Jorge’s knowledge levels. This view of the student model appears when the student clicks on (”Details”) (see Figure 1). Tables 2 and 3 show how marginal probability and EAP values change based on the student’s responses to a series of tasks. EAP values capture slight variations Figure 5: Jorge’s knowledge levels. of marginal probabilities. The final effect is an indi- cator bar that continuously adjusts as new evidence is added to the model. English ABLE supports indirect inspection of Bayesian student models. We believe that exploring one’s student model via a pedagogical agent is less Table 2: Sequence of Probability and EAP-length intimidating and has the potential to foster student values for a student solving NounAgreement tasks. learning without the possible negative effects on self- Marginal probability values converge to the Interme- esteem and motivation, especially for those students diate state as EAP-length values get closer to zero who are having a hard time with the system. P(Int) P(IntAdv) P(Adv) EAP Resp Previous research on inspecting Bayesian student mod- 0.647 0.280 0.073 0.21 Cor els through the use of guiding artificial agents showed 0.429 0.423 0.147 0.36 Inc that agents can facilitate student interaction with the 0.676 0.297 0.027 0.18 Inc model by helping students navigate and find conflict- 0.833 0.164 0.004 0.09 Cor ing nodes. Guided agent interaction was linked to 0.684 0.306 0.010 0.16 Inc higher levels of student reflection (Zapata-Rivera & 0.832 0.167 0.001 0.08 Cor Greer, 2004). 0.686 0.311 0.003 0.16 Inc Changes in marginal probability distributions can be 0.831 0.169 0.000 0.08 Inc depicted by showing a graphical indicator per each 0.918 0.082 0.000 0.04 state of the node (e.g., three bars, one per each state of as the player makes progress in the game. Up-to- Table 3: Sequence of Probability and EAP-length date estimates of players’ competencies based on a values for a student solving NounWordForm tasks. Bayesian student model can be integrated into the Marginal probability values converge to the Intermedi- game as progress/state indicators. Using these indica- ateAdvanced state as EAP-length values get closer to tors, players see how their competencies are changing 0.5 P(Int) P(IntAdv) P(Adv) EAP Resp based on their performance in the game. This level of self-awareness can be linked to the development of meta-cognitive abilities. 0.156 0.368 0.476 0.66 Cor 0.064 0.343 0.593 0.76 Cor We are planning to use embedded assessments to cap- 0.024 0.295 0.681 0.83 Inc ture valued information without disrupting the flow 0.102 0.560 0.338 0.62 Cor and engagement of the game. We have started ap- 0.004 0.593 0.403 0.70 Cor plying some of these ideas in the context of a pop- 0.001 0.521 0.478 0.74 Inc ular first person role-playing game called The El- 0.005 0.801 0.194 0.59 Cor der Scrolls° R IV:OblivionT M °C (Bethesda Softworks, 0.002 0.754 0.244 0.62 Inc 2006). For more information about how indirectly vis- 0.005 0.916 0.078 0.54 ible Bayesian student models can potentially be inte- grated into existing games, see Shute, Ventura, Bauer & Zapata-Rivera (in press). We are currently experimenting with fading as a mech- anism for forgetting about old pieces of evidence and assigning more weight to more recent evidence. Views 3 INITIAL STUDENT REACTIONS of past data can be handled by using windows of vari- We recently conducted a study focusing on usability ous sizes that implement various fading policies. These issues and learning effects in relation to English ABLE views of the student model can be maintained and tools and interface. We report on the results from our dynamically adjusted based on student performance. usability study. Information regarding learning effects For example, pedagogical agents and other consumers can be found in Zapata-Rivera et al. (2007). of student model information can maintain their own view into the past based on how important evidence of Participants included 149 native Spanish speakers past performance is to accomplish their student learn- (ESL students) who were assigned to 3 different con- ing goals. ditions (i.e., test preparation, English ABLE simple and English ABLE enhanced). Forty six of the partic- Pedagogical agents (e.g., virtual tutors) implementing ipants were assigned to the enhanced version of English various forms of adaptive instruction use their own ABLE that included: a Bayesian student model, an in- view of the student model to keep track of students directly visible student model and pedagogical agents. progress. Some of these pedagogical agents can im- plement some form of collaborative or negotiated as- In general, we were interested in knowing how students sessment using a view of the student model to sup- reacted to the indirectly visible Bayesian student mod- port formative dialogue between students and teach- eling approach. In particular, we wanted to know how ers. Evidence gathered from these educational stake- students reacted to the pedagogical agents and their holders can then be integrated with existing evidence knowledge levels. Students were asked to respond to of student performance into an aggregate view of the a series of questions using a likert scale with the fol- student model that implements a particular policy for lowing choices: strongly agree, agree, disagree, and integration of evidence. This framework can be used strongly disagree. as a research testbed for studying the effects of sev- Results from the usability study showed that 88% of eral adaptive instructional and assessment strategies the participants assigned to English ABLE enhanced, on student learning. understood the knowledge levels presented in the in- directly visible student model, 86% thought that the 2.4 Indirectly Visible Bayesian Student knowledge levels were useful, and 86% agreed that Models and Games the knowledge levels helped them understand what Jorge/Carmen knew. Indirectly visible Bayesian student models can be in- tegrated as part of first person role-playing games. Participants agreed with the following statements: (a) In these games, each player chooses a character that ”I liked helping Carmen/Jorge find grammar errors” identifies him/herself in the game. Each charac- (90%), (b) ”Carmen’s/Jorge’s comments were useful” ter has a particular personality, skills, and abili- (78%) , (c) ”Helping Carmen/Jorge motivated me to ties. Some of these traits change during the game keep going” (90%), (d) ”I have helped Carmen/Jorge a lot by finding the grammar errors” (73%), (e) ”I have if/when the student is teaching a wrong concept to learned by helping the Carmen/Jorge with his/her the pedagogical agent or trying to game the system? sentences” (90%), (f) ”The feedback provided by Dr. How should agents respond to the questions raised by Grammar helped me learn” (87%), and (g) ”I think students? How do we convince students that their Carmen and Jorge liked my help” (81%). help is really helping the pedagogical agent ”learn” the concepts? Although highly motivated students In addition, some of the students’ comments seemed can engage in this kind of interaction with pedagog- to indicate that they understood their role as teach- ical agents, what kinds of mechanisms should be in ers and used student model information to continu- place to maintain and encourage such high levels of ously assess learning progress. For example, a moti- motivation? How do we implement this level of inter- vated student mentioned that ”My Carmen is happy. action without negatively affecting the flow of a game? Her knowledge levels are increasing,” while a strug- These are all interesting open research questions that gling student exclaimed: ”Poor Carmen, she is not motivate and inform our plans for future work. learning a lot from me.” Future work includes using assessment information to Initial results show that students enjoyed the current support learning in various contexts, harnessing the implementation of the indirectly visible Bayesian mod- power of games and technology to provide highly inter- eling approach. We believe that teaching someone else active adaptive learning environments that seamlessly and seeing how he/she makes progress (or not) can be a use assessment information to improve student learn- strong motivational factor that can help maintain stu- ing, skills and performance in valued domain areas. dents engaged in the learning process. Although initial results are encouraging more studies are needed. Acknowledgements 4 DISCUSSION & FUTURE WORK I would like to thank the members of the English ABLE team including: Malcolm Bauer, Thomas Flo- Different external representations can be used to offer rek, Waverly VanWinkle, Meg Powers, Debbie Pisac- views of the student model and interaction techniques reta, Janet Stumper, James Purpura, Valerie Shute, can be implemented to help students and teachers to Russell Almond, Jody Underwood, Margaret Red- interact with the student model. It is important to man, Christopher Pfister, Hae-Jin Kim, Linda Tyler, take into account the goals of the learning session and Cathrael Kazin, Jan Plante, Yong-Won Lee, Victor the need of having an accurate student model in order Aluise, Feng Yu, Mary Enright and Maurice Hauck. to decide which kind of support is more appropriate for I also want to thank Valerie Shute, Russell Almond, a particular situation (Zapata-Rivera & Greer, 2004). Eric Hansen and three anonymous reviewers for pro- viding valuable feedback on earlier versions of this pa- Although students seemed to enjoy helping pedagogi- per. Also, I would like to extend my sincere appreci- cal agents find grammatical errors, current implemen- ation to the students, teachers and school administra- tation of the agents was limited to providing addi- tors who participated in our study. tional scaffolding in a language accessible to students and showing various emotional states based on the current state of Bayesian student model. Interaction References with these pedagogical agents could be enhanced by R., Almond, L., Dibello, F., Jenkins, R., Mislevy, D., supporting dialogue based interaction. For example, Senturk, L., Steinberg, & D., Yan (2001). Models pedagogical agents could ask students to explain par- for conditional probability tables in educational as- ticular actions or elicit additional information from sessment. In T. Jaakkola and T. Richardson, editors, students aiming at mapping the limits of their under- Artificial Intelligence and Statistics, 137-143. standing regarding a particular topic. Bethesda Softworks (2006) The Elder Scrolls° R Students could also question the estimates of knowl- TM° C IV:Oblivion Bethesda Softworks LLC, a Zeni- edge assigned to the pedagogical agent. Does the agent Max Media company. All Rights Reserved. really know about a particular grammatical structure? A student could think: ”Let’s ask the agent some ques- G., Biswas, D., Schwartz, J., Bransford, & the Teach- tions to see how he/she answer.” Testing the peda- able Agent Group at Vanderbilt (TAG-V) (2001). gogical agent on particular topics will also provide in- Technology support for complex problem solving: teresting evidence of student knowledge that can be From SAD environments to AI. In K. D. Forbus and P. added to the model. Should the pedagogical agent an- J. Feltovich (Eds.), Smart machines in education: The swer the questions at the level of the student or act coming revolution in educational technology. Menlo as a weaker student? Should Dr. Grammar intervene Park, CA: AAAI/MIT Press. 71-97. P., Black, & D., Wiliam (1998a). Assessment and Teaching tactics and dialog in AutoTutor. Interna- classroom learning. Assessment in Education: Prin- tional Journal of Artificial Intelligence in Education, ciples, Policy, and Practice, 5 (1), 7-74. 12, 257-279. P., Black, & D., Wiliam (1998b). Inside the black D., Hartley, & A., Mitrovic (2002). Supporting Learn- box: Raising standards through classroom assessment. ing by opening the Student Model. In proceedings of London: School of Education, King’s College. ITS 2002. pp. 453-462. P., Brna, J., Self, S., Bull, & H., Pain (1999). Negoti- E., Horvitz, J.S., Breese, D., Heckerman, D., Hovel, & ated Collaborative Assessment through Collaborative K., Rommelse (1998). The Lumiere Project: Bayesian Student Modelling.Proceedings of the workshop Open, user modeling for inferring the goals and needs of soft- Interactive, and other Overt Approaches to Learner ware users. Fourteenth Conference on Uncertainty in Modelling at AIED99. Le Mans, France, pp. 35-44. Artificial Intelligence. San Francisco: Morgan Kauf- mann, 256-265. S., Bull, & H., Pain (1995). ”’Did I say what I think I said, and do you agree with me?’: Inspect- W.L., Johnson, J.W., Rickel, & J.C., Lester (2000). ing and Questioning the Student Model”, Proceedings Animated Pedagogical Agents: Face-to-Face Interac- of World Conference on Artificial Intelligence in Edu- tion in Interactive Learning Environments. Interna- cation (AACE), Charlottesville, VA, 501-508. tional Journal of Artificial Intelligence in Education, 11(1), 47-78. S., Bull, A. Mabbott, A.S. Abu Issa, & J. Marsh (2005). Reactions to Inspectable Learner Models: J., Kay (1998). A Scrutable User Modelling Shell for Seven Year Olds to University Students. AIED’05 User-Adapted Interaction. Ph.D. Thesis, Basser De- Workshop on Learner Modelling for Reflection, to partment of Computer Science, University of Sydney, Support Learner Control, Metacognition and Improved Sydney, Australia. Communication between Teachers and Learners. 23-32 F. M., Lord, & M. R., Novick, (1968). Statistical the- M., Celce-Murcia & D., Larsen-Freeman (1999). The ories of mental test scores. Reading, MA: Addison- Grammar Book (2nd ed.). Heinle & Heinle Publishers. Wesley. T.W., Chan & A.B., Baskin (1990). Learning Com- R. J., Mislevy, & D. H., Gitomer (1996). The Role of panion Systems. In Frasson, C., and Gauthier, G. Probability-Based Inference in an Intelligent Tutoring (eds.) Intelligent Tutoring Systems: At the crossroads System.User Modeling and User-Adaptive Interaction. of AI and Education. Ablex Pub., 6-33. Special Issue on Numerical Uncertainty Management in User and Student Modeling, vol 5(3), 253-282. J. A., Collins, J. E., Greer, & S. X., Huang (1996). Adaptive Assessment using Granularity Hierarchies R., Nkambou, Y., Laporte, R., Yatchou, & G., and Bayesian Nets. In Frason, C., C., Gauthier, G. Gouardres (2003). Embodied emotional agent in in- and Lesgold, A., (eds.), Proceedings of Intelligent Tu- telligent training system. In Recent Advances in in- toring Systems ITS’96. Berlin: Springer, 569-577. telligent Paradigms and Applications, J. Kacprzyk, A. Abraham, and L. C. Jain, (Eds.) Studies In Fuzziness C., Conati, A.S., Gertner, & K., VanLehn, (2002). Us- And Soft Computing. Physica-Verlag Heidelberg, 235- ing Bayesian Networks to Manage Uncertainty in Stu- 253. dent Modeling. In User Modeling and User-Adapted Interaction. 12, 371-417. R.W., Picard (1997). Affective computing. Cam- bridge, MA: MIT Press. B., Daniel, J.D., Zapata-Rivera, & G., McCalla (2003). A Bayesian Computational Model of Social Capital in J., Reye (2004). Student Modelling based on Be- Virtual Communities. Proceedings of the First Inter- lief Networks.International Journal of Artificial Intel- national Conference on Communities and Technolo- ligence in Education. Vol. 14, 63-96. gies; C&T 2003. Kluwer Academic Publishers. ISBN V.J., Shute, M., Ventura, M., Bauer & D., Zapata- 1-4020-1611-5. Rivera (in press) Melding the Power of Serious Games V., Dimitrova (2003). StyLE-OLM: Interactive Open and Embedded Assessment to Monitor and Foster Learner Modelling. International Journal of Artificial Learning: Flow and Grow. To appear in L. Miller Intelligence in Education 13(1), pp. 35-78. (Ed.), Serious Games: Learning, Development and Change. S., Embretson, & S., Reise (2000). Item response the- ory for psychologists. Mahwah, NJ: Erlbaum. K., VanLehn, & J., Martin (1997). Evaluation on an assessment system based on Bayesian student model- A.C., Graesser, N., Person, D., Harter & TRG (2001). ing. International Journal of Artificial Intelligence in Education, Vol. 8. 179-221. J.D., Zapata-Rivera, & J., Greer (2001). Visualizing and Inspecting Bayesian Belief Models. Workshop on Effective Interactive AI Resources. International Joint Conference on Artificial Intelligence IJCAI 2001, 47- 49. J.D., Zapata-Rivera, & J., Greer (2004). Interacting with Bayesian Student Models. International Journal of Artificial Intelligence in Education. Vol. 14, Nr. 2 127-163. J.D., Zapata-Rivera, W., VanWinkle, Shute, V.J., J.S., Underwood & M.I. Bauer (2007). English ABLE. In Proceedings of the 13th International Conference of Artificial Intelligence in Education. 8pp.