Proceedings of EMOOCs 2017: Work in Progress Papers of the Experience and Research Tracks and Position Papers of the Policy Track Assessment in an Online Mathematics Course Ana Moura Santos1 and Pedro Ribeiro2 1 CEAFEL and Dept. of Mathematics, Instituto Superior Técnico, Universidade de Lisboa (Portugal) 2 Dept. of Physics, Instituto Superior Técnico, Universidade de Lisboa (Portugal) ana.moura.santos@tecnico.ulisboa.pt Abstract. We describe an assessment strategy that helped to ensure high partici- pation levels together with a relevant certification in a MOOC Mathematics course. We report on the first running of the Markov Matrices course making emphasis on the preparation of exercises with random parameters for graded weekly tests. A brief description is given of forum interactions during self-as- sessment moments followed by a discussion how quizzes and assessment tests positively influenced the MOOC completion rates. Keywords: MOOC Mathematics course, Formative and summative assessment, Online parametrized exercises, MOOC forums, MOOC free certification. 1 Context Assessment moments, in particular for students running for a free certification in online courses, play a key role in the participation rates of the audience, e.g. of Computer Science MOOCs [1, 3, 9]. Also, from our successful on-campus formative plus sum- mative assessment experiences [5], we saw a great potential in the alignment of the grading with the learning objectives, and then aimed to translate this winner approach to a MOOC Mathematics course on Markov Matrices released on Técnico Lisboa1 plat- form, MOOC Técnico2. Online electronic exercises with random parameters and automatic feedback started to be created and developed from 2000 by one of the authors, but have been intensively applied with formative purposes from 2009 to encourage on campus students to practice feedback with teachers and peers along the full semester [5]. Enrolled students on courses such as Linear Algebra, Vector Calculus, Probability and Statistics, are suc- cessful exposed to both types of assessment, formative and summative, that are com- bined in a way to achieve the goal of more deep and personalized learning [6, 10]. The MOOC courses of Técnico Lisboa3 are open online courses running on a cus- tomized platform built on top of Open edX. Starting October 2016 three courses were released: Markov Matrices, Energy Services and Experimental Physics, which ap- 1 https://tecnico.ulisboa.pt/ 2 https://mooc.tecnico.ulisboa.pt/ 3 https://tecnico.ulisboa.pt/en/education/courses/online-courses/ 22 Proceedings of EMOOCs 2017: Work in Progress Papers of the Experience and Research Tracks and Position Papers of the Policy Track proach several topics on engineering basic sciences and technology, organized in sub- topics to be delivered sequentially over four/five consecutive weeks, according to well- structured guidelines (see [2]). There were more than 420 enrolled participants in the first edition of the Markov Matrices course, all aged between 14 and 58 years old (average age of 32), mostly from Portugal (87%) employees (57%), and many of them with a Master academic degree (42%) and without a link with Técnico Lisboa (48%). The course assessment activities fostered an active participation which ended with 25% of participants earning the free certificate. 2 Assessment strategies in the Markov Matrices course We wanted to apply our winner approach of on-campus assessment strategies [5] to a MOOC. Granting a certification, we also aimed to certify that each participant compe- tently applies concepts and algorithms conveyed in the course, and not simply copy the answers found by others. The Markov Matrices course (shortly designated by mmX) was the first course to be designed and produced within the MOOC Técnico directives and delivered at the plat- form. This course was designed to serve two goals: as a complement of an initiation Linear Algebra4 course, and a course aimed to a broader audience with no specific pre- vious knowledge on Linear Algebra. The overall structure was designed in four weekly topics, which then were partitioned in several subtopics, each one of them identified with a video module, within an increasing of complexity sequence of contents and cor- responding assessment moments. Working on the design and curriculum development of the course, several choices [8] have been made by the instructor (one of the authors of this paper), not only for the planning of videos (a total of 21 videos) as learning content units [4, 7], but also for designing the following assessment moments: integration and linkage of formative self- assessment quizzes with practice exercises (18 quizzes total) on similar models chosen to illustrate the course relevant mathematical concepts; preparation of a pool of codified exercises with random parameters (34 exercises total), which cover the MOOC’s math- ematics concepts and applications. To strengthen the discussion and internalize the learnings, practice exercises equal to all students were made available in the course. However, for evaluation purposes in the mmX online course, we wanted to avoid student cheating and be able to recognize for individual work. For that, similarly to the previous online exercises, we developed a system that generates the same type of exercise over several iterations. The exercises with random parameters specially created for the mmX course were constructed with help of Mathematica programming language5, and were conceived to be aligned with the learning content video units, using similar graphics (see Fig. 2) for the main illus- trated models and evaluating all mathematical concepts, theorems included. Most of 4 Linear Algebra is an undergraduate first year course at Técnico Lisboa. 5 https://www.wolfram.com/mathematica 23 Proceedings of EMOOCs 2017: Work in Progress Papers of the Experience and Research Tracks and Position Papers of the Policy Track the created exercises include graphs, sophisticated model descriptions and hypotheses of theorems, and therefore could not simply be randomized using the python-based ex- ercises produced directly in open edX, which only supports randomization for numeri- cal data 6. The developed system takes advantage of Open edX libraries and their import/export functionalities7, that enables exercise storage in the platform for later usage in a course (see Fig. 1). The first step consists in identifying what is the exact knowledge to be assessed by the exercise, “exercise identification”, and what parameters, figures and/or statements of the exercise can be randomized. After that, a notebook necessarily con- taining the Mathematica code that will generate the body of the exercise as well as a randomized figure, part of a model of a statement, must be created. Fig. 1. Diagram of the workflow for creating an exercise with randomized parameters. To produce randomly generated answers, the code that generates these answers must also be present within the same notebook, for all correct and incorrect answer hypoth- esis. Presently, three types of exercises can be generated: multiple choice exercises, checkbox questions and numerical input questions (see Fig. 2). Once a notebook con- taining the exercise has been created, a zipped folder with a format capable of being imported to a library in the edX platform must be generated as indicated in Fig. 1. This 6 http://edx.readthedocs.io/projects/open-edx-building-and-running-a-course/en/latest/exercises_tools/nu- merical_input.html 7 http://edx.readthedocs.io/projects/open-edx-building-and-running-a-course/en/latest/exercises_tools/ran- domized_content_blocks.html 24 Proceedings of EMOOCs 2017: Work in Progress Papers of the Experience and Research Tracks and Position Papers of the Policy Track is achieved using “LEIA”, a separate notebook developed at Técnico that runs the note- books with the exercise parameters coding and returns a zipped folder containing all the instances of the respective exercise in the edX compatible format. When the zipped file is uploaded to the platform, the library can be chosen as an evaluation exercise, and the platform will automatically assign one of its exercises to a participant the first time he/she visits that element of the course. After a submission of a given answer, the cor- rection is revealed to the student, and the teacher receives an automatically generated gradebook with the score of each student. 3 Running the course The first edition of Markov Matrices course started at October 19 and ended by No- vember 23, 2016. Additionally, to the four week consecutive topics, it included an in- troductory half week with two videos, which together with a short quiz (diagnostic test) helped participants become acquainted with the organization of the contents, the type and timing of assessment moments, and the communication channels of the course. PARTICIPANT 1 The steady-state vectors in the signal transmission model: p 1-p 1-p p always converge to this matrix? 0.5 0.5 0.5 0.5 PARTICIPANT 2 I had also reached that conclusion, but I’m not sure if it is valid! STAFF Yes, it is true. After many steps, the message has only a 50% probability of staying the same as the original one… Of course, in real transmission lines the p coefficient is ex- tremely high, and thus, error dissipation is very slow (it only be- comes noticeable after many transmission steps). Table 1. Posts in the discussion forum concerning one of the practice exercises of Topic 2. From day one the discussion forum played the scene of lively exchanges of questions and answers (380 posts total) with vivid and in-depth discussions. In contrast with the parametrized exercises, the practice exercises (18 quizzes in total) were identical for all students, so these could not only serve as preparation material for the evaluation unit but also as a beacon for student interaction and cooperation. This discussion was en- couraged on the MOOC Técnico forums, by pinning a discussion thread to all practice exercises. The forum was also the channel of communication chosen for contact be- tween students and course staff for solving issues related with assessment and clarifi- cation of the contents when doubts arose manly related with exercises (see, e.g. Table 1). At the end of Topic 1, participants realized for the first time that exercises in assess- ment tests (5 assessment tests), unlike practice exercises in self-assessment quizzes, 25 Proceedings of EMOOCs 2017: Work in Progress Papers of the Experience and Research Tracks and Position Papers of the Policy Track were not the same to each of them (see an example in Fig. 2). Issued posts in the forums were then solved in a one-by-one basis, thus not violating the secrecy of the contacting student(s) unique problem. The evaluation of participant’s performance during the course was executed through five assessment tests in the course: one at the end of each four topics and one final test. The first assessment test, Test 1, was released at the same time as the contents of Topic 1 (October 21) and had a deadline for submission of one week (October 28). After that, the remaining weekly tests, Test 2, Test 3 and Test 4, were released simultaneously with the corresponding topics, but had due dates schedule for a little longer than a week. Fig. 2. Two variations of the same exercise for two different students. During this process, participants could check on the course progress bar their indi- vidual results. Each weekly topic evaluation moment, counting 80% to the final grade, was highly participated when compared to standard numbers for MOOCs. The final assessment test allowed to evaluate the extent to which participants understand all the fundamental contents taught during the course (20% of the final grade). We can em- phasize once more that the 25% of participants earning a free honor certificate upon successfully achieving at least 60% of the course graded activities, which compared with the 7.7% median certificate rate of courses offering free certification referred in the Report [3] represents a high completion rate. 26 Proceedings of EMOOCs 2017: Work in Progress Papers of the Experience and Research Tracks and Position Papers of the Policy Track 4 Conclusions In addition to percentages, from the circa 50 answers to the questionnaire applied at the end of the course, we can conclude that the participants of mmX consider that assess- ment tests were adequate to the contents addressed, reasonable simple and comprehen- sible in terms of complexity, and more easier than demanding. They also added com- ments as: “This model of assessment tests facilitates fast and successful learning of all concepts”, and “…the existence of weekly assessment tests made me dedicate a little more of my time to the contents”. Counting on the success of the present experience with parametrized exercises, we expect to make generalized use of these exercises in future online courses to be produced at MOOC Técnico platform. References 1. Admiraal, W., Huisman, B. & Pilli, O. (2015). Assessment in Massive Open Online Courses. The Electronic Journal of e-Learning, vol. 13 (4), pp. 207-216. 2. Costa, F.A., Moura Santos, A., Silva, A.G. & Viana, J. (2015). Guiões para desenho de cursos MOOC. In MEC. Experiências de Inovação Didática no Ensino Superior. Lisboa: MEC, pp. 327-342. 3. Chuang, Isaac and Ho, Andrew Dean, HarvardX and MITx: Four Years of Open Online Courses -- Fall 2012-Summer 2016 (December 23, 2016). Available at SSRN: https://ssrn.com/abstract=2889436 or http://dx.doi.org/10.2139/ssrn.2889436 4. Diwanji, P., Simon, B.P., Märki, M., Korkut, S. & Dornberger, R. (2014). Success Factors of Online Learning Videos. Proceedings of 2014 International Conference on Interactive Mobile Communication Technologies and Learning (IMCL), pp.125-132. 5. Finamore, A.C., Moura Santos, A. & Ribeiro, P. (2016) Fostering Stem Formative Assess- ment for Lifelong Learners. In ICERI2016 Proceedings ISBN: 978-84-617- 5895-1, DOI: 10.21125/iceri.2016.1031. 6. Glazer, N. (2014). Formative plus Summative Assessment in Large Undergraduate Courses: Why Both? International Journal of Teaching and Learning in Higher Education, vol. 26(2), pp. 276-86. 7. Guo, P., Kim, J. & Rubin, R. (2014). How Video Production Affects Student Engagement: An Empirical Study of MOOC Videos. Proceedings of the first ACM Conference on Learn- ing@ scale conference, pp. 41-50. 8. Moura Santos, A. & Viana, J. (2016). From Design to Production: First Course Experiences Within MOOC Técnico. In ICERI2016 Proceedings ISBN: 978-84-617- 5895-1, DOI: 10.21125/iceri.2016.1032. 9. Sharma, K., Kidzinski, L., Jermann, P. & Dillenbourg, P. (2016). Towards Predicting Suc- cess in MOOCs. In M. Khalil, M. Ebner, M. Kopp, A. Lorenz e M. Kalz (Eds). Proceedings of The European Stakeholder Summit on Experiences and Best Practices in and around MOOCs (EMOOCs 2016). Graz: University of Graz, pp. 107-122. 10. Taras, M. (2002). Using assessment for learning and learning from assessment. Assessment and Evaluation in Higher Education, vol. 27, pp. 501-510. 27