Design of CoTAS: Automated Computational Thinking Assessment System Sabiha Yeni Felienne Hermans Leiden University Leiden University Leiden Inst.of Advanced Computer Science Leiden Inst.of Advanced Computer Science s.yeni@umail.leidenuniv.nl f.f.j.hermans@liacs.leidenuniv.nl Leiden, The Netherland Leiden, The Netherland TAS offers multi-disciplinary assessment tools which can be used not only in the program- Abstract ming lessons, but also in other disciplines such as science, mathematics and social sciences in Computational thinking (CT) is widely ac- which CT skills are integrated. cepted as a fundamental practice for equip- ping students to formulate and solve problems in the digital era. Many countries are adopt- 1 Introduction ing mandatory curricula for computer science Assessment plays a key role in the development of (CS) lessons and CT education at high school. many educational reform movements. In the Horizon However, assessing how well students perform 2017 report, the key trends accelerating adoption of in CT activities is hard. Teachers face many higher education technology are defined [1]. Accord- challenges in the assessment process, because ing to this report, there is a growing focus on mea- there is a limited number of resources for as- suring learning. This trend describes an interest in sessment and a lack of online access to re- assessment and a wide variety of methods and tools sources. Therefore, the goal of this paper is that educators use to evaluate, measure, and document to support teachers by developing an effec- academic readiness, learning progress, skill acquisition tive automated Computational Thinking As- and other educational needs of students. On the other sessment System (CoTAS) for instructing and hand, in a study supported by European Commission evaluating CT skills of high school students called “Developing CT in compulsory education”, the in Python course. CoTAS facilitates the as- authors emphasize that the evaluation of CT skills is sessment of students’ CT skills. Supported by at an early stage and there is a need for further study, CoTAS, teachers will be able to determine stu- and that current assessment methods are insufficient dents’ CT skill levels and shape their learning for evaluating all aspects of CT skills [2]. Research on by continuously observing students’ individual assessment literacy indicates that particularly teach- levels of development during the learning pro- ers new to a content area and teaching practice often cess. Teachers can access different resources face many challenges when it comes to engaging in ro- to evaluate CT concepts, practices and per- bust assessment practices in their instructions [3], [4], spectives. CoTAS can provide automatic feed- also many teachers teaching CT lack a strong back- back, so teachers can guide students directly ground in programming [5]. In addition, construction- when misconceptions arise. Moreover, Co- ist approaches are actively used in CS education. The Copyright c 2019 for this paper by its authors. Use permitted process of evaluating students in the constructionist under Creative Commons License Attribution 4.0 International learning environment has several difficulties, because (CC BY 4.0). it is an open-ended and undefined situation. Design- In: I. Fronza, C. Pahl (eds.): Proceedings of the 2nd Systems of oriented learning environments based on construction- Assessments for Computational Thinking Learning Workshop ist approaches require frequent evaluation in various (TACKLE 2019), co-located with 14th European Conference on Technology Enhanced Learning (EC-TEL 2019), 17-09-2019, forms. published at http://ceur-ws.org. For assessing CT, various evaluation tools are used: tests, observations, open-ended questions, computer- of each unit with the test tool consisting of multiple based coding exams to determine levels of compre- choice questions in order to follow the development hension of programming-related terms and to shape during the training, to detect misconceptions and to teaching processes [6], [7], [8], [9]; artifact based in- identify issues that are not understood. The contents terviews, portfolios, thinking aloud method, projects, of the test will be programming related and Bloom’s rubrics to assess students’ effort and development in lower order thinking (knowledge, comprehension and the process [10], [11], [12], [13]; performance-based as- application level) questions. sessment, problem-based assessment, design scenarios The evaluation results for CT concepts: In “My to evaluate problem solving, algorithmic thinking and Progress” page of CoTAS, four different types of as- abstraction skills [14], [15], [16]; automated program- sessment scores are shown in Figure 1 (1 to 4). The ming assessment tools to provide quick feedback [17], first part (Fig1.1) shows the proficiency level of stu- [18], [19], [20] and automated CT assessment tools to dents’ projects. The percentage of projects’ profi- determine the CT skill levels of students [21], [22]. ciency level will be presented in three levels (basic, However, finding and validating CT measures that as- developing and proficient). The second part (Fig1.2) sesses CT with the holistic approach remains challeng- shows the usage frequency for CT concepts according ing. Brennan and Resnick [23] have six suggestions for to proficiency levels. assessing CT: supporting further learning, incorporat- ing artifacts (portfolio analysis), illuminating artifact development processes, checking in at multiple way- points, valuing multiple ways of knowing and including multiple viewpoints. Therefore, CoTAS aims to eval- uate the CT concepts, practices and perspectives of high school students during education in a text-based programming language (Python). For evaluating stu- dents’ comprehension of CT concepts, the automated quality assessment tool and test tool will be used [21], [24], [25], [26], [27]. CoTAS will offer problem-based assignments and problem-based tests to evaluate CT practices. Finally, CoTAS will present survey and in- terviews for evaluating students’ CT perspectives. Figure 1: My Progress Page of CoTAS 2 General features of CoTAS The goal of CoTAS is to improve the effectiveness of CT education and support teachers during the evalua- 2.2 CoTAS Tools for CT Practices tion of CT skills of high school students. CoTAS pro- In order to evaluate students’ proficiency level of CT vides both summative and formative evaluation tools practices (such as formulation, abstraction, algorith- for evaluating students’ CT concepts, practices and mic thinking, reusing and remixing, being iterative perspectives [23], [28]. and incremental, debugging) problem-based assign- ment tool and problem-based test tool of CoTAS will 2.1 CoTAS Tools for CT Concepts be used (Table 1). (2.1) The problem-based assign- In order to follow students’ comprehension of CT con- ment tool will include authentic assessment questions cepts (such as data structures, operators, conditionals, [13] and problem solving scenarios to evaluate stu- sequences, loops and functions), the automated qual- dents’ CT practices. As the automatic feedback, the ity assessment tool and the test tool of CoTAS will information comprising of the number of attempts be used (Table 1). (1.1)The automated quality as- performed while solving a problem, the duration of sessment tool will measure the proficiency level of stu- code writing and the similarity of the final output to dents’ codes without the need of human supervision. the desired output will be provided to analyze the CT metrics are identified to measure the proficiency- problem solving process of students. Teachers can complexity level of code [21], [22], [24], [25], [26]. This also score assignments manually according to predeter- tool of CoTAS will alleviate teachers’ struggle with mined rubrics. The rubrics will offer descriptors of per- manual assessment of students’ CT skills, as well as formance levels for each CT practice. (2.2) Problem- providing real-time feedback on how students develop based test tool will consist Bloom’s higher order think- CT competency over time. (1.2) Students’ knowledge ing (evaluation, synthesis and analysis level) multiple level about CT concepts will be evaluated at the end choice questions in order to detect development in the CT practices. This tool can be used not only in the the opportunity to examine the effectiveness of dif- programming lessons, but also in other disciplines such ferent assessment tools in predicting students’ final as science, mathematics and social sciences in which achievements. CT skills are integrated. The evaluation results for CT practices: The third 4 Conclusion part of “My Progress” page (Fig1.3) presents problem- Many countries have taken steps to bring the CT con- based assignment and test scores for CT practices. cepts into their respective curriculum. Although there is a high level of consensus regarding the inclusion of 2.3 CoTAS Tools for CT Perspectives the concepts related to CT into the curriculum, it is In order to evaluate students’ CT perspectives (such as known that there is resource shortage and insecurity computational identity, programming empowerment, about how this high-level thinking skill should be eval- perspectives of expressing, connecting and question- uated. In this context, CoTAS will contribute to the ing), CoTAS will offer the survey and interview ques- (inter)national CS education field. Moreover, CoTAS tions (Table 1). (3.1) Students will be required to rate will provide different advantages for improving the CT their agreement or disagreement with the statements skills of students. Coding is generally perceived as dif- in the survey. Surveys will be conducted at different ficult by students, and it is one of the most difficult time points to capture the development of students’ learning outcomes to evaluate for teachers. CoTAS CT perspectives. (3.2) Interviews will be used to ob- will provide guided and instant feedback to students tain more details on students’ CT perspectives; how- for improving the CT learning processes. CoTAS pro- ever the interview results will be manually evaluated poses to carry out formative and summative evalua- according to predetermined rubrics so it will require tion tools together in a holistic approach. Thus, it time and effort. can be used not only in the programming lessons, but The evaluation results for CT perspectives: The also in other disciplines such as science, mathematics fourth part of “My Progress” page (Fig1.4) shows the and social sciences in which CT skills are integrated. survey scores for CT perspectives of students at dif- The continuation of learning with the help of CoTAS ferent time points. Finally, Fig1.5 shows all actions a (using automatic feedback) outside of regular lessons student can perform in CoTAS. will provide the maximum benefit for students to reach their learning goals about CT skills. 3 Benefits of CoTAS CoTAS will provide different facilities for teachers, stu- dents and researchers during the assessment of CT skills of high school students. With the help of CoTAS, teachers will be able to access CT evaluation resources. The time spent for evaluation will be reduced through automatic feedback and provided resources. Teachers will be able to follow the progress of students during learning and they can manage the evaluation content easily. Teachers will see students’ mistakes and mis- conceptions those are frequently made. Students will be able to follow their own progress with the help of CoTAS. Students will receive instant and guiding feedback related during learning and eval- uation. CoTAS will provide opportunity for access- ing to resources anytime and anywhere for students. Through different assessment tools, students will be able to realize their own inadequacies and receive guid- ance to support their individual development. Researchers will be able to identify the factors those are effective in improving students’ CT skills by mak- ing predictive assessments. They will be able to exam- ine whether there is a relationship between the data obtained from CoTAS (such as access frequency to learning resources or number of shared projects etc.) and students’ level of CT skills. Researchers will have Table 1: CoTAS Tools Used for CT Components Evaluation Availability for CT Components CoTAS Tools Feedback Types Other Disciplines (1.1) The automated quality Concepts Automatic Python assessment tool (Data Structures, Operators, Conditionals, (1.2) The test tool (knowledge, Sequences, Loops, Functions) comprehension and application Automatic Python level questions) (2.1) The problem-based Manual & Practices Python assignment tool Automatic (Formulation, Abstraction, Algorithmic Thinking, (2.2) The problem-based test tool Reusing and Remixing, Being Iterative and All disciplines (evaluation, synthesis and Automatic Incremental, Debugging) developing CT analysis level questions) All disciplines Perspectives (3.1) The survey Automatic developing CT (Computational Identity, Programming Empowerment, All disciplines Perspectives of Expressing, Connecting and Questioning) (3.2) Interviews Manual developing CT References [11] Kotini, I., Tzelepi, S. (2015). A gamification- based framework for developing learning ac- [1] Becker, S. A., Cummins, M., Davis, A., Free- tivities of computational thinking. In Gami- man, A., Hall, C. G., Ananthanarayanan, V. fication in Education and Business, 219-252, (2017). NMC horizon report: 2017 higher ed- Springer. ucation edition The New Media Consortium, 1-60. [12] Bers, M. U. (2010). The TangibleK Robotics program: Applied computational thinking [2] Bocconi, S., Chioccariello, A., Dettori, G., for young children. Early Childhood Research Ferrari, A., Engelhardt, K., Kampylis, P., and Practice, 12(2). Punie, Y. (2016). Developing computational thinking in compulsory education. European [13] Fronza, I., Ioini, N. E., Corral, L. (2017). Commission, JRC Science for Policy Report. Teaching computational thinking using agile [3] DeLuca, C., Klinger, D. A. (2010). Assess- software engineering methods: A framework ment literacy development: Identifying gaps for middle schools. ACM Transactions on in teacher candidates’ learning. Assessment Computing Education (TOCE), 17(4), 19. in Education: Principles, Policy and Prac- [14] Werner, L., Denner, J., Campe, S., tice, 17(4), 419-438. Kawamoto, D. C. (2012). The fairy perfor- [4] Popham, W. J. (2009). Assessment literacy mance assessment: measuring computational for teachers: Faddish or fundamental?. The- thinking in middle school. In Proceedings of ory into practice, 48(1), 4-11. the 43rd ACM technical symposium on Com- puter Science Education , 215-220, ACM. [5] De Groot, J. (2018). Teaching Computa- tional Thinking - What do our educators [15] Webb, D. C. (2010). Troubleshooting assess- need? Delft University of Technology. Mas- ment: an authentic problem solving activity terThesis. for it education. Procedia-Social and Behav- ioral Sciences , 9, 903-907. [6] Yadav, A., Burkhart, D., Moix, D., Snow, E., Bandaru, P., Clayborn, L. (2015). Sow- [16] Djambong, T., Freiman, V. (2016). Task- ing the seeds: A landscape study on assess- Based Assessment of Students’ Computa- ment in secondary computer science educa- tional Thinking Skills Developed through Vi- tion. Comp. Sci. Teachers Assn., NY. sual Programming or Tangible Coding Envi- ronments. International Association for De- [7] Grover, S., Cooper, S., Pea, R. (2014). As- velopment of the Information Society. sessing computational learning in K-12. In Proceedings of the 2014 conference on Inno- [17] Ala-Mutka, K. (2005). A survey of auto- vation and technology in computer science mated assessment approaches for program- education, 57-62, ACM. ming assignments. Computer Science Edu- cation, 15(2), 83–102. [8] Atmatzidou, S., Demetriadis, S. (2016). Advancing students’ computational think- [18] Edwards, S. H., Perez-Quinones, M. A. ing skills through educational robotics: A (2008). Web-CAT: automatically grading study on age and gender relevant differences. programming assignments. In ACM SIGCSE Robotics and Autonomous Systems, 75, 661- Bulletin, 40(3), 328-328. ACM. 670. [19] Joy, M., Griffiths, N., Boyatt, R. (2005). The [9] Mishra, S., Iyer, S. (2015). An exploration of boss online submission and assessment sys- problem posing-based activities as an assess- tem. Journal on Educational Resources in ment tool and as an instructional strategy. Computing (JERIC), 5(3), 2. Research and practice in technology enhanced learning, 10(1), 5. [20] Korhonen, A., Malmi, L., Silvasti, P. (2003). TRAKLA2: a framework for automatically [10] Kong, S. C. (2016). A framework of curricu- assessed visual algorithm simulation exer- lum design for computational thinking devel- cises. In Proceedings of Kolin Kolistelut/Koli opment in K-12 education. Journal of Com- Calling–Third Annual Baltic Conference on puters in Education, 3(4), 377-394. Computer Science Education, 48-56. [21] Moreno-León, J., Robles, G., Román- [25] Wolz, U., Hallberg, C., Taylor, B. (2011). González, M. (2015). Dr. Scratch: Auto- Scrape: A tool for visualizing the code of matic analysis of scratch projects to assess Scratch programs. In Poster presented at the and foster computational thinking. Revista 42nd ACM Technical Symposium on Com- de Educación a Distancia, (46), 1-23. puter Science Education, Dallas, TX. [22] Aivaloglou, E., Hermans, F., Moreno-León, [26] Basawapatna, A. R., Repenning, A., Koh, K. J., Robles, G. (2017). A dataset of scratch H. (2015). Closing the cyberlearning loop: programs: scraped, shaped and scored. In Enabling teachers to formatively assess stu- Proceedings of the 14th international confer- dent programming projects. In Proceedings ence on mining software repositories, 511- of the 46th ACM Technical Symposium on 514, IEEE Press. Computer Science Education, 12-17, ACM. [23] Brennan, K., Resnick, M. (2012). New frame- [27] Boe, B., Hill, C., Len, M., Dreschler, G., works for studying and assessing the devel- Conrad, P., Franklin, D. (2013). Hair- opment of computational thinking. In Pro- ball: Lint-inspired static analysis of scratch ceedings of the 2012 annual meeting of the projects. In Proceeding of the 44th ACM American Educational Research Association, technical symposium on Computer science Vancouver, Canada, 1, 25. education, 215-220, ACM. [24] Seiter, L., Foreman, B. (2013). Modeling [28] Kong, S. C. (2019) Components and Meth- the learning progressions of computational ods of Evaluating Computational Thinking thinking of primary grade students. In Pro- for Fostering Creative Problem-Solvers in Se- ceedings of the ninth annual international nior Primary School Education. In: Kong ACM conference on International computing SC., Abelson H. (eds) Computational Think- education research, 59-66, ACM. ing Education. Springer, Singapore.