=Paper=
{{Paper
|id=Vol-3292/paper02
|storemode=property
|title=Towards a computer-assisted Computational Thinking assessment system in higher education
|pdfUrl=https://ceur-ws.org/Vol-3292/DCECTEL2022_paper02.pdf
|volume=Vol-3292
|authors=Xiaoling Zhang,Marcus Specht
|dblpUrl=https://dblp.org/rec/conf/ectel/ZhangS22
}}
==Towards a computer-assisted Computational Thinking assessment system in higher education==
Towards a computer-assisted Computational Thinking (CT)
assessment system in higher education
Xiaoling Zhang, Marcus Specht
Delft University of Technology, Van Moori Broekmanweg 6, Delft, 2628XE, The Netherlands
Abstract
With the vision to promote CT to a wider group of audiences, this PhD project explores the
formative assessment of CT skills in Programming Education to support students to learn CT
skills in Higher Education. In this project, we plan to investigate the importance of CT in the
context of Higher Education, explore the relationship between CT skills and programming
skills, build a model to assess learners’ CT skills and develop a computer-assisted assessment
system with automated components to enhance students’ CT competences in Higher Education.
Mixed-method research methodologies will be employed in distinct phases of the project
accordingly. A system which allows formative assessment of CT skills will be iteratively
designed and constructed throughout the project. The outcome of the project should support the
CT learning process, make CT more visible for people from diverse backgrounds and empower
them with a CT mindset to embrace the digitalization of society.
Keywords 1
Computational Thinking, Computer-Assisted Assessment, Higher Education, Educational
Technology
1. Introduction mentioned, digital skills, problem-solving
skills, and computational thinking (CT) are the
1.1. Digitalisation and top few most mentioned skills and are regarded
Computational Thinking as fundamental skills in workplaces [5-7, 28].
Computational Thinking is closely related to
Living in an era of digitalisation, digital the development of digitalisation in different
elements is everywhere. For instance, domains and changes the professional
education, healthcare and governance, competencies need for these professions. First
fundamentals to a modern society, are proposed by Papert as procedural thinking [8]
developing towards a digital direction [1-3]. and then being promoted by Wing [9], a
This has a huge influence on employment and considerable amount of research has been
skills, such as the increasing unemployment conducted to define CT in the past few decades.
rate, and the increasing demand for digital skills Though there is no agreed-upon theoretical or
in the labour market [4]. To empower people operational definition so far, existing works
the capability of living and working in such a share main components of CT, which are
digitalized society, governments, and education problem decomposition, abstraction, pattern
institutions from distinct levels world-wide recognition and algorithm [9-15]. Besides
have been striving to promote education of studying the operational and theoretical
computer-based technologies and skills varying definition of CT, massive amounts of studies
from academy to industry. Among skills being have been conducted globally to investigate
topics around CT education, such as
Proceedings of the Doctoral Consortium of Seventeenth European
Conference on Technology Enhanced Learning, September 12-16,
2022, Toulouse, France
EMAIL: x.zhang-14@tudelft.nl (A. 1); m.m.specht@tudelft.nl (A.
2);
ORCID: 0000-0003-0951-0771 (A. 1); 0000-0002-6086-8480 (A.
2);
©️ 2020 Copyright for this paper by its authors. Use permitted under Creative
Commons License Attribution 4.0 International (CC BY 4.0).
CEUR Workshop Proceedings (CEUR-WS.org)
pedagogical contents, didactic strategies, objectives considering the proficiency level
integration of CT into other disciplines [16-26]. also differ accordingly on learner’s level of
People of almost all ages can be participants proficiency. Therefore, it is important to know
in these studies, however, most of the existing what the necessary skills are to be developed in
research focuses specifically on K-12 settings, higher education, what proficiency level of CT
with an increasing number of studies conducted is expected for people from distinct domains
in Higher Education over the last decade. and in what way should CT be incorporated in
Existing work in K-12 settings has explored a different domains in Higher Education.
considerable range of topics regarding learning Programming education is frequently used for
and teaching CT in both science, technology, fostering CT in higher education; visual
engineering, and mathematics (STEM) and programming in Scratch and Alice as well as
non-STEM disciplines, results in a text programming in Python, C, C++, Java have
flourishment of development in tools and been used for teaching CT in K-12 settings as
activities for teaching and learning CT, both well as in Higher Education settings [39-40].
CS-unplugged such as bebras challenge and However, it remains a controversial topic
Lego construction and CS oriented such as whether everyone should learn to code. For
programmable robotics, micro-bits, code.org, example, Shein acclaimed that “Not everyone
Scratch, Alice [20]. While being regarded as needs coding skills but learning how to think
crucial competence for learners in higher like a programmer can be useful in many
education, the development of CT, compared to disciplines” [35]. Therefore, it would be
CT in K-12 setting, is still in its infancy. important to study the role of Programming
Increased attention has been paid to CT in Education.
Higher Education in recent years, most of CT and programming skills are closely
which are related to Computer Science (CS) interlinked and are both challenging for novice
major, and few are in non-CS major disciplines learners [29, 30]. However, a significant drop-
[26]. In their literature review, Lyon and out rate can be found in programming education
Magana identified several issues existing in on novice learners due to distinct difficulties
current CT education which makes it difficult students meet during their learning process
for students to understand CT, including [31]. Pane et al. [32] found that the ability to
unclear definition, lack of assessment methods, solve problems using programming skills so
unclear use of CT in classrooms [26]. They also that the solution can be transformed and
stressed the necessity of a clearer definition of executed by computing agents does not come
CT and called for more implementation of CT naturally for learners in CS studies.
in Higher Education and studies. Additionally, studies also suggest that the
With current insights into existing literature, absence of strategic tools can lead to deficient
it is obvious to conclude that CT is closely performance in learning to program [33-34].
related to developments of digitalisation in To overcome these challenges, it is
different domains and changes the professional necessary to conduct research in both
competencies needed for these professions. programming skills and CT skills and the
However, it is still unclear how to embed CT in relationship between them, which has been
different curricula and how to develop seldom researched.
transdisciplinary CT skills. Therefore, Through qualitative and quantitative
researchers need to conduct studies to establish analyses, Selby [38] built a preliminary model
a comprehensive and more complete system for to reveal connections between CT skills and
the purpose of enhancing people’s CT programming activities using Bloom’s
competencies. taxonomy. However, it does not demonstrate in
detail how CT can be measured in
1.2. Computational Thinking and programming. Thus, it is necessary to carry out
studies on how to empower students to use CT
Programming Education in Higher as a strategic tool for programming and gain CT
Education knowledge through learning to program.
In brief, the following questions should be
Learners of diverse backgrounds learn CT studied regarding CT and Programming
with various purposes and learners’ target Education in Higher Education:
• What skills are necessary for students strived to promote the concept of formative
in different domains in Higher Education? assessment in CS for K-12. In contrast, no
• What is the role of Programming existing study explicitly facilitates formative
Education for students from different domains assessment either in computing education or in
in Higher Education? Programming Education specifically in Higher
• How are programming skills and CT Education.
skills related and how to foster CT skills via While most of the assessments being
programming? conducted on CT and Programming Education
are summative, there is some work that applies
1.3. Formative Assessment and formative assessment measures in their
implementations. These implementations
Feedback Generation focused on merely part of programming
education and none of these works incorporated
Novice programmers who are new to CT into programming education, making them
programming are faced with challenges such as infeasible for assessing CT in Programming
misunderstanding the programming concepts, Education. Meanwhile, some studies aimed at
misusing the language syntax, and supporting students in learning to program,
understanding poorly the feedback generated mostly in the form of automated assessment
from the interpreter or compiler [31]. systems and intelligent tutoring systems for
Alternative approaches to overcome these programming exercises. In their literature
issues can be, for instance, enhancing teachers' review, Keuning et al. [47] reported that most
pedagogical content knowledge, developing of the elaborate feedback provided by the
more effective didactic strategies, using systems reviewed focus on the identification of
formative assessment to provide feedback. mistakes and no further suggestions on how to
Assessment and feedback are essential proceed and fix the problem. This, however,
elements in different learning theories which can impede students from enhancing their
are used to assist students in the learning performance according to the feedback model
process [41]. Assessment is presented in two defined by Hattie and Timperley [45].
categories in general, formative assessment and Therefore, it is necessary to conduct research to
summative assessment. Formative assessment explore formative assessment of CT in
is defined as assessment for learning while Programming Education in order to assist
summative assessment as assessment of students in the learning process to enhance their
learning [42]. Formative assessment generally CT in Programming Education.
consists of teacher observation, conventional With the vision to make CT skills more
assessment, oral presentation and so on. accessible and tangible in the context of
According to Paul Black & Dylan Wiliam [43], Programming Education for learners from
formative assessment remains incomplete until different domains, this project aims to develop
it has resulted in feedback and action on the part formative assessment components to improve
of the instructor and/or learner. Therefore, a students’ performance in learning to program
formative assessment is all about feedback. and gaining CT skills.
According to Hattie and Timperley [45],
feedback is one of the most crucial factors for
2. Theoretical Background
efficient learning.
The development of formative assessment in
Programming Education is still at an early age To address the questions mentioned in the
though there has been lots of research on last section, theories on formative assessment
intelligent tutoring systems which assess and theoretical models of CT and Programming
students’ solutions in recent years. Computer- Education are crucial. Therefore, they are being
assisted learning environments provide the investigated to ensure the reliability of the
opportunity to automate the assessment and conduction of the project. CT and Programming
considerable work has been conducted to assess Education will be first introduced with a focus
works in STEM disciplines automatically [44]. on Brennan and Resnick’s operational
In terms of Programming Education, Grover framework [16] and Bloom’s taxonomy on
[42], in the Raspberry Pi Foundation Programming Education. Then follows theory
Computing Education Research Seminar, for formative assessment and feedback models
with a focus on Hattie’s feedback model and the skills and CT skills as well as using Bloom’s
theory of formative assessment from Paul taxonomy and SOLO taxonomy to differentiate
Black & Dylan Wiliam [43]. The theories are various levels of cognition for both CT and
identified as the backbone in the programming skills [36, 37]. Assessment of CT
implementation of this project. through assessing Scratch codes in Dr. Scratch
with the framework presented by Brennan [38]
2.1. Computational thinking and is an example of how CT can be matched in
Programming Education [49]. Selby [39]
programming education (Bloom’s developed a model which discovers the
Taxonomy) relationship between CT skills and
programming activities by using Bloom’s
Although there are no agreed-upon taxonomy. This model can serve as the
operational and theoretical definitions, backbone in fostering CT via programming and
definitions given by researchers and educators vice versa.
share the same elements in their definition.
Wing defined CT operationally with the 2.2. Formative assessment and
concepts of abstraction and automation [9].
feedback generation
Having components used in Wing’s definition,
Barr and Stephenson [46] included also
problem decomposition, algorithmic thinking, Having a CT framework and a model which
data collection, analysis and representation and maps CT to programming using cognitive
simulation to define CT. Similarly, Selby’s levels in Bloom’s taxonomy is insufficient for
definition of CT consists of abstraction, this project as the aim of this project is to
decomposition, generalization, evaluation and enhance students’ CT skills via formative
algorithmic design [38]. Four main components assessment. Therefore, this subsection will
of CT can be identified from existing introduce theories on formative assessment and
definitions: problem decomposition, pattern models for generating feedback as formative
recognition, abstraction and algorithmic design. assessment is said to be all about feedback [42].
Deriving from the main CT components, Assessment is identified as one of the
Brennan and Resnick [38] proposed an fundamental elements in all learning theories in
operational framework of CT which is education [41]. Formative assessment is
frequently used in CT studies and the defined as assessment for learning, and it is
framework relates quite close to programming expected to result in feedback and action on the
concepts and skills. Three dimensions part of the instructor and/or learner if formative
constitute the framework: computational assessment is implemented. Thus, feedback is
concepts, computational practices and crucial in formative assessment, which is
computational perspectives. These components consistent with “Feedback plays a crucial role
are recognizable in other disciplines and in learning” [27].
practices as well, which is consistent with The efficiency of the feedback is influenced
Denning’s description CT: it is nothing new, it by the kind of formative feedback provided and
is the way of thinking about the world shaped the learner characteristics. Under the definition
by the current technologies [50]. This given by Boud and Molloy [51], feedback is
framework considers elements formative, and it can be used to improve
comprehensively from both a knowledge learners’ performance. Another type of
perspective and a psychology perspective and it feedback is summative feedback, typically
is a framework that can be practically used for consists of grades or percentage of evaluation,
setting learning objectives, designing which informs the learner about the
pedagogical contents, and assessing students’ performance. However, this type of feedback is
performance [48]. usually too superficial to be useful for learners.
CT concepts and CT practices involved in Therefore, formative feedback is of more
this framework [48] are some of the indicators importance for the purpose of improving
that measure CT competences through learning.
programming concepts and practices. Studies Different definitions and models have been
have been conducted to map programming investigated regarding feedback generation
both in general and for studies in specific
domains. Boud and Molloy define feedback as • CT competencies: according to
a process in which the learners improve their Brennan’s framework, CT competencies refer
work with the given information which presents to CT concepts, CT practices and CT
the discrepancy and similarities between perspectives.
learners’ work and the expected standards [51]. • Programming skills: including
Hattie and Timperley [45] described a model conceptual knowledge, syntactic knowledge
for feedback which is also in a formative way. and strategic knowledge and programming
The model aims to answer learners’ questions style.
about where they are, how they should proceed • Indicators for CT skills and
and where they should arrive. In this model, programming skills: Any features, instruments
feedback is categorized into “task level”, that provide a sign or a signal of CT
“process level”, “self-regulation level” and competence and programming skills.
“self-level”, with findings indicating self-level • Formative assessment: A kind of
the most ineffective one. assessment which provides feedback to the
Having a model of feedback is insufficient learner and it is an assessment for learning.
for generating the most effective feedback for
learners, extra facets should be considered 3. Research Questions
when generating feedback. In Le and
Pinkwart’s work [52], programming exercises
supported in learning environments were The research will be guided by the following
research questions:
categorized into three classes according to the
RQ1. How are CT skills and
level of ill-definedness of the programming
programming skills being conceptualised
problem. As Hattie and Timperley [45] pointed
and measured?
out that feedback should target students at
appropriate levels, it would be necessary to also 1. What are indicators and assessment
methods for CT competence and programming
consider Narciss’s [53] categorization of
skills?
feedback in computer-assisted learning
2. What systems and domains are using
environments according to the aspects of the
the indicators and assessments for CT
instructional context. Narciss [53] has
identified eight types of feedback components, competence and programming skills?
five of them are elaborated feedback 3. How to evaluate the validity of the
indicators/assessment?
component and are intended to “improve
After collecting the indicators for CT
learner’s performance”: knowledge about task
competencies and assessment methods,
constraints (KTC), knowledge about concepts
techniques used for formative assessment and
(KC), knowledge about mistakes (KM),
knowledge about how to proceed (KH) and feedback generation and the effect of feedback
should be investigated to provide the basis for
knowledge about Meta-cognition (KMC).
design feedback generation strategies.
Combining the context to be assessed, the type
Therefore, the second research question is:
of exercises to be assessed and the feedback
RQ2. How should feedback be provided
level to provide, a strategy for generating
feedback can be devised. to support developing CT skills and
programming skills, and how should
In sum, this project will first focus on
formative assessment be implemented in this
identification of the need for CT and the role of
process?
Programming Education in different
1. What formative assessment and
disciplines. Then, the focus will be shifted to
feedback generation strategies are used for the
the measurement of CT skills and programming
skills and the relationship between these two development of programming skills and CT
competence?
sets of skills. Based on studies conducted, this
2. What are the effects of different types
project will then explore feedback generation
of feedback on motivation, learning gain, and
and develop feedback generation strategies to
CT performance?
promote CT for students from different
domains and enhance their performance in CT 3. What empirical knowledge has been
established regarding the effect of providing
skills and programming skills. The following
feedback on the development of CT
definitions will be used for the remainder of the
competence and programming skills??
proposal:
4. How to use formative assessment and parallel, design and development of the
generate feedback to support the development formative assessment tool for CT in the context
of CT and programming skills? of Programming Education will be carried out
Based on the results obtained by answering throughout the lifecycle of the project. In
the questions above, the next step is to addition to that, the design, development and
contextualize the feedback and thus employ testing of the prototype will be iteratively
formative assessments for learners from proceeded. The plan for the workflow is
different educational backgrounds. To achieve provided in the diagram shown in Figure 1 (in
the goal, the following questions should be the Appendix.
studied: Phase 1 Desktop research - Literature
RQ3. How can Programming Education review
and learning of CT be contextualised and In this phase, a systematic literature review
embedded in different educational domains? will be conducted to get a holistic overview of
1. How important are links between formative assessments for supporting learners
curricular tasks and CT skills? in different disciplines to learn CT in the
2. What role can transfer learning play in context of Programming Education. This
the contextualisation of CT? process will follow the PRISMA statements
3. What are the means to contextualise and the PRISMA diagram, including defining
and embed CT learning in different domains? research questions, collecting literature,
4. What is the impact of contextualised screening, checking eligibility of the literature,
teaching of CT skills on student motivation and data extraction and analysis of extracted results.
understanding? RQ1.1, RQ1.2, RQ2.1 and RQ3.1 will be
addressed in this phase. The outcome of this
4. Design and Methods phase will be indicators used for assessment
and assessment methods for CT in
Programming Education; a comprehensive
The research is organized in four phases. In
overview of formative assessment and feedback
the first phase a desktop research/systematic
generation; empirical experiences of CT in
literature review will be used to identify
different domains.
relevant works to get an overview of state-of- Phase 2 Exploratory research/ Formative
the-art regarding the topic being studied in this studies - Build up the assessment model and
project - formative assessment for supporting
a CT Dashboard
students from different disciplines in the
This phase begins with interviews with
process of learning CT in the context of
different target groups. The aim of the interview
Programming Education in Higher Education.
is to identify the necessity of CT skills and the
The following factors will be identified in this role of Programming Education for learners
phase: indicators used for assessment and
with diverse backgrounds. In combination with
assessment methods for CT in Programming
the indicators and assessment methods
Education; formative assessment and feedback
identified in Phase 1, assessment models can
generation; empirical experiences of CT in
then be prototyped according to the result from
different domains. The indicators identified in a qualitative analysis of the interviews. The
the first phase can then be used to develop an
interviews should also clarify the embedding of
assessment model for CT in the context of
the CT skills in the different study contexts and
Programming Education and a CT dashboard to
the relevance for student and educators’ goals
present learners’ progress and CT level.
in the different curricula. According to the goals
Exploratory research in the form of formative
and models a CT dashboard will be developed.
studies will be employed in this phase. Phase To ensure the usability of the models and the
three will focus on the development of
CT dashboard, a usability study will be
strategies for feedback generation and
conducted in a programming course for
formative assessment based on the assessment
students and the models and CT dashboard will
model and the CT dashboard built in phase two.
be refined accordingly. Once the usability of the
In the last phase, an integrated study will be model is verified, quasi experimental studies
conducted to evaluate the tool developed and
will then come into play to examine the effect
refine the system according to different needs
of using the assessment model and CT
from people of different backgrounds. In
dashboard.
In this phase, RQ1.3, RQ2.2 and RQ2.3 will This work is a part of a PhD project funded
be studied, and an assessment model based on by Center for Education and Learning at
the indicators and assessment methods found in Leiden-Erasmus-Delft Universities (LDE-
Phase 2 will be developed. This will include a CEL).
participatory design and prototype of a CT
dashboard. The design and the development of 6. References
the models and the CT dashboard will proceed
iteratively.
Phase 3 Develop feedback and formative [1] Dillenbourg, P. (2016). The Evolution of
Research on Digital Education. Int J Artif
assessment based on assessment model and
Intell Educ 26, 544–560 (2016).
CT Dashboard
https://doi.org/10.1007/s40593-016-0106-
This phase will focus on addressing RQ2.4,
z.
which is about developing proper feedback
generation strategy to present to students their [2] Duggal, R., Brindle, I., Bagenal, J. (2018,
January 15). Digital healthcare: regulating
CT competencies and programming skills
the revolution. doi:
based on the strategies for feedback generation
https://doi.org/10.1136/bmj.k6.
and formative assessment identified in Phase 1
[3] Holzer, M., Kim, Seang-Tae. (2006)
and the CT assessment prototype and CT
dashboard developed in Phase 2. Formative Digital Governance in Municipalities
studies will be conducted to iteratively develop Worldwide (2005) : A Longitudinal
Assessment of Municipal Websites
the feedback generation model. Student models
Throughout the World. United Nations
will be identified in this phase by using data
Public Administration Network.
such as analysis of students’ code, student's
http://unpan1.un.org/intradoc/groups/publ
competence profile and analysis of students’
performance. At the end of this phase, strategies ic/documents/aspa/unpan022839.pdf.
[4] Schwab,K., Sala-i-Martín, X. (2013).The
for providing feedback and formative
Global Competitiveness Report 2013–
assessment should be identified.
2014: Full Data Edition. URI:
Phase 4 Evaluation - Integrated study on
http://hdl.handle.net/11146/223.
the developed formative assessment tool
The result from Phase 3 will provide a basis [5] D Barr, J Harrison, L Conery. (2011).
to address RQ3.2 to RQ3.4 in this phase. Computational thinking: A digital age skill
for everyone. Learning & Leading with
Considering the factors which are important in
Technology, 2011 - ERIC.
adapting feedback for learners from different
[6] Francisco José García-Peñalvo, Antònio
domains identified in phase 1, RQ3.2 to RQ 3.4
José Mendes,Exploring the computational
will be addressed by conducting an integrated
study which includes both case studies and an thinking effects in pre-university
education,Computers in Human
evaluation study to contextualise the model
Behavior,Volume 80,2018, Pages 407-
developed and embed it into different
411, ISSN 0747-5632,
educational domains and verify the validity and
https://doi.org/10.1016/j.chb.2017.12.005.
the effectiveness of the designed system. This
integrated study aims to evaluate the tool [7] Anita JUŠKEVIČIENĖ, Valentina
DAGIENĖ. (2018). Computational
developed and refine the system according to
Thinking Relationship with Digital
diverse needs from people of different
Competence.
backgrounds such that CT can be promoted
[8] Papert, S. Mindstorms: Children,
further to a wider audience.
Computers, and Powerful Ideas. Basic
Books, 1980.
5. Acknowledgements [9] Wing, J. Computational thinking,
Commun. ACM 49, 3 (Mar. 2006), 33–35.
Xiaoling Zhang: Conceptualization, [10] Wing, J. 2008. Computational thinking
Methodology, Data Collection, Analysis, and thinking about computing.
Writing - Original Draft, Writing – Review & Philosophical Transactions of The Royal
Edit, Visualization, Resources Society A, 366, 3717-3725.
Marcus Specht: Conceptualization, [11] Wing, J. 2011. Research Notebook:
Methodology, Writing – Review & Edit Computational Thinking - What and Why?
The Link. Pittsburgh, PA: Carneige https://doi.org/10.3102/0013189X124630
Mellon. 51.
[12] Computer Science Teachers Association [21] K. Jaipal‐jamani and C. Angeli, Effect of
Task Force. 2011. K–12 Computer robotics on elementary preservice
Science Standards, New York, ACM. teachers' self‐efficacy, Sci. Learn.
[13] Hu, C. 2011. Computational thinking: Comput. Think. (2017), 175–192.
what it might mean and what we might do https://doi.org/10.1007/s10956‐016‐9663‐
about it. Proceedings of the 16th annual z.
joint conference on Innovation and [22] Y. Jeon and T. Kim.The effects of the
technology in computer science education. computational thinking‐based
Darmstadt, Germany: ACM. programming class on the computer
[14] Guzdial, M. 2011. A Definition of learning attitude of non‐major students in
Computational Thinking from Jeannette the teacher training college, J. Theor.Appl.
Wing. Computing Education Blog Inf. Technol. 95 (2017), 4330–4339.
[Online]. Available from: [23] B. Kim, T. Kim, and J. Kim, Paper‐and‐
http://computinged.wordpress.com/2011/ pencil programming strategy toward
03/22/a-definition-of-computational- computational thinking for non‐majors:
thinking-from-jeanette-wing/ [Accessed Design your solution, J. Educ. Comput.
30-11-2020]. Res. 49 (2013), 437–459.
[15] Guzdial, M. 2012. A nice definition of https://doi.org/10.2190/EC.49.4.b
computational thinking, including risks [24] Y. Lan, Exploration on database teaching
and cyber-security. Computing Education based on computational thinking, Bol.
Blog [Online]. Available from: Tec. Bull. 55 (2017), 363–370.
http://computinged.wordpress.com/2012/ https://www.scopus.com/inward/record.ur
04/06/a-nice-definition-of-computational- i?eid=2‐s2.0‐
thinking-including-risks-and- cyber- 85038935540&partnerID=40&md5=010e
security/ [Accessed 30-11-2020]. 1ae2c24dbf9eaa18b000844ece22.
[16] Brennan, K., & Resnick, M. 2012, New [25] C. Mouza et al., Resetting educational
frameworks for studying and assessing the technology coursework for pre‐service
development of computational thinking. teachers: A computational thinking
Paper presented at the Annual Meeting of approach to the development of
the American Educational Research technological pedagogical content
Association, Vancouver, BC. knowledge (TPACK), Australas. J. Educ.
[17] Z. Berkaliev et al., Initiating a Technol. 33 (2017).
programmatic assessment report, Primus https://doi.org/10.14742/ajet.3521.
24 (2014), 403–420. [26] Lyon, J. & Magana, A. (2020).
https://doi.org/10.1080/10511970.2014.89 Computational thinking in higher
3939. education: A review of literature.
[18] P. Curzon et al., Developing Computer Applications in Engineering
computational thinking in the classroom: a Education. 28. 10.1002/cae.22295.
framework, Comput. Sch. (2014), [27] A. L. S. O. de Araujo, W. L. Andrade, and
http://eprints.soton.ac.uk/369594/10/Deve D. D. S. Guerrero, “A systematic mapping
lopingComputationalThinkingInTheClass study on assessing computational thinking
roomaFramework.pdf. abilities,” in Frontiers in Education
[19] C. Evia, M. R. Sharp, and M. A. Perez‐ Conference (FIE), 2016 IEEE. IEEE,
Quinones, Teaching structured authoring 2016, pp. 1–9.
and DITA through rhetorical and [28] Francisco José García-Peñalvo, Antònio
computational thinking, IEEE Trans. Prof. José Mendes. Exploring computational
Commun. 58 (2015), 328–343. thinking effects in pre-university
https://doi.org/10.1109/TPC.2016.251663 education. Computers in Human Behavior.
9. Volume 80. 2018. Pages 407-411. ISSN
[20] S. Grover and R. Pea, Computational 0747-5632.
thinking in K‐12: A review of the state of https://doi.org/10.1016/j.chb.2017.12.005.
the field, Educ. Res. 42 (2013), 38–43. [29] M. Tedre. Many paths to computational
thinking. Paper presented at the TACCLE
3 final conference, Brussels, Belgium [38] Cynthia C. Selby. 2012. Promoting
(2017). computational thinking with
[30] P. Denning, M Tedre, P Yongpradit. programming. In Proceedings of the 7th
Misconceptions about computer science. Workshop in Primary and Secondary
Communications of the ACM. Volume 60, Computing Education (WiPSCE '12).
Number 3 (2017), Pages 31-33. Association for Computing Machinery,
[31] Yizhou Qian and James Lehman. 2017. New York, NY, USA, 74–77.
Students’ Misconceptions and Other DOI:https://doi.org/10.1145/2481449.248
Difficulties in Introductory Programming: 1466.
A Literature Review. ACM Trans. [39] Cynthia C. Selby. 2014. How Can
Comput. Educ. 18, 1, Article 1 (December Teaching of Programming Be Used to
2017), 24 pages. DOI Enhance Computational Thinking Skills?
:https://doi.org/10.1145/3077618. University of Southampton, Faculty of
[32] Pane, J. F., Ratanamahatana, C. A. & Social and Human Sciences, PhD Thesis,
MYERS, B. A. 2001. Studying the pagination.
language and structure in non- [40] Valerie J. Shute, Chen Sun, Jodi Asbell-
programmers' solutions to programming Clarke. Demystifying computational
problems. International Journal of Human- thinking. Educational Research Review.
Computer Studies, 54, 237-264. Volume 22. 2017. Pages 142-158. ISSN
[33] Robins, A., Rountree, J. & Rountree, N. 1747-938X.
2003. Learning and Teaching https://doi.org/10.1016/j.edurev.2017.09.0
Programming: A Review and Discussion. 03.
Computer Science Education, 13, 137 - [41] Dale H. Schunk. (2012). Learning theories
172. an educational perspective sixth edition.
[34] Saknini, V. & Hazzan, O. 2008. Reducing [42] Shuchi Grover. (2020). Formative
Abstraction in High School Computer Assessment for Students in CS
Science Education: The Case of Classrooms,
Definition, Implementation, and Use of https://www.youtube.com/watch?v=0ZuS
Abstract Data Types. J. Educ. Resour. qsJQRFg&feature=emb_title. [last access:
Comput., 8, 1-13. 2020-11-16]
[35] Esther Shein. 2014. Should everybody [43] Black, P., Wiliam, D. Developing the
learn to code? Commun. ACM 57, 2 theory of formative assessment. Educ Asse
(February 2014), 16–18. Eval Acc 21, 5 (2009).
DOI:https://doi.org/10.1145/2557447. https://doi.org/10.1007/s11092-008-9068-
[36] Susana Masapanta-Carrión and J. Ángel 5.
Velázquez-Iturbide. 2018. A Systematic [44] Barana, A., Conte, A., Fioravera, M.,
Review of the Use of Bloom's Taxonomy Marchisio, M., Rabellino, S.; A model of
in Computer Science Education. In formative automatic assessment and
Proceedings of the 49th ACM Technical interactive feedback for STEM. In:
Symposium on Computer Science Proceedings of 2018 IEEE 42nd Annual
Education (SIGCSE '18). Association for Computer Software and Applications
Computing Machinery, New York, NY, Conference, pp. 1016–1025. IEEE
USA, 441–446. Computer Society Conference Publishing
DOI:https://doi.org/10.1145/3159450.315 Services (CPS), Tokyo, Japan (2018).
9491. [45] Hattie, J., & Timperley, H. (2007). The
[37] David Ginat and Eti Menashe. 2015. power of feedback. Review of educational
SOLO Taxonomy for Assessing Novices' research, 77(1), 81-112.
Algorithmic Design. In Proceedings of the [46] V. Barr and C. Stephenson, Bringing
46th ACM Technical Symposium on computational thinking to K‐12: What is
Computer Science Education (SIGCSE involved and what is the role of the
'15). Association for Computing computer science education community?
Machinery, New York, NY, USA, 452– ACM Inroads 2 (2011), 48–54.
457. [47] Keuning, H., Jeuring, J. T., & Heeren, B.
DOI:https://doi.org/10.1145/2676723.267 J. (2019). A Systematic Literature Review
7311. of Automated Feedback Generation for
Programming Exercises. ACM USA, 132–133.
Transactions on Computing Education, DOI:https://doi.org/10.1145/2818314.281
19(1), [3]. 8338.
https://doi.org/10.1145/3231711 [50] Peter J. Denning and Matti Tedre. (2019).
[48] Yeni, S., & Hermans, F. (2019). Design of Computational Thinking.
CoTAS: Automated computational [51] Boud, D., & Molloy, E. K. (2013).
thinking assessment system. In TACKLE Feedback in higher and professional
2019: 2nd Systems of Assessments for education: Understanding it and doing it
Computational Thinking Learning well. Routledge.
workshop: Proceedings of the 2nd Systems https://doi.org/10.4324/9780203074336.
of Assessments for Computational [52] Sebastian Gross, Bassam Mokbel, Barbara
Thinking Learning workshop (TACKLE Hammer, and Niels Pinkwart. 2015.
2019) (Vol. 2434). (CEUR Workshop Learning Feedback in Intelligent Tutoring
Proceedings). Systems. Künstliche Intelligenz 29, 4
[49] Jesús Moreno-León and Gregorio Robles. (2015), 413–418.
2015. Dr. Scratch: a Web Tool to [53] Susanne Narciss. 2008. Feedback
Automatically Evaluate Scratch Projects. strategies for interactive learning tasks.
In Proceedings of the Workshop in Handbook ofresearch on educational
Primary and Secondary Computing communications and technology (2008),
Education (WiPSCE '15). Association for 125–144.
Computing Machinery, New York, NY,
7. Appendix
RQ1: How are CT skills and programming RQ2: How should feedback be provided to support developing CT skills and programming RQ3: How can Programming Education and learning of CT be
skills being conceptualised and measured? skills, and how should formative assessment be implemented in this process? contextualised and embedded in different educational domains?
Method: Systematic Literature Review Using Method: Mixed Method Method: Mixed Method Method: Mixed Method
PRISMA Diagram
Objectives Objectives Objectives Objectives
• Relationship between CT • Focus groups reflection on mapping of CT • Feedback generation strategy for students • Usability of the developed assessment
and programming skills and programming skills based on findings in S2 component
• Indicators for CT competence • Validated mapping of CT and programming • Refinement of the assessment model built • Validity and reliability of the assessment
• Feedback generation strategies skills (consider different domains) in S2 component
• Systems / models / prototypes • Assessment prototype & CT Dashboard • Student models from different disciplines • Refinement of the component developed
• Empirical knowledge • Usability of the prototype • Usability of the assessment component
Deliverable: Conference/ Journal paper Deliverable: Conference/ Journal paper Deliverable: Conference/ Journal paper Deliverable: Conference/ Journal paper
Technical Development Track – Iterative design process, development, and test
Figure 1. The whole PhD research plan with the main goals presented for each year. The system for
providing feedback will be iteratively designed and developed throughout the project lifecycle.