=Paper=
{{Paper
|id=Vol-3292/paper07
|storemode=property
|title=Trade-off model for supporting educators' digital competence assessment
|pdfUrl=https://ceur-ws.org/Vol-3292/DCECTEL2022_paper07.pdf
|volume=Vol-3292
|authors=Linda Helene Sillat,Kairit Tammets,Mart Laanpere
|dblpUrl=https://dblp.org/rec/conf/ectel/SillatTL22
}}
==Trade-off model for supporting educators' digital competence assessment==
Trade-off model for supporting educators’ digital competence
assessment
Linda Helene Sillat, Kairit Tammets, and Mart Laanpere
Tallinn University, Narva Road 25, Tallinn, 10120, Estonia
Abstract
The majority of the efforts in assessing educators’ digital competence over the past decade have
been focused on developing evidence-based and scientifically reliable assessment instruments.
These instruments are often created ad-hoc by research groups without deeper understanding
of the educators’ needs and expected benefits for digital competence assessment. That implies
that although the instrument might give valid and reliable results for the researchers it disregards
all other related stakeholders – educators, school leaders, educational technologist, teacher
trainers etc. To understand and guide evidence-informed decision-making when developing,
adapting or implementing digital competence assessment instruments it is important to
accommodate all stakeholders to provide meaningful assessment results and data. To provide a
solution for this problem we have designed a trade-off model which focuses on mapping the
digital competence assessment instruments to stakeholder needs and expected benefits. Our
research is divided into three main phases. First, we focused on understanding the concept and
domain of educators’ digital competence. For which we analysed the existing educators’ digital
competence frameworks, models and similar previous mappings from the literature. Secondly,
to explain the alternative digital competence assessment approaches and instruments we
mapped the underlying assessment processes and piloted alternative instrument with different
educator groups. The third and final phase focused on designing, developing and validating the
trade-off model. The following describes all three phases and provides an overview of the initial
findings which are accompanied with suggestions for further research in the field of educators’
digital competence assessment.
Keywords 1
Digital competence; assessment, instruments, educators, trade-off model.
1. Introduction needed but more importantly we need to
understand the level of digital competence of
educators to support meaningful professional
Using technologies in teaching and learning
development. Digital competence is considered
is not considered a novel practice any more but
as a goal oriented, confident and critical use of
rather presented as a norm for quality
technologies for work, employability, learning,
education. Innovative and pedagogically
leisure and inclusive participation in society
reasonable ways to implement technologies on
[1].
the other hand has presented difficult among
Educational assessment has been a central
teachers and thus the discussion on educators’
discussion for overall quality assurance in
digital competence has gained popularity.
educational settings or trying to understand
However, it is evident that not only mapping the
knowledge development [2]. Harlen & James
needed digital competence of educators is
[3] have stated that there are three general
Proceedings of the Doctoral Consortium of the Seventeenth
European Conference on Technology Enhanced Learning,
September 12-16, 2022, Toulouse, France
EMAIL: linda.sillat@tlu.ee; kairit.tammets@tlu.ee;
mart.laanpere@tlu.ee
ORCID: 0000-0001-9012-6165; 0000-0003-2065-6552; 0000-
0002-9853-9965
©️ 2020 Copyright for this paper by its authors. Use permitted under Creative
Commons License Attribution 4.0 International (CC BY 4.0).
CEUR Workshop Proceedings (CEUR-WS.org)
assessment approaches which also related to exploration of alternative assessment and (3)
digital competence assessment - formative, developing and validating the trade-off model.
summative and diagnostic assessment. Within To better focus the research, we examined the
these assessment approaches there is a variety research problem through three research
of instruments, most notably self-assessment, questions:
knowledge-based tests and authentic [RQ1] What are the implications and
assessment instruments like e-portfolios of alternative approaches of assessing educators’
reflective journals. It can be argued that for the digital competence?
past decade the efforts have mainly been [RQ2] What are the stakeholder
towards developing self-assessment requirements and needs for educators’ digital
instruments which are cost-effective, mostly competence assessment?
adaptable and cover variety of educators’ [RQ3] How are the alternative assessment
groups (i.e. primary to higher and vocational approaches established and sustained?
education). However, research done piloting
and implementing these self-assessment 2.1. Research context
instruments proposes a question whether
educators assess their digital competence or
The doctoral research focuses on the
something else entirely. Benali et al. [4]
Estonian educational setting and educators.
propose that majority of educators often assess
Based on Lucas et al. [10] educators’ digital
their self-confidence in integrating
competence is considered as a complex concept
technologies to their pedagogical practice and
due to the set of factors which include personal
fail to give suitable evidence of their current
characteristics, social, cultural, pedagogical and
practices. It is also considered that many digital
ethical considerations.
competence assessment instruments which are
Estonia operates in a decentralized
based on self-assessment do not cover digital
educational system which allows competition
competence but rather focus on low-order
between schools but also provides school and
cognitive skills [5], [6].
educator autonomy [11]. Autonomy is
Previous research has also revealed that
considered educators collective right to
knowledge-based testing and authentic
determine the way they implement the schools’
assessment requires higher volume of
curriculum in their classes while choosing
resources, both financial and human capital and
suitable pedagogical methods, tools, materials
is difficult to monitor [7].
and also technologies [12]. Educators
Regardless the form of assessment and type
autonomy is closely linked to professionalism
of used instruments it is concluded that there is
where after initial teacher training period any
a sustainability issue which implies that there is
form of examination or testing is not expected
a contradiction between the number of digital
or accepted by the educators. Although,
competence frameworks and models and the
teachers are required to regularly commit to
number of corresponding instruments.
professional development activities there is
Another dimension in educators’ digital
minimal monitoring or control mechanism.
competence assessment is the understanding of
the related stakeholder groups who either
require access to the assessment results or data. 3. Phase 1 - Educators digital
Adhering to these stakeholder group needs and competence
expectations has proven to be a difficult task
[8]. On one hand we lack a clear understanding
The first phase of the research was to
of these stakeholder profiles but more
understand and delineate the concept and
importantly there is little research which
domain of educators’ digital competence and
describes the needs.
assessment. This phase was guided by the
research question:
2. Research methodology [RQ1] What are the implications and
alternative approaches of assessing educators’
The doctoral research was done in three digital competence?
phases implementing design-based research We carried out a systematic literature review
methodology [9] – (1) domain analysis, (2) (SLR) [13] following the methodological
example of Siddiq et al. [14]. The SLR database competence assessment processes. One of the
search was carried out during March 2018 to proposed solutions was a large-scale
January 2019. For clear overview of the field participatory research which would focus on
we first identified the underlying synonyms and piloting alternative assessment instruments and
alternative phrases for database search. The approaches.
used terms included – digital competence: Based on the SLR results we concluded that
digital competency, ICT literacy, digital the future research lines included following the
literacy, ICT skills, digital skills, computer DigCompEdu framework [15] for educators
skills, technology literacies, digital which covers EU level specifics of educators
st
competencies and 21 century skills. To get an pedagogical practice and the derivatives or
overview of the instruments developed based predecessors were presented in the majority of
on the frameworks and models we also limited the analysed literature. The results also pulled
the database search based on the terminology focus on piloting and analyzing alternative
related to measurement – assessment, assessment approaches to self-assessment to
evaluation, testing, measuring, questionnaire. better understand the implications.
Literature screening resulted 40 suitable studies
which made up the literature used in the SLR. 4. Phase 2 – Alternatives in digital
Based on the analysis the SLR provided four
key results which helped to better define the competence assessment
concept of educators’ digital competence.
Additionally, the results provided the first The second and most extensive phase of the
insight to the implications related to the study focused on implementing alternative
alternative assessment approaches and digital competence assessment instruments
instrument. based on the DigCompEdu framework [15]
First, the SLR confirmed that majority of the which was the contextual basis of the for the
educators’ digital competence assessment following research. The second phase of the
related research focuses on quantitative studies study followed two research questions:
by implementing self-assessment instruments RQ1] What are the implications and
and there is a clear lack of qualitative research alternative approaches of assessing educators’
to accompany the results to explain the digital competence?
reliability and validity of the instruments. [RQ2] What are the stakeholder
Secondly, used self-assessment instruments requirements and needs for educators’ digital
are created ad-hoc often based on country competence assessment?
specific framework and targeted specific group While the main focus of this phase was to
of educators (i.e. in-service teachers, student identify the implications of alternative
teachers etc.). approaches, the research done also gave input
Third and considerably most fundamental to the related stakeholder groups and the
result revealed that self-assessment is often respective needs.
one-dimensional, meaning that there is During this phase four studies were
relatively low possibility to understand and conducted which included self-assessment
explain why and how educators approach instruments, knowledge-based testing and e-
digital competence self-assessment. To this end portfolio based digital competence assessment
it is important to embed alternative assessment approaches. The focus of the four studies was
approaches like testing or authentic assessment the following:
– including portfolios, reflective journals and Study 1 – In-service teachers’ perceptions
observations to understand educators’ of digital competence during distance learning
perceptions of their competence and make period.
sense of the evidence provided by the Study 2 – Comparative multiple-case study
educators. Furthermore, alternative and of three combined self-assessment and
combined competence assessment would knowledge-based testing digital competence
potentially further the research if educators assessment approaches.
assess their digital competence r rather self- Study 3 – SELFIE4Teachers [16]
efficacy or self-confidence. instrument based mixed methods study
The final key result of the SLR presented the combining self-assessment and nominal group
need for validated guidelines for the digital technique (NGT) [17] group interview.
Study 4 – Competence based LMS2 are a lot of efforts in designing and developing
focusing on e-portfolio based assessment of these assessment instruments they often lack in
digital competence. reliability. Additionally, as instrument validity
Table 1 describes the methodology, research is a multifaceted concept (i.e. face validity,
instrument, samples and timeline of these construct validity etc.) it boils down to the
studies. stakeholder needs. The second phase of the
Table 1 doctoral research also confirmed that there is a
Second phase studies. continuous issue with digital competence
Study Study Study Study assessment instrument sustainability where
1 2 3 4 focus on re-designing and developing new
Methodology Quan Quan MM Qual instruments is considered of higher priority,
Instrument SA SA&KB SA&NGT Auth. rather than updating the excising instruments.
Sample 1125 2248 18 84
Study time 2020 2019- 2022 2022
2021 5. Phase 3 - Trade-offs in digital
SA – Self-assessment. competence assessment
KB – Knowledge-based test.
NGT – Nominal Group Technique group The third and final phase of the research
interview. focuses on identifying the stakeholder specific
Auth. – Authentic assessment using e-portfolio. trade-offs in educators’ digital competence
Main results of the four studies can be assessment, developing and validating the
described in the following key ideas. First, trade-off model. This phase followed two
when implementing self-assessment research questions:
instruments, on average, educators assess their [RQ2] What are the stakeholder
digital competence as average technology requirements and needs for educators’ digital
users. In some cases, this describes the competence assessment?
educators’ inability of assessing their own [RQ3] How are the alternative assessment
competence and once again presents the approaches established and sustained?
question whether they assess digital The third phase included two main studies
competence or perceived self-confidence. where the first focused on identifying the
Second outcome of the studies revealed that stakeholder profiles (in-service teacher, student
educators are unable to provide appropriate teacher, advanced teacher, teacher trainer,
evidence to describe their digital competence. educational technologist, school leader,
As always there are exceptions, but the main qualification examination assessment board
issue lies in the fact that educators do not member) and scenarios and on the stakeholder
differentiate the different digital competence expectations and needs, resulting in the first
dimensions [15] (professional engagement; version of the trade-off model. The study was a
digital resources, teaching and learning, combined quantitative (N=1125) and
assessment, empowering learners and qualitative (N=4) methodology.
facilitating learners’ digital competence) and The second and final study of the doctoral
provide low-level generic evidence. research included the validation of the
The third result describes the educators’ stakeholder profiles and the trade-off model.
expectations towards the assessment The study was done following a Nominal
instrument, stating that the used instruments Group Technique and included representatives
often include hard to understand concepts and of each stakeholder profile (N=6).
definitions. Simultaneously, the educators As this phase of the research is still
brought out issues with the instrument length, underway the following describes initial
time spent on completion and the feedback outcomes. We consider noteworthy that all
report usability. stakeholders consider the process of digital
The final contribution of the four studies competence assessment valuable which helps to
relates to the validity, reliability and understand the professional development needs
sustainability of the used instruments. Based on of educators. Furthermore, the inductive
the research we concluded that although there analysis of the differences in stakeholder needs
2
https://edidaktikum.ee
gave us a clear indication that it is nearly assessment in education. Assessment in
impossible to provide a reliable and of high Education: Principles, Policy and Practice,
validity universal digital competence 11(1), 7–26. (2004).
assessment instrument. This means that a trade- https://doi.org/10.1080/096959404200020
off model could provide a solution to adhere to 8976
the stakeholder needs. The results also provide [3] Harlen, W., & James, M. Assessment and
deeper understanding on the stakeholder learning: Differences and relationships
specific scope and dimension of educators’ between formative and summative
digital competence assessment expectations. assessment. International Journal of
Phytoremediation, 21(1), 365–379.
6. Conclusion (1997).
https://doi.org/10.1080/096959497004030
4
The doctoral research is currently in the final [4] Benali, M., Kaddouri, M. & Azzimani, T.
stages where our efforts are focused on Digital competence of Moroccan teachers
publishing the results of finalized studies and
of English. International Journal of
formulating the analytical overview and main Education and Development using ICT,
scientific contributions. 14(2). (2018). Open Campus, The
While digital competence assessment and University of the West Indies, West Indies.
more specifically educators’ digital Retrieved May 6, 2022 from
competence has been an ongoing discussion
https://www.learntechlib.org/p/184691/.
and research topic for more than 15 years our [5] Kluzer, S., & Priego, L. P. DigComp into
research provides a new dimension to action: get inspired, make it happen.
understanding the assessment instruments, Publications Office of the European
approaches and processes. This doctoral
Union. JRC Science for Policy Report,
research can be described a metalevel research EUR 29115 EN. Editors: S. Carretero, Y.
which aims to describe and provide solutions Punie, R. Vuorikari, M. Cabrera, & W.
for the digital competence assessment through O’Keefe. (2018).
multiple stakeholder lens rather than trying to https://doi.org/10.2760/112945
provide one universal solution to a multifaceted [6] Siddiq, F., Hatlevik, O. E., Olsen, R. V.,
research problem. Throndsen, I., & Scherer, R. Taking a
future perspective by learning from the
7. Acknowledgements past - A systematic review of assessment
instruments that aim to measure primary
The doctoral research has received funding and secondary school students’ ICT
from: literacy. Educational Research Review,
1. The European Union’s Horizon 2020 19, 58–84. (2016).
research and innovation program under grant https://doi.org/10.1016/j.edurev.2016.05.0
agreement No. 669074. Activity was supported 02
through and according to Mobilitas Plus [7] Zenouzagh, M. Z. The effect of online
MOBEC001 CEITER action plan. summative and formative teacher
2. The European Union’s Horizon 2020 assessment on teacher competences. Asia
research and innovation program under grant Pacific Education Review, 20(3), 343–
agreement No. 856954. 359. (2019).
https://doi.org/10.1007/s12564-018-9566-
1
8. References [8] Struyven, K., Blieck, Y., & De Roeck, V.
The electronic portfolio as a tool to
[1] Ferrari, A. Digital Competence in Practice: develop and assess pre-service student
An Analysis of Frameworks. (2013). Joint teaching competences: Challenges for
Research Centre of the European quality. Studies in Educational Evaluation,
Commission., 91. 43, 40–54. (2014).
https://doi.org/10.2791/82116. https://doi.org/10.1016/j.stueduc.2014.06.
[2] Broadfoot, P., & Black, P. Redefining 001
assessment? The first ten years of
[9] Edelson, D. C. Design Research: What We
Learn When We Engage in Design. The
Journal of the Learning Sciences, 11(1),
105–121. (2002).
http://dx.doi.org/10.1207/S15327809JLS1
101_4
[10] Lucas, M., Bem-Haja, P., Siddiq, F.,
Moreira, A., & Redecker, C. The relation
between in-service teachers’ digital
competence and personal and contextual
factors: What matters most? Computers
and Education, 160 (March 2020). (2021).
https://doi.org/10.1016/j.compedu.2020.1
04052
[11] Erss, M. Complete freedom to choose
within limits’–teachers’ views of
curricular autonomy, agency and control
in Estonia, Finland and Germany.
Curriculum Journal, 29(2), 238–256.
(2018).
https://doi.org/10.1080/09585176.2018.14
45514
[12] Lawson, T. Teacher autonomy: Power or
control? Education 3-13, 32(3), 3–18.
(2004).
[13] Sillat, L. H., Tammets, K., & Laanpere, M.
Digital competence assessment methods in
higher education: A systematic literature
review. Education Sciences, 11(8). (2021).
https://doi.org/10.3390/educsci11080402
Yur'yev
[14] Siddiq, F.; Hatlevik, O.E.; Olsen, R.V.;
Throndsen, I.; Scherer, R. Taking a future
perspective by learning from the past—A
systematic review of assessment
instruments that aim to measure primary
and secondary school students’ ICT
literacy. Educ. Res. Rev. 19, 58–84.
(2016).
[15] Redecker, C., & Punie, Y.: Digital
Competence of Educators DigCompEdu.
(2017).
[16] European Commission. Digital Education
action Plan 2021-2027 Resetting
education and training for the digital age.
Commission Staff Working Document,
(SWD (2020) 209 final), 1–103. (2020).
[17] Delbecq, A. L., & Van de Ven, A. H. A
Group Process Model for Problem
Identification and Program Planning. The
Journal of Applied Behavioral Science.
(1971).