=Paper= {{Paper |id=Vol-2985/paper6 |storemode=property |title=How Deployment Processes Affect the Adoption of Learning Analytics in Higher Education Institutions: Improving Potential for Impact with Better Deployment Practices |pdfUrl=https://ceur-ws.org/Vol-2985/paper6.pdf |volume=Vol-2985 |authors=Amanda Sjöblom,Anni Silvola,Jiri Lallimo }} ==How Deployment Processes Affect the Adoption of Learning Analytics in Higher Education Institutions: Improving Potential for Impact with Better Deployment Practices== https://ceur-ws.org/Vol-2985/paper6.pdf
How deployment processes affect the adoption of
learning analytics in higher education institutions:
Improving potential for impact with better
deployment practices
Amanda Sjöbloma , Anni Silvolab and Jiri Lallimoc
a
  Aalto University, 02150 Espoo, Finland
b
  University of Oulu, 90570 Oulu, Finland
c
  Aalto University, 02150 Espoo, Finland


                                         Abstract
                                         As the development and implementation of new learning analytics and other digital tools in higher
                                         education appear to be on a continual rise, we examined the attitudes toward new tools, and particularly
                                         what affects the adoption and experienced usefulness of new tools. An increasing amount of knowledge
                                         has been accumulated on aspects of learning analytics solutions that affect impact and user experiences,
                                         but the effects of the processes surrounding the deployment of new solutions appear less clear. We aimed
                                         to discover what factors higher education staff find useful when required to take a new tool into use,
                                         and what factors can hinder the learning and use of new learning analytics tools. Results indicated that
                                         deployments often fail to account for user-characteristics, and that deployment processes should be more
                                         tailored, accounting for users’ skills, roles, and tasks. HE staff indicated that new LA tools often lack
                                         adequate support, communication, instructions, and considerations of user needs, and are not able to
                                         communicate clear use-cases and expected value. These identified shortcomings provide good lessons
                                         for future learning analytics deployment projects, urging developers of analytics tools to invest time and
                                         effort into smooth deployment and support, so that the tools can have an impact in higher education
                                         institutions. Improvement at a relatively easy-to-deliver -level, such as good tailoring of instructions
                                         and communication of practical value, could improve the acceptance and use of learning analytics, and
                                         thus also impact on the intended learning or teaching targets of the learning analytics solution.

                                         Keywords
                                         learning analytics, user needs, user training, digital tool deployment in HE




1. Introduction
Online environments, learning analytics (LA) and other digital tools have become an integral
part of the various learning, teaching, research, and administrative processes in higher education
(HE). Institutions are increasingly utilising various LA solutions and other digital tools to better
understand and support students and teachers (Larrabee Sonderlund, Hughes, & Smith, 2019;
Viberg, Hatakka, Bälter, & Mayroudi, 2018). Despite evidence of the potential benefits of various
learning analytics solutions and digital tools, there appears to be little evidence in literature of

NORDIC LASI 2021
Envelope-Open amanda.sjoblom@aalto.fi (A. Sjöblom); anni.silvola@oulu.fi (A. Silvola); jiri.lallimo@aalto.fi (J. Lallimo)
                                       © 2020 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
    CEUR
    Workshop
    Proceedings
                  http://ceur-ws.org
                  ISSN 1613-0073
                                       CEUR Workshop Proceedings (CEUR-WS.org)
wide use of these tools (Viberg et al., 2018). Additionally, there is a need to better understand
the HE institutions’ staff perspectives on LA deployment. For example, Howell et al. (2017)
identified, that teachers and advisors may need clarification on the purposes of using LA tools
as a support for guidance and teaching practices. These findings indicate a need to better
understand the mechanisms of adoption process of the digital tool, including the role of different
background variables explaining the adoption experience of the users in different roles in
educational institution.
   We examined the factors that both teachers and non-academic staff in Finnish HE institutions
associate with the deployment of new tools. In Finland, some trials and implementations of
LA tools in HE have been done, but wider deployment of different tools and the culture of
using LA in different educational practices is still emerging. The key aim was to identify good
and bad practices associated with introduction of new digital and LA tools, as experienced by
the expected user-groups, as these users have a key role in guiding and developing the future
processes where learning analytics tools will be implemented. In order to examine deployment
conditions in a broad sense, rather than specifying the exact types of learning analytics or tools
of interest, we investigated the conditions surrounding deployment and learning of new LA
and digital tools in general, rather than in relation with a specific new LA tool, to identify
common factors. We attempt to primarily cover the experienced challenges and good practices
surrounding implementations of LA and digital tools, for the purpose of supporting future LA
deployment processes, so that the potential of new tools and solutions might be actualized more
effectively. This approach is intended to supplement the previous research that has identified
aspects of the LA tools that affect user experiences and attitudes in HE (e.g., Viberg et al., 2018),
with a view on how those aspects could be better supported with good deployment practices.
This aim was achieved through a questionnaire aimed at HE staff, investigating their experiences
of LA and other digital tool deployment in relation to their user characteristics, such as basic
ICT self-efficacy and general use of ICT (Hatlevik, 2017), and background of working in higher
education. ICT self-efficacy is an important concept in understanding HE staffs’ ICT use. It
refers to HE employees’ beliefs about their capabilities to utilise ICT for teaching and other
work-related practices (Krumsvik, 2011). The assumption that underlies behind the concept of
self-efficacy is that when people are confident about being able to complete a task, they can be
more willing to direct their effort for the task (Bandura, 2006). Previous studies have shown
that teachers with low ICT self-efficacy were the less frequent users of technology in their
work (Hammond, Reynolds & Ingram, 2011). Additionally, it has been shown that the intention
to use technology can be influenced by self-efficacy and technological complexity (Teo, 2014).
In the context of LA implementation, ICT self-efficacy may play an important role for users’
willingness to adopt new digital tools.
   We were interested in whether ICT self-efficacy and use of ICT, together with roles in HE
would affect how the conditions surrounding the deployment of new tools are experienced, and
how the importance and effectiveness of different forms of training materials for new tools are
perceived. We also examined participants’ views with a qualitative approach regarding what
good practices for LA and digital tool deployment they identify, and what challenges they face
when required to adopt new tools and solutions.
   The key motivation of this study was to identify such aspects of supporting academic staff
that help them adopt new LA tools into their daily educational practices. The practical goal
is to gain information on how we might approach new deployment processes to make them
more approachable and user-friendly for the academic staff. This knowledge will improve the
opportunities of LA tools to reach their potential impact in HE, by providing lower thresholds for
adoption of LA tools into wider use. To investigate these, four key questions were formulated:
   1. How do HE work-background, basic ICT self-efficacy and use of ICT affect how adoption
of new learning analytics solutions and digital tools is experienced?
   2. How do HE work-background, basic ICT self-efficacy and use of ICT affect how important
training materials are for new tool adoption?
   3. What are the key aspects that training materials and support services should have in order
to be helpful when adopting and using new digital tools?
   4. What are the key challenges associated with adopting new learning analytics solutions
and digital tools?

1.1. Background
Some reviews have indicated relatively little impact of LA solutions on learning or teaching,
despite their identified potential to improve learning, support or teaching (Larrabee Sonderlund
et al., 2019; Viberg et al., 2018). There are also clear indications that user characteristics affect
the impact of LA solutions (e.g., Hatlevik, 2017), as do aspects of the LA tools themselves.
Regarding LA design, it appears that when impact of LA tools and solutions is evaluated, the
important factors are focused on the visual aspects of the tool and general usability, (e.g., Park &
Jo, 2019), as well as the functions and design of the tool (e.g., Charleer, Klerkx, Duval, De Laet, &
Verbert, 2016; Herodotou, Rienties, Boroowa, Zdrahal, & Hlosta, 2019), but more understanding
of the practices related to the implementation and use of the tools should be achieved. Also
investigating how new LA and other digital tools should be deployed in order for the key
stakeholders to find the tools useful, and to help the users apply the solutions in ways relevant
to them, could contribute to the impact of the tools. Coupled with an understanding of how
digital skills and related characteristics affect the experience with new digital tools, we may
be able to facilitate changes in different educational processes involving new ways of working
with LA better support learning, teaching and support processes.
   Implementation of LA tools requires the development of practices and processes related to
the planned use of the designed tools. With a clear understanding of the experienced conditions
surrounding typical deployment processes, developing good deployment processes could be
an opportune way to impact the experiences of and the attitudes towards adopting new LA
tools. Although some previous findings have indicated that involving intended user groups in
LA development in higher education has a positive impact on the users’ experiences (Lallimo &
Sjöblom, 2019), it is unclear what deployment factors affect LA tool adoption and impact. The
tools and solutions that implement different kinds of LA methods generally involve various
predictive or descriptive models and/or visualisations, categorisations and supervised and
unsupervised learning methods (Klein, Lester, Rangwala, & Johri, 2019; Larrabee Sonderlund et
al., 2019). This may produce challenges for those in higher education who are less used to using
ICT tools. Previous research has identified a gap between the capabilities of LA solutions and
user needs (Klein et al., 2019), amplifying the potential problems in introducing new tools, and
reducing the possibilities of the new solutions improving teaching, learning and support services.
Additionally, it has been demonstrated that for teachers, digital competence, ICT self-efficacy
and general use of ICT are interrelated (Hatlevik, 2017), which in turn may suggest a relation
with readiness to adopt new ICT tools. Given the quickly developing nature of LA, it is likely
that continuous effort is required from HE institutions in training the affected stakeholders
in order for new tools to be used effectively. It has already been identified in some specific
circumstances, such as teacher education, that continual development of digital competences and
ICT self-efficacy is important, given the increasing digitalisation (Gudmundsdottir & Hatlevik,
2018).


2. Methods
2.1. Participants
Ninety-nine people from HE institutions in Finland completed the questionnaire. Majority of
the participants were from universities (83), with the rest from applied universities, and 68
had positions that involved teaching, and majority (78) had at least some pedagogical training.
Thirty-two (32.32%) of the participants were from science, technology and mathematics (STEM)
-fields. Participating in the study was voluntary and data were collected anonymously.

2.2. Materials and procedures
Data were collected with an online questionnaire that was circulated at higher education events
and networks, as well as internal information channels. The participants were informed about
the purpose of the questionnaire. The questionnaire consisted of three parts. First partici-
pants were asked to provide information about their roles in HE (teaching/administrative role,
STEM/non-STEM field, university/applied university), and to answer Likert-scales measuring
their basic ICT self-efficacy and use of ICT. The questions were based on previously used
questionnaires (Hatlevik, 2017) and translated and modified for current purposes. Questions
are presented in the Appendix. Next, participants were asked to think about their experiences
of adopting to LA and other digital tools. Regarding new LA and digital tool deployment, the
participants were asked about their attitudes and experiences taking new LA and digital tools
into use and about their experiences of training material usability in for digital tool adoption
with five Likert-scale questions. The participants were also asked about the type of support they
find useful when taking a new tool into use with a multiple-choice question with an opportunity
to provide other open-ended answers. The third part contained two open-ended questions for
thematic analysis (Braun & Clarke, 2012), concerning what training materials should contain in
order to be useful, and what factors typically make the learning and using a new digital tool
challenging.
Figure 1: Demonstrates use of ICT together with the experience of taking a new LA and digital tools
into use.


3. Results
3.1. Quality of experiences with new LA tools and the experienced
     importance of training materials
As the new LA and digital tool experience was expected to be simultaneously affected by
changes to the background variables of teaching, institution, and field, together with basic ICT
self-efficacy and use of ICT, it was predicted with a multiple regression. A significant regression
equation was found, F(5,93) = 4.13, p = .002, with an adjusted R2 = .14. The predicted experience
was modelled as 3.54 + (-0.37) * teaching + (-0.3) * university + (-0.11) * STEM + 0.01 * basic
ICT self-efficacy + 0.35 * use of ICT. However, only use of ICT was found to be a significant
predictor (p = .003), with the relation depicted in Figure 1. The result indicated that experiences
of new tools are primarily affected by use of ICT, with increased use of ICT associated with
more positively rated experiences with adopting and using new LA tools.
   Experienced importance of training materials was expected to be affected by the same
independent variables. A similar multiple regression was used to investigate the experienced
importance of training material usability and availability, indicating a significant regression
equation, F(5,93) = 3.74, p = .004, with an adjusted R2 = .12. The predicted relevance of training
materials was modelled as 7.3 + (-0.32) * teaching + 0.3 * university + (-0.11) * STEM + (-0.23)
* basic ICT self-efficacy + 0.05 * use of ICT, but only basic ICT self-efficacy was found to be a
significant predictor (p = .011), shown in Figure 2. This result indicated that the experienced
importance of training materials is predicted by basic ICT self-efficacy, with lower ICT self-
efficacy associated with increased rating for importance of instructional materials.
   Regarding the usefulness of different forms of support, out of the provided options quick
guides (with 79.8% of participants finding them useful), video instructions (72.7%), and available
support staff (72.7%) were found to be useful by most participants. FAQs (57.8%), instruction
session (53.3%), written instructions (43.4%) and discussion forums (42.4%) also received support,
while only 16.2% considered learning games to be useful.
Figure 2: Demonstrates basic ICT self-efficacy together with experience of the importance of training
materials.


3.2. Qualitative results
Two qualitative questions were analysed in depth, employing a thematic analysis. Two coders
analysed the answers, and identified similar themes for both important training material aspects
and key challenges. Regarding the aspects that make training materials for new LA tools useful,
participants systematically identified several key themes, relating to training materials’ contents
and qualities: 1) pictures to make instruction more concrete, 2) both quick guide and more
elaborate instructions, 3) examples of how the tool works for specific tasks, 4) role-specific step-
by-step guides, 5) good structure, and 6) communication channels between users, developers
and support.
   First, participants commonly mentioned a need for pictures and videos to accompany explana-
tions, and to make the instructions more concrete, and to ease the translation from instruction to
the tool itself. This was often associated with a description of need to better relate instructions
and actual tool use.
   “Requires clarity, good relations between images, text, functions, structure, functionality and
systematicity.”
   “Videos based on good scripts. Instructions must not be jumpy or illogical. Clear images and the
possibility of zooming in closer.”
   Another key theme was a need for both a quick guide for the basic functionalities of the tool,
and also for longer detailed instructions that allow the user to go into finer details according
to their interests. This was often related to the theme of a need for better examples of how
the tool can be applied to different academic or pedagogical tasks, and what kind of results to
expect in relation to functionalities. These issues were discussed commonly in the context of
understanding how the tools could or should be used in relation to specific tasks that the user
must undertake, depending on their role, as well as a need for relevant progression examples of
how the user might expand their knowledge once they understood the basics.
   “A basic users’ guide for the tool, but in particular good use cases, how the tools have been utilised
before.”
   “Written instructions should contain various good keywords, to make it easier to find the in-
formation you need. A good search engine connected to all materials. Clear pictures of the user
interface. Example cases to give the user an idea about all of the things that can be done with
the different properties of the tool. A possibility to comment on all instructions, so that they can
be updated and supplemented and you can get tips from other users’ problems. Good self-learn
online materials, that you can complete while taking the tool into use. For example step-by-step
instructions for different functionalities, with dummy-versions for practicing, while simultaneously
taking the real tool to use.”
   One of key themes was a need for a clear, logical progression, with the answers often including
descriptions of unfulfilled needs for role-specific step-by-step guides as well as for clear language
free of jargon, and a good thematic structure. A communicative aspect was also commonly
mentioned, indicating a need for clear communication channels between users, developers and
support, both in terms of the tool and any instructional material, so that both the tools and the
relevant materials would suit the processes they were designed to support.
   “Usually instructions are not on the right level, and do not show how the tool should be used in
my role. Instructions often have a lot of inessential details, and they can make the use of the tool
sound too complicated.”
   “It would be useful to know some background and bases about why this [tool] is important.
Attention is usually directed too quickly to learning a new tool and performing some actions, and it
is forgotten why these things are done.”
   “A support person should always be named. It is important to have someone you can contact and
ask for advice, as videos or written instructions cannot be trusted to contain all the information
regarding specific roles you have or tasks you do.”
   Second, five key challenges were identified from the answers. Here, the answers were
relatively similar to the answers above regarding what good materials and support should
contain, now detailing them from a perspective that they often do not receive the kind of
support they would find helpful. Many participants also identified the LA tools themselves as
a key challenge (1), along with either the quality or non-existence of support or instructional
materials (2). These were often associated with experiences that a lack of communication made
the learning and use of the new digital tools challenging (3). Many participants also explicitly
linked these answers to their own answers to the earlier question.
   “Inadequate instructions, difficult user interface.”
   “Communication is missing or too delayed. Organised information and communicated benefits
are essential. Not enough time for familiarisation. Poor instructions.”
   Participants also commonly found that the actual usability of the tools to support the teaching,
academic advising and administrative and other tasks was challenging, and that the tools could
not be tailored to suit their needs (4). In particular, many participants indicated that the new
tools generally meant a higher workload with new tasks added to the requirements of their role,
rather than the tools supporting them in their existing tasks (5). The most commonly mentioned
theme centered on workload, generally citing a lack of time to really investigate the new LA
tools, a constantly increasing and thus too large a number of different LA and other digital
tools, and the time commitment required by both the learning of a new tool and its potentially
mandatory use. Indeed, 52.5% of the participants mentioned time constraints as one of the key
challenges for adopting new LA tools.
   “There should be a real need for the tool, based on actual professional requirements and tasks,
rather than administrative or organisational pressure to use tools.”
   “Unsuitability of tools and systems to real needs and tasks. Poor instructions. Tools that seem
incomplete/untested.”
   “Not enough time. No available evaluations of tools, no readily available, genuine estimates
about how useful the tool would be in my own work. No good instructions or support.”


4. Discussion
The first two hypotheses regarding variables affecting experiences of using new LA tools
and the experienced importance of training materials were only partially supported. The key
quantitative results indicated that participants’ general use of ICT, influenced how participants
experienced the deployment of new LA solutions and digital tools, with higher indicated ICT
use related to more positive experiences. The findings also indicated that lower basic ICT
self-efficacy was related to heightened need for training materials. Together these findings
indicate that user characteristics associated with ICT use and ICT self-efficacy have an impact
on adopting new LA solutions and digital tools, and that accounting for them might improve
the acceptance and use of new solutions. More specifically, the results suggest that general
readiness to use ICT tools provides the users with more readiness to adopt new LA tools, while
less fluent ICT user groups would benefit from being in the focus when deployment support and
instructions are prepared, so that they could receive the additional support they are likely to
need. This also indicates that when requiring users to adopt new tools, assessing their familiarity
with ICT use and adoption and their ICT self-efficacy would aid in preparing the appropriate
deployment processes and support for users. Along the same lines, when deployment of new
LA solutions is targeted for user groups where lower ICT self-efficacy might be expected, focus
and resources should be directed at providing good, targeted instructional material and support,
as these groups in particular are likely to require support in order to benefit from new LA tools.
Additionally, it is also important notion for educational institutions to consider, whether high
quality training materials could be used to improve academic staffs’ ICT self-efficacy beliefs
and therefore support different user groups to fluently adopt new technologies in institutions.
   However, the hypothesised predictive effect of HE background variables was not supported,
where the chosen HE work background variables did not account for a significant amount of
either experiences with new LA tools, or how instructional materials are viewed. This may
indicate that, in general, different groups within the higher education context, regardless of
whether the users are teachers or not, in STEM or in other fields, their experiences might be
rather similar. At the same time, this speaks to the qualitative analysis that indicated that both
LA tools and the associated instruction materials and support should be specifiable according
to roles and tasks. These findings together appear to suggest that this is usually not the
case, making the experiences of different groups similar in the aspects that they require more
specificity. The general message from the open answers was a common shared experience of
new LA tools being deployed for them to use without providing enough support, and without
good demonstrations of how these new tools should be adopted to suit their roles and needs.
   For the third research question about key aspects of good training materials, the thematic
analyses identified several factors: need for pictures and videos, quick guides and detailed,
task-specific instructions, role-specific examples and step-by-step guides, good structure, and
communication between users and developers. These would appear obvious, yet findings
regarding the fourth research question about challenges to adopting new tools indicate that
many of these aspects are often lacking or missing when new LA tools are deployed. Regarding
the types of training materials, particularly videos, quick guide and support persons were found
to be potentially most useful, linking to the qualitative findings, which indicated that often
training materials are missing, unsuitable or unclear. The participants also expressed that
the contents of training materials should be easily related to the use of the actual tool, with
examples to demonstrate how the new LA tool supports the tasks that the user needs to do,
based on the kind of teaching, advising, administration or learning support processes that they
undertake. Participants often mentioned that instructional material needs to be clearer and
better structured, and that it should be specifiable according to roles and tasks. Communication
aspects were also highlighted, both in terms of support and in informing the new users of the
intended purpose and value of using the new LA tools. Interestingly, very similar topics were
mentioned both when participants were asked about what training materials should contain,
and when they were asked about the challenges of adopting new LA tools. This appears to
indicate that often the deployment of new tools and solutions are lacking the critical support
aspects that users identify. These aspects also offer key issues to solve when aiming for a
successful deployment of a new LA tool in higher education. If the adoption and use of new
solutions and tools can be boosted and supported by addressing the users’ key concerns by
providing adequate and well-tailored support, we might expect an increase in both the usage
and the impact of new solutions. Some negative experiences with new LA tools may also
be circumvented more effectively, if teachers and other expected user groups for new tools
are invited to participate early on in the planning and development processes for new tools,
rather than first experiencing them when the new tool or solution is implemented. It has
been shown previously that student-centered LA development processes improve the students’
experiences of new learning analytics tools (Lallimo & Sjöblom, 2019), and this could be expected
to generalise to other higher education stakeholders as well.

4.1. The effect of new LA tools on experienced workload
Perhaps the most difficult theme that was identified was time requirements associated with
adopting and using new LA solutions and digital tools. Generally, new tools require both
learning time, and the experiences generally indicate that the continued use of new tools also
adds to the workload and require additional time, rather than helping in the essential tasks
associated with either teaching, learning support or administrative processes. Time requirement
concerns have also been identified before (Hatlevik, 2017), and are likely one of the key issues
that must be addressed. Concerns about the additional time cost and workload made by the
requirements to use new LA tools likely tie in with the other concerns that were raised regarding
usability, the suitability of the tools for the actual teaching, learning and learning support tasks,
and lack of appropriate instructions and support, and may be alleviated if they are addressed.
Workload concerns do, however, also suggest that a critical evaluation of new LA tools and their
values and requirements in higher education is in order, so we may avoid a negative impact of
additional workload associated with LA solutions and digital tools, in addition to addressing the
specific concerns on how to lower the workload caused by each individual tool. We do propose,
however, that time requirement problems may be partially solved by providing users with clear
and task-appropriate instructional material, that aids them both in adopting the tool quicker,
and in making the use of the tool more effective and useful for their primary teaching, learning
support and administrative tasks.

4.2. Views on future deployments
Stakeholder-centric approach has been previously found to be an appropriate, effective and
user-friendly approach for providing a deeper insight into how value can be created for users
when developing learning analytics concepts and solutions (Klein et al., 2019; Lallimo & Sjöblom,
2019; Silvola, Näykki, Kaveri, & Muukkonen, 2021). The current findings indicate that a similar
approach could be useful also when planning and implementing the deployment of LA tools
and solutions, and could further the actualisation of the solutions’ potential for supporting
teaching and learning and other processes in higher education. The current findings on the
importance of communication are also in line with the previously identified perceptions that
LA may be beneficial, if collaboration exists amongst the HE stakeholders: teachers, students,
and their institution (Howell, Roberts, Seaman, & Gibson, 2018). Simultaneously, the findings
suggest that HE institutions may benefit from providing ICT skills training in general, as it may
positively influence the staff’s ability to adopt new LA and other digital tools into use, given
that use of ICT was shown to positively affect experiences with learning and using new LA
tools. Engaging and exciting staff to adopt new methods and tools is essential for the tool to
have an impact in the learning, teaching and learning support environments and experiences
in higher education (Zilvinskis, Willis, & Borden, 2017). We propose that if we can be more
sensitive to the needs of the intended user groups, both in terms of the LA tools and solutions
as well as the surrounding implementation conditions, e.g., training, support, communication,
and integration to the primary functions of teaching, learning and learning support, we can
improve the attitudes towards and the adoption of new LA and digital solutions. This will also
increase the opportunity for the tool to have recognisable impact in HE institutions.


Acknowledgments
This work was funded by the Finnish Ministry of Education and Culture, [grant number
OKM/272/523/2017].


5. References
Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper, P. M. Camic, D. L. Long, A. T.
Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology, Vol.
2. Research designs: Quantitative, qualitative, neuropsychological, and biological (pp. 57–71).
American Psychological Association.
   Bandura, A. (2006). Guide for constructing self-efficacy scales. In F. Pajares, & T. Urdan (Eds.),
Adolescence and education: Vol. 5. Self-efficacy and adolescence (pp. 307–337). Greenwich, CT:
Information Age.
   Charleer, S., Klerkx, J., Duval, E., De Laet, T., & Verbert, K. (2016). Creating effective learning
analytics dashboards: Lessons learnt. In European conference on technology enhanced learning
(42-56). Springer, Cham.
   Gudmundsdottir, G. B., & Hatlevik, O. E. (2018). Newly qualified teachers’ professional digital
competence: implications for teacher education. European Journal of Teacher Education, 41(2),
214-231.
   Hammond, M., Reynolds, L., & Ingram, J. (2011). How and why do student teachers use ICT?
Journal of Computer Assisted Learning, 27, 191–203.
   Hatlevik, O. E. (2017). Examining the relationship between teachers’ self-efficacy, their digital
competence, strategies to evaluate information, and use of ICT at school. Scandinavian Journal
of Educational Research, 61(5), 555-567.
   Herodotou, C., Rienties, B., Boroowa, A., Zdrahal, Z., & Hlosta, M. (2019). A large-scale
implementation of predictive learning analytics in higher education: the teachers’ role and
perspective. Educational Technology Research and Development, 67(5), 1273-1306.
   Howell, J. A., Roberts, L. D., Seaman, K., & Gibson, D. C. (2018). Are we on our way to
becoming a “helicopter university”? Academics’ views on learning analytics. Technology,
Knowledge and Learning, 23(1), 1-20.
   Klein, C., Lester, J., Rangwala, H., & Johri, A. (2019). Technological barriers and incentives to
learning analytics adoption in higher education: in-sights from users. Journal of Computing in
Higher Education, 31(3), 604-625.
   Krumsvik, R. J. (2011). Digital competence in Norwegian teacher education and schools.
Högre Utbilding, 1, 39–51.
   Lallimo, J. & Sjöblom, A. (2020). Student-centered development of learning analytics at a
higher education institution. Companion Proceedings of the 10th International Conference on
Learning Analytics & Knowledge, 176-177.
   Larrabee Sønderlund, A., Hughes, E., & Smith, J. (2019). The efficacy of learning analytics in-
terventions in higher education: A systematic review. British Journal of Educational Technology,
50(5), 2594-2618.
   Park, Y., & Jo, I. H. (2019). Factors that affect the success of learning analytics dashboards.
Educational Technology Research and Development, 67(6), 1547-1571.
   Silvola, A., Näykki, P., Kaveri, A., & Muukkonen, H. (2021). Expectations for supporting
student engagement with learning analytics: An academic path perspective. Computers &
Education, 168, 104192.
   Teo, T. (2014). Unpacking teachers’ acceptance of technology: Tests of measurement invari-
ance and latent mean differences. Computers & Education, 75, 127–135.
   Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The current landscape of learning
analytics in higher education. Computers in Human Behavior, 89, 98-110.
   Zilvinskis, J., Willis III, J., & Borden, V. M. (2017). An overview of learning analytics. New
Directions for Higher Education, 2017(179), 9-17.
A. Questionnaire items for basic ICT self-efficacy (1), use of ICT
   (2), experience relating to deployment of new LA/digital
   tools (3), forms of support (4) and experienced needs for
   materials and challenges (5 & 6)
1. Evaluate the following statements about digital skills (scale: 1 (completely disagree) – 7
(complete agree))

    • I can use spreadsheets to draw figures
    • I can download and install programs
    • I can edit digital photos and graphics
    • I can form a database

   2. Evaluate the following statements about use of technology (scale: 1 (completely disagree)
– 7 (complete agree))

    • I find it fluent to use information technology for administrative tasks
    • I find it fluent to use information technology as a part of teaching during lectures and
      exercises
    • I find it fluent to use complex statistical figures
    • I find it fluent to start using new information technology tools

   3. Evaluate the following statements about new LA/digital tools: (scale: 1 (completely
disagree) – 7 (complete agree))
   General experience with new tools:

    • I find it inspiring that there are new tools available to support advising / teaching
    • The necessary amount of support is usually offer with new tools
    • In general, the time required to learn new tools is reasonable in relation to the benefits of
      the tool

  Training materials:

    • Availability of instructional material is important for fluent use of new tools
    • The usability of training materials (ease of following instructions, relevance) is important
      for smooth implementation

  4. What kind of deployment instructions / support services are / would be helpful (binary
yes/no)

    • Instructional video
    • Thorough written instructions
    • Quick guide for use
    • Frequently asked questions
    • Learning game
    • Personal / group instruction sessions
    • Support person available to assist
    • Forum for users to share their experiences
    • Other, what?

  5. In general, what should the materials contain in order to be useful? What might be missing?
(open-ended)
  6. What things typically make the introduction of new LA/digital tools challenging in your
work? (open-ended)