<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Evaluating Usage of an Analytics Tool to Support Continuous Curriculum Improvement</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Isabel Hilliger</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Constanza Miranda</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sergio Celis</string-name>
          <email>scelis@ing.uchile.cl</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Mar Pérez-SanAgustín</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Pontificia Universidad Católica de Chile</institution>
          ,
          <addr-line>Santiago</addr-line>
          ,
          <country country="CL">Chile</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Universidad de Chile</institution>
          ,
          <addr-line>Santiago</addr-line>
          ,
          <country country="CL">Chile</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Université Paul Sabatier Toulouse III, Institut de Recherche en Informatique de Toulouse (IRIT)</institution>
          ,
          <addr-line>Toulouse</addr-line>
          ,
          <country country="FR">France</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2019</year>
      </pub-date>
      <abstract>
        <p>Curriculum analytics (CA) consists of using analytical tools to collect and analyse educational data, such as program structure and course grading, to improve curriculum development and program quality. This paper presents an instrumental case study to illustrate the usage of a CA tool to help teaching staff collect evidence of competency attainment in an engineering school in Latin America. The CA tool was implemented during a 3-year continuous improvement process, in the context of an international accreditation. We collected and analysed data before and after tool implementation to evaluate its use by 124 teaching staff from 96 course sections. Data collection techniques included: analysis of documentary evidence collected for the continuous improvement process and teaching staff questionnaires. Findings show that the tool supported staff tasks related to the assessment of competency attainment at a program level. However, usability and functionality issues would have to be addressed to also support course redesign, providing actionable information about students' performance at an individual level. Lessons learned imply that institutions could adapt and adopt existing CA tools to support curriculum analysis by not only investing in tool development, but also in capacity building for its use for continuous improvement processes.</p>
      </abstract>
      <kwd-group>
        <kwd>Curriculum Analytics</kwd>
        <kwd>Case Study</kwd>
        <kwd>Competency-based Curriculum</kwd>
        <kwd>Continuous Improvement</kwd>
        <kwd>Higher Education</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>Within higher education, different types of analytics have been implemented to tackle
painstaking work required to improve learning results at different levels[1], [2]. One of
these analytical approaches is Curriculum Analytics (CA),defined as the collection,
analysis and visualization of administrative curricular data— such as course enrolment
and student grades —to inform and support decision making at a program level [3]. In
specific educational contexts, CA has been proposed as a good solution for lightening
task workload required to identify courses where the improvement of learning results
is crucial to the program success [4]. This workload is particularly high in
competencybased curriculums because each course emphasizes specific core competencies within
an academic plan [5]. Traditionally, the analysis of this type of curriculum requires to
assess the alignment between program and course level competencies, determine
whether both type of competencies are attained and formulate action plans to improve
teaching and learning [4]. The emergence of CA techniques opens new possibilities to
perform these tasks in a timely and cost-effective manner [4].</p>
      <p>However, the CA promise of supporting curriculum analysis is far from fulfilled.
Several tools have been proposed to identify gateway courses in a curriculum and
improve its outcomes [4], but managers and teaching staff still perceive that they lack
systematic information for course improvement, such as students’ academic results
regarding taken courses and core competency attainment [5]. Research about CA
adoption is still in an early stage, so most of the tools developed propose solutions to tackle
institutional needs are not necessarily related to any existing improvement process [6].
In some cases, using technology to help stakeholders to visualize institutional data
available is a good means of supporting data-driven decision-making and promoting
transparency in certain educational processes. However, prior work suggests that, in
order to leverage the potential of CA tools, an institution needs to be ready to address
data-driven changes [6], having already implemented processes to examine what is and
is not working at different levels [1].</p>
      <p>
        To illustrate how the use of CA tools could systematically support continuous
curriculum improvement for an extended period of time, we present an instrumental case
study about a continuous improvement process implemented in an engineering school
in Latin America (UC-Engineering). This process was implemented between 2015 and
2017 to comply with the North American Accreditation Board for Engineering and
Tech
        <xref ref-type="bibr" rid="ref8">nology (ABET), and in mid-2016</xref>
        , a CA tool was implemented to support teaching
staff. To evaluate the teaching staff’s usage of this tool in five competency-based
programs, we collected and analysed data before and after tool implementation. The
following data collection techniques were used: 1) analysis of documentary evidence
reported for the continuous improvement process by 124 teaching staff in 96 course
sections, and 2) teaching staff questionnaires applied to 25 out of 63 teaching assistants
who interacted with the CA tool. These two types of data sources were triangulated to
evaluate tool usage from the teaching staff ‘s perspective, exploring its implications in
terms of tool usefulness and ease-of-use.
2
      </p>
    </sec>
    <sec id="sec-2">
      <title>Curriculum Analytics Tools</title>
      <p>Over the past decade, researchers have developed CA tools to address multiple
educational challenges from the perspective of different stakeholders. Table 1 summarizes
some of the tools that have been documented in recent conference proceedings and
journal articles describing the institution in which they were developed, and their
distinctive features for different users. Concerning students, tools 1 and 2 were developed
with the objective of changing their approach to study, using visualizations of their
performance to motivate them to adopt help-seeking behaviours. Regarding teaching
staff and managers, tools 3-10 were developed with the objective of providing
information to identify students who are facing difficulties in their studies, expecting staff
to reach out to their students and provide guidance. Finally, tools 11-14 were developed
for students or staff, aiming to help them to identify crucial courses in a curriculum,
and anticipate the impact of course-level improvements in competency attainment at a
program level.</p>
      <p>Most of the tools presented in Table 1 do not provide information about the
methodology used to evaluate its impact on any existing institutional processes. By
methodology, we mean experimental approaches to determine whether the tool is effectively
fulfilling its objective [7]. For instance, little is known about how students perceived
the tools developed to promote help-seeking behaviours (such as tool 1) [8]. Regarding
tools that aim to identify students at risk (such as tool 9), there is no clear relationship
between the use of the tool and student outcomes yet [8]. And, concerning the tools for
monitoring course-level improvement and competency attainment, researchers have
just started to evaluate their adoption by collecting information about user perceptions
[5], without necessarily reporting data about its actual use to drive improvements in
program design or academic program delivery [9].</p>
      <p>Building upon this latest idea, we present a case study that evaluates teaching staff
usage of a CA tool in the context of a continuous improvement process that has already
been installed at an institutional level. Our aim is not only to present the results of
evaluating the tool from users’ perspective, but also to analyse how it was used to
support an existing process for curriculum improvement.
Tool name and user
9. Course Signals (focused
on teaching staff)
10. Student Flow Diagrams
(focused on managers and
teaching staff)
11. Curricular Analytics
(focused on managers, teaching
staff)
12. Visualized Analytics of
Core Competencies-VACCs
(focused on students)
13. Competency Analytics
Tool-CAT (focused on
managers and staff)
14. Course University Study
Portal-CUSP (focused on
managers and teaching staff)</p>
      <p>Developer
Purdue University, USA
(licensed to Ellucian)
University of New
USA</p>
      <p>Mexico,</p>
      <p>Distinctive features [References]
A set of risk indicators to classify
students in subgroups [1], [8]
Sankey diagram to understand
students’ academic trajectories [1]
University of New
USA</p>
      <p>Mexico,</p>
      <p>Interactive graphical representation
of the curriculum [1]
Yuan Ze University, Taiwan
(currently used in Yuan Ze
University)
Singapore Management
University, Singapore
University of Sydney, Australia</p>
      <p>Visualization of competency
attainment in radar charts regarding
grades, credit hours and peer
performance [5]
Curriculum progression statistics
based on competency map and
course information [4]
Web-based application to model
competency development in 5
maturity levels [11]
3
3.1</p>
    </sec>
    <sec id="sec-3">
      <title>Methods</title>
      <sec id="sec-3-1">
        <title>Study Design and Objectives</title>
        <p>According to Zelkowitz [7], instrumental case studies are useful to determine if a
technological tool makes it easier to produce something compared to prior scenarios. Since
the objective of this study is to evaluate usage of a CA tool throughout a continuous
improvement process, we select instrumental case study as the most appropriate
methodology. The case study took place at UC-Engineering between 2015 and 2017. In this
case, we specifically evaluated whether the CA tool supported teaching staff efforts to
collect documentary evidence to account for competency attainment.
3.2</p>
      </sec>
      <sec id="sec-3-2">
        <title>Case Study Context and Proposition</title>
        <p>
          In the early 2000s, UC-Engineering decided to accredit five academic programs by
ABET, which concentrated 35% of its undergraduate enrolment (1,500 students out of
a total of 4,000): 1) Civil Engineering, 2) Electrical Engineering, 3) Computer
Engineering, 4) Mechanical Engineering, and 5) Chemical Engineering. In 2015, the
managers at the Office for Undergraduate Studies and the Engineering Education Unit
designed a continuous improvement process to be implemented for the renewal of the
ABET accreditation of these five programs. The continuous improvement process was
organized in six semesters between the first semester of 2015 and the second se
          <xref ref-type="bibr" rid="ref1 ref2">mester
of 2017</xref>
          . Every semester, teaching staff were required to undertake the tasks described
in Fig. 1. Before the semester started, they had to participate in a workshop to revise
their course syllabuses to make sure the competencies declared at a course level were
aligned to the ones at assigned program level. At the beginning of the semester, they
had to report an assessment plan to account for competency attainment at their courses
(two competencies per course in most cases). Once the semester finished, they had to
report competency assessment results, which were transformed into percentages of
competency attainment to be revised in an end-of-semester curriculum meeting
(curriculum discussions). Additionally, they might have been required to submit a sample of
assessment methods used to assess the competencies assigned to inform curriculum
decision-making (if needed).
        </p>
        <p>At the end of the continuous improvement process, 104 assessment plans
(http://bit.ly/2SYxWxc) and spreadsheets of competency assessment results
(http://bit.ly/2VOdUKx) were expected to be reported by 124 teaching staff from 96
course sections, analysing the attainment of the 11 competencies proposed by ABET
Criterion 3 (http://bit.ly/2SeVzRj).</p>
        <p>Fig. 1. Semester tasks that teaching staff had to undertake for the continuous
improvement process implemented at UC-Engineering between 2015 and 2017.</p>
        <p>
          By the end of 2015, 38 assessment plans and competency assessment results were
already collected from 29 course sections for an interim report to be se
          <xref ref-type="bibr" rid="ref8">nt to ABET in
June 2016</xref>
          . These documents were attached and sent in e-mails by teaching staff, and
then uploaded to Dropbox folders by UC-Engineering managers. For curriculum
discussions, analysts of the Engineering Education Unit had to transform graded
assessment results into percentages of competency attainment, so this information was only
available at the end of the semester. By automating this analysis, program chairs and
teaching staff would have access to competency attainment results whenever they
needed them for their decision-making. This motivated UC-Engineering managers to
implement a CA tool developed jointly by the University of Sydney and U-Planner
(https://www.u-planner.com/en-us/home), a Latin American company that provides
technological solutions and research services to higher education institutions. This tool
was originally developed to facilitate the alignment between competencies in a graduate
profile and the teaching and assessment methods of the different courses within a
program (see Fig. 2), and to collect documentary evidence of competency assessment at a
course level as attachments. In order to include an automated visualization of
competency attainment in the CA tool, U-Planner had to implement an ETL process to
integrate course grading results from the LMS, and then create a report that transformed
these results into percentages of competency attainment LMS (see Fig. 3). After
validating the report for 12 courses duri
          <xref ref-type="bibr" rid="ref8">ng the first semester of 2016</xref>
          , the CA tool was
implemented for supporting the continuous improvement process si
          <xref ref-type="bibr" rid="ref8">nce the second
semester of 2016</xref>
          . At the end of 2017, the CA tool was not only used to collect all the
documentary evidence required for the accreditation purposes (syllabuses, course
descriptions, and assessment plans), but also to visualize competency attainment results
in each course for different academic periods. The percentages of competency
attainment were calculated by dividing each student grade into the maximum score of an
assessment method, and then multiplying this result by 100. Subsequently, each
percentage of competency attainment was classified into four performance levels
according to different thresholds established by teaching staff. These performance levels were:
1) unsatisfactory, 2) developing, 3) satisfactory, and 4) exemplary (see Fig. 3).
        </p>
        <p>
          Fig. 2. Screenshot of the UC-Engineering CA tool for supporting continuous
improvement. Through this tool, teaching staff performed the tasks required for the
continuous improvement process si
          <xref ref-type="bibr" rid="ref8">nce the first semester of 2016</xref>
          .
        </p>
        <p>Fig. 4 depicts the period considered for the case study, indicating the period were
the tool was implemented. The tool aimed at supporting the continuous curriculum
process by means of facilitating the following activities (see Fig. 1 with the semester tasks):
(1) Filling a course description form to describe broadly the teaching and
assessment methods.
(2) Indicating the relationship between competencies at both program and course
level (Fig. 2).
(3) Listing performance indicators for competencies at a program level that could
be assessed at a course level.
(4) Aligning performance indicators with graded assessment methods at a course
level.
(5) Generating automated reports on percentages of competency attainment (Fig.
3). This functionality is integrated with the institutional LMS to automatically
capture the students’ grades for align graded assessment methods.
(6) Uploading the following documentary evidence as attachments: Course
syllabus; Assessment plans; Competency assessment results; Sample of assessment
methods.</p>
        <p>Fig. 3. Screenshot of automated report of percentages of competency attainment at a
course level generated by the CA tool regarding the performance indicator
’Communicates constructively with other classmates’ for the effective communication
competency.</p>
        <p>Fig. 4. Summary of the continuous improvement process implemented at
UCEngineering. Light grey dots indicate the semesters where the teaching staff tasks
were not supported by the CA tool, and the dark grey dots the periods where the tool
was implemented as part of the process.
3.3</p>
      </sec>
      <sec id="sec-3-3">
        <title>Case Study Participants, Data Gathering Techniques and Analysis</title>
        <p>Between 2015 and 2017, 124 members of the teaching staff participated in the
continuous improvement process implemented at UC-Engineering. These 124-teaching staff
include 61 teachers (44 faculty members and 17 part-time instructors) who reported
documentary evidence of competency attainment in 96 course sections, and 63 teaching
assistants who supported teachers in outcome assessment tasks (see Table 2).
(*) Total number of staff members who reported documentary evidence between 2015 and 2017
(**) Number of staff members who were involved as CA use after tool implementation (42
faculty members and part-time instructors, and 63 teaching assistants)
The evaluation process was organized into two phases. The first phase consisted in
evaluating how the CA tool was used to facilitate the collection of documentary
evidence for analysing competency attainment in curriculum discussions. The
documentary evidence consisted of any documentation that was reported to account for
competency attainment at a course level, such as: course syllabus, course description,
competency assessment results, and samples of assessment methods (see Fig. 1). In
order to compare the number and the type of documentary evidence generated before
and after the CA tool was implemented, three researchers used a coding scheme to
classify the evidence reported by each teaching staff member in each course section.
This scheme was developed based on a bottom-up coding approach, and each category
was defined by examining the files uploaded in both Dropbox and the CA tool.).From
this bottom-up approach, four categories emerged, so the three researchers used these
categories to assigned scores from 0 to 1 to account for the type of documentary
evidence reported every semester (see Table 3): 1) reported assessment plans, 2) reported
a sample of assessment methods, 3) reported competency attainment results, 4) reported
course syllabus, 5) included a course description, and 6) used the automated report of
the CA tool. Then, the scores assigned ranged from 0 to 6 in each course section, in
which a score equal to 0 indicates a minimum amount and variety of evidence reported
(and a score equal to indicates 6 a maximum amount and variety of evidence).</p>
        <p>Reported competency attainment The teaching staff did not report competency attainment results at
results (0) a course section level.</p>
        <p>Reported competency attainment The course reported competency attainment results based on
results (1) graded assessments at a course level.</p>
        <p>Reported course syllabus (0) The teaching staff did not report the course syllabus to
complement the evidence items for the accreditation process.</p>
        <p>Reported course syllabus (1) The teaching staff reported the course syllabus as a complement
to other evidence items reported for the accreditation process.</p>
        <p>Included a course description (0) The teaching staff did not include a course description among the
evidence items uploaded in the CA tool.</p>
        <p>Included a course description (1) The teaching staff included a course description among the
evidence items uploaded in the CA tool.</p>
        <p>Reported percentages of compe- The teaching staff did not reported percentages of competency
attency attainment (0) tainment.</p>
        <p>Reported percentages of compe- The teaching staff reported percentages of competency attainment
tency attainment (1) at a program level.</p>
        <p>
          The second phase consisted in exploring the perceived usefulness and
ease-ofuse of the CA tool from the viewpoint of its users. For this purpose, researchers have
recently proposed instruments to evaluate the impact of analytical tools on educational
practices [2], these instruments have still been implemented at a course-level in
controlled environments. Therefore, we decided to develop a paper-based questionnaire
based on the prior work of Ali et al [12], considering that his objective was also
exploring teaching staff’s perspectives in a real-life context. Our questionnaire consisted of a
closed-ended and an open-ended question section (http://bit.ly/2Jh3VVG). The
closedended section consisted of a 5-point Likert scale to determine the level of staff’s
agreement on different items related to perceived usefulness and perceived ease-of-use, while
the open-ended section included the following questions to understand usability and
ease-of-use implications from an exploratory approach:
• What use would you give to the CA tool after interacting with it?
• What kind of information would you expect from this tool?
• What do you think the CA tool lacks in terms of information and functionality?
In order to gather information from the user experience once the CA tool
implementation process was well advanced, we applied the questionnaire during a workshop for
teaching assistants held at the beginning of the second se
          <xref ref-type="bibr" rid="ref1 ref2">mester of 2017</xref>
          (see semester
tasks in Fig. 1). A total number of 25 teaching assistants who attended this workshop
responded to the questionnaire voluntarily (representing 25 out of the 63 teaching
assistants who supported outcome assessment tasks si
          <xref ref-type="bibr" rid="ref8">nce the second semester of 2016</xref>
          ).
After transcribing their responses, we estimated the percentage of respondents who
agreed with these items by counting the number of respondents whose scores were
equal or higher than 4, and then dividing them by the number of total respondents (see
Fig. 6).
        </p>
        <p>60%
56%</p>
        <p>92%
84%
80%</p>
        <sec id="sec-3-3-1">
          <title>In general, the CA tool seems useful</title>
          <p>for institutional management.</p>
        </sec>
        <sec id="sec-3-3-2">
          <title>It is easy to learn how to use the CA tool.</title>
        </sec>
        <sec id="sec-3-3-3">
          <title>The purpose of using CA tool is clear and understandable.</title>
        </sec>
        <sec id="sec-3-3-4">
          <title>The CA tool allows me to obtain</title>
          <p>information easily.</p>
          <p>The CA tool allows me to obtain more
information about courses than…
0%
20%
40%
60%
80%
100%</p>
          <p>Once all the evidence items and questionnaire were analysed, we triangulated
questionnaire and document analysis results from phase 1 (see Fig. 5 and 6). This process
consisted of contrasting the amount and variety of documentary reported by teaching
staff before and after tool implementation, in addition to analysing the perspectives of
questionnaire respondents on tool usage for curriculum analysis based on evidence of
competency attainment.
4</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Case Study Findings</title>
      <p>
        The main findings of the case study are summarized in Table 5. Firstly, the CA tool
helped teaching staff collect a greater number and variety of evidence, providing
visualizations of competency attainment at a program level (Finding 1 in Table 5).
Document analysis results show that the number and the variety of evidence reported
per course section increased from two to five after the CA tool was implemented (see
Fig. 2). In most cases, these three additional items included course syllabuses, course
descriptions, and the percentages of competency attainment. Before the CA tool was
implemented, teaching staff delegated the transformation of graded assessment results
into percentages of competency attainment to professionals in the Engineering
Education Unit. However, once the CA tool was implemented, 89% of the course sections
relied on the automated report provided by the tool to account for competency
attainment at a course level. This effort to collect a greater number and variety of
evidence is not due to greater administrative pressure, since UC-Engineering managers
presented a report to ABET i
        <xref ref-type="bibr" rid="ref8">n the first semester of 2016</xref>
        , and all subsequent work was
done solely with the motivation to continuously improve the curriculum.
      </p>
      <p>Additionally, the results of the questionnaire show that 92% of respondents agreed
with the item ‘In general, the CA tool seems useful for institutional management’ (see
Fig. 4). In the open-ended questions, teaching assistants mentioned that the CA tool
facilitated the use of evidence to account for the implementation of a competency-based
curriculum throughout the accreditation process. They also mentioned its potential use
to provide students with information about course methods and its alignment with
competencies from the graduate profile.</p>
      <p>Secondly, teaching staff identified usability and functionality issues that affect
the use of evidence to inform course redesign and students’ understanding of
performance (Finding 2 in Table 5). The results of teaching staff questionnaires show that
56% agreed with the item ‘The CA tool allows me to obtain more information about
courses than other tools (such as the institutional LMS and a web application to search
for course information).’ In the open-ended sections, respondents claimed that the CA
tool had usability issues. For example, respondents indicated that the tool views had
too many tabs and fields, which hindered loading of information. Respondents also
mentioned functionality issues, such as the lack of feedback about the quality of course
information uploaded, and documentary evidence reported. Also, the tool lacks
actionable information on student performance to provide timely and specific feedback
throughout the semester.</p>
      <p>Furthermore, questionnaire respondents indicated that, although they were able to
upload the documentary evidence to account competency-attainment, they perceived
the CA tool site was not intuitive enough. The fields to be completed, as well as the
files to be uploaded were not clearly explained. This may partly explain why only 16%
of course sections reported samples of assessment tools. These results suggest that the
CA tool could be improved by including ‘help messages’ or indications related to the
process within the platform.</p>
      <p>Document analysis and questionnaire results
Course sections reported 3 additional
evidence items for the accreditation process
after the CA tool was implemented.
89% of course sections used the automated
reporting feature of the CA tool to account
for competency attainment at a course level
(40 out of 45).
92% of teaching staff agreed with the
questionnaire item ‘In general, the CA tool seems
useful for institutional management’ (23 out
of 25).</p>
      <p>Supporting data
Document analysis
results (Fig. 6)
Screenshot of
automated report of
competency attainment
(Fig. 3)
Teaching staff
questionnaire (Fig. 5)
Findings
2. Teaching staff
identified usability
and functionality
issues in the CA tool
that affect the use of
evidence to inform
course redesign and
students’
understanding of performance.</p>
      <p>Document analysis and questionnaire results Supporting data
56% of teaching staff agree with the question- Teaching staff
quesnaire item ‘The CA tool allows me to obtain tionnaire (see Fig. 5)
more information about courses than other
tools.’ (14 out of 25)
16% of course sections reported samples of Document analysis
assessment methods after the CA tool was results (Fig. 6)
implemented (7 out of 45).</p>
      <p>Dropbox</p>
      <p>Curriculum Analytics Tool
This case study showed the results of the usage of a CA tool in a long-term continuous
improvement process. The main finding shows that the use of the CA tool helped
teaching staff collect a greater number and variety of documentary evidence of competency
attainment at a program level, allowing staff to upload documents that were previously
shared or stored somewhere else, such as course syllabus.</p>
      <p>Furthermore, the CA tool provided staff with a visualization of competency
attainment during the whole continuous improvement, information that was previously
shown and discussed only in curriculum meetings at the end of the semester. This
finding not only reflects that CA tools could be a good solution to reduce task workload
for curriculum analysis [2], but also that this type of solutions provides visualizations
of student performance that is typically hidden in institutional processes. This is
particularly valuable for competency-based curriculums because it provides an overview of
the competencies attained by the students at a program level. Teaching staff usually
face difficulties in competency assessment because they tend to be abstract and
complicated at a course level, while managers deal with the complexity of understanding
learning results in a hierarchical structure of courses [3]. With this new tool, managers
and teaching staff could have more actionable information for making better decisions
to reinforce the required competencies at a program level.</p>
      <p>Although this work has illustrated that a CA tool can support curriculum analysis in
the context of an accreditation process, there are still needs of teaching staff that have
not been met by the tool under study. Finding 2 in Table 5 indicates that there are
usability and functionality issues that prevent teaching staff from using the tool to obtain
information to redesign courses and to understand students’ performance at an
individual level. These are staff needs that have already motivated the development of tools
[8]. In order to solve these issues, we could specify redesign requirements for the CA
tool by using information we have already collected to evaluate usage. Thus, lessons
learned from this case study imply that institutions could adopt and adapt existing CA
tools to support curriculum analysis, investing not only in tool development and
redesign, but also in capacity building for evidence-informed continuous improvement [1].</p>
      <p>Along these lines, one of the main contributions of this paper is that it evaluates the
use of a CA tool in an existing institutional process for an extended period of time,
going beyond current evaluation strategies that mostly employ self-reported data
without relying on a technology validation methodology [5]. The document analysis
considered evidence items of the whole 3-year period in which the continuous
improvement process was implemented. This long-term period assured collecting enough
information to determine whether the CA tool facilitated teaching staff efforts to collect
evidence of competency attainment before and after its implementation [7].</p>
      <p>Yet, this study has its limitations. Questionnaire results only represented a small
sample of teaching staff members who interacted with the CA tool. To understand the
contribution of the CA tool for a larger group of teachers, future work would have to
explore the type of information most staff need to inform course or curriculum redesign.
We anticipate this work might require monitoring learning results for an extended
period of time as we did it in this case study [5], so we can expand the current knowledge
on the impact of CA tools on curriculum improvement even further.
6</p>
    </sec>
    <sec id="sec-5">
      <title>Conclusions</title>
      <p>This case study illustrates the usage of a CA tool to support a continuous improvement
process for a competency-based curriculum in a university setting. Findings show that
CA tools support curriculum analysis when they are implemented to help teaching staff
to cope with tasks they are already performing for an existing institutional process, such
as course planning and competency assessment. In this study, the CA tool not only
facilitated the collection of documentary evidence to account for competency
attainment at a program level, but also provided visualizations of competency attainment
results that are usually not available for staff. Despite staff feeling comfortable when
using the tool to upload evidence of competency attainment, usability and
functionalities issues should still be improved in future versions of this tool. Future work on CA
adoption should not only focus on tool development, but also on the evaluation of its
impact in the formulation of curriculum improvement actions.</p>
    </sec>
    <sec id="sec-6">
      <title>Acknowledgements</title>
      <p>This work was funded by the EU LALA project (grant no.
586120-EPP-1-2017-1-ESEPPKA2-CBHE-JP). The authors would like to thank Camila Aguirre from U-Planner
for collaborating with this study, and the reviewers for their constructive suggestions.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>M. D. Pistilli</surname>
            and
            <given-names>G. L.</given-names>
          </string-name>
          <string-name>
            <surname>Heileman</surname>
          </string-name>
          , “
          <article-title>Guiding Early and Often: Using Curricular and Learning Analytics to Shape Teaching, Learning</article-title>
          and Student Success in Gateway Courses,” New Dir. High. Educ., vol.
          <volume>180</volume>
          , pp.
          <fpage>21</fpage>
          -
          <lpage>30</lpage>
          ,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <given-names>M.</given-names>
            <surname>Scheffel</surname>
          </string-name>
          , “
          <article-title>The Evaluation Framework for Learning Analytics Impact The Evaluation Framework,” The research reported in this thesis was carried out at the Open Universiteit in the Netherlands at Welten Institute - Research Centre for Learning, Teaching and Technology, and under the auspices of SIKS, the Dutch Research School for Information and</article-title>
          Kno,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <given-names>X.</given-names>
            <surname>Ochoa</surname>
          </string-name>
          , “
          <article-title>Simple metrics for curricular analytics</article-title>
          ,”
          <source>in CEUR Workshop Proceedings</source>
          ,
          <year>2016</year>
          , vol.
          <volume>1590</volume>
          , pp.
          <fpage>20</fpage>
          -
          <lpage>26</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <given-names>S.</given-names>
            <surname>Gottipati</surname>
          </string-name>
          and
          <string-name>
            <given-names>V.</given-names>
            <surname>Shankararaman</surname>
          </string-name>
          , “
          <article-title>Competency analytics tool: Analyzing curriculum using course competencies,” Educ</article-title>
          . Inf. Technol., pp.
          <fpage>1</fpage>
          -
          <lpage>20</lpage>
          ,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>Comput.</surname>
          </string-name>
          , vol.
          <volume>5</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>32</fpage>
          -
          <lpage>44</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <given-names>C.</given-names>
            <surname>Klein</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Lester</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Rangwala</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Johri</surname>
          </string-name>
          , “
          <article-title>Learning Analytics Tools in Higher Education: Adoption at the Intersection of Institutional Commitment</article-title>
          and Individual Action,
          <source>” Rev. High. Educ.</source>
          , vol.
          <volume>42</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>565</fpage>
          -
          <lpage>593</lpage>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <given-names>M. V.</given-names>
            <surname>Zelkowitz</surname>
          </string-name>
          , “
          <article-title>An update to experimental models for validating computer technology,”</article-title>
          <string-name>
            <given-names>J.</given-names>
            <surname>Syst</surname>
          </string-name>
          . Softw., vol.
          <volume>82</volume>
          , no.
          <issue>3</issue>
          , pp.
          <fpage>373</fpage>
          -
          <lpage>376</lpage>
          ,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <given-names>N.</given-names>
            <surname>Sclater</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Peasgood</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J.</given-names>
            <surname>Mullan</surname>
          </string-name>
          , “
          <article-title>Learning Analytics in Higher Education: A review of UK and internacional practice</article-title>
          ,”
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <given-names>J.</given-names>
            <surname>Greer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Molinaro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Ochoa</surname>
          </string-name>
          , and
          <string-name>
            <given-names>T.</given-names>
            <surname>Mckay</surname>
          </string-name>
          ,
          <source>“Proceedings of 1st Learning Analytics for Curriculum and Program Quality Improvement Workshop,” in 6th Invernational Learning Analytics &amp; Knowledge Conference</source>
          ,
          <year>2016</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>24</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          <string-name>
            <surname>Technol.</surname>
          </string-name>
          , vol.
          <volume>11</volume>
          , no.
          <issue>3</issue>
          , pp.
          <fpage>389</fpage>
          -
          <lpage>399</lpage>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <string-name>
            <given-names>R.</given-names>
            <surname>Gluga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Kay</surname>
          </string-name>
          , and T. Lever, “Is Five Enough?
          <article-title>Modeling Learning Progression in IllDefined Domains at Tertiary Level,” in IllDef2010: ITS2010 Workshop on Intelligent Tutoring Technologies for Ill-Defined Problems</article-title>
          and
          <string-name>
            <surname>Ill-Defined Domains</surname>
          </string-name>
          ,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          <string-name>
            <given-names>L.</given-names>
            <surname>Ali</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Asadi</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D.</given-names>
            <surname>Ga</surname>
          </string-name>
          , “
          <article-title>Factors influencing beliefs for adoption of a learning analytics tool: An empirical study</article-title>
          ,
          <source>” Comput. Educ.</source>
          , vol.
          <volume>62</volume>
          , pp.
          <fpage>130</fpage>
          -
          <lpage>148</lpage>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>