<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Evaluating Usage of an Analytics Tool to Support Continuous Curriculum Improvement</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Isabel</forename><surname>Hilliger</surname></persName>
							<email>ihillige@uc.cl</email>
							<affiliation key="aff0">
								<orgName type="institution">Pontificia Universidad Católica de Chile</orgName>
								<address>
									<settlement>Santiago</settlement>
									<country key="CL">Chile</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Constanza</forename><surname>Miranda</surname></persName>
							<email>csmirand@uc.cl</email>
							<affiliation key="aff0">
								<orgName type="institution">Pontificia Universidad Católica de Chile</orgName>
								<address>
									<settlement>Santiago</settlement>
									<country key="CL">Chile</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Sergio</forename><surname>Celis</surname></persName>
							<email>scelis@ing.uchile.cl@mar.perez</email>
							<affiliation key="aff1">
								<orgName type="institution">Universidad de Chile</orgName>
								<address>
									<settlement>Santiago</settlement>
									<country key="CL">Chile</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Mar</forename><surname>Pérez-Sanagustín</surname></persName>
							<affiliation key="aff0">
								<orgName type="institution">Pontificia Universidad Católica de Chile</orgName>
								<address>
									<settlement>Santiago</settlement>
									<country key="CL">Chile</country>
								</address>
							</affiliation>
							<affiliation key="aff2">
								<orgName type="department">Institut de Recherche en Informatique de Toulouse (IRIT)</orgName>
								<orgName type="institution">Université Paul Sabatier Toulouse III</orgName>
								<address>
									<settlement>Toulouse</settlement>
									<country key="FR">France</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Evaluating Usage of an Analytics Tool to Support Continuous Curriculum Improvement</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">30D94B242FE7D939E443E6FA6D84B651</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-24T02:23+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Curriculum Analytics</term>
					<term>Case Study</term>
					<term>Competency-based Curriculum</term>
					<term>Continuous Improvement</term>
					<term>Higher Education</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Curriculum analytics (CA) consists of using analytical tools to collect and analyse educational data, such as program structure and course grading, to improve curriculum development and program quality. This paper presents an instrumental case study to illustrate the usage of a CA tool to help teaching staff collect evidence of competency attainment in an engineering school in Latin America. The CA tool was implemented during a 3-year continuous improvement process, in the context of an international accreditation. We collected and analysed data before and after tool implementation to evaluate its use by 124 teaching staff from 96 course sections. Data collection techniques included: analysis of documentary evidence collected for the continuous improvement process and teaching staff questionnaires. Findings show that the tool supported staff tasks related to the assessment of competency attainment at a program level. However, usability and functionality issues would have to be addressed to also support course redesign, providing actionable information about students' performance at an individual level. Lessons learned imply that institutions could adapt and adopt existing CA tools to support curriculum analysis by not only investing in tool development, but also in capacity building for its use for continuous improvement processes.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1">Introduction</head><p>Within higher education, different types of analytics have been implemented to tackle painstaking work required to improve learning results at different levels <ref type="bibr" target="#b0">[1]</ref>, <ref type="bibr" target="#b1">[2]</ref>. One of these analytical approaches is Curriculum Analytics (CA),defined as the collection, analysis and visualization of administrative curricular data-such as course enrolment and student grades -to inform and support decision making at a program level <ref type="bibr" target="#b2">[3]</ref>. In specific educational contexts, CA has been proposed as a good solution for lightening task workload required to identify courses where the improvement of learning results is crucial to the program success <ref type="bibr" target="#b3">[4]</ref>. This workload is particularly high in competencybased curriculums because each course emphasizes specific core competencies within an academic plan <ref type="bibr" target="#b4">[5]</ref>. Traditionally, the analysis of this type of curriculum requires to assess the alignment between program and course level competencies, determine whether both type of competencies are attained and formulate action plans to improve teaching and learning <ref type="bibr" target="#b3">[4]</ref>. The emergence of CA techniques opens new possibilities to perform these tasks in a timely and cost-effective manner <ref type="bibr" target="#b3">[4]</ref>. However, the CA promise of supporting curriculum analysis is far from fulfilled. Several tools have been proposed to identify gateway courses in a curriculum and improve its outcomes <ref type="bibr" target="#b3">[4]</ref>, but managers and teaching staff still perceive that they lack systematic information for course improvement, such as students' academic results regarding taken courses and core competency attainment <ref type="bibr" target="#b4">[5]</ref>. Research about CA adoption is still in an early stage, so most of the tools developed propose solutions to tackle institutional needs are not necessarily related to any existing improvement process <ref type="bibr" target="#b5">[6]</ref>. In some cases, using technology to help stakeholders to visualize institutional data available is a good means of supporting data-driven decision-making and promoting transparency in certain educational processes. However, prior work suggests that, in order to leverage the potential of CA tools, an institution needs to be ready to address data-driven changes <ref type="bibr" target="#b5">[6]</ref>, having already implemented processes to examine what is and is not working at different levels <ref type="bibr" target="#b0">[1]</ref>.</p><p>To illustrate how the use of CA tools could systematically support continuous curriculum improvement for an extended period of time, we present an instrumental case study about a continuous improvement process implemented in an engineering school in Latin America (UC-Engineering). This process was implemented between 2015 and 2017 to comply with the North American Accreditation Board for Engineering and Technology (ABET), and in mid-2016, a CA tool was implemented to support teaching staff. To evaluate the teaching staff's usage of this tool in five competency-based programs, we collected and analysed data before and after tool implementation. The following data collection techniques were used: 1) analysis of documentary evidence reported for the continuous improvement process by 124 teaching staff in 96 course sections, and 2) teaching staff questionnaires applied to 25 out of 63 teaching assistants who interacted with the CA tool. These two types of data sources were triangulated to evaluate tool usage from the teaching staff 's perspective, exploring its implications in terms of tool usefulness and ease-of-use.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2">Curriculum Analytics Tools</head><p>Over the past decade, researchers have developed CA tools to address multiple educational challenges from the perspective of different stakeholders. Table <ref type="table" target="#tab_0">1</ref> summarizes some of the tools that have been documented in recent conference proceedings and journal articles describing the institution in which they were developed, and their distinctive features for different users. Concerning students, tools 1 and 2 were developed with the objective of changing their approach to study, using visualizations of their performance to motivate them to adopt help-seeking behaviours. Regarding teaching staff and managers, tools 3-10 were developed with the objective of providing information to identify students who are facing difficulties in their studies, expecting staff to reach out to their students and provide guidance. Finally, tools 11-14 were developed for students or staff, aiming to help them to identify crucial courses in a curriculum, and anticipate the impact of course-level improvements in competency attainment at a program level.</p><p>Most of the tools presented in Table <ref type="table" target="#tab_0">1</ref> do not provide information about the methodology used to evaluate its impact on any existing institutional processes. By methodology, we mean experimental approaches to determine whether the tool is effectively fulfilling its objective <ref type="bibr" target="#b6">[7]</ref>. For instance, little is known about how students perceived the tools developed to promote help-seeking behaviours (such as tool 1) <ref type="bibr" target="#b7">[8]</ref>. Regarding tools that aim to identify students at risk (such as tool 9), there is no clear relationship between the use of the tool and student outcomes yet <ref type="bibr" target="#b7">[8]</ref>. And, concerning the tools for monitoring course-level improvement and competency attainment, researchers have just started to evaluate their adoption by collecting information about user perceptions <ref type="bibr" target="#b4">[5]</ref>, without necessarily reporting data about its actual use to drive improvements in program design or academic program delivery <ref type="bibr" target="#b8">[9]</ref>.</p><p>Building upon this latest idea, we present a case study that evaluates teaching staff usage of a CA tool in the context of a continuous improvement process that has already been installed at an institutional level. Our aim is not only to present the results of evaluating the tool from users' perspective, but also to analyse how it was used to support an existing process for curriculum improvement. 3 Methods</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1">Study Design and Objectives</head><p>According to Zelkowitz <ref type="bibr" target="#b6">[7]</ref>, instrumental case studies are useful to determine if a technological tool makes it easier to produce something compared to prior scenarios. Since the objective of this study is to evaluate usage of a CA tool throughout a continuous improvement process, we select instrumental case study as the most appropriate methodology. The case study took place at UC-Engineering between 2015 and 2017. In this case, we specifically evaluated whether the CA tool supported teaching staff efforts to collect documentary evidence to account for competency attainment.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.2">Case Study Context and Proposition</head><p>In the early 2000s, UC-Engineering decided to accredit five academic programs by ABET, which concentrated 35% of its undergraduate enrolment (1,500 students out of a total of 4,000): 1) Civil Engineering, 2) Electrical Engineering, 3) Computer Engineering, 4) Mechanical Engineering, and 5) Chemical Engineering. In 2015, the managers at the Office for Undergraduate Studies and the Engineering Education Unit designed a continuous improvement process to be implemented for the renewal of the ABET accreditation of these five programs. The continuous improvement process was organized in six semesters between the first semester of 2015 and the second semester of 2017. Every semester, teaching staff were required to undertake the tasks described in Fig. <ref type="figure" target="#fig_0">1</ref>. Before the semester started, they had to participate in a workshop to revise their course syllabuses to make sure the competencies declared at a course level were aligned to the ones at assigned program level. At the beginning of the semester, they had to report an assessment plan to account for competency attainment at their courses (two competencies per course in most cases). Once the semester finished, they had to</p><p>Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) report competency assessment results, which were transformed into percentages of competency attainment to be revised in an end-of-semester curriculum meeting (curriculum discussions). Additionally, they might have been required to submit a sample of assessment methods used to assess the competencies assigned to inform curriculum decision-making (if needed).</p><p>At the end of the continuous improvement process, 104 assessment plans (http://bit.ly/2SYxWxc) and spreadsheets of competency assessment results (http://bit.ly/2VOdUKx) were expected to be reported by 124 teaching staff from 96 course sections, analysing the attainment of the 11 competencies proposed by ABET Criterion 3 (http://bit.ly/2SeVzRj). By the end of 2015, 38 assessment plans and competency assessment results were already collected from 29 course sections for an interim report to be sent to ABET in June 2016. These documents were attached and sent in e-mails by teaching staff, and then uploaded to Dropbox folders by UC-Engineering managers. For curriculum discussions, analysts of the Engineering Education Unit had to transform graded assessment results into percentages of competency attainment, so this information was only available at the end of the semester. By automating this analysis, program chairs and teaching staff would have access to competency attainment results whenever they needed them for their decision-making. This motivated UC-Engineering managers to implement a CA tool developed jointly by the University of Sydney and U-Planner (https://www.u-planner.com/en-us/home), a Latin American company that provides technological solutions and research services to higher education institutions. This tool was originally developed to facilitate the alignment between competencies in a graduate profile and the teaching and assessment methods of the different courses within a program (see Fig. <ref type="figure">2</ref>), and to collect documentary evidence of competency assessment at a</p><p>Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) course level as attachments. In order to include an automated visualization of competency attainment in the CA tool, U-Planner had to implement an ETL process to integrate course grading results from the LMS, and then create a report that transformed these results into percentages of competency attainment LMS (see Fig. <ref type="figure">3</ref>). After validating the report for 12 courses during the first semester of 2016, the CA tool was implemented for supporting the continuous improvement process since the second semester of 2016. At the end of 2017, the CA tool was not only used to collect all the documentary evidence required for the accreditation purposes (syllabuses, course descriptions, and assessment plans), but also to visualize competency attainment results in each course for different academic periods. The percentages of competency attainment were calculated by dividing each student grade into the maximum score of an assessment method, and then multiplying this result by 100. Subsequently, each percentage of competency attainment was classified into four performance levels according to different thresholds established by teaching staff. These performance levels were: 1) unsatisfactory, 2) developing, 3) satisfactory, and 4) exemplary (see Fig. <ref type="figure">3</ref>). Fig. <ref type="figure">2</ref>. Screenshot of the UC-Engineering CA tool for supporting continuous improvement. Through this tool, teaching staff performed the tasks required for the continuous improvement process since the first semester of 2016.</p><p>Fig. <ref type="figure">4</ref> depicts the period considered for the case study, indicating the period were the tool was implemented. The tool aimed at supporting the continuous curriculum process by means of facilitating the following activities (see Fig. <ref type="figure" target="#fig_0">1</ref> with the semester tasks):</p><p>(1) Filling a course description form to describe broadly the teaching and assessment methods.</p><p>Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0)</p><p>(2) Indicating the relationship between competencies at both program and course level (Fig. <ref type="figure">2</ref>). (3) Listing performance indicators for competencies at a program level that could be assessed at a course level. (4) Aligning performance indicators with graded assessment methods at a course level. ( <ref type="formula">5</ref>) Generating automated reports on percentages of competency attainment (Fig. <ref type="figure">3</ref>). This functionality is integrated with the institutional LMS to automatically capture the students' grades for align graded assessment methods. ( <ref type="formula">6</ref>) Uploading the following documentary evidence as attachments: Course syllabus; Assessment plans; Competency assessment results; Sample of assessment methods. Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0)</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.3">Case Study Participants, Data Gathering Techniques and Analysis</head><p>Between 2015 and 2017, 124 members of the teaching staff participated in the continuous improvement process implemented at UC-Engineering. These 124-teaching staff include 61 teachers (44 faculty members and 17 part-time instructors) who reported documentary evidence of competency attainment in 96 course sections, and 63 teaching assistants who supported teachers in outcome assessment tasks (see Table <ref type="table" target="#tab_2">2</ref>). The evaluation process was organized into two phases. The first phase consisted in evaluating how the CA tool was used to facilitate the collection of documentary evidence for analysing competency attainment in curriculum discussions. The documentary evidence consisted of any documentation that was reported to account for competency attainment at a course level, such as: course syllabus, course description, competency assessment results, and samples of assessment methods (see Fig. <ref type="figure" target="#fig_0">1</ref>). In order to compare the number and the type of documentary evidence generated before and after the CA tool was implemented, three researchers used a coding scheme to classify the evidence reported by each teaching staff member in each course section. This scheme was developed based on a bottom-up coding approach, and each category was defined by examining the files uploaded in both Dropbox and the CA tool.).From this bottom-up approach, four categories emerged, so the three researchers used these categories to assigned scores from 0 to 1 to account for the type of documentary evidence reported every semester (see Table <ref type="table">3</ref>): 1) reported assessment plans, 2) reported a sample of assessment methods, 3) reported competency attainment results, 4) reported course syllabus, 5) included a course description, and 6) used the automated report of the CA tool. Then, the scores assigned ranged from 0 to 6 in each course section, in which a score equal to 0 indicates a minimum amount and variety of evidence reported (and a score equal to indicates 6 a maximum amount and variety of evidence).</p><p>Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Table <ref type="table">3</ref>. Coding Scheme to analyse documentary reported throughout the continuous improvement process implemented at UC-Engineering (before and after tool implementation)</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Categories</head><p>Category description Reported assessment plans (0)</p><p>The teaching staff did not report an assessment plan informing how course assessment methods were used to measure competency attainment. Reported assessment plans <ref type="bibr" target="#b0">(1)</ref> The teaching staff reported an assessment plan to inform how course assessment methods were used to measure competency attainment at a course section level. Reported a sample of assessment methods (0)</p><p>The teaching staff did not report a sample of assessment methods to account for different levels of competency attainment among different students. Reported a sample of assessment methods <ref type="bibr" target="#b0">(1)</ref> The teaching staff reported a sample of assessment methods to account for different levels of competency attainment, such as developing or satisfactory.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Reported competency attainment results (0)</head><p>The teaching staff did not report competency attainment results at a course section level.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Reported competency attainment results (1)</head><p>The course reported competency attainment results based on graded assessments at a course level. Reported course syllabus (0)</p><p>The teaching staff did not report the course syllabus to complement the evidence items for the accreditation process. Reported course syllabus <ref type="bibr" target="#b0">(1)</ref> The teaching staff reported the course syllabus as a complement to other evidence items reported for the accreditation process. Included a course description (0) The teaching staff did not include a course description among the evidence items uploaded in the CA tool. Included a course description <ref type="bibr" target="#b0">(1)</ref> The teaching staff included a course description among the evidence items uploaded in the CA tool. Reported percentages of competency attainment (0)</p><p>The teaching staff did not reported percentages of competency attainment.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Reported percentages of competency attainment (1)</head><p>The teaching staff reported percentages of competency attainment at a program level.</p><p>The second phase consisted in exploring the perceived usefulness and ease-ofuse of the CA tool from the viewpoint of its users. For this purpose, researchers have recently proposed instruments to evaluate the impact of analytical tools on educational practices <ref type="bibr" target="#b1">[2]</ref>, these instruments have still been implemented at a course-level in controlled environments. Therefore, we decided to develop a paper-based questionnaire based on the prior work of Ali et al <ref type="bibr" target="#b11">[12]</ref>, considering that his objective was also exploring teaching staff's perspectives in a real-life context. Our questionnaire consisted of a closed-ended and an open-ended question section (http://bit.ly/2Jh3VVG). The closedended section consisted of a 5-point Likert scale to determine the level of staff's agreement on different items related to perceived usefulness and perceived ease-of-use, while the open-ended section included the following questions to understand usability and ease-of-use implications from an exploratory approach:</p><p>• What use would you give to the CA tool after interacting with it?</p><p>• What kind of information would you expect from this tool?</p><p>• What do you think the CA tool lacks in terms of information and functionality? In order to gather information from the user experience once the CA tool implementation process was well advanced, we applied the questionnaire during a workshop for teaching assistants held at the beginning of the second semester of 2017 (see semester tasks in Fig. <ref type="figure" target="#fig_0">1</ref>). A total number of 25 teaching assistants who attended this workshop Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) responded to the questionnaire voluntarily (representing 25 out of the 63 teaching assistants who supported outcome assessment tasks since the second semester of 2016). After transcribing their responses, we estimated the percentage of respondents who agreed with these items by counting the number of respondents whose scores were equal or higher than 4, and then dividing them by the number of total respondents (see Fig. <ref type="figure">6</ref>). Fig. <ref type="figure">5</ref>. Percentage of teaching assistants who agreed with the questionnaire items related to the CA tool usefulness and ease of use (N=25, see questionnaire in: http://bit.ly/2Jh3VVG).</p><p>Once all the evidence items and questionnaire were analysed, we triangulated questionnaire and document analysis results from phase 1 (see Fig. <ref type="figure">5 and 6</ref>). This process consisted of contrasting the amount and variety of documentary reported by teaching staff before and after tool implementation, in addition to analysing the perspectives of questionnaire respondents on tool usage for curriculum analysis based on evidence of competency attainment.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4">Case Study Findings</head><p>The main findings of the case study are summarized in Table <ref type="table">5</ref>. Firstly, the CA tool helped teaching staff collect a greater number and variety of evidence, providing visualizations of competency attainment at a program level (Finding 1 in Table <ref type="table">5</ref>). Document analysis results show that the number and the variety of evidence reported per course section increased from two to five after the CA tool was implemented (see Fig. <ref type="figure">2</ref>). In most cases, these three additional items included course syllabuses, course descriptions, and the percentages of competency attainment. Before the CA tool was implemented, teaching staff delegated the transformation of graded assessment results into percentages of competency attainment to professionals in the Engineering Education Unit. However, once the CA tool was implemented, 89% of the course sections relied on the automated report provided by the tool to account for competency</p><formula xml:id="formula_0">56% 60% 80% 84% 92% 0% 20% 40% 60% 80% 100%</formula><p>The CA tool allows me to obtain more information about courses than…</p><p>The CA tool allows me to obtain information easily.</p><p>The purpose of using CA tool is clear and understandable.</p><p>It is easy to learn how to use the CA tool.</p><p>In general, the CA tool seems useful for institutional management.</p><p>Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0)</p><p>attainment at a course level. This effort to collect a greater number and variety of evidence is not due to greater administrative pressure, since UC-Engineering managers presented a report to ABET in the first semester of 2016, and all subsequent work was done solely with the motivation to continuously improve the curriculum. Additionally, the results of the questionnaire show that 92% of respondents agreed with the item 'In general, the CA tool seems useful for institutional management' (see Fig. <ref type="figure">4</ref>). In the open-ended questions, teaching assistants mentioned that the CA tool facilitated the use of evidence to account for the implementation of a competency-based curriculum throughout the accreditation process. They also mentioned its potential use to provide students with information about course methods and its alignment with competencies from the graduate profile.</p><p>Secondly, teaching staff identified usability and functionality issues that affect the use of evidence to inform course redesign and students' understanding of performance (Finding 2 in Table <ref type="table">5</ref>). The results of teaching staff questionnaires show that 56% agreed with the item 'The CA tool allows me to obtain more information about courses than other tools (such as the institutional LMS and a web application to search for course information).' In the open-ended sections, respondents claimed that the CA tool had usability issues. For example, respondents indicated that the tool views had too many tabs and fields, which hindered loading of information. Respondents also mentioned functionality issues, such as the lack of feedback about the quality of course information uploaded, and documentary evidence reported. Also, the tool lacks actionable information on student performance to provide timely and specific feedback throughout the semester.</p><p>Furthermore, questionnaire respondents indicated that, although they were able to upload the documentary evidence to account competency-attainment, they perceived the CA tool site was not intuitive enough. The fields to be completed, as well as the files to be uploaded were not clearly explained. This may partly explain why only 16% of course sections reported samples of assessment tools. These results suggest that the CA tool could be improved by including 'help messages' or indications related to the process within the platform. Table <ref type="table">4</ref>. Main findings about the usefulness and the ease-of-use of the CA tool Findings Document analysis and questionnaire results Supporting data 1. The CA tool helped teaching staff collect a greater number and variety of evidence, providing visualizations of competency attainment at a program level.</p><p>Course sections reported 3 additional evidence items for the accreditation process after the CA tool was implemented. Document analysis results (Fig. <ref type="figure">6</ref>) 89% of course sections used the automated reporting feature of the CA tool to account for competency attainment at a course level (40 out of 45).</p><p>Screenshot of automated report of competency attainment (Fig. <ref type="figure">3</ref>) 92% of teaching staff agreed with the questionnaire item 'In general, the CA tool seems useful for institutional management' (23 out of 25).</p><p>Teaching staff questionnaire (Fig. <ref type="figure">5</ref>)</p><p>Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Findings Document analysis and questionnaire results Supporting data 2. Teaching staff identified usability and functionality issues in the CA tool that affect the use of evidence to inform course redesign and students' understanding of performance. 56% of teaching staff agree with the questionnaire item 'The CA tool allows me to obtain more information about courses than other tools.' (14 out of 25)</p><p>Teaching staff questionnaire (see Fig. <ref type="figure">5</ref>) 16% of course sections reported samples of assessment methods after the CA tool was implemented (7 out of 45). Document analysis results (Fig. <ref type="figure">6</ref>) Fig. <ref type="figure">6</ref>. Average number of evidence items per course submitted each semester.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5">Discussion and Limitations</head><p>This case study showed the results of the usage of a CA tool in a long-term continuous improvement process. The main finding shows that the use of the CA tool helped teaching staff collect a greater number and variety of documentary evidence of competency attainment at a program level, allowing staff to upload documents that were previously shared or stored somewhere else, such as course syllabus. Furthermore, the CA tool provided staff with a visualization of competency attainment during the whole continuous improvement, information that was previously shown and discussed only in curriculum meetings at the end of the semester. This finding not only reflects that CA tools could be a good solution to reduce task workload for curriculum analysis <ref type="bibr" target="#b1">[2]</ref>, but also that this type of solutions provides visualizations of student performance that is typically hidden in institutional processes. This is particularly valuable for competency-based curriculums because it provides an overview of the competencies attained by the students at a program level. Teaching staff usually face difficulties in competency assessment because they tend to be abstract and complicated at a course level, while managers deal with the complexity of understanding learning results in a hierarchical structure of courses <ref type="bibr" target="#b2">[3]</ref>. With this new tool, managers and teaching staff could have more actionable information for making better decisions to reinforce the required competencies at a program level.</p><p>Although this work has illustrated that a CA tool can support curriculum analysis in the context of an accreditation process, there are still needs of teaching staff that have not been met by the tool under study. Finding 2 in Table <ref type="table">5</ref> indicates that there are usability and functionality issues that prevent teaching staff from using the tool to obtain information to redesign courses and to understand students' performance at an individual level. These are staff needs that have already motivated the development of tools <ref type="bibr" target="#b7">[8]</ref>. In order to solve these issues, we could specify redesign requirements for the CA tool by using information we have already collected to evaluate usage. Thus, lessons learned from this case study imply that institutions could adopt and adapt existing CA tools to support curriculum analysis, investing not only in tool development and redesign, but also in capacity building for evidence-informed continuous improvement <ref type="bibr" target="#b0">[1]</ref>.</p><p>Along these lines, one of the main contributions of this paper is that it evaluates the use of a CA tool in an existing institutional process for an extended period of time, going beyond current evaluation strategies that mostly employ self-reported data without relying on a technology validation methodology <ref type="bibr" target="#b4">[5]</ref>. The document analysis considered evidence items of the whole 3-year period in which the continuous improvement process was implemented. This long-term period assured collecting enough information to determine whether the CA tool facilitated teaching staff efforts to collect evidence of competency attainment before and after its implementation <ref type="bibr" target="#b6">[7]</ref>.</p><p>Yet, this study has its limitations. Questionnaire results only represented a small sample of teaching staff members who interacted with the CA tool. To understand the contribution of the CA tool for a larger group of teachers, future work would have to explore the type of information most staff need to inform course or curriculum redesign. We anticipate this work might require monitoring learning results for an extended period of time as we did it in this case study <ref type="bibr" target="#b4">[5]</ref>, so we can expand the current knowledge on the impact of CA tools on curriculum improvement even further.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6">Conclusions</head><p>This case study illustrates the usage of a CA tool to support a continuous improvement process for a competency-based curriculum in a university setting. Findings show that CA tools support curriculum analysis when they are implemented to help teaching staff to cope with tasks they are already performing for an existing institutional process, such as course planning and competency assessment. In this study, the CA tool not only facilitated the collection of documentary evidence to account for competency attainment at a program level, but also provided visualizations of competency attainment results that are usually not available for staff. Despite staff feeling comfortable when using the tool to upload evidence of competency attainment, usability and functionalities issues should still be improved in future versions of this tool. Future work on CA adoption should not only focus on tool development, but also on the evaluation of its impact in the formulation of curriculum improvement actions.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Fig. 1 .</head><label>1</label><figDesc>Fig. 1. Semester tasks that teaching staff had to undertake for the continuous improvement process implemented at UC-Engineering between 2015 and 2017.</figDesc><graphic coords="5,124.70,273.40,345.90,194.55" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Fig. 3 .Fig. 4 .</head><label>34</label><figDesc>Fig. 3. Screenshot of automated report of percentages of competency attainment at a course level generated by the CA tool regarding the performance indicator 'Communicates constructively with other classmates' for the effective communication competency.</figDesc><graphic coords="7,198.65,309.40,197.70,131.80" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head></head><label></label><figDesc>for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0)</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0"><head></head><label></label><figDesc></figDesc><graphic coords="6,124.70,339.40,345.90,237.55" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_0"><head>Table 1 .</head><label>1</label><figDesc>Curriculum analytics tools documented in the recent literature</figDesc><table /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_1"><head>Tool name and user Developer Distinctive features [References]</head><label></label><figDesc></figDesc><table><row><cell>Tool name and user</cell><cell>Developer</cell><cell>Distinctive features [References]</cell></row><row><cell>9. Course Signals (focused</cell><cell>Purdue University, USA (li-</cell><cell>A set of risk indicators to classify</cell></row><row><cell>on teaching staff)</cell><cell>censed to Ellucian)</cell><cell>students in subgroups [1], [8]</cell></row><row><cell>10. Student Flow Diagrams</cell><cell>University of New Mexico,</cell><cell>Sankey diagram to understand stu-</cell></row><row><cell>(focused on managers and</cell><cell>USA</cell><cell>dents' academic trajectories [1]</cell></row><row><cell>teaching staff)</cell><cell></cell><cell></cell></row><row><cell>11. Curricular Analytics (fo-</cell><cell>University of New Mexico,</cell><cell>Interactive graphical representation</cell></row><row><cell>cused on managers, teaching</cell><cell>USA</cell><cell>of the curriculum [1]</cell></row><row><cell>staff)</cell><cell></cell><cell></cell></row><row><cell>12. Visualized Analytics of</cell><cell>Yuan Ze University, Taiwan</cell><cell>Visualization of competency attain-</cell></row><row><cell>Core Competencies-VACCs</cell><cell>(currently used in Yuan Ze Uni-</cell><cell>ment in radar charts regarding</cell></row><row><cell>(focused on students)</cell><cell>versity)</cell><cell>grades, credit hours and peer per-</cell></row><row><cell></cell><cell></cell><cell>formance [5]</cell></row><row><cell>13. Competency Analytics</cell><cell>Singapore Management Univer-</cell><cell>Curriculum progression statistics</cell></row><row><cell>Tool-CAT (focused on man-</cell><cell>sity, Singapore</cell><cell>based on competency map and</cell></row><row><cell>agers and staff)</cell><cell></cell><cell>course information [4]</cell></row><row><cell>14. Course University Study</cell><cell cols="2">University of Sydney, Australia Web-based application to model</cell></row><row><cell>Portal-CUSP (focused on</cell><cell></cell><cell>competency development in 5 ma-</cell></row><row><cell>managers and teaching staff)</cell><cell></cell><cell>turity levels [11]</cell></row><row><cell>1. Check My Activity (fo-</cell><cell>University of Maryland, Balti-</cell><cell>Student visualizations of LMS logs</cell></row><row><cell>cused on students)</cell><cell>more, USA</cell><cell>compared to their peers and their</cell></row><row><cell></cell><cell></cell><cell>grades [1], [8]</cell></row><row><cell>2. E 2 Coach (focused on stu-</cell><cell>University of Michigan, USA</cell><cell>Student visualizations of feedback</cell></row><row><cell>dents)</cell><cell></cell><cell>based on peer performance [1]</cell></row><row><cell>3. Student Relationship En-</cell><cell>University of Sydney, Australia</cell><cell>Interface for customizable analysis</cell></row><row><cell>gagement System (focused</cell><cell>(used in 58 courses)</cell><cell>of students' academic performance</cell></row><row><cell>on teaching staff)</cell><cell></cell><cell>datasets and visualizations [9]</cell></row><row><cell>4. Risk management model</cell><cell>The University of Queensland,</cell><cell>A set of risk indicators at a pro-</cell></row><row><cell>(focused on managers and</cell><cell>Australia (piloting phase)</cell><cell>gram and course level [9]</cell></row><row><cell>teaching staff)</cell><cell></cell><cell></cell></row><row><cell>5. The Ribbon Tool (focused</cell><cell>UC Davis, USA (disseminated</cell><cell>Sankey diagram to understand stu-</cell></row><row><cell>on managers and teaching</cell><cell>to be used by other higher edu-</cell><cell>dents' academic trajectories [9]</cell></row><row><cell>staff)</cell><cell>cation institutions)</cell><cell></cell></row><row><cell>6. Know your students (fo-</cell><cell>UC Davis, USA</cell><cell>Dashboard interface with students'</cell></row><row><cell>cused on teaching staff)</cell><cell></cell><cell>demographic data at a course level</cell></row><row><cell></cell><cell></cell><cell>[9]</cell></row><row><cell>7. Departmental Diagnostic</cell><cell>UC Davis, USA</cell><cell>Dashboard interface with students'</cell></row><row><cell>Dashboard (focused on man-</cell><cell></cell><cell>demographic data at a course level</cell></row><row><cell>agers)</cell><cell></cell><cell>[9]</cell></row><row><cell>8. Learning dashboard for In-</cell><cell>KU Leuven, Belgium</cell><cell>Dashboard with information about</cell></row><row><cell>sights and Support during</cell><cell></cell><cell>student course enrolment, course</cell></row><row><cell>Study Advice-LISSA (fo-</cell><cell></cell><cell>credits earned, and grades of one or</cell></row><row><cell>cused on student advisers)</cell><cell></cell><cell>more students [10]</cell></row><row><cell cols="3">Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons</cell></row><row><cell cols="2">License Attribution 4.0 International (CC BY 4.0)</cell><cell></cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_2"><head>Table 2 .</head><label>2</label><figDesc>Teaching staff involved throughout the continuous improvement process at UC-Engineering (before and after tool implementation)</figDesc><table><row><cell>Faculty members</cell><cell>Teaching</cell></row><row><cell>and part-time</cell><cell>Assistants</cell></row><row><cell>instructors</cell><cell></cell></row></table><note>(*) Total number of staff members who reported documentary evidence between 2015 and 2017 (**) Number of staff members who were involved as CA use after tool implementation (42 faculty members and part-time instructors, and 63 teaching assistants)</note></figure>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Acknowledgements</head><p>This work was funded by the EU LALA project (grant no. 586120-EPP-1-2017-1-ES-EPPKA2-CBHE-JP). The authors would like to thank Camila Aguirre from U-Planner for collaborating with this study, and the reviewers for their constructive suggestions.</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Guiding Early and Often: Using Curricular and Learning Analytics to Shape Teaching, Learning and Student Success in Gateway Courses</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">D</forename><surname>Pistilli</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><forename type="middle">L</forename><surname>Heileman</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">New Dir. High. Educ</title>
		<imprint>
			<biblScope unit="volume">180</biblScope>
			<biblScope unit="page" from="21" to="30" />
			<date type="published" when="2017">2017</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">The Evaluation Framework for Learning Analytics Impact The Evaluation Framework</title>
		<author>
			<persName><forename type="first">M</forename><surname>Scheffel</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Universiteit in the Netherlands at Welten Institute -Research Centre for Learning, Teaching and Technology, and under the auspices of SIKS, the Dutch</title>
				<imprint>
			<publisher>Research School for Information and Kno</publisher>
			<date type="published" when="2017">2017</date>
		</imprint>
	</monogr>
	<note>The research reported in this thesis was carried</note>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Simple metrics for curricular analytics</title>
		<author>
			<persName><forename type="first">X</forename><surname>Ochoa</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">CEUR Proceedings</title>
				<imprint>
			<date type="published" when="2016">2016</date>
			<biblScope unit="volume">1590</biblScope>
			<biblScope unit="page" from="20" to="26" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Competency analytics tool: Analyzing curriculum using course competencies</title>
		<author>
			<persName><forename type="first">S</forename><surname>Gottipati</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Shankararaman</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Educ. Inf. Technol</title>
		<imprint>
			<biblScope unit="page" from="1" to="20" />
			<date type="published" when="2017">2017</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">Open Student Models of Core Competencies at the Curriculum Level: Using Learning Analytics for Student Reflection</title>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">Y</forename><surname>Chou</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">IEEE Trans. Emerg. Top. Comput</title>
		<imprint>
			<biblScope unit="volume">5</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="32" to="44" />
			<date type="published" when="2015">2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Learning Analytics Tools in Higher Education: Adoption at the Intersection of Institutional Commitment and Individual Action</title>
		<author>
			<persName><forename type="first">C</forename><surname>Klein</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Lester</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Rangwala</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Johri</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Rev. High. Educ</title>
		<imprint>
			<biblScope unit="volume">42</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="565" to="593" />
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">An update to experimental models for validating computer technology</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">V</forename><surname>Zelkowitz</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">J. Syst. Softw</title>
		<imprint>
			<biblScope unit="volume">82</biblScope>
			<biblScope unit="issue">3</biblScope>
			<biblScope unit="page" from="373" to="376" />
			<date type="published" when="2009">2009</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<monogr>
		<author>
			<persName><forename type="first">N</forename><surname>Sclater</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Peasgood</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Mullan</surname></persName>
		</author>
		<title level="m">Learning Analytics in Higher Education: A review of UK and internacional practice</title>
				<imprint>
			<date type="published" when="2016">2016</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<monogr>
		<author>
			<persName><forename type="first">J</forename><surname>Greer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Molinaro</surname></persName>
		</author>
		<author>
			<persName><forename type="first">X</forename><surname>Ochoa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Mckay</surname></persName>
		</author>
		<title level="m">Proceedings of 1st Learning Analytics for Curriculum and Program Quality Improvement Workshop</title>
				<meeting>1st Learning Analytics for Curriculum and Program Quality Improvement Workshop</meeting>
		<imprint>
			<date type="published" when="2016">2016</date>
			<biblScope unit="page" from="1" to="24" />
		</imprint>
	</monogr>
	<note>6th Invernational Learning Analytics &amp; Knowledge Conference</note>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Learning Analytics Dashboards to Support Adviser-Student Dialogue</title>
		<author>
			<persName><forename type="first">S</forename><surname>Charleer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Vande Moere</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Klerkx</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Verbert</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><forename type="middle">De</forename><surname>Laet</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">IEEE Trans. Learn. Technol</title>
		<imprint>
			<biblScope unit="volume">11</biblScope>
			<biblScope unit="issue">3</biblScope>
			<biblScope unit="page" from="389" to="399" />
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Is Five Enough? Modeling Learning Progression in Ill-Defined Domains at Tertiary Level</title>
		<author>
			<persName><forename type="first">R</forename><surname>Gluga</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Kay</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Lever</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IllDef2010: ITS2010 Workshop on Intelligent Tutoring Technologies for Ill-Defined Problems and Ill-Defined Domains</title>
				<imprint>
			<date type="published" when="2010">2010</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Factors influencing beliefs for adoption of a learning analytics tool: An empirical study</title>
		<author>
			<persName><forename type="first">L</forename><surname>Ali</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Asadi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Ga</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Comput. Educ</title>
		<imprint>
			<biblScope unit="volume">62</biblScope>
			<biblScope unit="page" from="130" to="148" />
			<date type="published" when="2013">2013</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
