=Paper= {{Paper |id=Vol-2704/paper2 |storemode=property |title=Learning Analytics at UC-Engineering: Lessons learned about Infrastructure and Organizational Structure |pdfUrl=https://ceur-ws.org/Vol-2704/paper2.pdf |volume=Vol-2704 |authors=Mar Pérez-Sanagustín,Isabel Hilliger,Jorge Maldonado-Mahauad,Ronald Pérez-Alvarez,Josefina Hernández-Correa |dblpUrl=https://dblp.org/rec/conf/ectel/Perez-Sanagustin20 }} ==Learning Analytics at UC-Engineering: Lessons learned about Infrastructure and Organizational Structure== https://ceur-ws.org/Vol-2704/paper2.pdf
 Learning Analytics at UC-Engineering: Lessons learned
   about Infrastructure and Organizational Structure

    Mar Pérez-Sanagustín1,2, Isabel Hilliger1, Jorge Maldonado-Mahauad1,3, Ronald
                   Pérez-Álvarez1, Josefina Hernández Correa1,
              1
          School of Engineering, Pontificia Universidad Católica de Chile
            Av. Vicuña Mackenna, 4860, Macul, Santiago (RM), Chile
                2
                  Université de Toulouse III Paul Sabatier, France
                       3
                         Universidad de Cuenca, Ecuador
mar.perez-sanagustin@irit.fr, {ihillige, jjmaldonado,raperez13,
                      josefina.hernandez}@uc.cl

Abstract: The development of Learning Analytics (LA) capabilities in a Higher Edu-
cation institution is challenging. On the one hand, the institution requires of a techno-
logical infrastructure for adapting and/or developing LA services. On the other hand,
the institution also needs of an organizational structure for designing and implementing
new processes for assuring the adoption of these services. There are two different ap-
proaches for developing the necessary infrastructure and organizational structure. One
consists on following a top-down process, in which the leadership of the LA initiative
is mainly driven by institutional managers, who provide the necessary means. Another
is bottom-up, where the initiatives are led by ground-level teaching staff without in-
volving institutional managers. This article presents both approaches through two LA
initiatives of Engineering School at the Pontificia Universidad Católica de Chile (UC-
Engineering). We show how these two initiatives emerged and integrated into existing
academic processes to improve teaching and learning at an institutional level. The in-
frastructure and organizational structure resulting from each initiative is presented, as
well as the lessons learned. This paper aims at serving as an example for other univer-
sities in Latin America interested on developing and incorporating LA capabilities.

Keywords: Learning Analytics, Higher Education, Latin America


1      Introduction

    Incorporating Learning Analytics (LA) at an institutional level is challenging, espe-
cially in Higher Education institutions in Latin America (Latam). Although some ef-
forts have been made in this region to incorporate LA services [1], these initiatives are
still immature for moving from experimentation to full institutional integration [2]. In
Latam, institutions face two main challenges. On the one hand, they often lack the

Copyright ©2020 for this paper by its authors. Use permitted under Creative Com-
mons License Attribution 4.0 International (CC BY 4.0).



                                                                                       1
technological infrastructure for developing and/or adapting LA services [1]. On the
other hand, they also lack the organizational structures to integrate the use of educa-
tional data into existing institutional processes to inform strategic planning or guiding
teaching and learning practices [2].
    Regarding the technological infrastructure required for building new LA tools, a
report by JISC (2013) makes a review on the technologies needed for developing LA
capacity at an institutional level [3]. This analysis is based on the logical analytical
workflow identified by Elias (2011) [4], who describes the technologies that have been
traditionally used for decision making, such as business analytics. This workflow con-
sists of seven phases: (1) Collecting and acquiring data; i.e. data extraction from its
source; (2) Storing data; (3) Cleaning data to rectify anomalies; (4) Integrating data;
i.e. aligning data to other existing databases or to common vocabulary used in the in-
stitution; (5) Analysing data for building descriptive or predictive models; (6) Repre-
senting and visualizing data for the appropriate audience; and (7) proposing alert sys-
tems for relevant stakeholders. For each of these phases, this report presents a review
of the different technologies that could serve as reference for installing LA capacities
in Higher Education institutions. Although this report is not updated with latest ad-
vances in the area, it is possible to infer the key elements required for building LA
capacity. First, institutions require of an appropriate infrastructure to store educational
data. This storage should allow the integration of data collected from the different ser-
vices within the university, such as academic registration or virtual learning environ-
ments. However, data integration requires a common vocabulary know how to link data
and how to use it in different contexts [1]. Second, tools and services are needed for
analyzing the data according to the stakeholders needs. And third, a technical team,
able of maintaining, organizing and preparing data for analysis is required.
    Regarding the organizational structure, some researchers point out that an enabling
leadership is key to overcome the challenges inherent to the institutional adoption of
Learning Analytics [5]. However, finding the appropriate balance between different
types of leadership for fostering LA adoption is not easy to overcome. Dawson et al
(2018) [5], who analyzed leadership for LA adoption based on the Complexity Leader-
ship Theory (CLT), defines a spectrum between top-down and bottom-up leadership
approaches. Top-down approaches correspond to LA initiatives led by senior managers
and/or institutional leaders such as vice provosts, who do not necessarily involve stake-
holders at a ground-level such as teaching staff or students. Whereas bottom-up ap-
proaches are those who are led by ground-level staff, without involving managers from
higher institutional hierarchies. That is, organizational structures should be defined so
as to flexibly coordinate the needs and efforts of the stakeholders at different levels of
the institution and articulate the exploratory LA initiatives emerging from the bottom
with more institutionally driven activities coming from the top.
    Since experiences on LA capacity building in Latam are scarce, institutions in this
region are forced to take as a reference experiences and cases conducted world-wide,
which usually do not face problems and challenges of similar nature. With this paper,
we aimed at presenting how LA capabilities have been developed at a Latam institution
—specifically, at Engineering School at the Pontificia Universidad Católica de Chile
(UC-Engineering), one of the better ranked institutions according to QS ranking 2019.
Concretely, this paper presents two different LA initiatives, one that emerged as a


                                                                                         2
bottom-up process and another one that followed a top-down process. The bottom-up
LA initiative emerged from the interest of a group of researchers that aimed at analyzing
the data collected through the Coursera platform, the institutional platform for deploy-
ing Massive Open Online Courses (MOOC). The top-down LA initiative followed a
continuous improvement process installed in the institution to inform curricular
changes at a program level, in the context of international accreditation. For both initi-
atives, this paper describes the context, the technological infrastructure developed
and/or adapted, and the organizational structure to support them. Then, we present the
lessons learned from other initiatives so as to guide other institutions from Latam with
similar needs in LA capacity building.


2         Bottom-up LA initiative: Supporting students’ self-
          regulation in MOOCs

This section describes a LA initiative that emerged as a bottom-up process led by a
group of researchers at UC-Engineering. This initiative aimed at proposing a LA tool
capable of supporting students’ self-regulatory strategies in MOOCs, in order to im-
prove their learning experience and help them better achieve their objectives. This tool
is called NoteMyProgress and it has been used in all MOOC courses produced by UC-
Engineering, besides supporting a blended learning course.

    2.1    Context
   In 2015, UC-Engineering launched the UC Online Engineering initiative. This initi-
ative aims to create open online courses (MOOCs) for Coursera and digital content to
transform traditional teaching-learning practices. Since this initiative began, UC Engi-
neering has 17 MOOCs with more than 400,000 registered students, and several pro-
jects have reused MOOCs as a complement to the curriculum courses [6], [7].
   As a result of the initiative, UC Engineering began to collect a large amount of data
on students from all over the world, from demographics to their interaction with online
resources. This large volume of data was seen at the institution as an opportunity to
launch research initiatives around LA, aimed at improving student experience in these
new digital learning environments.
   In this context, a group of UC-Engineering researchers proposed a project to support
student study strategies in digital learning environments, in order to improve MOOC
learners’ engagement and performance. This project was funded by the National Com-
mission of Science and Technology of Chile (CONICYT) between 2015 and 2018. One
of the results of the project was the tool NoteMyProgress (NMP) [8], a LA tool for
supporting students’ self-regulation strategies in online environments in an automatic
and personalized way. Through interactive visualizations, it provides actionable aggre-
gated information about student activity in the online course and its interaction with its
contents. The objective of this tool was to promote students’ reflection on their learning
strategies, motivating informed decision-making to improve their performance.




                                                                                        3
 2.2     Technological infrastructure
   The development of NMP was conducted by a software company specialized in vis-
ualizations, and coordinated by one of the researches of the team involved in the
CONICYT project. For its development, the researchers followed a design-based re-
search approach called the Interactive Learning Design Framework (ILD) by Bannan
(2003) [9]. This framework organizes tool development into an iterative process in
which the requirements of the tool are defined after an Informed Exploration Phase.
This phase consisted on a literature review of papers developing tools for supporting
SRL in online environments [8]. According to the requirements identified in this phase,
a first version of the tool was designed and implemented in a local server of the com-
pany involved in the development. This first version was evaluated locally in one of the
MOOCs of the UC-Engineering to identify usability and functional problems. The con-
clusions of this evaluation were used for designing a second version of the tool. This
second version was installed in a web server at UC-Engineering and deployed in the
three MOOCs part of UC Engineering online initiative at that time. Three researchers
led a pilot study to analyze the data collection process for evaluating the broad impact
of the tool. The whole development process took one year, and the data analysis for
preparing a report summarizing the main results of the pilot took one more year.
   Therefore, the infrastructure involved in this initiative was: (1) an external local
server to host the first version ; (2) a web server at UC-Engineering for installing the
second version, and (3) an account in Google Apps for uploading the last version of the
tool and make it available to final users.

 2.3     Organizational structure
   Different stakeholders were involved in the process.
   • Three researchers participated in the literature review process for defining the
         requirements of the tool and the final analysis for reporting the impact of the
         tool.
   • Two developers from an external company participated in the development of
         the tool in collaboration with the research team;
   • Seven teachers and teaching assistants took part of the evaluation process;
   • The Director of the Engineering Education coordinated the relationship between
         the research team and the teaching staff involved in the evaluation process;
   • The Ethical Committee of the University, who validated the consent forms fa-
         cilitated to the users that downloaded and installed the tool, as well as the agree-
         ment for using the data for the analysis.
   Since this process emerged from a group of researchers, the first phase of the LA
initiative involved mostly grounded level stakeholders, while the Director of Engineer-
ing Education, a higher-level stakeholder, was involved when scaling up the usage of
the tool.




                                                                                           4
    2.4    Challenges encountered in the process of developing LA.
   During the development and deployment process of NMP tool, the research team
encountered two main challenges, especially in the deployment and piloting of the tool.
Regarding the technological infrastructure, the implementation of the tool at a univer-
sity level required of the coordination of the technical team of the university via the
Engineering Education leader. This step required meetings for convincing intermediate
managers that piloting an innovative initiative could lead to potential benefits.
   Regarding the organizational structure, researchers involved in the project were al-
ready familiar with the LA service potential. However, making senior managers and
teaching staff aware of how the NMP tool could impact students’ performance required
empirical evidence the potential of the tool. In fact, the research team continues to work
on new tool versions, in order to facilitate their incorporation into institutional pro-
cesses, such as undergraduate blended teaching.


3         Top-down LA initiative: Supporting continuous improvement
          processes at a program-level

This section describes a LA initiative that emerged as a top-down process leaded by the
Office for Undergraduate Studies and the Engineering Education Unit of the UC-
Engineering School. The aim of this initiative was to propose a LA tool for supporting
a continuous improvement process at five programs in UC-Engineering, in order to
comply with one criterion of an international accreditation process [10]. This tool is
called Curriculum Analytics (CA) tool, and it has been used by 124 teaching staff from
in 96 course sections.

    3.1    Context
   In 2007 and 2011, after an institutional strategic decision, the UC-Engineering de-
cided to comply their programs to the quality standards of the Accreditation Board of
Education Technology (ABET). Five out of the eleven engineering degrees were ac-
credited by ABET during this period. In 2015, ABET mandated a continuous improve-
ment process to renew the accreditation of these programs, providing evidences of how
competency attainment data has been used to improve curriculum and teaching prac-
tices at a program-level.
   To facilitate the accreditation process, the Office for Undergraduate Studies and the
Engineering Education Unit designed a process for collecting data about student com-
petency attainment. This process consisted on supporting teaching staff as they deter-
mine assessment plans to measure competency attainment at a course-level, and ana-
lyzing the results of course assessments so as to discuss the students’ competence at-
tainment at a program meeting [10]. As a starting point, the assessment plans and the
competency attainment results were stored in Dropbox folders. However, the collection
of assessment evidence become overwhelming forteaching staff, and the analysis of the
data collected was not readily available for program meetings. To alleviate this process,
the Director of the Undergraduate Studies Office decided to invest on a LA tool. This



                                                                                        5
tool was called CA tool, and its design aimed at facilitating the storage of assessment
evidence, such assessment plans and competency attainment results, besides providing
visualizations of competency attainment for program meetings.

 3.2     Technological infrastructure
   The CA tool was also developed by a Chilean Company following the Interactive
Learning Design Framework (ILD) by Bannan (2003) [9] for adapting a tool previously
developed by an Australian University. The Informed exploration phase of this frame-
work was led by a team member of the Engineering Education Unit, who collected data
from 25 teachers and 51 students affiliated to UC-Engineering by means of an open-
ended questionnaire about the information and functionalities they would expect from
a Curriculum Analytics tool. The result of the analysis of the qualitative information
collected was a list of features and interfaces to be included in the tool. With these
requirements, the software development company designed a first version of the tool
including information about the students’ competency attainment at a course-level.
That is the percentage of students who achieved a competence at a satisfactory level
according to their learning results in a course assessment method. Once the first version
of the tool was ready, the member of the UC Engineering Education Unit developed an
instrumental case study to evaluate how the tool supported 124 teaching staff in 96
course sections. The results showed that the teachers valued the use of the tool for col-
lecting information about their course, besides having automated reports of the stu-
dents’ competency attainment [11].
   The infrastructure involved in this initiative was: (1) a web-based application of
other university that helped to have a preliminary idea of how a Curriculum Analytics
tool could help a continuous improvement process, and (2) an internal university server
for deploying the piloting tests tool and the final version of the tool.

 3.3     Organizational structure
  Different stakeholders from different teams of the UC-Engineering School were in-
volved in the tool development and evaluation process.

  •     The Director of the Office for Undergraduate Studies
  •     One manager from the Engineering Education Unit who collected and analyzed
        data to identify requirements and evaluate how the tool was used by teaching
        staff during the continuous improvement process, and one academic who col-
        laborated with this manager during the definition of the requirements needed for
        adapting the tool to the final stakeholders,
   • Two project managers of an external company who coordinated the develop-
        ment process of the Continuous Improvement Platform;
   • 125 teaching staff participated in the evaluation process;
   Since this process emerged from the Director of the Office for Undergraduate Stud-
ies, and it was led by a manager by the Engineering Education Unit, the stakeholders
involved in the first phases of the LA initiative were high level, while more grounded-
level stakeholders participated in the design and evaluation process of the initiative.


                                                                                       6
    3.4    Challenges encountered in the process
   During the development and deployment process of the Curriculum Analytics tool
main challenges were encountered. Regarding technological infrastructure, the most
challenging aspect was to integrate data from different sources into the CA tool. In
order to obtain automated reports about students’ competency attainment, the CA tool
had to integrate course partial grading with course enrolment, in addition to linking
manual parameters that indicated what partial grades indicated learning results for a
specific competency. During its implementation, managers had to conduct several val-
idation exercises to compare if the data of the report reflected the competency attain-
ment results that were supposed to be visualized during program meetings.
   Regarding organizational structure, the Engineering Education and the office of Un-
dergraduate Studies had to find the mechanisms for integrating the use of the LA tool
as part of already existing processes to avoid teaching workload. Since the implemen-
tation of the CA tool responded to a top-down initiative, teaching staff were reticent to
conduct additional tasks to the ones they already undertake to assess competency at-
tainment and learning results in their courses. For this, managers of the CA unit trained
teaching assistants about how to use the CA tool, so they could help teaching staff to
upload the required evidence for the international accreditation process.


4         Conclusions and lessons learned

   This paper summarizes two of the LA initiatives that were developed at UC Engi-
neering School: (1) NoteMyProgress, an initiative to support students’ self-regulation
in MOOCs; and (2) the Curriculum Analytics tool, an initiative to support continuous
improvement processes at a program-level based on competency attainment evidence.
Both initiatives were successfully deployed, but some challenges were encountered in
the way in each of the projects.
   The first initiative followed a bottom up approach, where grounded-level researchers
were involved in the design and evaluation of the LA tool. The main challenges identi-
fied were two: (1) Regarding technical aspects: the lack of intermediate managers to
support the incorporation of the tool into an existing institutional process; and (2) Re-
garding organizational aspects: the lack of awareness of the potential of this tool to
support online learners due to the lack of involvement of managers during early phases.
The second initiative followed a top-down approach that emerged from senior manag-
ers. The main challenges identified were two: (1) Regarding technical aspects, the dif-
ficulties to integrate educational data from different sources into an analytical tool de-
veloped by an external company, and (2) Regarding organizational aspects, the re-
sistance of teaching staff to undertake additional tools, and the need to involve teaching
assistants to support them as they collect evidence for competency attainment.
   From the two approaches, the following lessons learned were extracted for future
initiatives. First, it is important to combine top-down and bottom-up approaches to fa-
cilitate the involvement of varied stakeholders during the design and the implementa-
tion of an LA tools. In the bottom-up LA initiative, ground-level staff played a key role
during the design process, providing feedback to tool developers regarding their needs
and preferences. Whereas, in the top-down initiative, senior managers play a key role


                                                                                        7
during the incorporation of the tool into an existing academic process, managing re-
sources and training for engaging teaching staff. Second, it is important to anticipate
the need for servers and data warehouses in order to integrate data from different
sources. In the bottom-up LA initiative, UC-Engineering had to move their tool from a
company sever to web server managed by the university. In the top-down LA initiative,
data integration was crucial, besides the successive validation of automated reports.
Third and final, it is important to spread the potential that LA tools could have for
addressing institutional needs, besides building the expertise required to organize, clean
and manage educational data responsible. We expect that these and other initiatives
motivate other universities to take paths as the one taken by UC-Engineering.


5      Acknowledgements
Work funded by the LALA project (grant no. 586120-EPP-1-2017-1-ES-EPPKA2-
CBHE-JP). This project has been funded with support from the European Commission.
This publication reflects only the views of the authors, and the Commission cannot be
held responsible for any use which may be made of the information contained therein.


6      References

[1]     C. Cobo and C. Aguerrebere, “Building capacity for learning analytics in Latin
        America,” in Learning Analytics for the Global South, no. Learning Analitycs, C. Ping
        Lim and V. L. Tinio, Eds. Quezon City, Philippines: Foundation for Information
        Technology Education and Development, Inc., 2018, pp. 63–67.
[2]     H. Lemos dos Santos, C. Cechinel, J. B. Carvalho Nunes, and X. Ochoa, “An Initial
        Review of Learning Analytics in Latin America,” in 12th Latin American Conference
        on Learning Technologies (LACLO), 2017.
[3]     W. Kraan and D. Sherlock, “Analytics tools and infrastructure,” JISC CETIS Anal. Ser.,
        vol. 1, no. 11, pp. 1–24, 2013.
[4]     T. Elias, “Learning Analytics: Definitions , Processes and Potential,” 2011.
[5]     S. Dawson, O. Poquet, C. Colvin, T. Rogers, A. Pardo, and D. Gasevic, “Rethinking
        learning analytics adoption through complexity leadership theory,” in LAK’18:
        International Conference on Learning Analytics and Knowledge, 2018.
[6]     M. Pérez-Sanagustín, I. Hilliger, C. Alario-hoyos, C. Delgado Kloos, and S. Rayyan,
        “H-MOOC framework: reusing MOOCs for hybrid education,” J. Comput. High. Educ.,
        vol. 29, no. 1, pp. 47–64, 2017.
[7]     J. Hernández, F. Rodríguez, I. Hilliger, and M. Pérez-Sanagustín, “MOOCs as a
        Remedial Complement: Students’ Adoption and Learning Outcomes,” IEEE Trans.
        Learn. Technol., vol. 12, no. 1, pp. 133–141, 2019.
[8]     R. Pérez-álvarez, J. Maldonado-Mahauad, and M. Pérez-Sanagustín, “Design of a tool
        to support self-regulated learning strategies in MOOCs,” J. Univers. Comput. Sci., vol.
        24, no. 8, pp. 1090–1109, 2018.
[9]     B. Bannan, “The Role of Design in Research: The Integrative Learning Design
        Framework,” Educ. Res., no. July, pp. 22–24, 2003.
[10]    I. Hilliger, S. Celis, and M. Pérez-Sanagustín, “Work in Progress: Engaging Engineering


                                                                                             8
       Teaching Staff in Continuous Improvement Process WIP : Engaging engineering
       teaching staff with continuous improvement processes,” in ASEE Annual Conference &
       Exposition, 2019.
[11]   I. Hilliger, C. Miranda, S. Celis, and M. Pérez-Sanagustín, “Evaluating usage of an
       analytics tool to support continuous curriculum improvement,” in European Conference
       on Technology Enhanced Learning, 2019, vol. 2437, pp. 1–14.




                                                                                         9