=Paper= {{Paper |id=Vol-1590/paper-02 |storemode=property |title=Using a Risk Management Approach in Analytics for Curriculum and Program Quality Improvement |pdfUrl=https://ceur-ws.org/Vol-1590/paper-02.pdf |volume=Vol-1590 |authors=Wai Yee Wong,Marcel Lavrencic |dblpUrl=https://dblp.org/rec/conf/lak/WongL16 }} ==Using a Risk Management Approach in Analytics for Curriculum and Program Quality Improvement== https://ceur-ws.org/Vol-1590/paper-02.pdf
        Using a Risk Management Approach in Analytics for
          Curriculum and Program Quality Improvement
                             Wai Yee Wong                                                   Marcel Lavrencic
                     The University of Queensland                                    The University of Queensland
            Institute for Teaching and Learning Innovation                  Institute for Teaching and Learning Innovation
                         St Lucia, QLD Australia                                         St Lucia, QLD Australia
                            +617 3365 6731                                                   +617 3365 3169
                        amywong@uq.edu.au                                              m.lavrencic@uq.edu.au


ABSTRACT                                                               purposes of understanding and optimizing learning and
Learning analytics, with a risk management approach, provides          environments in which it occurs”, a definition of learning analytics
relevant and actionable information to teaching and administrative     by the Society for Learning Analytics Research [8], in the context
                                                                       of curriculum and program quality enhancement. Curriculum based
staff to make evidence-based decisions in curriculum and program
                                                                       analytics is defined as the actions of collecting, analysing and
quality improvement. This paper outlines the development and
                                                                       interpreting key stakeholder data, such as student admission,
pilot implementation of a risk management model with an online         retention, satisfaction, course and program structure, and
feedback system in a research-intensive Australian university.         assessment, across multiple offerings to enhance the development,
Providing teachers and executives with the opportunity, facilitated    implementation and evaluation of curriculum and program quality
by the essential IT infrastructure, to contextualise data and to       [9]. Active engagement from university executives, academics and
document their response to the identified risks is a proactive         students in using evidence-based practices to evaluate curriculum
approach to empower staff to make enhancements to their teaching       design and make decisions about curriculum and program reforms
practices, and to influence academic management. In addition, the      is pivotal to the success and sustainability of efforts to curriculum
opportunity for individual teaching staff to examine the progress of   and program quality improvement [9].
their own courses is a fundamental step in curriculum and program      This paper outlines the development of a risk management
quality improvement. Positive feedback has been received in terms      framework in the revised Curriculum and Teaching Quality
of the ease of access and opportunity provided to contextualise the    Appraisal (CTQA) process at a research-intensive Australian
risk. Future development will incorporate dynamic data from            university, which will be fully implemented for the academic year
different sources, such as student participation in the learning       2016. The pilot phase of implementation concluded in January
management system, to build a holistic risk management                 2016. The paper also discusses how a risk management model
framework in teaching and learning.                                    better facilitates data-driven decision making, and curriculum and
                                                                       program quality improvement, compared with the traditional
CCS Concepts                                                           performance management framework. Alongside with the risk
• Social and professional topics→Professional topics→                  management framework, a series of interactive reports and
Management of computing and information systems→Project                dashboards for University Executives, Program Convenors, Course
and people management→Systems analysis and design                      Coordinators and teaching staff are also developed. This is an
                                                                       attempt to provide comprehensive, relevant, and actionable
Keywords                                                               information to key stakeholders to encourage the use of evidence-
Risk management; analytics; teaching; curriculum; quality              based practices, as well as to assist individual teaching staff to
assurance.                                                             examine the success of a course which is fundamental to curriculum
                                                                       and program quality improvement. Last, but not least, an online
1. INTRODUCTION                                                        feedback system also acts as an effective means to close the loop of
In the current highly competitive environment, new modes of            the risk management process. Staff are provided with the
governance that emphasise performance, quality and accountability      opportunity to document their response to the data provided via the
of student learning and experience have become common practice         online feedback system. Risk management with active participation
in higher education institutions (HEIs) [1, 2, 3]. HEIs are under      from staff empowers the University community to make data-
pressure to demonstrate their teaching quality with increasing         driven decisions in considering student learning and experience.
degrees of accountability and quality assurance expectations [4]. In
the Australian higher education system, the Australian                 2. BACKGROUND
Qualifications Framework (AQF) provides criteria for different         The CTQA is a key component of this University’s overall quality
types of qualifications, as well as the expected learning outcomes,    assurance process in teaching and learning. It is undertaken on an
skills and knowledge required for each qualification level [5].        annual basis, and involves an evidence-based consideration of the
Together with the Tertiary Education Quality and Standards             overall quality of its teaching programs. The previous CTQA
Agency’s (TEQSA) risk assessment framework [6], these national         process was established in 2008 and was based on a performance
frameworks evaluate and monitor the teaching, learning and             management model, which identified programs that did not meet
assessment quality of HEIs [7]. Linking these national                 the specified performance indicators. Since 2008, there have been
requirements to the field of learning analytics, the emergent          changes in both the external and internal higher education
question is how to best use the “measurement, collection, analysis     environment. In order to align the University’s teaching and
and reporting of data about learners and their contexts, for the       learning quality assurance process to the national agenda, and to
maximise the internal benefits of this quality assurance process, a            2.   Student Load: An unplanned significant increase in
decision was made to revise the CTQA process.                                       student load could potentially impact on the quality of
                                                                                    student experience. Conversely, an unplanned significant
3. THE REVISED CTQA PROCESS                                                         and continuing decrease may signal a decline in the
The principle of the revised CTQA process is to collect relevant                    quality of the programs offered as perceived by
data, and undertake critical and diagnostic data analyses which                     prospective students.
focus on trends, issues, actions taken and outcomes to support
ongoing curriculum and program quality improvement. The                        3.   Domestic Retention: A low retention rate may suggest
rationale of selecting a risk management framework, instead of                      that there are potential quality issues in the process of
using a performance management framework, is based on the                           student admission, teaching and learning, and the overall
concept that through identification and management of risk, it can                  student experience. Prompt actions to address early
impact performance. A performance management framework                              attrition are critical to minimise the compound effect on
focuses on the measurement of the actual results and their deviation                attrition in the later years of the program.
from the targets [10]. Academic staff reactively respond to the                4.   International Retention: Rationale same as Indicator 3.
identified areas for improvements and implement strategies in an
attempt to reach the university’s targets. A number of academic                5.   Full-Time Employment after Graduation: A very low
staff previously expressed their resentment to a performance                        employment rate could indicate that students may not be
management framework, as they felt that they should not be                          well-equipped with the necessary graduate attributes for
penalised for the poor performance of the indicators that they have                 successful transition to the next stage of their chosen
limited control on, such as the student load. In contrast, a risk                   profession. However, volatility in the labour market
management framework emphasises the importance of proactive                         needs to be factored in when interpreting this indicator.
actions for risk mitigation [10]. The premise of this framework lies
in the fact that when an indicator is identified as at risk, it may not        6.   Overall Satisfaction: A core quality indicator in higher
necessarily signal poor performance of a specified course/program.                  education and provides an overall guide as to whether the
Instead, the identification of risk provides an opportunity for the                 program met student expectations. Poor satisfaction is a
staff to mitigate and contextualise the risk, and make a conclusion                 risk to the institution’s future market demand.
of whether current actions are adequate to address the identified              7.   Pass Rate: A core indicator of student success and quality
risk or further actions are required. Academic staff who participated               of the academic environment. When the pass rate is at
in the pilot welcomed the change from a performance to a risk                       very high/low levels, it may suggest that there are
management framework, as it lessens the punitive perception of the                  potential quality issues in student teaching and learning,
process and encourages conversations between staff and senior                       and/or the overall student experience.
executives to investigate the identified risks.
                                                                               8.   Completion Times: This indicator represents one
The first step in developing the revised process is key stakeholder                 dimension of the effectiveness of the delivery of
consultation to ensure that relevant and actionable information is                  educational services. Number of students in different
provided to teaching staff and University executives. A broad                       study mode (full-time or part-time) need to be factored in
consultation was conducted with the Associate Deans (Academic)                      when interpreting the results. Prompt actions to identify
in each Faculty, Chairs of Teaching and Learning Committees of                      at-risk students, at an early stage, who are not being able
each School, Heads of Schools, Program Convenors and Course                         to complete a program and to provide them with
Coordinators. Through committee meetings, presentations and                         appropriate support are essential to minimise the
individual discussions, a community of teaching and administrative                  possibility of reaching the stage of non-completion.
staff was encouraged to engage in making evidence-based
decisions to improve student learning. Based on the outcomes of           3.2 Risk Indicators for Courses
the consultation, in alignment to the TEQSA risk assessment               The set of risk indicators for courses and the rationale, based on the
framework [6] and the University’s strategic plan and policies,           TEQSA risk assessment framework [6] and the University’s
separate sets of risk indicators were defined for courses and             strategic plan and policies, are outlined as follows:
programs. The future plan is to include dynamic data from other
sources, such as the student learning management system, as the                1.   Enrolments: An unplanned significant increase in student
model evolves in time.                                                              enrolments could potentially impact on the quality of
                                                                                    student experience. Conversely, an unplanned significant
3.1 Risk Indicators for Programs                                                    and continuing decrease may signal a decline in quality
The set of risk indicators for programs and the rationale, based on                 in courses offered as perceived by prospective students.
the TEQSA risk assessment framework [6] and the University’s
strategic plan and policies, are outlined as follows:                          2.   Pass Rate: A core indicator of student success and quality
                                                                                    of the academic environment. When the pass rate is at
     1.   Year 12 Student First Preferences to a Program with an                    very high/low levels, it may suggest that there are
          Overall Position (OP) 1-5 (OP ranges from 1 – the highest                 potential quality issues in student teaching and learning,
          to 25 – the lowest): This indicator shows the ability of a                and/or the overall student experience.
          program at this University to attract students with high
          academic achievements in comparison to its competitors.              3.   Student Evaluation of Course and Teacher (SECaT)
          A significant decrease may signal a decline in the quality                Response Rate: This is one of the indicators to reflect
          or value of the program offered. However, recruitment                     student engagement with the course in providing
          strategies and employment in a profession need to be                      feedback. However, strategies implemented and timing at
          considered when interpreting this indicator.                              which the SECaT was administered need to be
                                                                                    considered when interpreting this indicator.
                                                                         management approach, staff only formulate a solution after an
     4.   Average SECaT Score for Q1: I had a clear                      increase in student enrolments is evident. The revised CTQA is an
          understanding of the aims and goals of the course.             annual process that focuses on data-driven decision making through
     5.   Average SECaT Score for Q2: The course was                     contextualising and mitigating risks, evidence-based action
          intellectually stimulating.                                    planning, and revisiting and evaluating proposed actions in
                                                                         subsequent annual reviews.
     6.   Average SECaT Score for Q3: The course was well
          structured.                                                    This section outlined the development of the revised CTQA
                                                                         process. The next section will focus on how to create visualisations
     7.   Average SECaT Score for Q4: The learning materials             that encourage a community of teaching and administrative staff to
          assisted me in this course.                                    engage in making evidence-based decisions to improve student
                                                                         learning at both course- and program-levels.
     8.   Average SECaT Score for Q5: Assessment requirements
          were made clear to me.                                         4. DATA VISUALISATION
     9.   Average SECaT Score for Q6: I received helpful                 The ultimate goal of data visualisation is to provide clear and useful
          feedback on how I was going in the course.                     information to the targeted audience. However, it is an iterative
                                                                         process to find the best way to visually present data to meet the
     10. Average SECaT Score for Q7: I learned a lot in this             needs of the stakeholders [11]. Being able to easily access the
         course.                                                         required data is the key starting point to make data-driven decisions
                                                                         in teaching practices, curriculum design and academic program
     11. Average SECaT Score for Q8: Overall, how would you              delivery. Therefore, the aim of the first iteration of data
         rate this course?                                               visualisation for the revised CTQA process is to provide University
     For indicators 4 to 11, these are core quality indicators to        executives, academic and administrative staff with quick and easy
     provide a guide as to whether a course met student                  access to both high-level overview and detailed-level information
     expectations. Prompt actions to address low student                 about the courses and programs offered, with the incorporation of
     satisfaction scores in specific areas will assist in identifying    simple visual cues, such as differential colour coding to provide
     the issues and implementing appropriate strategies to               greater ease in interpretation of risks. Three levels of data
     minimise student attrition and increase overall student             visualisation are created. The first level is the new executive
     satisfaction over time.                                             dashboards and reports (see Figure 1), which provide University
                                                                         executives with an overview of the minimal-, neutral-, increasing-
Using separate sets of risk indicators for courses and programs          and at-risk courses and programs.
enable individual Course Coordinators and teaching staff to
examine the success of the courses that they have taught in a
semester. This is an obvious progression from the former CTQA,
as previously only faculty- and school-level data were available
with limited individual course/program information. Nevertheless,
individual courses are the building blocks of the curriculum and
program. The provision of course-level data will further engage
teaching staff in the curriculum and program quality improvement.
Most importantly, the key feature of this risk management model is
the opportunity provided for teaching and administrative staff to
contextualise and mitigate the identified risk, to make a decision on
whether the identified risk should be closely managed, or the risk          Figure 1. A snapshot of a program executive dashboard.
is expected and actions have been in place to minimise its impact.
Staff can also document their feedback to the data provided via an       The second level is the new Faculty and School dashboards and
online feedback system which will be further discussed in Section        reports (see Figure 2), which provide the Associate Dean
5. This active engagement from teaching and administrative staff in      (Academic) of each Faculty, Heads of Schools, Chairs of Teaching
the revised CTQA process encourages them to reflect on the               and Learning Committees, Program Convenors and Course
relevant student learning data and adopt a continuous improvement        Coordinators with an overview of the minimal-, neutral-,
approach to teaching and learning. Staff are able to review              increasing- and at-risk courses and programs offered within their
individual program data on an annual basis, and individual course        Faculty and School.
data on a semester basis. By using trend data of each program and
course, teaching and administrative staff are proactively managing
risks rather than reactively managing performance. The revised
process not only identifies the at-risk courses and programs, but
also the minimal-, neutral-, increasing-risk courses and programs.
The opportunity to explore the risk indicators, which contribute to
a heightened risk for increasing-risk courses and programs, as well
as those result in a lesser risk for neutral- and minimal-risk courses
and programs, allows staff to adopt a proactive approach in
managing risks. For example, course staff are able to modify their
teaching practices, such as the use of a flipped classroom model to               Figure 2: A snapshot of a Faculty dashboard.
allow more interactive sessions with students, in anticipation of an
increasing trend of student enrolments. Unlike the reactive
The third level is the detailed course/program report for an              changes will be made to provide students with the opportunity to
individual course/program (see Figure 3). Previously, Course              demonstrate their knowledge and skills via different modes of
Coordinators or individual teaching staff were required to collate        assessment. This process is the start of a continuous improvement
and compile their own reports from the available and relevant             approach to teaching and learning, in which assessment is a core
teaching and learning data about a course/program. The new reports        component, and should be encouraged in other Faculties/Schools.
consolidate all the required data and provide the stakeholders with
an integrated report for each course/program.                             The first iteration of data visualisation for the revised CTQA
                                                                          process only includes static and historical data about student
                                                                          learning. In the second iteration of data, the aim is to create
                                                                          interactive reports and dashboards with automatic drill-down
                                                                          functions to reveal dynamic data, such as student access patterns to
                                                                          online resources and assessment, and student and teacher
                                                                          engagement patterns with the Learning Management System
                                                                          (LMS). As part of the curriculum and program quality
                                                                          improvement, these additional data about student interactions with
                                                                          online resources and technologies would provide insight into the
                                                                          optimal structure of a course/program that will engage and motivate
                                                                          students to learn.

                                                                          5. ONLINE FEEDBACK SYSTEM
                                                                          The continuous process of reviewing, reflecting and proposing new
                                                                          solutions is a core part of the quality improvement process. One of
                                                                          the strategies to engage a community of teaching staff in curriculum
                                                                          and program quality improvement is to empower them to complete
      Figure 3: A snapshot of a detailed program report.                  the revised CTQA process loop via an online feedback system (see
Staff, who have access to these modified detailed course/program          Figure 4). The purpose of this online feedback system is to provide
level reports, are already actively using them to explore the             an opportunity for staff, firstly, to provide contextualised
strengths and limitations of their courses/programs. They have also       information around selected courses/programs, such as those
provided positive feedback about the reports and process. This            identified as increasing- or at-risk. Secondly, to confirm or
unified approach reduces a considerable amount of administrative          disconfirm the identified risk and determine the residual risk for
time in collating data. As a result, they can use the time to engage      relevant courses/programs as minimal-, neutral-, increasing- or at-
in data-rich conversations focused on improving curriculum and            risk. Finally, to document proposed actions that will be undertaken
pedagogical practices, reflection and decision-making as to how to        to address the confirmed risks.
improve student learning in their course/program.
In addition, these three levels of reports and dashboards are
interrelated, which provide the opportunity for key stakeholders to
either drill down to the details of the strengths and limitations of a
course/program, or zoom out to look at the relationship of a
particular course/program to the relevant group of
courses/programs. These three levels of data visualisation aim to
generate conversations, initially, between individual teaching staff,
and gradually expand the conversations with the Course
Coordinators and Program Convenors, and collaborate to make
evidence-based decisions to improve teaching practices,
curriculum and program quality.
Apart from the three levels of data visualisation, it is essential that
reasonable requests of teaching and learning data from individual               Figure 4: A snapshot of the online feedback system.
teaching staff are adequately addressed. Nevertheless, courses are
the building blocks in a curriculum and program. Providing                The documentation of feedback is pivotal in the continual cycle of
individual teaching staff with customised reports could, in fact,         curriculum and program quality improvement, as the feedback
extend their engagement in the curriculum and program quality             collected from academic staff, Course Coordinators/Program
improvement process. The additional data that an individual               Convenors, and Faculty Executives establish the basis for the
teacher requests may also be beneficial to other courses/programs.        required actions to address the risks. All key stakeholders can
Hence, consideration should be made to incorporate those in the           review their feedback and document progress in comparison to the
new iteration of the reports and dashboards. An example is the            previous release of data. The program reports and dashboards are
request of analysing the distribution of assessment types (that is,       updated on an annual basis, whereas the course reports and
examinations, presentations, essay writing) in the compulsory             dashboards are released after the conclusion of a semester. Once
courses of a program. These relevant and actionable data about            these reports are available, each Faculty and School will have the
assessment allows teaching staff and Program Convenors to have a          autonomy to decide which group/s of courses or programs to focus
holistic view of student learning and assessment experience in a          on in order to enhance their delivery, and the approach they use in
program. When data revealed that a large percentage of assessment         response to the data provided. This autonomy provides
was examinations, one would expect that investigation into the            opportunities to generate conversations among staff to develop a
rationale of the existing assessment regime is conducted and              Faculty/School-wide response to the issues identified and raised
during the review process and the ability to apply the learnings of      participation in the LMS, to build a holistic risk management
best practice to other courses or programs requiring intervention        framework in teaching and learning in higher education.
and/or reward. In summary, this online feedback system is
developed to enable collection and consolidation of feedback and         8. REFERENCES
proposed actions to address risk.                                        [1] Cohen, P. 2004. The crisis of the university. Campus Review,
                                                                              14, 15 (Apr. 2004), 9-12.
6. FEEDBACK FROM PILOT PROCESS                                           [2] Knight, P. T. and Trowler, P. R. 2000. Departmental
The purpose of this pilot was to ascertain the effectiveness of the           leadership in higher education. Society for Research into
new process and associated communication strategy. The                        Higher Education, Philadelphia, Pa.
information gathered provided an opportunity for the Learning            [3] Tremblay, K., Lalancette, D. and Roseveare, D. 2012.
Analytics and Evaluations teams to mitigate risks associated with a           Assessment of Higher Education Learning Outcomes:
University-wide implementation, and facilitate resolutions to any             Feasibility Study Report OECD. (2012). Retrieved February
identified issues prior to the formal rollout of the new process              10, 2015 from http://www.oecd.org/edu/skills-beyond-
across the University.                                                        school/AHELOFSReportVolume1.pdf.
                                                                         [4] Lockyer L, and Dawson S. 2011. Learning designs and
Feedback from the participants was positive. They appreciated the
                                                                              learning analytics. In Proceedings of the 1st International
integrated course/program reports which provide all the relevant
                                                                              Conference on learning analytics and knowledge (Banff, AB,
data for a particular course/program. This unified approach reduces
                                                                              Canada, February 27 - March 01, 2011). ACM, 153-156.
a considerable amount of administrative time in collating the data
                                                                              DOI= http://dx.doi.org/10.1145/2090116.2090140
from different sources. In addition, the Faculty/School reports
                                                                         [5] Australian Qualifications Framework Council. 2013.
provided an overview of the minimal-, neutral-, increasing-, and at-
                                                                              Australian Qualifications Framework (January 2013).
risk courses/programs in a Faculty/School, which assists in
                                                                              Retrieved February 10, 2015 from
directing attention, resources or recognition to particular groups of
                                                                              http://www.aqf.edu.au/wp-content/uploads/2013/05/AQF-
courses/programs. The identified courses/programs risk dashboard
                                                                              2nd-Edition-January-2013.pdf
appeared to have face validity based on the participants’ knowledge
                                                                         [6] Tertiary Education Quality and Standards Agency. 2016.
and experience. Participants also acknowledged that the revised
                                                                              TEQSA’s Risk Assessment Framework Version 2.1.
process provides them with the opportunity to contextualise and
                                                                              (February 2016). Retrieved February 20, 2016 from
mitigate the identified risk, to make a decision of whether the risk
                                                                              http://www.teqsa.gov.au/sites/default/files/publication-
should be closely managed, or the risk was expected and actions
                                                                              documents/TEQSARiskAssessFramework_v2.1_0.pdf
have been in place to minimise its impact via the online feedback
                                                                         [7] Marshall, S. J., Orrell, J., Cameron, A., Bosanquet, A. and
system.
                                                                              Thomas, S. (2011). Leading and managing learning and
                                                                              teaching in higher education. Higher Education Research &
7. CHALLENGES                                                                 Development. 30, 2 (Apr. 2011), 87-103. DOI=
This paper presents how learning analytics methodologies play a
                                                                              10.1080/07294360.2010.512631
pivotal role in developing understanding, optimising and
                                                                         [8] Siemens, G. and Baker, R. (2012). Learning analytics and
transforming courses/programs, using a risk management
                                                                              educational data mining: Towards communication and
framework with an online feedback system. The two major
                                                                              collaboration. In Proceedings of the 2nd International
challenges encountered in the development of the revised CTQA
                                                                              Conference on Learning Analytics and Knowledge
process are the institutional culture change from a performance
                                                                              (Vancouver, British Columbia, Canada, April 29 - May 02,
management to a risk management framework, and collaboration
                                                                              2012). DOI=10.1145/2330601.2330661
with the business intelligence and IT departments. The lessons
                                                                         [9] Dawson, S. and Hubball, H. 2014. Curriculum Analytics:
learnt in developing and implementing the pilot revised CTQA
                                                                              Application of Social Network Analysis for Improving
process revealed that effective communication, with the support
                                                                              Strategic Curriculum Decision-Making in a Research-
from the University senior executives, is the best strategy in dealing
                                                                              Intensive University. Teaching and Learning Inquiry: The
with these challenges. Although a cultural shift in an institutional-
                                                                              ISSOTL Journal. 2, 2 (2014), 59-74.
wide system can take up to a few years, consistent communication
                                                                         [10] Arena, M. and Arnaboldi, M. (2014). Risk and performance
and clear expectations from all key stakeholders involved are the
                                                                              management: Are they easy partner? Management Research
important incremental steps in shifting the culture from a
                                                                              Review. 72, 2 (2014), 152-166. DOI= 10.1108/MRR-08-
performance to a risk management model. In terms of collaboration
                                                                              2012-0180.
with business intelligence and IT departments, the message needs
                                                                         [11] Olmos, M. M. and Corrin, L. 2012. Learning analytics: A
to be focused on the value-adding role of learning analytics to the
                                                                              case study of the process of design of visualizations. Journal
current business intelligence and IT functions, instead of being
                                                                              of Asynchronous Learning Networks.16, 3 (2012), 39-49.
perceived as a threat to their operation.
The development of the risk management framework, and its
associated reports and dashboards and online feedback system, is
still evolving. Continual support to the teaching and administrative
staff in terms of understanding the data, as well as possible
pedagogical enhancement that they could implement in their
courses/programs, is required to sustain their engagement with the
data to make evidence-based decisions in the curriculum and
program improvement process. Future development will
incorporate dynamic data from additional sources, such as student