Data-Driven Programmatic Change at Universities: What works and how Jim Greer Craig Thompson University of Saskatchewan University of Saskatchewan Jim.greer@usask.ca Craig.thompson@usask.ca Ryan Banow Stephanie Frost University of Saskatchewan University of Saskatchewan Ryan.banow@usask.ca Stephanie.frost@usask.ca ABSTRACT Studies of decision-making indicate that sometimes decisions are In this paper, we present some of our recent experiences with a made very quickly based on instinct, ignoring the actual deeper data visualization tool and offer some use cases where the problem underneath [1]. visualization tool can potentially drive programmatic change in With learning analytics and data visualization tools it is now universities. The Ribbon Tool provides an interactive easier to put into the hands of academics more powerful visualization of student flows through academic programs, interactive tools to dig into data, to discover for themselves the progressing over time to either successful completion (graduation) facts and relationships that matter to them, to experiment with or attrition. Through effective use of the Ribbon Tool by those models that can answer some of their questions, and to develop who can effect curriculum change, their ability to generate persuasive arguments that can support the case for change. We persuasive arguments for change are enhanced. This paper have found that interactive data visualizations can support presents some use cases and commentary on actual usage of the academic leaders in initiating data-driven and evidence-informed Ribbon Tool to call for programmatic change across a university. change. Keywords 2. DATA-DRIVEN VISUALIZATIONS Learning Analytics, Visualization, Programmatic Change, Ribbon WITH THE RIBBON TOOL Tool A data visualization tool called the “Ribbon Tool” has been developed at UC Davis (http://t4eba.com/ribbon/) building upon 1. INTRODUCTION the Sankey Diagram functionality with the Data-Driven Academics pride themselves on evidence-informed decision Documents (D3) data visualization library [2]. This tool has been making, but when it comes to making changes in their teaching utilized for visualizing student flows through academic programs practices, curricula, or academic programs, data and evidence in universities, with groups of students represented as coloured seem to hold little sway. Perhaps this stems from the belief that as ribbons as they move from admission to graduation or attrition. an expert in a subject area, one is automatically an expert in how, An example of a Ribbon screenshot is shown in Figure 1. where, and what of the subject area should be taught. Perhaps this stems from the outdated “mini-me” assumption that students are Vertical bars within the tool indicate the status of students in a either faculty in training or destined for attrition. Or, or perhaps it particular year and term of an academic program. The ribbons stems from the discipline-based belief that teaching practices, that flow from bar to bar correspond to the number of students curricula and academic programs were carved in stone tablets by moving from state to state. For example, in Figure 2, the three the ancestors and never meant to change. bars indicate September snapshots in 2011, 2014 and 2015. The red ribbons show numbers of students who began in Engineering Instigating change in university programs is difficult, in part in September 2011 and continued tracking them as they move because it is easy to throw sand in the wheels of change, but also forward in time. in part because the agents of change and the influencers are rarely the same people. Sadly, evidence-informed arguments to justify In the Ribbon Tool, a “mouse-over” in the diagram will reveal a changes in teaching or curriculum often have no more persuasive text box showing the number of students in a particular ribbon. effect, or perhaps even less effect, than anecdotal stories about “in The textboxes in Figure 2 show that of the 351 students who my day”, or “my son or daughter experienced”. While skepticism began in Engineering in the fall of 2011, some 240 were still can be healthy when evaluating evidence gathered from others’ enrolled in the fall of 2014. Another 33 students had transferred observations and statistical analysis, it can also be used to in from Arts and Science. Some students had transferred out of stonewall or stymie change. Engineering, to Arts and Science or another faculty. Some had dropped out of the University, and a few were on a “stop-out”. Confronting academics and administrators with cold facts, such as By fall 2015 (after 4 years), one can see that 88 students had “One third of your students from certain diversity groups are graduated with an Engineering degree. A few others had degrees leaving your program within the first two years” or “One quarter in other faculties and 177 were still enrolled for their 5th year. of your students are failing their required first mathematics Note that a substantial number of Engineering students complete a course” are met with retorts like “Tell me something I haven’t heard before!” or “Bring me some evidence that is actionable!”. one-year paid internship, which naturally extends the degree to a Along with the visualization, the user is provided a set of filters. minimum 5-year duration. For example, if one were interested in examining gender The vertical bars represent a hierarchy of temporal information. differences in student flows through Engineering, one could filter In the above example, the top level of the hierarchy represents to obtain separate diagrams for female and male students. These whether students were enrolled, had been granted a degree, or had can be quickly visually compared to see if proportions of degrees left the institution. In the next level, we show the college or granted, attrition, time to graduation, departmental breakdowns school in which they had been enrolled (or had granted them a are impacted by gender. Other filters based on any set of degree or from which they left or stopped-out). If one were to drill categorical demographic or academic characteristics can be added. down to a third level, the data shows the department (Electrical, For example, the program flow-through for female students Mechanical, Civil, etc.) where the student is enrolled or awarded a entering Engineering directly from high school with SAT scores degree. Expanding or collapsing the hierarchy gives a more or less in the top decile can be examined with a few mouse clicks. refined view. The interactive visualization allows the user to This flexible and powerful visualization tool has been used isolate a particular group in the hierarchy (for example students extensively at UC Davis and is now being disseminated to other who were enrolled in Mechanical Engineering in 2014) and universities through the “Tools for Evidence-Based Action (TEA) project backward to see where they came from and forward to see Community” [3], funded in part by the Helmsley Foundation. The where they went next. Moving through the hierarchy and Ribbon Tool has been greeted with great enthusiasm by deans and isolating views allows the user to focus in on areas of potential other administrators at our University as a tool to augment their interest. other data analysis efforts and as a means to explore elements of their academic programs. Figure 1: Screenshot of the UCDavis Ribbon Tool Figure 2: Screenshot of Ribbon Tool for Engineering students 2011-2015 improving in its reliability and robustness, thanks to the 3. POPULATING THE RIBBON TOOL development team at UC Davis. Below are some use cases that WITH DATA have proven useful in our experience to date. The Ribbon Tool requires two chunks of data, a set of filter values and a data hierarchy. There can be an arbitrary number of 4.1 Examining Degree Completion and Time filter variables, each with a label and a set of nominal values. The to Graduation data hierarchy can have an arbitrary depth and at each level of the Timely degree completion is a key component of enrolment hierarchy a value must exist for each student. The branching management. For example, university funding is often associated factor at each level of the hierarchy must be fixed in terms of its with 6-year completion rates in undergraduate programs. subcategory options. Each student represented in the visualization Students stuck in a program for an extended time can reduce the must have a full set of values corresponding to the filter variables. number of available spaces in critical courses, and can face Further each student must have a value for each level in the compounded delays due to rigid, prerequisite-bound course hierarchy. Data can be imported into the Ribbon Tool from either sequences. a pair of csv files or from a JSON file. Using the Ribbon Tool it is easy to see degree completion times In the datasets we have prepared for our institution, students are and to determine the number of students in a cohort who are not individually identified, other than by a sequential index. As a completing degrees within 6 years or who are embarking on a 7th result, the data held in Ribbon, although hosted in the Amazon or 8th year. Furthermore, it is possible, using filters to see if the Cloud, has low risk of abuse. Nevertheless, efforts are underway students failing to complete within 6 years have had stop outs, to offer a local data storage option for some universities hesitant academic probation actions, internships, etc. It is possible to to store even de-identified student data off-site. differentiate completion rates for students with different demographic factors, such as first-in-family (first-generation) 4. SOME USE CASES AND EXPERIENCES students, international students, under-represented minority We have been using the Ribbon Tool at the University of students, etc. It is possible to display student GPAs within the Saskatchewan for only a few months now. During that time the hierarchy to determine if students slow to graduate have or have tool has been further enhanced in its capabilities and features and not reached certain academic achievement levels. The combination of filters and hierarchy refinements has where effects may be differential across the different student permitted our Engineering School to discover some new insights demographics, and where levels of participation and engagement and bottlenecks regarding time to graduation. are vital indicators, the Ribbon Tool is helping us to us develop and refine program-impact hypotheses that can then be tested 4.2 Retention and Attrition statistically. Analyzing student attrition and retention factors is an interest in some parts of every university. Universities focused on broad 5. ACTIONABLE DECISIONS access in Arts and Sciences are often faced with retention Of course, all of these kinds of comparisons and descriptions challenges. Students unprepared for the change in culture of presented in the use-cases above can be achieved with a university life and those with academic deficiencies are not the comprehensive set of reports, bar charts or tables or with the facile only students who sometimes leave the institution. Established use of a statistics package. The difference with the Ribbon Tool retention risk factors such as lower socio-economic status, being a is the speed with which one can mock up a scenario and try first-generation student, being a member of an under-represented different filters and breakdowns to get an impression of where minority all need to be considered. But when comparing different problems may be lurking or where impact may be seen. academic programs, such as Engineering and the Humanities, Furthermore, with the Ribbon Tool, an Associate Dean or there may be different factors leading to attrition. For example, Department Chair can take the reins and drive the visualization belief in the benefit of completing a university degree may be a tool to explore exactly what is interesting - to follow a hunch or to factor in some areas whereas the rigor of completing the degree confirm or deny a commonly held view. may be a factor in others [4]. Putting a powerful visualization tool in the hands of agents of We have used the Ribbon Tool to track attrition and to change can empower them to make more persuasive cases for differentiate students moving to a different program versus stop- change with their colleagues. We have seen how visualizations outs versus drop-outs. Furthermore in areas where there are that show scenarios with no perceptible difference, when various entry points into programs, it is possible to examine conventional wisdom would predict a difference, does help people retention factors for students who have entered through different to confront and question their biases. These are precisely the paths. In doing a retention analysis with Ribbon, demographic kinds of evidence that can change minds, and actionable decisions filter variables corresponding to expected causes of retention can arise from changed minds. be quickly examined to see which factors or combinations of factors seem to make a difference. Being able to isolate a 6. CONCLUSION particular collection of students (e.g. those who drop out after Our experiences with the Ribbon Tool confirm that sophomore year in a program) to further investigate their visualizations of student progression can be highly informative demographic makeup and their pathways has proven useful. and powerfully persuasive in moving administrative staff to Ribbon can also be used to determine whether the flow of students action. Uncovering the factors affiliated with undesired outcomes through academic programs has been affected by changes in and discovering those connected with positive outcomes sets the demographics of entering students, whether as a result of changes stage for change. to the feeder system or changes in admission policies. The Ribbon Tool is one tool that can help with moving people to The Ribbon Tool has enabled our Engineering School as well as action, but like any tool it has its limitations. It is best suited for our Faculty of Arts and Sciences to study retention issues (in analyzing historical patterns and flows and is not well suited for STEM and elsewhere) more closely and to get a better forecasting or modeling the future effects of potential changes. It understanding of attrition patterns, particularly of under- is also a tool that readily looks over relatively longer time scales represented minority students. we have not yet produced data to explore a more granular time scale. Finally, like any other tool, it can be mis-used to 4.3 Program Innovation, Monitoring and oversimplify relationships or to mis-represent realities. Just as Evaluation with any power tool, much persuasive power is placed in the As academic programs evolve and as new learner supports hands of the tool operator. are introduced there is a need for ongoing program monitoring and evaluation. The Ribbon Tool provides a mechanism for REFERENCES supporting the early phase of program evaluation through its rapid means of detecting differences across cohorts of students. For [1] Kahneman, D. (2011). Thinking, fast and slow. New York: example, it is easy to compare student flows before and after the Farrar, Straus and Giroux. implementation of some program change. It is also possible to [2] D3 Data-Driven Documents (http://d3js.org/). differentiate with a filter those students who were selected for participation in a pilot program and further to filter those who did [3] Tools for Evidence-Based Action (TEA) Community or did not engage. (http://t4eba.com). We have begun to explore the impact of changes to our academic [4] Kuley, E. A., Maw, S., & Fonstad, T. (2015). Engineering advising processes, the introduction of a freshman learning Student Retention and Attrition Literature Review. communities program, and the impact of increased academic Proceedings of the Canadian Engineering Education support services in mathematics and writing. In such programs, Association. where the macro effects may take many years to be realized,