=Paper=
{{Paper
|id=Vol-1590/paper-01
|storemode=property
|title=Empowering Instructors Through Customizable Collection and Analyses of Actionable Information
|pdfUrl=https://ceur-ws.org/Vol-1590/paper-01.pdf
|volume=Vol-1590
|authors=Danny Y.T. Liu,Charlotte E. Taylor,Adam J. Bridgeman,Kathryn Bartimote-Aufflick,Abelardo Pardo
|dblpUrl=https://dblp.org/rec/conf/lak/LiuTBBP16
}}
==Empowering Instructors Through Customizable Collection and Analyses of Actionable Information==
Empowering instructors through customizable collection
and analyses of actionable information
Danny Y.T. Liu Charlotte E. Taylor Adam J. Bridgeman
Faculty of Science Faculty of Science Educational Innovation
The University of Sydney The University of Sydney The University of Sydney
danny.liu@sydney.edu.au charlotte.taylor@sydney.edu.au adam.bridgeman
@sydney.edu.au
Kathryn Bartimote-Aufflick Abelardo Pardo
Quality and Analytics Group Faculty of Engineering and IT
The University of Sydney The University of Sydney
kathryn.aufflick@sydney.edu.au abelardo.pardo@sydney.edu.au
ABSTRACT explaining student outcomes [21; 34; 55] and addressing the needs
The use of analytics to support learning has been increasing over of students in different disciplines [43].
the last few years. However, there is still a significant disconnect Notwithstanding, there is increasing interest in the instructor-facing
between what algorithms and technology offer and what everyday benefits of LA. These include detecting patterns and trends, using
instructors need to integrate actionable items from these tools into data to support decision making, testing assumptions, and
their learning environments. In this paper we present the evolution understanding the effect of learning designs [25]. Tools that display
of the Student Relationship Engagement System, a platform to and analyze student data can help instructors reflect on their designs
support instructors to select, collect, and analyze student data. The and better understand the relationships between variables [15; 51].
approach provides instructors the ultimate control over the decision Moreover, new tools are being developed that address a long-held
process to deploy various actions. The approach has two objectives: appeal to connect LA with the learning sciences [18], by helping
to increase instructor data literacies and competencies, and to instructors understand how learner behaviors correspond with their
provide a low adoption barrier to promote a data-driven pedagogical intent [11]. Recent results in the area of artificial
pedagogical improvement culture in educational institutions. The intelligence in education suggest a shift in focus away from fully
system is currently being used in 58 courses and 14 disciplines, and self-contained decision systems to a paradigm based on human
reaches over 20,000 students. intelligence amplification [5]. However, low data literacies and
competencies pose a significant barrier to address this shift and
CCS Concepts achieve wider LA acceptance and adoption [6; 24].
• Information systems~Decision support systems • Human-
Taken together, these suggest that greater impact of LA (e.g. insight
centered computing~Visual analytics • Computing
into curricular design and delivery versus prediction of retention),
methodologies~Machine learning approaches • Applied
may be catalyzed by addressing, and indeed leveraging, identified
computing~Education • Software and its engineering~Software
adoption barriers. In this paper, we take the position that, to be
creation and management
effective, LA must empower instructors with tangible solutions to
Keywords address pressing needs [15; 37]. For some, this may mean
Learning analytics adoption; scaling up; instructors; curriculum addressing immediate retention issues [10], that is, “to satisfy a
design and delivery; teaching approaches; machine learning. tangible, small-scale problem” [38, p236], while pushing
instructors along the adoption pipeline [35] to more involved
1. INTRODUCTION insights. This builds on findings from early adoption of computers
Since the early days of learning analytics (LA), the promise has in teaching, where “use of computers for one purpose may
been that the collection and analysis of large educational datasets encourage enthusiasm for further computer use” [26, p7]. We
could yield “actionable intelligence” [8, p41] to improve the overall present a case study of a bespoke web-based LA solution at the
student learning experience. At some of the institutions that have University of Sydney, outline its capabilities and impact, to date,
adopted LA, this intelligence typically takes the form of algorithms and highlight the flow-on impacts for shifting teaching practices,
that predict student outcomes and aim to reduce attrition and failure curricular design and delivery, and growing a culture of LA use.
rates [10; 16; 44; 53]. The higher education sector has been one of We use Greller and Drachsler’s [24] generic LA design framework
the first to explore the adoption of these techniques [22]. Despite to situate our work in terms of stakeholders, objectives, data,
these initiatives, recent reviews highlight the lack of widespread instruments, and limitations.
adoption of LA in the higher education sector [10; 44]. Various
explanations have been suggested for this. At a high level, these 2. OVERVIEW OF OUR APPROACH
include policy and ethical challenges [41; 54], institutional leaders’ We opted for a bottom-up approach where a basic but high-utility
misconceptions of LA [10], and the sector’s general culture of system was developed and improved collaboratively with
resistance to change [19; 40]. At an operational level, other authors instructors. From an early stage, this meant that our system
have reported the inflexibility of vendor solutions, and difficulties addressed pressing objectives of key stakeholders [14]. Our design
in accessing data [38], as well as the accuracy of such data [6]. To philosophy shared common themes with other LA developments,
add complexity to this situation, evidence is mounting that the one- including usability, usefulness, data interoperability, real-time
size-fits-all approach, typical in LA, may be inadequate in operation, flexibility, and generalizability [8; 15; 23]. However, in
contrast to other approaches, our system’s growth was instructor-
centered and ‘organic’, initially addressing a small-scale need
(originally, tracking class attendance) and iteratively building a deeper understanding of the underlying data. Greller and
features into the system (e.g. personalized interventions, data Drachsler astutely describe that “enticing visualisations… [and] the
mining to uncover hidden relationships in course design) as simplicity and attractive display of data information may delude the
instructors’ data literacies and competencies grew. A recent review data clients, e.g. teachers, away from the full pedagogic reality”
of LA implementations at Australian institutions suggests that such [24, p52]. With this in mind, we decided to minimize visualizations
early small-scale applications can have large impacts on capacity and instead provide instructors with the ability to run large-scale
building [10]. customized queries on their students’ data. This meant that
instructors of even very large courses could select, collect, and
2.1 Data collection extract the data they wanted, and also run basic analyses that are of
The importance of having the interest to their contexts [23]. Importantly, we aimed to avoid
right data in the right place is a algorithmic black boxes [35], which are present in other solutions
central issue for LA [28]. Most [e.g. 2], instead giving instructors full control of the process.
practical LA implementations
involve collecting data into a This level of functionality was built to respond to pressing
central database available to the institutional needs to address issues of student engagement, taking
instrument [e.g. 3; 15; 38] or advantage of the data that were already being collected. Using the
building analytics directly into customizable analysis engine, instructors could specify conditions
the data source [e.g. 33]. and efficiently identify particular groups of students (Figure 2).
Recognizing that both LMS and Once identified, instructors could then deliver personalized
student information system (SIS) feedback to students via email or the cellular network. We observed
data have shortcomings [21; 31], that instructors “relied on their intuition and hunches to know when
and in keeping with our students are struggling, or to know when to suggest relevant
instructor-empowering learning resources” [13, p20].
philosophy, we opted for a hybrid
approach where instructors could
decide which data were most
important for their contexts. For
example, our discussions with
instructors identified that class
Figure 1. A smartphone-
engagement and attendance data
friendly in situ data
were important, in keeping with
recording and display
evidence-based practice for
interface.
student outcomes [42; 47].
Unsurprisingly, interim grade and other performance data were also
relevant [9]. Therefore, we started by developing a web-based, and
smartphone-friendly, system that was easy and efficient to use and
met these contextual needs (Figure 1). Since technology acceptance
and adoption are closely linked with usefulness and usability [12], Figure 2. Screenshot of interface for customizable analysis
this was a first step in empowering instructors’ data usage. rules engine.
Due to technical limitations of our institution’s information In addition to this approach to extracting information at scale, we
technology infrastructure and capabilities, our system could not also focused on a seldom-raised concern, namely “the focus of LA
programmatically access LMS or SIS data. Other authors have appears fixed to an institutional scale rather than a human scale”
solved this issue by capitulating to vendor-locked solutions, which [31, p4]. We therefore wished to promote the power of LA in
offer a level of automatization but at the cost of flexibility, augmenting human interaction. To this end, our system design
customizability, and possibly even scalability [38]. We addressed allowed instructors to customize the information that could be
the issue by building in an additional facility to import any student- immediately extracted and displayed to other staff (such as tutors
matched data required through semi-automated data file uploads. and support staff) as they worked directly with students in face-to-
This is a similar design philosophy to Graf et al. [23] in allowing face contexts (e.g. Figure 1). In a similar application, Lonn et al.
free choice of data, and addresses realistic instructional situations [37] empowered academic advisors with pertinent student data.
where course-specific nuances can confound less flexible systems While use of our system in this way has been predominantly
[38]. Serendipitously, this had the unintended advantage of forcing operational (e.g. redirecting students in class if they have not
instructors to consider the data they were entering, in terms of its completed assigned pre-work), we envisage that, as more relevant
relevance to their context and pedagogical design. In fact, the data are available, this ‘mini human dashboard’ approach will spark
criticality of these contextual factors is becoming much clearer [e.g. deep human conversations supported by the relevant data.
15; 21], lending strong support to our approach. In terms of Greller In terms of Greller and Drachsler’s [24] framework, our approach
and Drachsler’s [24] framework, our approach addressed the direct allowed both staff (faculty as well as student support staff) and
objectives of stakeholders in providing a stable, easy to use student stakeholders to take advantage of data through the
instrument that collected immediately relevant data. instrument. In this process, information was prepared and presented
2.2 Data extraction and affordances for action to stakeholders, and the transparent analysis engine also forced
instructors to develop data interpretation and decision-making
Once the right data are in the right place, the typical progression in
competencies [24]. Moreover, we saw our approach as reflecting
LA usually involves visualization via dashboards [45]. However,
the human judgment and instructor empowerment roots of LA [52].
there is a danger that these visually appealing interfaces may
distract users (such as instructors, students, and management) from
2.3 Guided semi-automated discovery anecdotal feedback indicates that uptake is, in part, due to the
The closely related field of educational data mining has a greater customizability and afforded actions (i.e. usefulness [12]) and ease-
focus on automated methods of discovering meaning in educational of-use of the system. This contrasts with the issues highlighted by
data than LA [4], which address one of the key opportunities for Lonn et al. [38] around their scaled-up LA system with a vendor-
LA, namely “to unveil and contextualize so far hidden information locked approach not being “nimble enough to be responsive to
out of the educational data” [24, p47]. Data mining techniques in idiosyncratic cases” [38, p238]. The interventions for students,
LA [4] have primarily focused on outcome prediction through using our system, have contributed to sustained improvements in
regression and classification [e.g. 21], semantic analyses [29], and retention as well as overall performance (Figure 4). Now that
social network analysis [e.g. 36]. However, data mining techniques instructors have more experience working with their data, we are
typically require substantial technical understanding and are collaborating with them to expand opportunities afforded by our
beyond the capabilities of most instructors [56]. Additionally, input system to further understand, optimize, and transform their
variables are differentially predictive for each instructional context teaching.
[21], necessitating a more nuanced and contextualized approach to
100% HD
information discovery.
80% DI
To this end, we are in the initial stages of testing an approach that
helps instructors uncover hidden relationships in data about their 60% CR
students. We are combining the data they have already collected in 40%
our system with the machine learning application programming PS
20%
interfaces (APIs) provided by BigML (https://bigml.com). Our FA
approach involves instructors selecting data to analyze, based on 0%
their pedagogical context and intent, using a drag-and-drop 2011 2012 2013 2014 Attrited
graphical user interface where they can also transform and/or Figure 4. Outcomes from a representative Science course.
combine data (Figure 3) and select a target (dependent) variable Percentage of students (y-axis) in each outcome category (HD,
(e.g. an interim grade). The system then runs a series of machine high distinction; DI, distinction; CR, credit; PS, pass; FA, fail;
learning algorithms (see section 3.2) against these data and returns attrited, i.e. left the course) is presented against calendar years
analysis results for instructors to interpret in their context. This where the course was offered.
approach is more user-friendly than a similar system designed by
Pedraza-Perez et al. [46], and can also include data beyond the 3. UNDERSTANDING, OPTIMIZING, AND
LMS. This process may provide novel insights into curriculum TRANSFORMING TEACHING
design and delivery, such as visual and statistical identification of
factors that impact student outcomes, and identifying patterns in 3.1 Teaching practices
performance across multiple courses with different course designs. Too often the student experience at university is one of isolation
Other possible insights are outlined in section 3.2. from instructors, which is especially poignant for students
transitioning to higher education where instructors can appear
disconnected [30]. While LA may exacerbate this situation by
defocusing the human aspects of learning [31], our approach
encourages instructors to break this pattern: hence the name of our
system, the Student Relationship Engagement System (SRES). The
strength of the SRES lies in the ability for instructors to customize
analyses to the needs of their course and students. One of the
primary goals of the SRES is to personalize communication with
students and engage them in conversations about their learning.
This is particularly important when operating at scale with large
Figure 3. Attribute selection interface allowing instructors to cohorts, as data-driven personalizations are a key factor in
select, transform, and combine data they wish to analyze. promoting student engagement [7]. We see this as a blending of
In terms of Greller and Drachsler’s [24] framework, this nascent Greller and Drachsler’s [24] objectives of reflection and prediction,
approach adds algorithmic capability to the instrument to provide where timely data are extracted to aid co-reflection by instructors
certain stakeholders with possibly hidden information, beyond that and students. We find that this approach can also encourage more
of prediction. However, it requires higher data literacies and meaningful student-faculty contact, thus addressing a constant
competencies, such as critical evaluation skills (internal limitations warning in the field that students’ internal conditions must be taken
[24]). By working through the other steps of the process already into account [20].
outlined (namely data selection, collection, extraction, and basic
analyses), our presumption is that instructors will have gained some 3.2 Instructional and curricular design and
of these competencies. Together, we see this as a combination of delivery
LA and educational data mining, where instructor judgment is Currently, we are trialing several newer developments in the SRES
empowered through leveraging machine learning [52]. in our own courses to explore further ways to support decision
making [24] about instructional and curricular design and delivery.
2.4 Preliminary outcomes Here, we present three proof-of-concept examples that attempt to
The first version of our system was trialed with four courses in derive meaning in our contexts by analyzing real course data (Table
2012. Since then, it has been adopted in 14 disciplines and 58 1) using machine learning tools. Instructors can select (Figure 3)
courses, covering over 20,000 students. This approach has allowed data that are most relevant in their contexts (for example, mid-term
our system to evolve functionality through collaboration with the test grade, session length in the LMS, attendance count early in the
instructors who are using it. Although lacking empirical data, semester, average grade of online quizzes early in the semester,
activity in online forums, etc), and apply these tools to uncover students, but also supports decisions about learning activities and
hidden patterns. For example, what relationship is there between assessing course effectiveness [50; 51].
class attendance, different aspects of online engagement, and test
grades? In many cases in LA and educational data mining, decision tree
algorithms are used purely as opaque models for prediction of
Table 1. Description of sample variables. student outcomes [e.g. 27; 32]. However, this does not take full
advantage of the fact that decision trees are one of the few machine
Data/variable Description learning algorithms that can produce easily human-interpretable
Piazza_questions number of questions asked on online and -understandable predictive models, in the form of choices and
forum rules [49]. As in our example, analysis of LMS interaction and
completion data with decision trees can reveal behavioral and early-
C_COURSE Total session length in LMS
performance characteristics of high- and low-performing students,
ACTIVITYIN
and allows instructors to adapt their courses and interventions
HOURS
based on this information [17; 50].
online_worksheets Total score in formative online quizzes
final_grade Final course grade 3.2.2 Association rule mining
Association rule mining reveals typically hidden patterns in data
early_attendance Attendance pattern at first four practical that commonly occur together [4; 51]. These patterns are expressed
classes of semester as rules or relationships of varying strength from antecedent to
Test_1 Mark in first mid-term exam/test consequent conditions. Our application leverages a BigML
early_prework_ Average of first four pre-class online visualization to graphically represent these rules. In our context,
quizzes quizzes association rule mining provided evidence that lower in-class
attendance was associated with lower online activity, and that lower
Piazza_answers Number of replies posted to online online activity was a central node between other disengagement
forum measures (Figure 6, main network). On the other hand, common
3.2.1 Decision trees
Decision tree algorithms generate hierarchical conditions-based
predictive models that attempt to explain conditions or patterns in
data that lead to a particular outcome [49]. In our context, the
decision tree discovered through machine learning suggested that
early quiz performance (which was only worth a low proportion of
the final grade) was an important factor in student success (Figure
5). While instructor intuition about their students may predict this,
there is value in having data demonstrating various ‘paths to
success’. Additionally, when one considers that each of these
quizzes are worth only 0.65% of a students’ final grade (again
emphasizing the importance of context and design), this data-
enabled discovery becomes the grounds for supporting the Figure 6. Example visualization of association rule mining
evidence-based practices of emphasizing time on task and results from BigML API as accessed by our system.
continuous assessment. These analyses are now driving
relationships were also found between strong mid-term test marks,
pedagogical changes (e.g. decisions on provision of feedback in
high online quiz marks, and strong pre-class quiz performance
these quizzes versus no feedback) to improve student performance.
(Figure 6, bottom-left network), although interestingly high online
For instructors, this approach not only helps identify struggling
activity was not included. While again this might seem obvious,
this data-driven finding could trigger curriculum or instructional
design changes to better engage students [48].
The associations discovered could also
inform intervention strategies by identifying
linked problem areas [50].
3.2.3 Clustering
Clustering algorithms group members of a
dataset (in this case, students) together based
on similarity between their data [4]. In our
context, the clustering algorithm identified a
group of mid-performing students who had
high engagement with an online forum
(Piazza_questions, Figure 7, cluster 4),
compared to relatively low engagement from
higher-performing students. Interestingly,
this cluster was differentiated from another
cluster of mid-performing students, who had,
overall, much lower online engagement
Figure 5. Example decision tree classification results from BigML API as accessed
(Figure 7, cluster 0). This finding counters the
by our system. As an example, the highlighted branch leads to a fail (FA) classification.
6. REFERENCES
[1] Amershi, S. and Conati, C. (2009) Combining Unsupervised
and Supervised Classification to Build User Models for
Exploratory. Journal of Educational Data Mining, 1(1), 18-
71.
[2] Arnold, K. E. (2010) Signals: Applying Academic Analytics.
Figure 7. Example clustering output from BigML API as Educause Quarterly, 33(1), n1.
accessed within our system. [3] Arnold, K. E. and Pistilli, M. D. (2012) Course signals at
common understanding that higher discussion forum engagement Purdue: using learning analytics to increase student success.
is correlated with higher performance [e.g. 39], and again re- In Proceedings of the 2nd International Conference on
emphasizes the importance of considering contextual and Learning Analytics and Knowledge, Vancouver.
pedagogical factors [21]. In our context, the online forum [4] Baker, R. S. and Inventado, P. S. (2014). Educational data
functioned in a question and answer format, which may help to mining and learning analytics Learning Analytics (pp. 61-
explain why a cluster of poorer-performing students had higher 75): Springer.
engagement, i.e. posting of questions. Together, these analyses and
their data-driven findings can be powerful for instructors because [5] Baker, R. S. (2016) Stupid Tutoring Systems, Intelligent
they help to support or refute a priori assumptions about their Humans. International Journal of Artificial Intelligence in
students, pedagogical strategies, and curricular design. Clustering Education, 1-15.
may also provide insight into behaviors common to groups of [6] Bichsel, J. (2012) Analytics in higher education: Benefits,
differentially-performing students [1]. Some have even suggested barriers, progress, and recommendations. Available:
that clustering students based on observed behaviors may assist https://net.educause.edu/ir/library/pdf/ERS1207/ers1207.pdf
formation of congruous student groups [50]. [7] Bridgeman, A. and Rutledge, P. (2010) Getting personal -
3.3 Cultural shifts feedback for the masses. Synergy, 30, 61-68.
Our approach leveraged existing instructor needs to introduce them [8] Campbell, J. P., DeBlois, P. B. and Oblinger, D. G. (2007)
to a data-driven LA system, the SRES. A consequence of doing so Academic analytics: A new tool for a new era. EDUCAUSE
has been to force them to think about their contexts and the relevant review, 42(4), 40-57.
data. We are currently analyzing these instructor capability [9] Clow, D. (2012) The learning analytics cycle: closing the
outcomes, as others have suggested that “implementing early and loop effectively. In Proceedings of the 2nd International
to small scale, even if inadequately, will build capacity” [10, p38]. Conference on Learning Analytics and Knowledge,
Our approach certainly started small-scale, and was perhaps Vancouver.
somewhat inadequate in not providing automatic access to the
plethora of data available in LMS logs and the SIS. However, our [10] Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D.,
hope is that by starting small and introducing instructors to data- Buckingham Shum, S., Nelson, K. J., Alexander, S.,
driven ways of operating, we can introduce them to deeper LA ‘by Lockyer, L., Kennedy, G., Corrin, L. and Fisher, J. (2015)
stealth’ and gradually expand their capabilities, in parallel with Student retention and learning analytics: a snapshot of
expansion of the system’s capabilities. Australian practices and a framework for advancement
(draft report). Australian Office for Learning and Teaching,
4. CONCLUSION Sydney, Australia.
The field of learning analytics is under unprecedented pressure to
[11] Corrin, L., Kennedy, G., Barba, P. D., Bakharia, A., Lockyer,
effectively bridge the gap between technological capacity and
L., Gasevic, D., Williams, D., Dawson, S. and Copeland, S.
tangible improvements of the student experience. The shift towards
(2015) Loop: A learning analytics tool to provide teachers
tools that enhance current instructional practice is occurring. In this
with useful data visualisations. In Proceedings of the 32nd
paper we have presented the evolution of the Student Relationship
Conference of the Australasian Society for Computers in
Engagement System following an organic and instructor-centric
Learning in Tertiary Education, Perth.
approach. The platform provides a high level of control over data
collection and processing as well as direct control over the actions [12] Davis, F. D. (1993) User acceptance of information
derived from the analysis. The current uptake of the tool across technology: system characteristics, user perceptions and
disciplines suggests its suitability to promote data literacy skills and behavioral impacts. International Journal of Man-Machine
a culture of data-supported innovation. As further avenues to Studies, 38(3), 475-487.
explore, we have identified the need to increase the understanding [13] Dietz-Uhler, B. and Hurn, J. E. (2013) Using learning
of how instructors are empowered through data-driven analysis of analytics to predict (and improve) student success: A faculty
learning designs and delivery. perspective. Journal of Interactive Online Learning, 12(1),
17-26.
5. ACKNOWLEDGMENTS
We thank many instructors and student support staff for their input [14] Drachsler, H. and Greller, W. (2012) Confidence in learning
into the process, countless students for their enthusiasm, and the analytics. In Proceedings of the 2nd International
Australasian Society for Computers in Learning in Tertiary Conference on Learning Analytics & Knowledge, Vancouver.
Education (ascilite) Learning Analytics Special Interest Group for [15] Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A. and
their support. Schroeder, U. (2012) Design and implementation of a
learning analytics toolkit for teachers. Journal of Educational
Technology & Society, 15(3), 58-76.
[16] ECAR-ANALYTICS Working Group (2015) The Predictive [30] Krause, K. (2005) Understanding and promoting student
Learning Analytics Revolution: Leveraging Learning Data engagement in university learning communities. Paper
for Student Success. EDUCAUSE Center for Analysis and presented as keynote address: Engaged, Inert or Otherwise
Research, Louisville. Occupied, 21-22.
[17] Falakmasir, M. H. and Habibi, J. (2010) Using educational [31] Kruse, A. and Pongsajapan, R. (2012). Student-Centered
data mining methods to study the impact of virtual classroom Learning Analytics CNDLS Thought Papers: Georgetown
in e-learning. In Proceedings of the 3rd International University.
Conference on Educational Data Mining, Pittsburgh. [32] Lauría, E. J., Moody, E. W., Jayaprakash, S. M.,
[18] Ferguson, R. (2012) Learning analytics: drivers, Jonnalagadda, N. and Baron, J. D. (2013) Open academic
developments and challenges. International Journal of analytics initiative: initial research findings. In Proceedings
Technology Enhanced Learning, 4(5-6), 304-317. of the Third International Conference on Learning Analytics
[19] Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S. and Knowledge, Leuven.
and Alexander, S. (2014) Setting learning analytics in [33] Liu, D. Y. T., Froissard, J.-C., Richards, D. and Atif, A.
context: overcoming the barriers to large-scale adoption. In (2015) An enhanced learning analytics plugin for Moodle:
Proceedings of the Fourth International Conference on student engagement and personalised intervention. In
Learning Analytics And Knowledge. Proceedings of the 32nd Conference of the Australasian
[20] Gašević, D., Dawson, S. and Siemens, G. (2015) Let’s not Society for Computers in Learning in Tertiary Education,
forget: Learning analytics are about learning. TechTrends, Perth.
59(1), 64-71. http://dx.doi.org/10.1007/s11528-014-0822-x [34] Liu, D. Y. T., Froissard, J.-C., Richards, D. and Atif, A.
[21] Gašević, D., Dawson, S., Rogers, T. and Gasevic, D. (2016) (2015) Validating the Effectiveness of the Moodle
Learning analytics should not promote one size fits all: The Engagement Analytics Plugin to Predict Student Academic
effects of instructional conditions in predicting academic Performance. In Proceedings of the 2015 Americas
success. The Internet and Higher Education, 28, 68-84. Conference on Information Systems, Puerto Rico.
http://dx.doi.org/10.1016/j.iheduc.2015.10.002 [35] Liu, D. Y. T., Rogers, T. and Pardo, A. (2015) Learning
[22] Goldstein, P. J. and Katz, R. N. (2005) Academic analytics: analytics - are we at risk of missing the point? In
The uses of management information and technology in Proceedings of the 32nd Conference of the Australasian
higher education. Available: Society for Computers in Learning in Tertiary Education,
http://www.educause.edu/library/resources/academic- Perth.
analytics-uses-management-information-and-technology- [36] Lockyer, L., Heathcote, E. and Dawson, S. (2013) Informing
higher-education pedagogical action: Aligning learning analytics with learning
[23] Graf, S., Ives, C., Rahman, N. and Ferri, A. (2011) AAT: a design. American Behavioral Scientist, 57(10), 1439-1459.
tool for accessing and analysing students' behaviour data in [37] Lonn, S., Krumm, A. E., Waddington, R. J. and Teasley, S.
learning systems. In Proceedings of the First International D. (2012) Bridging the gap from knowledge to action:
Conference on Learning Analytics and Knowledge, Banff. Putting analytics in the hands of academic advisors. In
[24] Greller, W. and Drachsler, H. (2012) Translating learning Proceedings of the 2nd International Conference on
into numbers: A generic framework for learning analytics. Learning Analytics and Knowledge, Vancouver.
Journal of Educational Technology & Society, 15(3), 42-57. [38] Lonn, S., Aguilar, S. and Teasley, S. D. (2013) Issues,
[25] Gunn, C., McDonald, J., Donald, C., Milne, J., Nichols, M. challenges, and lessons learned when scaling up a learning
and Heinrich, E. (2015) A practitioner's guide to learning analytics intervention. In Proceedings of the Third
analytics. In Proceedings of the 32nd Conference of the international conference on learning analytics and
Australasian Society for Computers in Learning in Tertiary knowledge, Leuven.
Education, Perth. [39] Macfadyen, L. P. and Dawson, S. (2010) Mining LMS data
[26] Jacobsen, D. M. (1998) Adoption Patterns of Faculty Who to develop an “early warning system” for educators: A proof
Integrate Computer Technology for Teaching and Learning of concept. Computers & Education, 54(2), 588-599.
in Higher Education. In Proceedings of the 98th World [40] Macfadyen, L. P. and Dawson, S. (2012) Numbers are not
Conference on Educational Multimedia and Hypermedia, enough. Why e-learning analytics failed to inform an
Freiburg. institutional strategic plan. Journal of Educational
[27] Jayaprakash, S. M., Moody, E. W., Lauría, E. J., Regan, J. R. Technology & Society, 15(3), 149-163.
and Baron, J. D. (2014) Early alert of academically at-risk [41] Macfadyen, L. P., Dawson, S., Pardo, A. and Gašević, D.
students: An open source analytics initiative. Journal of (2014) Embracing big data in complex educational systems:
Learning Analytics, 1(1), 6-47. The learning analytics imperative and the policy challenge.
[28] Jones, D., Beer, C. and Clark, D. (2013) The IRAC Research & Practice in Assessment, 9(2), 17-28.
framework: Locating the performance zone for learning [42] Massingham, P. and Herrington, T. (2006) Does attendance
analytics. In Proceedings of the 30th Conference of the matter? An examination of student attitudes, participation,
Australasian Society for Computers in Learning in Tertiary performance and attendance. Journal of university teaching
Education, Sydney. & learning practice, 3(2), 3.
[29] Knight, S. and Littleton, K. (2015) Discourse-centric [43] McPherson, J., Tong, H. L., Fatt, S. J. and Liu, D. Y. T.
learning analytics: Mapping the terrain. Journal of Learning (2016) Learning analytics and disciplinarity: Building a
Analytics, 2(1), 185-209. typology of disciplinary differences from student voices. In
Proceedings of the 6th International Conference on Learning [51] Romero, C. and Ventura, S. (2010) Educational data mining:
Analytics and Knowledge, Edinburgh. a review of the state of the art. Systems, Man, and
[44] Newland, B., Martin, L. and Ringan, N. (2015) Learning Cybernetics, Part C: Applications and Reviews, IEEE
analytics in UK HE 2015: A HeLF survey report: HeLF Transactions on, 40(6), 601-618.
Heads of e-Learning Forum. Available: [52] Siemens, G. and Baker, R. S. (2012) Learning analytics and
http://www.helf.ac.uk/ educational data mining: towards communication and
[45] Pardo, A. (2014). Designing Learning Analytics Experiences. collaboration. In Proceedings of the 2nd International
In J. A. Larusson and B. White (Eds.), Learning Analytics Conference on Learning Analytics and Knowledge,
(pp. 15-38): Springer. Vancouver.
[46] Pedraza-Perez, R., Romero, C. and Ventura, S. (2010) A Java [53] Siemens, G., Dawson, S. and Lynch, G. (2013) Improving
desktop tool for mining Moodle data. In Proceedings of the the quality and productivity of the higher education sector.
4th International Conference on Educational Data Mining, Society for Learning Analytics Research.
Eindhoven. [54] Slade, S. and Prinsloo, P. (2013) Learning analytics ethical
[47] Rodgers, J. R. (2001) A panel-data study of the effect of issues and dilemmas. American Behavioral Scientist, 57(10),
student attendance on university performance. Australian 1510-1529.
Journal of Education, 45(3), 284-295. [55] Wolff, A., Zdrahal, Z., Nikolov, A. and Pantucek, M. (2013)
[48] Romero, C., Ventura, S. and De Bra, P. (2004) Knowledge Improving retention: predicting at-risk students by analysing
discovery with genetic programming for providing feedback clicking behaviour in a virtual learning environment. In
to courseware authors. User Modeling and User-Adapted Proceedings of the 3rd international conference on learning
Interaction, 14(5), 425-464. analytics and knowledge, Leuven.
http://dx.doi.org/10.1145/2460296.2460324
[49] Romero, C., Ventura, S., Espejo, P. G. and Hervás, C. (2008)
Data Mining Algorithms to Classify Students. In [56] Zorilla, M. E., García-Saiz, D. and Balcázar, J. L. (2010)
Proceedings of the 1st International Conference on Towards parameter-free data mining: Mining educational
Educational Data Mining, Montreal. data with yacaree. In Proceedings of the 4th International
Conference on Educational Data Mining, Eindhoven.
[50] Romero, C., Ventura, S. and García, E. (2008) Data mining
in course management systems: Moodle case study and
tutorial. Computers & Education, 51(1), 368-384.