Dashboard for Actionable Feedback on Learning Skills: How Learner Profile Affects Use Tom Broos1,2 , Laurie Peeters1 , Katrien Verbert1 , Carolien Van Soom1 , Greet Langie1 , and Tinne De Laet1 1 KU Leuven, Leuven, Belgium 2 tom.broos@kuleuven.be Abstract. Learning Analytics Dashboards (LAD) provide a means to leverage data to support learners, teachers, and counselors. This paper reports on an in-depth analysis of how learners interact with a LAD. N=1,406 first-year students in 12 different study programs were invited to use a LAD to support them in their transition from secondary to higher education. The LAD provides actionable feedback about five of the learning skills assessed by the Learning and Study Strategies In- ventory (LASSI): concentration, anxiety, motivation, test strategies, and time management. We logged access to and behavior within the LAD and analyzed their relationship with these learning skills. While eight out of ten students accessed the LAD, students with lower time man- agement scores tend to have a lower click-trough rate. Once within the LAD, students with lower scores for specific learning skills are accessing the corresponding information and remediation possibilities more often. Regardless of their scores for any of the other learning skills, learners with higher motivation scores are reading the remediation possibilities for the other four learning skills more often. Gender and study program have an influence on how learners use the LAD. Our findings may help both researchers and practitioners by creating awareness about how LAD use in itself may depend on the context and profile of the learner. Keywords: Learning Analytics Dashboard · First-Year · Higher Edu- cation · Learning Skills 1 Introduction This paper reports on a Learning Analytics Dashboard (LAD) that was offered to N=1,406 first-year students in 12 different STEM (Science, Technology, En- gineering, and Mathematics) study programs at the University of Leuven, Bel- gium. The LAD aimed at supporting students in transition from secondary to higher education with actionable information about five meta-cognitive abilities –further referred to as ‘learning skills’– that contribute to academic achieve- ment: concentration, anxiety, motivation, test strategies, and time management. Students’ learning skills were assessed using the Learning and Study Strategies Inventory (LASSI) at the beginning of the academic year. In total 80.7% of the invited students did click trough to the dashboard, but the response rate was different depending on the profile of the student, as was the level of activity within the LAD. Previously [3] we introduced the dashboard and focused on its scalability and perceived usefulness. This paper presents an in-depth analysis of the interplay between the message the dashboard aims to convey and the profile of the targeted student. The relatively large number of students involved permits a quantitative study of in-dashboard behavior to address three research questions: RQ-1: Who are we reaching? → How do learning skills affect the click-through- rate (CTR) of the LAD? RQ-2: Is our message invoking questions? → How do learning skills levels in- fluence user activity with regard to dashboard sections related to those (corresponding) learning skills? RQ-3: Are learning skills relevant to LAD use? → Do some learning skills affect LAD user activity related to other (non-corresponding) learning skills? These questions were inspired by the context of student counseling services in which the project was embedded. Counselors had divergent expectations about if the LAD could reach students with different profiles equally and especially to its appeal to students with lower learning skill scores. In our opinion, understanding how different types of students – in this case, students with different learning skill scores – perceive and interact with LADs provides useful information to improve the design of future student-facing LADs. 2 Related work Learning Analytics (LA) is a relatively young field at the intersection of theory, design, and data science, and borrows from many related disciplines [8]. A definition of LA in simple terms by Duval [6] states that LA are “about collecting traces that learners leave behind and using those traces to improve learning”. It involves “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” [14]. The reporting part is often fulfilled by dashboards. Learning Analytics Dashboards are using data visualization as a technique to deliver actionable information. An overview of interesting LADs was presented in [21] and extended in [22]. A first systematic literature review was provided by [17]. It defines the LAD as “a single display that aggregates multiple visu- alizations of different indicators about learner(s), learning process(es) and/or learning context(s)” and concluded that few studies provide strong evidence for effective impact on learning. In an overview of the state of the art of LA in higher education [13], the authors underlined the limited group size of most LA designs and called the scalability of current systems to a wider context into ques- tion. Another recent systematic overview of LADs [2] noted that understanding students’ use of LA systems is essential to improve recommendations and to accommodate dashboards to students’ needs. Learning dispositions are closely related to what we appoint in this paper as ‘learning skills’. They are described as a concept that “refers to a relatively enduring tendency to behave in a certain way”, “identified in the action a per- son takes in a particular situation” and closely related to ‘competence’, ‘style’, or ‘capability’ [18]. These authors discuss the use of a self-report questionnaire ELLI (Effective Lifelong Learning Inventory [4]) to collect the disposition data and the provision of learners with a visualization, the ‘ELLI spider diagram’, to support reflection. Recent work [20] combines learning dispositions data with digital traces available from learning management (LMS) and student informa- tion systems (SIS) to explain tool use, to improve the predictive power of LA predictive models, and to improve learning feedback. 3 Student Dashboard To facilitate future comparison and generalization, this section follows the recom- mendation [2] to describe student-facing LADs using nine categories of questions. Intended goal The goal of the dashboard was to help students in transition from secondary to higher education by providing actionable insight on their learning skills and information on how to improve. Boundary conditions were that (1) the dashboard was embedded in a traditional context of higher educa- tion strongly depending on face-to-face education, (2) it needed to be scalable across students from several study programs and (3) had to avoid dependency on supplementary, fine-grained information about specific courses or programs. Information Selection The dashboard focused on five learning skills that were found to be valuable predictors of study success for first year students in STEM- oriented higher education programs [16]: concentration (CON), (failure) anxiety (ANX)3 , motivation (MOT), test strategies (TST) and time management (TMT). The dashboard used data about students, their learning skills, and scores that were already available within the institution, but fragmented across services and not fed back to the data subjects in a direct way. Fig. 2 describes a high-level overview of data flows and systems involved. The Learning and Study Strate- gies Inventory (LASSI) [11, 23] was used to assess learning skills using a survey taken within the first weeks of the academic year. For some study programs an online version of the questionnaire was used, but in most cases a pen-and- paper alternative was preferred for practical reasons. Paper questionnaires were scanned in and processed using Optical Mark Recognition (OMR). Subsequently, 3 The level of anxiety is measured on an inverted scale: the higher the anxiety score, the lower the level of anxiety. all questionnaires were scored and results were fed into a LA Data Mart, which also contained data about students, courses and exam scores, extracted from the university’s campus management system. Student counselors of the study programs involved provided snippets of textual information using an extension to the markdown format. Within the dashboard, these snippets were combined using scenario’s based on the individual profile of the student. The dashboard system did not require direct access to the student’s name or other characteris- tics that allow for straightforward identification, as they were made available at access time by the single-sign on infrastructure. Needs Assessment As in many higher education programs, the first examina- tion period in the University of Leuven takes place in the middle of the academic year. Awaiting a first formal assessment, first-year students have only limited in- formation at their disposal to estimate their own academic abilities. The outcome of their learning strategies is therefore uncertain. Social-comparison theory sug- gests that in absence of objective knowledge about their own position, individuals turn to comparing themselves to others [7] to reduce uncertainty. However, in transition to higher education, students lose the familiar benchmarking opportu- nities provided by the classroom context of secondary education. The dashboard attempted to support these students by providing additional information, not just about their personal learning skill levels, but also about how they compare to those of peers. Additionally, the comparison was extended to previous year’s students’ learning skills and academic results to demonstrate the association with academic performance. Visual Design The dashboard contained extensive textual information accom- panied by easy to understand charts. Fig. 1 shows the dashboard from the per- spective of a random student in the ‘Engineering Technology’ program. The content was divided into six tabs (marked by A in the screenshot), the first one containing an introduction about the objective of the LAD, the origin of the data, and the connection to the research project. The five subsequent tabs each went into detail about one specific learning skill. These tabs were offered in al- phabetical order (in Dutch). The upper part of each learning skill tab contained information about the learning skill definition, the student’s level, and compari- son to peers. The lower part contained similar information about students from the previous year and linked it to their academic performance measured by the cumulative study efficiency (CSE), the percentage of obtained credits from total credits. CSE is a measure for academic progress that is subject to binding (‘30% rule’) and soft (‘50% rule’) institutional regulations [1]. At the bottom (marked by B in the screenshot) of each tab, a button labeled ‘Okay, what now?’ was available. When clicked, the tab expanded to include additional textual infor- mation on how to improve the learning skill level including a range of options going from simple tips and tricks to subscription in remediation and counseling programs offered by the university. Fig. 1. Screenshot of a student’s view on the dashboard. Translation of the Dutch text labels of the six tabs (A), from left to right: Introduction, Concentration, Failure Anx- iety, Motivation, Test Strategy and Time Management. The active tab is ‘Motivation’. The text above the first chart introduces the learning skill at hand, introduces the individual score and an interpretation thereof and explains how to read the positioning chart right below. The text in between charts prefaces the visualization of previous year’s students results (in terms of study efficiency). At the bottom, students are pre- sented with a button (B), labeled ‘Okay, what now?’ to show tips and information on how to improve the specific learning skill. LASSI test system University systems Campus Pen-and-paper Optical Mark Management questionnaire Recognition System (SAP) Online LASSI Scoring questionnaire LASSI results Dashboard system Courses, student info LA Data Mart Scenario Single sign on Configuration (Shibboleth) Authentication, Dashboard user info (Markdown) text System and parameters Fig. 2. High-level overview of systems and data flows to support the dashboard Visualization Unit charts were selected to visualize all data for reasons of sim- plicity. The reasoning was that students who are just entering higher education may not have built adequate data literacy skills yet. By using simple dots to represent students—one dot is one student—the visualization tried to find con- nection to a more intuitive notion of concepts such as comparability of groups and significance. Student perceptions Once a user opened a third tab within the dashboard, a yellow message box was shown at the bottom of the screen, asking the the student to answer three simple questions on a 1–5 scale. A minority (14,7%) of students provided complete feedback. Results (see Fig. 3) were generally positive (4–5/5) with respect to the perceived usefulness (89%) and clearness (71%) of the information, and positive but less outspoken (55%) when students were asked if they wanted to receive more of this type of information. Usability test In-house Human-Computer Interaction researchers and data visualization experts were consulted to improve upon a working prototype of the dashboard. Some, but not all of their suggestions were subsequently imple- mented, keeping timely delivery in mind. A new iteration of the dashboard is scheduled for the next academic year. This offers an opportunity for additional usability testing and improvement. Actual effects At the time of design of the dashboard and study, randomized control-trials (RCTs) were found difficult to justify in the context of first-year 80 80 80 60 60 60 Count Count Count 40 40 40 20 20 20 0 0 0 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 I would like to receive more of I find this information useful (n= 170 ). I find this information clear (n= 169 ). this type of information (n= 172 ). Fig. 3. Student perceptions of usefulness, clearness and their wish to receive more of this type of information student support. The procedure would require the exclusion of some students from access to the information which was decided to be ethically questionable. For an interesting discussion of RCTs and alternatives in the context of distance learning, see [9]. Student use In [3] we presented the details of when students accessed the dashboard, which devices they used to do so, and how the device type influenced behavior. The breakdown of content in tabs and expandable content (marked by A and B in Fig. 1 ) permitted easy registration of user actions. How students are using the system in relation to the profile of the student, was briefly discussed before in [3], but is now being explored by this paper in detail. 4 Results The LAD was built on recent work in educational sciences on learning skills, their role in explaining learning performance in higher education and the specifics thereof in STEM education. At the same time, the LAD is also an artifact in the "learning through the act of building" tradition of design science research [10,12]. In this paper, we focus on reporting results that were obtained from data that were collected from the fine-grained logging facilities of the system. Several student counselors were involved. They shared their expectations about user activity in relation to the learning skills and profile of the student. However, we did not start our analysis from strong a priori theory-driven hy- potheses. Rather, we applied a more inductive approach, using exploratory data analysis to spot links, while working toward an integrated model. We reuse the same type of plot throughout the paper. Figures 4–6, 8 represent logistic regression information in a compact format [19]. We used the implemen- tation for R by [5], with a few minor changes. The left hand side vertical axis of these figures shows the predicted probability of event occurrence. The horizontal axis depicts the learning skill scores as assessed by the LASSI. Frequency his- tograms for each category of the dependent variable facilitate interpretation of the effect of the data points on the logistic regression curve. The right hand side vertical axis represents the frequency count. The (red) curve shows the predicted probability that a student will exhibit a certain behavior (click through, read tips, return to tab). An upward curve suggests a positive relationship: the more students master the learning skill, the higher the predicted probability they will engage. A downward curve suggests the opposite: the fewer students master the learning skill, the less they are predicted to engage. We extended the plot by adding two (blue) vertical lines. The solid line represents the average learning skill level of students who did not display the behavior in question; the dashed line represents the average for students who did. 4.1 Click through Fig. 4 shows an upward slope for each of the five learning skills, indicating an increased probability of clicking through for students with better learning skill levels. The difference was found to be significant for each of the learning skills separately [3]. 1.0 0 1.0 0 1.0 0 1.0 0 1.0 0 55 50 60 70 50 Prob. click−through 0.8 0.8 0.8 0.8 0.8 110 100 120 140 100 0.6 0.6 0.6 0.6 0.6 Freq. 0.4 0.4 0.4 0.4 0.4 110 100 120 140 100 0.2 55 0.2 50 0.2 60 0.2 70 0.2 50 0.0 0 0.0 0 0.0 0 0.0 0 0.0 0 10 20 30 40 10 20 30 40 15 25 35 15 25 35 10 20 30 CON ANX MOT TST TMT Fig. 4. Predicted probability of click-through depending on each learning skills score 4.2 Corresponding learning skill behavior Once students clicked through, the relationship between learning skill levels and user activity seemed to be inverted: the top row of plots in Fig. 5 shows a down- ward slope for each of the learning skills with regard to reading corresponding tips: the lower students’ levels, the more likely they were to read the improve- ment advice. The difference was found to be significant for each of the learning skills separately [3]. A similar picture is sketched for returning a second time (or more) to reread the content related to a weaker learning skill. A one-tailed Man-Whitney test was applied. For concentration (p=1e−4), motivation (p=0.0025), test strategy (p=3e−4) and time management (p=0.0021), the difference in distributions of learning skill scores for students that did return to the corresponding tabs was found to be significant at the 5% level, but not for anxiety (p=0.1546). 1.0 0 1.0 0 1.0 0 1.0 0 1.0 0 Prob. read corresponding tip 0.8 45 0.8 45 0.8 55 0.8 60 0.8 40 90 90 110 120 80 0.6 0.6 0.6 0.6 0.6 Freq. 0.4 90 0.4 0.4 0.4 0.4 90 110 120 80 0.2 45 0.2 45 0.2 55 0.2 60 0.2 40 0.0 0 0.0 0 0.0 0 0.0 0 0.0 0 10 20 30 40 10 20 30 40 20 25 30 35 15 25 35 10 20 30 CON ANX MOT TST TMT Prob. return to corresponding tab 1.0 0 1.0 0 1.0 0 1.0 0 1.0 0 0.8 50 0.8 40 0.8 45 0.8 55 0.8 45 100 80 90 110 90 0.6 0.6 0.6 0.6 0.6 Freq. 0.4 0.4 0.4 0.4 0.4 90 100 80 90 110 0.2 50 0.2 40 0.2 45 0.2 55 0.2 45 0.0 0 0.0 0 0.0 0 0.0 0 0.0 0 10 20 30 40 10 20 30 40 20 25 30 35 15 25 35 10 20 30 CON ANX MOT TST TMT Fig. 5. Predicted probability of reading corresponding tips (top) or returning to cor- responding tab (bottom) depending on each learning skills score 4.3 Other learning skills behavior For some learning skills, an association was found with student’s behavior (read- ing tips, returning to tabs) for other, non-corresponding learning skills. Reading tips. Figure 6 (top) shows an upward slope for two out of five learning skills, indicating an increased probability of opening at least one of the tips re- lated to the other learnings skills higher for students with higher motivation and test strategy scores. A one-tailed Mann–Whitney test was conducted; only for motivation (p=0.0017) the distributions differed significantly between the group that accessed at least one the non-corresponding tips and the group that did not read any of the non-corresponding tips (p=0.2719 for concentration, p=0.4994 for anxiety, p=0.0975 for test strategy, p=0.4648 for time management). Returning to tabs. Figure 6 (bottom) shows a downward slope for each of the learning skills, suggesting an increased probability of returning to at least one the tabs related to the other learnings skills higher for students with lower scores for any of the learning skills. A one-tailed Mann–Whitney test resulted in support of this at the 5% level for concentration (p=0.0112), anxiety (p=0.0262), moti- vation (p=0.0369) and time management (p=0.0142), but not for test strategy (p=0.0848). 4.4 Integrated model Fig. 7 shows that learning skills are not uncorrelated. Some of the outcomes attributed above to one learning skill may be due to underlying effects of an- 1.0 0 1.0 0 1.0 0 1.0 0 1.0 0 Prob. read other tip(s) 0.8 40 0.8 30 0.8 35 0.8 40 0.8 35 80 60 70 80 70 0.6 0.6 0.6 0.6 0.6 Freq. 0.4 0.4 60 0.4 0.4 0.4 70 80 70 80 0.2 40 0.2 30 0.2 35 0.2 40 0.2 35 0.0 0 0.0 0 0.0 0 0.0 0 0.0 0 10 20 30 40 10 20 30 40 20 25 30 35 15 25 35 10 20 30 CON ANX MOT TST TMT 1.0 0 1.0 0 1.0 0 1.0 0 1.0 0 Prob. return to other tab(s) 0.8 45 0.8 35 0.8 40 0.8 45 0.8 35 90 70 80 90 0.6 0.6 0.6 0.6 0.6 70 Freq. 0.4 90 0.4 0.4 80 0.4 0.4 70 70 90 0.2 45 0.2 35 0.2 40 0.2 45 0.2 35 0.0 0 0.0 0 0.0 0 0.0 0 0.0 0 10 20 30 40 10 20 30 40 20 25 30 35 15 25 35 10 20 30 CON ANX MOT TST TMT Fig. 6. Predicted probability of reading at least one non-corresponding tip (top row) or returning to at least one non-corresponding tab (bottom row) depending on each learning skills score other, correlated learning skill. In order to further analyze students’ engagement with the dashboard we constructed several logistic regression models. We added information about study program4 and gender and interrelationships between the learning skills. Our goal was not so much to look for absolute numbers, but rather to check if earlier results remain valid outside of isolation. The initial predictors for each of the models are the students’ scores for each of the learning skills (CON, ANX, MOT, TST, and TMT), the study program in which they are enrolled (using ‘Bio Engineers’ as reference group) and gender (female=1). In order to find the best fitting predictors, a stepwise regression procedure in both directions based on the Akaike information criterion (AIC) was applied. Model fit was assessed using likelihood ratio chi-square tests. For a summary of results, see Table 1. The clicked model aimed to predict the probability of students clicking through to the dashboard depending on their profile. Of all students that received an invitation, 80.73% clicked through to the dashboard. Students with higher test strategy scores and students with higher time management scores seemed to have an increased probability to click through, but only for the latter the model yielded a statistically significant result. The study program the student is enrolled in also played a significant role. Gender and the other learning skill scores were not significant. 4 Twelve study programs were grouped into six study program groups: Bio- Engineering; CBBGG (Chemistry, Biology, Biochemistry-Biotechnology, Geography, Geology), Engineering Science, Engineering Science: Architecture, Engineering Tech- nology, and MIP (Mathematics, Informatics, Physics) CON MOT ANX TMT TST 1 CON 0.8 0.6 ANX 0.4 0.2 MOT 0 −0.2 TST −0.4 −0.6 TMT −0.8 −1 Fig. 7. Ellipse matrix [15] of correlation for the five learning skills, suggesting that learning skills are not entirely independent. Table 1. User behavior and profile, coefficients for logistic regression models. Diago- nally marked (blue) are coefficients for dashboard content related to the learning skill corresponding to the score. Marked vertically (light red) are the coefficients associating the motivation score to dashboard contents for the four other learning skills. n Intercept Academic skills Study program Gender (no/yes) CON ANX MOT TST TMT EngSc EngSc-Arch CBBGG EngTech MIP (female) Clicked ** * *** 271/1135 -0.806 – – – 0.0363 0.0501 0.570 0.724 -0.942 -0.238 0.194 – Return to tab * *** CON 792/307 0.909 -0.0633 – – – – – – – – – -0.387* ANX 804/277 0.801 – -0.0363** -0.0447* – – 0.625* 0.0515 0.124 0.39 0.697* – MOT 804/265 1.08 – -0.0351* -0.0512** – – 0.596* 0.396 -0.109 0.264 0.57 -0.458* TST 796/273 0.937 – – 0.0322 -0.0693*** -0.0308 – – – – – -0.365* TMT 890/173 -0.746 -0.0563** – – – – 1.06** 0.656 0.56 0.516 1.10** -0.34 Tips CON 700/399 -0.438 -0.0629*** – 0.0563** – – – – – – – – ANX 881/200 0.251 – -0.120*** 0.0525* – – 0.217 -0.286 -0.162 -0.646* -0.690* 0.378* MOT 897/172 0.607 – – -0.0627** – – -0.378 -0.226 -0.504 -0.939*** -0.552 – TST 896/173 -0.233 – – 0.0850*** -0.135*** – – – – – – 0.394* TMT 834/229 -2.47** – – * 0.0598 0.0828 *** -0.119*** 0.0598 -0.529 -0.707* -0.477 -0.477 0.324 * p ≤ 0.05; ** p ≤ 0.01; *** p ≤ 0.001 The return to tab models aimed to expose which student characteristics play a role in revisiting tabi for learning skill i compared to visiting tabi only once or not. An interesting finding was that students with a lower score for a spe- cific learning skill score i tended to return more often to the corresponding tabi . This finding was present for all return to tab models except for time man- agement. A possible explanation for this could be that most students accessed the tabs from left to right, in the same alphabetical order they were presented. Time management happened to be discussed on the last, outer right tab on the dashboard, thus students did not need to return to the time management tab after a first skimming of the dashboard because they were already on it. Study program was a significant predictor for returning to the anxiety tab, the moti- vation tab, and the time management tab. Also male students seemed to return more often to a specific tab compared to female students. This was significant at the 5% level for returning to the concentration tab, the motivation tab, and the test strategy tab. The tips models aimed to predict which students clicked on the ‘Okay, what now?’ button for each tabi , which gave them practical tips to improve the corresponding learning skill. Similar to return to tab , students with a lower scorei were predicted to be more likely to click on the ‘Okay, what now’ button corresponding to the learning skill i. The students’ study program was significant for the anxiety tips, the motivation tips, and the time management tips. Female students seemed to click more often on the tips compared to male students. This was significant at the 5% level for the anxiety and the test strategy tip clicks. Motivation played an remarkable role for all four of the other learning skills: for any given level of concentration, anxiety, test strategy, and time management, an increased level of motivation led to a higher user user interest in the improvement tips. 4.5 Interaction effects Several extended models were tested to include interaction effects of learning skills with gender, interaction of motivation with the other four learnings skills and cross-interaction of all learning skills, whether or not combined with gender interactions. Most of these models provided only limited additional information at the expense of increased complexity. Therefore, we confine ourselves here to the interesting case of interaction between anxiety and gender. Remember that anxiety is measured on an inverted scale: the higher the anxiety score, the lower the level of anxiety. As shown by Fig. 8 , anxiety exhibits an interaction effect with gender for the probability of returning to the anxiety tab. The slope is less or more straight for male students (left-hand side) and clearly downward for female students. Also the difference in means as shown by the blue vertical lines is negligible for male students, while it is distinct for female students. To illustrate the impact of the interaction effect, Table 2 provides two simpli- fied logistic regression models, one that tries to predict the return to anxiety tab using only the anxiety level and gender as predictors and a second model that also includes the interaction term. The influence of the anxiety level disappears almost completely for male students in the second model and strongly increases for female students. Table 2. Comparison of simplified logistic regression models for return to anxiety tab behavior, with and without interaction terms. Return to tab:ANX Intercept ANX gender anx·gender AIC * * Without interaction -0.2459 -0.0287 -0.3462 – 1259.7 With interaction -1.0514* -0.0002 2.3896** -0.1069*** 1248.8 * p ≤ 0.05; ** p ≤ 0.01; *** p ≤ 0.001 Prob. returns to ANX tab Prob. returns to ANX tab 1.0 0 1.0 0 0.8 30 0.8 15 60 30 0.6 0.6 Freq. 0.4 60 0.4 30 0.2 30 0.2 15 0.0 0 0.0 0 10 15 20 25 30 35 40 10 15 20 25 30 35 40 ANX: Male ANX: Female 1.0 0 1.0 0 Prob. returns to TST tab Prob. returns to TST tab Fig. 8. Illustration 0.8 of interaction effects. 40 The0.8chart on the left-hand20side shows the predicted probabilities 0.6 for male students80 returning to the anxiety tab in relation to 0.6 40 Freq. their anxiety score. 0.4 The the right-hand80 side 0.4 chart depicts the same 40information for female students.0.2The interaction effect 40of gender 0.2 is clearly visible by20the differences between left and0.0right. 0 0.0 0 15 20 25 30 35 40 20 25 30 35 TST: Male TST: Female 5 Discussion and Conclusion This paper presented an in-depth analysis of how the learner profile, in particular the learning skill levels of STEM students in transition from secondary to higher education, affected the use of a LAD. Regarding the question which audience was reached by the LAD, we showed that learning skills may indeed affect the click-through (response) rate for the LAD (RQ-1): the higher the score for a particular learning skill, the more likely students were to access the dashboard. While most learning skills seemed to exhibit such an effect in isolation, only time management did so with statistical significance in a reduced model. The higher the time management skills of the student, the more likely the student was to access the dashboard. One unexplored interpretation may be that students with inadequate time management skills simply missed or forgot our invitation. As an indicator of the ability of the dashboard to invoke self-reflection, we determined that learning skill levels influenced user activity related to content about those learning skills in our LAD (RQ-2). In most cases, with the exception of returning to the time management tab, which is possibly explained by an order effect, a lower learning skill level tended to lead to increased user activity (revisiting tabs, reading tips) concerning this particular learning skill. Furthermore, we did show that particular learning skills affected user activity in connection to other learning skills (RQ-3). This was especially the case for motivation. We have reason to believe that motivated students are engaging with the dashboard more intensively because they see it as an opportunity to improve, something we would like to see further explored. Additionally, we found indications for the influence of study program and gender. Especially the role of gender deserves to be studied more thoroughly. For example, we noted that male students were more likely to reread some of the learning skill tabs, while female students accessed the tips more frequently. Moreover, gender demonstrated an interaction effect that was explicitly outspo- ken in relation to anxiety and reading anxiety tips. Our results are pointing into the direction of students using the LAD for self-reflection and to gather actionable insight. However, while our study did target a relatively high number of students (N=1,406) in comparison to most LAD-related work, we did not include a formal assessment of effective impact on learning. Our work suggests that design of LADs may be improved by un- derstanding how students interact with them and how this interaction pattern differs depending on the profile of the student. Future work may include a more fine-grained tracking of how students use the LAD, for instance to learn if stu- dents spend more time reading texts or if they are predominantly looking at the data visualization. For this study, only actions within the LAD were tracked. To probe into the actionability of the information provided, data collection could be extended to some of the actions suggested by the LAD: increased LMS ac- tivity, electronic scheduling of counseling meetings, registration for a workshop to improve a learning skill. The analysis may also be extended to include study achievement following dashboard usage, for instance to determine if non-use of the LAD is indicative for students at risk of failure. Acknowledgement. This research is co-funded by the Erasmus+ program of the European Union (562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD). References 1. Cumulative study efficiency – KU Leuven. http://www.kuleuven.be/english/ education/student/studyprogress/cse, [Online; accessed 24-April-2017] 2. Bodily, R., Verbert, K.: Trends and issues in student-facing learning analytics re- porting systems research. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference. pp. 309–318. ACM (2017) 3. Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., De Laet, T.: Dash- board for actionable feedback on learning skills: Scalability and usefulness (ac- cepted for publication). In: International Conference on Learning and Collabora- tion Technologies. Springer (2017) 4. Crick, R.D., Broadfoot, P., Claxton, G.: Developing an effective lifelong learning inventory: The elli project. Assessment in Education: Principles, Policy & Practice 11(3), 247–272 (2004) 5. de la Cruz Rot, M.: Improving the presentation of results of logistic regression with r. Bulletin of the Ecological Society of America 86(1), 41–48 (2005), http: //www.jstor.org/stable/bullecosociamer.86.1.41 6. Duval, E.: Learning analytics and educational data mining. Online only). Retrieved from: https://erikduval.wordpress.com/2012/01/30/learning- analytics-and-educational-datamining (2012) 7. Festinger, L.: A theory of social comparison processes. Human relations 7(2), 117– 140 (1954) 8. Gašević, D., Kovanović, V., Joksimović, S.: Piecing the learning analytics puzzle: A consolidated model of a field of research and practice (2017) 9. Herodotou, C., Heiser, S., Rienties, B.: Implementing randomised control trials in open and distance learning: a feasibility study. Open Learning: The Journal of Open, Distance and e-Learning pp. 1–16 (2017) 10. Hevner, A.R.: A three cycle view of design science research. Scandinavian journal of information systems 19(2), 4 (2007) 11. H&H Publishing: LASSI , Dutch version: c H&H Publishing Company, Inc., 1231 Kapp Drive, Clearwater, Florida 33765. authors: Weinstein, Claire Ellen (1987-2002-2016), Dutch version: Lacante, Lens, Briers (1999) (2017), http:// www.hhpublishing.com/\_assessments/lassi 12. Kuechler, B., Vaishnavi, V.: On theory development in design science research: anatomy of a research project. European Journal of Information Systems 17(5), 489–504 (2008) 13. Leitner, P., Khalil, M., Ebner, M.: Learning analytics in higher education—a liter- ature review. In: Learning Analytics: Fundaments, Applications, and Trends, pp. 1–23. Springer (2017) 14. Long, P.D., Siemens, G., Conole, G., Gašević, D.: Proceedings of the 1st Interna- tional Conference on Learning Analytics and Knowledge. ACM (2011) 15. Murdoch, D., Chow, E.: A graphical display of large correlation matrices. The American Statistician 50(2), 178–180 (1996) 16. Pinxten, M.: At-risk at the gate: prediction of study success of first-year science and engineering students in an open-admission university in Flanders. any incremental validity of study strategies? (submitted for publication) 17. Schwendimann, B.A., Rodríguez-Triana, M.J., Vozniuk, A., Prieto, L.P., Boroujeni, M.S., Holzer, A., Gillet, D., Dillenbourg, P.: Understanding learning at a glance: An overview of learning dashboard studies. In: Proceedings of the Sixth International Conference on Learning Analytics & Knowledge. pp. 532–533. ACM (2016) 18. Shum, S.B., Crick, R.D.: Learning dispositions and transferable competencies: ped- agogy, modelling and learning analytics. In: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge. pp. 92–101. ACM (2012) 19. Smart, J., Sutherland, W.J., Watkinson, A.R., Gill, J.A.: A new means of pre- senting the results of logistic regression. The Bulletin of the Ecological Society of America 85(3), 100–102 (2004) 20. Tempelaar, D.T., Rienties, B., Nguyen, Q.: Towards actionable learning analytics using dispositions. IEEE Transactions on Learning Technologies 10(1), 6–16 (2017) 21. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J.L.: Learning analytics dashboard applications. American Behavioral Scientist 57(10), 1500–1509 (2013) 22. Verbert, K., Govaerts, S., Duval, E., Santos, J.L., Van Assche, F., Parra, G., Klerkx, J.: Learning dashboards: an overview and future research opportunities. Personal and Ubiquitous Computing 18(6), 1499–1514 (2014) 23. Weinstein, C.E., Zimmerman, S., Palmer, D.: Assessing learning strategies: The design and development of the LASSI. Learning and study strategies: Issues in assessment, instruction, and evaluation pp. 25–40 (1988)