=Paper= {{Paper |id=Vol-3024/paper2 |storemode=property |title=Improving student inclusion through learning analytics: a Step-Wise approach |pdfUrl=https://ceur-ws.org/Vol-3024/paper2.pdf |volume=Vol-3024 |authors=Hildo Bijl }} ==Improving student inclusion through learning analytics: a Step-Wise approach== https://ceur-ws.org/Vol-3024/paper2.pdf
Improving student inclusion through learning
analytics: a Step-Wise approach
Dr Hildo Bijl1
1
    Utrecht University of Applied Sciences, Padualaan 99, Utrecht, The Netherlands


                                         Abstract
                                         At the Utrecht University of Applied Sciences, students from a non-standard background often have de-
                                         ficiencies in their mathematics/physics/engineering skills. To improve inclusiveness, a practice support
                                         app called Step-Wise has been set up that automatically detects these deficiencies and advises students
                                         on which skills to practice. Through student interviews using the CIMO-logic, it has been established
                                         that this app mainly improves the effectiveness of practice, although it also has a beneficial effect on the
                                         amount of exercises practiced and on the student motivation.

                                         Keywords
                                         Inclusion, Learning analytics, Bayesian user modeling, Engineering education




1. Introduction
At the Utrecht University of Applied Sciences, the study of Mechanical Engineering receives
students from a variety of backgrounds. A majority of students comes fresh out of high school,
but another significant group comes from an apprenticeship or labour background. These
students have a large amount of practical experience, but often lack certain fundamental skills
in mathematics, physics and/or engineering mechanics. This results in a relatively large number
of drop-outs specifically from this group of students.
   The way to combat this high drop-out rate is typically through more supervision and guidance.
Together with each student, the teachers identify the skill deficiencies and make a plan to tackle
them. This is a very time-consuming process, and due to the limited time available to the staff
the effects are limited. As a result, interest rose in automating a part of this process through
learning analytics. An extensive experiment with a brand new Smart Learning Environment
(SLE) called Step-Wise has been set up. This paper has as goal to show the set-up of said learning
environment and discuss the results of the executed experiment.
   In literature on the sociology of education, the "socioeconomic achievement gap” – the dispar-
ity in academic achievement between students from high- and low-socioeconomic backgrounds
– is well-known. [1] It is present and increasing, even in The Netherlands. The best way to
combat this is equal access to educational opportunities [2] but a difference in initial skills can


LAS4SLE @ EC-TEL 2021: Learning Analytics for Smart Learning Environments, September 21, 2021, Bolzano, Italy
" hildo.bijl@hu.nl (H. Bijl)
~ https://www.hildobijl.com/ (H. Bijl)
 0000-0002-7021-7120 (H. Bijl)
                                       © 2021 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
    CEUR
    Workshop
    Proceedings
                  http://ceur-ws.org
                  ISSN 1613-0073       CEUR Workshop Proceedings (CEUR-WS.org)



                                                                                                           1
Hildo Bijl CEUR Workshop Proceedings                                                          1–9


prevent these equal opportunities: if a student does not have the prior knowledge and skills
that are expected, education will inevitably be less effective.
   A variety of methods have been discussed in literature to combat this. There are tools to
support general study planning [3], but these have not been proven effective. Other tools focus
on deciding when teachers need to perform an intervention. [4] These methods have shown
more promise, but in the absence of sufficient teacher availability, it would help to have said
intervention also be automated. It is hard to accurately judge the effectiveness of such automatic
programs, mainly due to the lack of large-scale long-term studies and the difficulty of objective
evaluation. [5, 6, 7] However, initial studies have shown promise, leading to the experiment
described in this paper.
   When setting up an SLE, there are various important concepts to keep in mind. The main goal
is to use differentiation between students to improve learning outcomes. This differentiation has
shown promise, but also complexities. [8, 9, 10] Direct feedback has shown positive effects [11]
but so have the principles of Just-In-Time Teaching [12], High-Impact Learning [13] and 4C/ID
[14]. In addition, principles of gamification have also shown to contribute to student motivation.
[15] All these ideas have been taken together and implemented in a new SLE: Step-Wise.
   This paper is set up as follows. In Section 2 the set-up of the Step-Wise practice platform is
discussed, including the didactic background. Section 3 discusses the learning analytics side of
the platform, detailing the functioning of the algorithms. Afterwards, Section 4 explains the
application of the platform in various courses and the lessons learned from it. This paper is
closed off by conclusions and recommendations in Section 5.


2. The practice system: physics engines and direct feedback
The Step-Wise practice platform was designed with three main goals in mind.

   1. Provide students with practice experiences that are as close as possible to what they need
      to do in their final assessments.
   2. Support students during practice, giving feedback on work and guidance/support when
      they get stuck.
   3. Coach students in their general learning process, detecting deficiencies and helping them
      tackle it.

   The least innovative is goal 1. To tackle this, the Step-Wise platform has input fields that
allow the easy entering of numbers with units, shown in Figure 1. Behind these input fields is an
intelligent physics engine that checks the numbers, ensuring that different ways of writing the
right answer thing are all considered correct. The input fields also provide customized feedback
to the student based on the provided answer, conform goal 2. (Again, see Figure 1.) In addition,
the app also supports interactive plots/diagrams that require user interaction. And just like with
the regular input fields, these interactive diagrams provide feedback to given solutions as well.
   However, the direct feedback on student work is not the only way goal 2 is obtained. The
main idea behind Step-Wise is that each exercise consists of steps. If a student understands
the exercise, they are immediately allowed to enter the final answer. If they do not, they can
request to solve the exercise Step-Wise. In this case, they are presented the steps needed to



                                                2
Hildo Bijl CEUR Workshop Proceedings                                                               1–9




Figure 1: Input fields have support for entering units. These are checked by a smart physics engine.
Submissions automatically get customized feedback. (The platform’s language currently is Dutch.)




Figure 2: An example of an exercise. Step 1 asks the student to apply the gas law, step 2 asks them to
recognize the process type, and step 3 asks how they would want to continue: calculating pressure or
temperature. The corresponding input field then appears, including later on the corresponding solution.


solve the exercise, one by one, and subsequently get feedback on those steps as well. For some
exercises, there are multiple ways to solve the exercise. In that case the student can indicate
which method they will apply – see Figure 2 – and the rest of the exercise adapts.
   Experience has shown that this Step-Wise approach is already strongly appreciated by
students. However, this is not what makes Step-Wise innovative. That is the learning analytics
system behind the exercises.


3. Learning analytics: skill trees and Bayesian modeling
In Step-Wise, each course is dissected into a large number of skills. Typically, every 15-20
minutes of lecture time represents one skill. These skills are then hierarchically linked in a large
skill tree. A small example is shown in Figure 3.



                                                  3
Hildo Bijl CEUR Workshop Proceedings                                                                1–9




Figure 3: A small and simplified part of the course skill tree. This roughly corresponds to two lectures.


   The innovative part of Step-Wise is that every step of every exercise is coupled to a skill
from the respective skill tree. Whenever a student submits anything, their success or failure at
the respective step is immediately noted and registered. Of course a single success does not
directly imply mastery. Instead, the system uses a specially designed Bayesian user modeling
algorithm to keep track of the chance that a student will perform a given skill correctly the
next time. Contrary to Bayesian Knowledge Tracing, this algorithm does not take into account
the (virtually non-existent) chance of correct guesses, but instead models how the probability
of success changes throughout the learning process. The result is an estimate, for every skill
at every point in time, of the future success rate. This idea is visualized in Figure 4, with a
thorough discussion of all the mathematics behind the algorithm available in [16].
   Through this Bayesian user modeling, the Step-Wise platform always has up-to-date informa-
tion on the level of each student at each skill. This allows the platform to counsel the student in
which skill to practice next. Whenever a student is practicing a skill, without having mastered
all the prerequisites, they are recommended to practice said prerequisites first. Or, if the student
is practicing a skill that they have already mastered, they are recommended to practice the
first follow-up skill that is not mastered yet. In practice this means that, when a student is
practicing an exercise and consistently fails at a certain subskill, they get a pop-up "You are
recommended to practice [this subskill] first." Do keep in mind that Step-Wise is meant as a
practice support platform: students are always free to ignore advice and practice what they
deem to be appropriate for them at the given time.
   The Step-Wise platform does not stop at only skill recommendations. Whenever a student
practices a skill, Step-Wise also selects the optimal practice exercise. Every skill has a large
number of exercises coupled to it. However, some exercises have more steps or more complex
steps than others, and as a result are more complicated. Whenever a student wants to practice a
certain skill, Step-Wise calculates the chance that the student will do each exercise correctly. It
then filters out exercises with a very high estimated success rate (too easy) and with a very low
estimated success rate (too difficult). This ensures that a student always receives exercises on
their own level.



                                                   4
Hildo Bijl CEUR Workshop Proceedings                                                              1–9




Figure 4: A simplified visualization of how the machine learning algorithm tracks a student’s level.


  There are various other ways in which Step-Wise uses the available user data, but it is beyond
the scope of this paper to elaborate on that. Further details can be found in the Step-Wise
explainer in [17].


4. Application to education
The Step-Wise platform has so far been applied to three courses.

    • Alpha test: a second-year thermodynamics course for 60 full-time students. The goal was
      to filter out bugs.
    • Beta test: a second-year part-time thermodynamics course for 20 part-time students. The
      goal was to improve the user experience.
    • Main test: a first-year thermodynamics course for 80 full-time students. The goal was to
      gauge the effectiveness of the system at improving the student’s learning experience.

   A numerical analysis of the effectiveness of the platform could not be performed. Due to the
limited number of students and due to ethical reasons, it was not possible to split the student
body up into a part with access to the app and a part without. In addition, due to the corona
crisis, a comparison of this group with last year’s students would be inappropriate.
   Instead, the evaluation was done through the CIMO-logic [18]: in a certain Context, check if
a given Intervention activates a Mechanism that results in an adjusted Outcome. Because the
app has a variety of functionalities, this analysis is done per feature. What was the mechanism
activated by each functionality? And what were the resulting outcomes? To answer these
questions, a dozen interviews have been held with volunteering students, focusing on how the
app changed their behavior, their understanding and their motivation.



                                                  5
Hildo Bijl CEUR Workshop Proceedings                                                                       1–9


    • Automatic checking of exercises: Nearly all students noted1 , "The app is very strict!
      One small mistake and the exercise is incorrect." Students indicated that this decreased
      motivation, but at the same time did force them to be more thorough in their work. Since
      performing fault-free calculations is an important part of engineering education, this
      is a useful learning outcome. One student even mentioned, "This is the first university
      physics course I passed, ever!2 And it’s because the app always pointed out my mistakes."
    • Automatic feedback: Students indicated that most of the time the automatic feedback
      gave them some hint on why their answer was wrong. This allowed them to try and find
      their mistake. It encouraged them to give the exercise another try.
    • Interactive input fields and diagrams: A small majority of students spontaneously
      noted the ease-of-use of the app and specifically the method of solution input. They
      said that other apps they have used in the past (Canvas, MapleTA) do not allow for the
      intuitive use of units, but this app did. It made practice more like real life. This increased
      the effectiveness of practice and the motivation to practice.
    • Step-Wise approach to exercises: The most appreciated feature of the app was the
      Step-Wise approach. Students mentioned that splitting an exercise up into steps and
      checking each step one by one provided them with a much-desired structure. And by
      seeing this consistent solution method behind exercises, students were subsequently
      motivated to practice more.
    • Skill tracking: When practicing a skill, students continuously see a small globe in the
      top right of the app, which fills up when they do well. This skill globe was a continuous
      source of both desire and frustration to many students. Seeing it fill up encouraged them
      to practice more and fill it up further, but seeing it empty a bit on every tiny mistake
      also occasionally resulted in bouts of anger. Nothing conclusive can hence be said about
      the effect on student’s motivation, but it certainly resulted in an increased amount of
      practice.
    • Skill recommendations: Various students indicated that the course overview (a clear
      image of which course skills they had mastered) helped show them what they still
      needed to practice, improving the effectiveness of practice. Students also, almost without
      exception, indicated that they initially followed the recommendations of the app. However,
      as they grew more accustomed to using the app, they did start to deviate from the
      recommendations. Especially after the app "Completely destroyed my score after one
      tiny pointless mistake" students were tempted to continue with the next skill, having
      decided for themselves they had obtained sufficient mastery.
    • Exercise selection: None of the students were aware of any intelligent exercise selection
      script that gave them easier exercises at the start and harder exercises as they improved.
      Students did notice, however, that the app was quite monotonous in providing them
      with exercises. They often received the same exercise, albeit with different randomly
      generated numbers, twice in a row. (This is an inconvenient by-effect of the exercise
      selection strategy, which should reduce/disappear once more exercises are added to the
    1
      All quotes mentioned are student quotes. Quotes were originally in Dutch and have been translated.
    2
      The Step-Wise trial was run on the fourth physics course the students had. The student involved came from
a labour school, anecdotally showing that the app can be useful for students with an alternative background.



                                                      6
Hildo Bijl CEUR Workshop Proceedings                                                                   1–9


Table 1
Overview of the estimated effects of various app features on the desired outcomes, based on student
interviews. A plus indicates a positive estimated effect, a minus a negative one, and an 𝑜 is a neutral/non-
existing effect.
 Functionality                             Amount practiced      Effectiveness of practice    Motivation
 Automatic checking of exercises                    o                        +                     −
 Automatic feedback                                 +                        +                     o
 Interactive input fields and diagrams              o                        +                     +
 Step-Wise approach to exercises                    +                       ++                     +
 Skill tracking                                    ++                        o                     o
 Skill recommendations                              +                        +                     +
 Exercise selection                                 −                       +                      −


       app.) As a result, student motivation dropped and a few students indicated they practiced
       slightly less as a result.

   If we summarize the outcomes, based on three categories "Amount practiced", "Effectiveness
of practice" and "Motivation", then we find Table 1. The results conform to the main student
sentiment about the app. "The app definitely made practice more useful, and encouraged me to
work harder. It even made it a bit more fun."


5. Conclusions and recommendations
Overall, it can be concluded that the Step-Wise app helped students practice and improved
the learning outcome, especially from students from an alternate background. This is mainly
due to an increased quality of practice. It must be noted here that some of these effects could
also be obtained without any app. After all, especially the Step-Wise approach of the app was
appreciated, and it is also possible to write a PDF solution manual in a similarly structured
fashion. Nevertheless, the interactive elements of the app also had demonstrated positive effects,
proving that an app can have an added value. Combining the interactive elements and the
Step-Wise approach in a single app of course combines the best of both worlds.
   The app did have a few interesting downsides. A main philosophy of Step-Wise is that all
students will eventually master all skills. However, there are many students in university that
only want to loosely master three quarters of the course and then get a bare passing grade.
Especially those students struggled with the strict exercise checking algorithms of the app. The
app is hence not completely inclusive to students who, as a student said, "just want to wing it."
It’s not clear yet if it’s better to adjust Step-Wise to this student culture, or adjust the student
culture to the philosophy behind the app.
   Another downside of every student constantly getting personalized randomly generated
exercises is that it prevents students from working together. This problem is inherent to
randomly generated exercises, as was also experienced by [19]. A follow-up project can focus
on expanding the app to include a collaboration mode.




                                                     7
Hildo Bijl CEUR Workshop Proceedings                                                        1–9


Acknowledgments
This project was funded by the Dutch ministry of OCW through Comenius grant 405.20865.254.


References
 [1] A. K. Chmielewski, The global increase in the socioeconomic achievement gap, 1964
     to 2015, American Sociological Review 84 (2019) 517–544. URL: https://doi.org/10.1177/
     0003122419847165. doi:10.1177/0003122419847165.
 [2] W. Crenna-Jennings, Key drivers of the disadvantage gap: Literature Review, Education in
     England: Annual Report 2018, Education Policy Institute, 2018.
 [3] T. D. Laet, M. Millecamp, M. Ortiz-Rojas, A. Jimenez, R. Maya, K. Verbert, Adoption and
     impact of a learning analytics dashboard supporting the advisor – student dialogue in a
     higher education institute in latin america, British Journal of Educational Technology 51
     (2020) 1002–1018. URL: https://doi.org/10.1111/bjet.12962. doi:10.1111/bjet.12962.
 [4] M. Hlosta, C. Herodotou, V. Bayer, M. Fernandez, Impact of predictive learning analytics
     on course awarding gap of disadvantaged students in stem, in: I. Roll, D. McNamara,
     S. Sosnovsky, R. Luckin, V. Dimitrova (Eds.), Artificial Intelligence in Education, Springer
     International Publishing, Cham, 2021, pp. 190–195.
 [5] J. M. Spector, Conceptualizing the emerging field of smart learning environments, Smart
     Learning Environments 1 (2014) 1–10. URL: https://doi.org/10.1186/s40561-014-0002-7.
     doi:10.1186/s40561-014-0002-7.
 [6] C. Herodotou, B. Rienties, M. Hlosta, A. Boroowa, C. Mangafa, Z. Zdrahal, The scalable
     implementation of predictive learning analytics at a distance learning university: Insights
     from a longitudinal case study, The Internet and Higher Education 45 (2020). URL: https://
     www.sciencedirect.com/science/article/pii/S1096751620300014. doi:10.1016/j.iheduc.
     2020.100725.
 [7] Z. Papamitsiou, A. Economides, Learning Analytics for Smart Learning Environments: A
     Meta-Analysis of Empirical Research Results from 2009 to 2015, 2016, pp. 1–23. doi:10.
     1007/978-3-319-17727-4_15-1.
 [8] K. Scalise, Differentiated e-learning: Five approaches through instructional technology,
     Int. J. Learn. Technol. 3 (2007) 169–182.
 [9] J. Laskaris, 5 tips for applying differentiated instruction in eLearning, 2015. URL: https:
     //www.talentlms.com/blog/5-tips-for-applying-differentiated-instruction-in-elearning/.
[10] C. Pappas,         Differentiated instruction in elearning:               What elearning
     professionals      should     know,       2015.    URL:      https://elearningindustry.com/
     differentiated-instruction-in-elearning-what-elearning-professionals-should-know.
[11] J. Laskaris, The elearning feedback power: Personal, specific, and timely, 2016. URL:
     https://www.talentlms.com/blog/elearning-feedback-power/.
[12] G. Novak, E. Patterson, A. Gavrin, W. Christian, Just-in-Time Teaching: Blending active
     learning and web technology, Prentice Hall, Saddle River, NJ, 1999.
[13] F. Dochy, I. Berghmans, A. Koenen, M. Segers, Bouwstenen voor High Impact Learning,
     Boom uitgevers, Utrecht, 2015.




                                               8
Hildo Bijl CEUR Workshop Proceedings                                                        1–9


[14] P. A. K. Jeroen J. G. van Merriënboer, 4C/ID in the Context of Instructional Design and the
     Learning Sciences, Routledge, 2018, pp. 169–179. URL: https://www.routledgehandbooks.
     com/doi/10.4324/9781315617572-17. doi:10.4324/9781315617572-17.
[15] E.      McLaughlin,        6     reasons      why      gamification       enhances      the
     learning        experience,         2017.       URL:       https://elearningindustry.com/
     gamification-enhances-the-learning-experience-6-reasons-why.
[16] H. Bijl, Automatic skill tracking, 2021. URL: https://github.com/HildoBijl/stepwise/blob/
     master/frontend/public/SkillTracking.pdf.
[17] H. Bijl, Step-Wise: an interactive practice platform, 2021. URL: https://step-wise.com/
     Explainer-EN.pdf.
[18] J. Holmström, T. Tuunanen, J. Kauremaa, Logic for design science research theory ac-
     cumulation, in: Proceedings of the Annual Hawaii International Conference on System
     Sciences, 2014, pp. 3697–3706. doi:10.1109/HICSS.2014.460.
[19] M. Meijers, P. Verkoeijen, The relationship between ICT based formative assessment and
     academic achievement in a Mechanics of Materials course, in: Proceedings of the SEFI 47th
     Annual Conference, 2019, pp. 1753–1762. URL: https://www.sefi.be/wp-content/uploads/
     2019/10/SEFI2019_Proceedings.pdf.




                                               9