<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Kyoto, Japan, March</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>A Learning Analytics Dashboard to Investigate the Influence of Interaction in a VR Learning Application⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Birte Heinemann</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sergej Görzen</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ana Dragoljić</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Lars Florian Meiendresch</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Marc Troll</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ulrik Schroeder</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>RWTH Aachen University</institution>
          ,
          <addr-line>Ahornstraße 55, 52074 Aachen</addr-line>
          ,
          <country country="DE">Germany</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2024</year>
      </pub-date>
      <volume>1</volume>
      <fpage>8</fpage>
      <lpage>22</lpage>
      <abstract>
        <p>Learning in Virtual Reality ofers various ways to make the learning process interactive, but the implementation of such features is complex, time-consuming and expensive. In order to evaluate the eficiency of interactive tasks, a Learning Analytics dashboard, presented in this paper, was created for both teachers/educators and content creators. The dashboard presents data from a study with diferent interactive/immersive and non-interactive/non-immersive variants of a learning application for the rendering pipeline, a showcase topic from computer graphics. The dashboard has been implemented with transferability in mind by using xAPI as a data format and can thus be easily transferred to other contexts.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Virtual Reality</kwd>
        <kwd>Learning Analytics</kwd>
        <kwd>Dashboard</kwd>
        <kwd>Multi-modal Learning Analytics</kwd>
        <kwd>Rendering Pipeline</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction and Background</title>
      <p>
        In recent years, the integration of Virtual Reality (VR) into educational settings has gained
more and more interest [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. This surge in popularity is not without reason; on one hand, the
technology gets afordable. On the other hand, the immersive and interactive nature of VR
provides a unique learning experience. As educators and content creators explore the growing
possibilities of VR in education, the need for assessment, design guidelines, best practices, and
efective tools to assess and optimize these experiences becomes increasingly important, e.g.
Ansone et al. discussing the need for a pedagogic framework for Usability in VR [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>With Learning Analytics, we can go beyond the goal of providing learners with feedback
on their performance using superficial descriptive analytics, a gap identified by Susnjak et al.
[3]. Rather it is possible to give teachers and content creators interesting insights into the
behavior of learners by visualizing learning traces in a Learning Analytics dashboard (LAD).
This way, we will enable the creation of content that can be adapted to the actual needs of
learners and teachers, create actionable insights for educators and gain fundamental knowledge
about learning (behaviour) in VR environments, which should also be generalizable to other
use cases.</p>
      <p>The assumption that interactive tasks enhance learning outcomes is not new (see [4] and [5]),
but we still investigate the design of learning environments. Interactive learning is the basis
for many VR educational applications, e.g. to develop spatial abilities [6]. However, the gap
persists in the uncertainty surrounding which interactive tasks are most efective in conveying
educational content. Addressing this question, we introduce a LAD designed explicitly for
educators and content creators (persons creating educational material without teaching them
directly). This dashboard serves as a tool for investigating the impact of interactivity in VR
learning applications.</p>
      <p>While the primary focus is on understanding the efects of interactivity, the learning
environment, the data and the study also embark on a nuanced exploration by collecting data in both
immersive VR and a desktop VR version without the goal of achieving a media comparison,
see [7]. The comparative analysis within the LAD facilitates a comprehensive examination
of the two variants, presenting usability issues but also the nuances and similarities between
immersive and desktop VR. The dashboard allows for the testing of interactive scenarios of
varying complexity, as the application is engineered to be modular.</p>
      <p>In essence, this paper presents a LAD to investigate the usage of VR in education. Primary
stakeholders are teachers and content creators who are interested in the relationship between
interactivity and learning outcomes. The insights gained from this study contribute not only
to the refinement of existing VR educational practices but also pave the way for innovative,
informed, and economical advancements in future applications.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related Work</title>
      <p>Related work to this can be examined from diferent angles, primarily from the perspective of
Learning Analytics in VR, but also from the perspective of interactive learning and dashboard
design. Related work in the field of computer graphics education is summarized in Heinemann
et al. [8]. To summarize the preliminary work, it can be said that very little use has been made
of VR as a technology in teaching, but many approaches in computer graphics are interactive
and practical.</p>
      <p>Multimodal Learning Analytics and VR Learning Analytics in Virtual Reality can be based
on various data sources. The technology itself can be used to collect information beyond "simple
log data" (often referred to as clickstreams in traditional learning environments), for example,
movement data. This means that VR learning applications can, in contrast to classic setups,
be evaluated without external sensors using multimodal Learning Analytics (MMLA), see [9].
Oucaichi et al. state that VR is an ideal space to capture learners’ behaviour and body movement,
but they question whether VR is an ideal learning space [10]. Besides this, the general analysis
(as well as the collection, pre-processing, annotation and interpretation) of multimodal data
remains a challenge [11]. According to Oucaichi et al. [10], VR is one of the still rather small
emerging technologies considered in MMLA research. This work provides approaches to answer
the gap regarding the success of VR as a learning medium [10].</p>
      <p>Learning Analytics Dashboards There are many studies on dashboards in general, as
they are an important tool for involving people in the decision-making process for learning
activity and data analysis [12]. Initially, visualizations and predictions were the focus of
research. Currently, a research focus is to extend the dashboards to include processes that
involve stakeholders and, for example, multimodal data [12]. In addition to current research
topics, gaps identified in research on LADs, for example, include the lack of integration of JEDI
concepts (justice, equity, diversity, and inclusion) [13]. Including JEDI concepts comes with
its own challenges and opportunities; we’ll discuss mainly the theme of software development
resources.</p>
      <p>Learning Analytics Data Format One of the possible data formats specialized for tracking
learner activities is the eXperience API (xAPI) specification. To briefly introduce xAPI here, the
reason for using this specification is the following: we aim to leverage the eXperience API (xAPI)
specification across our work in various forms and flavours of Learning Analytics. Through
the use of xAPI statements, user activities are articulated using the syntax: an actor performs
a specific verb on an object/activity. The incorporation of xAPI extensions helps to provide
additional information about the learning scenario (termed context extension), details regarding
the target activity (or object) (activity extension), or additional information concerning task
progress (result extensions) [14]. However, xAPI has a degree of freedom on how to use it and
isolated solutions are often created. To achieve a more consistent usage standard, it is possible
to use xAPI profiles and registries. Ehlenz et al. explore the perspectives aforded by existing
and former xAPI registries, especially in an academic area [15]. Possible solutions like an xAPI
Registry are already implemented; see [16]. The xAPI Registry was institutionalized to provide
a profound basis for the scientific application of xAPI across various disciplines. Developers and
researchers have the opportunity to suggest modifications, participate in the maintenance or add
new definitions through either GitLab or the web interface. The registry is implemented
humanfriendly for interdisciplinary usage, but also implements straightforward machine-readable
interfaces. Furthermore, each unique identifier (IRI in xAPI lingo) for definitions is also a valid
pointer to a readable version of all stored metadata for both machines and humans, making the
data itself even more accessible and hopefully self-explaining.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Research Prototype: RePiX VR</title>
      <p>The research prototype used for this study is the modular application Rendering Pipeline
eXperience in Virtual Reality (RePiX VR) [8], demonstration link (YouTube). RePiX VR is
an interactive, immersive learning scenario which focuses on teaching the nine core stages
(explained in [17]) of the rendering pipeline, a process of converting a 3D environment into a
2D image. It can be used either in a VR environment, using a head-mounted display, or on a
computer (desktop VR).</p>
      <p>One goal of the application is to address challenges for teachers, such as the didactic reduction
for the diferent target groups, the dificult visualization of the procedural, basic concepts
especially for individual steps of the pipeline - and the balance between details of the individual
steps of the pipeline and the overview of the overall process. Especially at an abstract level, it is
dificult to involve students and provide interactions that promote learning actively.</p>
      <p>
        The learning objectives of the simulation are located at diferent taxonomy levels and thus
enable both a gentle learning curve for beginners and an in-depth examination for advanced
learners (revision of Bloom’s taxonomy [
        <xref ref-type="bibr" rid="ref3">18</xref>
        ]). The target dimensions of the application include
typical declarative factual knowledge, learning the terminology and acquiring conceptual and
procedural knowledge at various taxonomy levels. Examples of the range of learning objectives
are conceivable from "Learners name the results of the individual steps of the pipeline" to
"Learners can compare diferent lighting algorithms on a visual level and make a reasoned
decision". In order to address learners with heterogeneous knowledge as intuitively as possible,
the application was designed as a guided tour in which a robot serves as a contact person and
teacher.
      </p>
      <p>
        To achieve the goal of gathering data for our learning applications, we created the framework
OmiLAXR (Open and modular integration of Learning Analytics in XR) [
        <xref ref-type="bibr" rid="ref4">19</xref>
        ]. This framework
was implemented in several development cycles and was based on requirements derived from
experiences gained from various applications and use cases. For the tracking, we chose to use
the xAPI specification and an xAPI Registry; see [16].
      </p>
      <p>The OmiLAXR framework focuses on supporting an easy integration of xAPI in XR
applications (in Unity). We created components and concepts for enabling the automatized gathering
of VR environment activities by a set of tracking systems, e.g. interacting with an object, eye
tracking data, movement and head gestures (nodding and shaking). By integrating OmiLAXR
in our research object RePiX VR, we achieved a rich collection of user behaviour for our study
(explained next).</p>
    </sec>
    <sec id="sec-4">
      <title>4. Study</title>
      <p>
        In October 2022, the RePiX VR project was tested with 128 (VR) + 159 (desktop) students who
were participating in the computer graphics lecture of the RWTH Aachen University, resulting
in 120 complete data records for the immersive VR variant, 56 complete data sets for the desktop
variant. The high number of incomplete data sets in the desktop version is due to the voluntary
completion of questionnaires and duplicate registrations from the circle of VR users (to study
for the exam). To gain some insights into knowledge acquisition, each participant had to
answer surveys directly before and after the learning experience (pre- and post-test design). The
content-related questions were the same in both surveys. Still, the second survey also included
the UXIVE questionnaire ([
        <xref ref-type="bibr" rid="ref5">20</xref>
        ]) with questions related to 10 concepts of VR experiences, like
usability, flow, and presence questions.
      </p>
      <p>To gain some insights into the efects of interactivity while learning, the application, texturing,
lighting, and rasterization stage had two versions, an interactive and a non-interactive one;
see Table 1. To create a controlled environment, six diferent application modes were created,
where each game mode had diferent interactive stages. Since the interactive versions of each
Variant</p>
      <p>Application</p>
      <p>Texturing</p>
      <p>Lighting</p>
      <p>Rasterization
marks the interactive variant of a stage.
stage are more time-consuming than the non-interactive ones, each participant was shown two
interactive and two non-interactive stages.</p>
      <p>The non-interactive version of the application stage shows an animation of how each variable
of a matrix changes the appearance of an object. The interactive version builds on the
noninteractive version. After showing the animation, the learner should use the gained knowledge
about matrices to move a sun, earth, and moon as they would move in a solar system.</p>
      <p>The non-interactive texturing stage describes how a texture is stored in a 2D texture map
and then applied to a 3D object. In the interactive version of the texturing stage, the learner can
use a virtual spray can, which allows him to spray in diferent colours on the 3D object or the
2D texture map. If the participant colours the 3D object, the tint is applied to the texture, and
vice versa.</p>
      <p>The non-interactive version of the lighting stage gives a short simulation of what happens in a
scenario without lighting by turning of the light in the whole learning scenario. The interactive
version of the lighting stage shows an object and gives the learner flashlights. Through the
lfashlights, the learner can light the object from diferent angles with diferent light sources.
The learner can also switch and compare the efects of diferent shading models (Flat shading,
Gouraud shading, and Phong shading).</p>
      <p>The non-interactive rasterization stage explains how, from the 3D scenario, a colour for each
screen pixel is calculated so that a monitor can show a 3D scenario. In the interactive version
of the rasterization stage, the learner can create triangles and modify the pixel density of a
screen. Through those changes, the learner can experience how the created triangles appear on
a monitor.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Results and Dashboard</title>
      <sec id="sec-5-1">
        <title>This section will explain the LAD and present insights from the study.</title>
        <p>Research Question The dashboard is created to give educators and content creators insights
into two research questions: (1) In which aspects difer the learning experiences when using
immersive Virtual Reality instead of 2D desktop applications? and (2) In which aspects difer
the learning experiences for interactive scenarios instead of behaviouristic informative texts or
animations?.</p>
        <p>Indicators To create a dashboard that could answer the given research questions, the xAPI
statements were filtered and pre-processed to create indicators. For both research questions,
the indicators time spent per stage, quantity of specific interactions , task score, and survey score
were used. For the diference between the learning experiences in VR compared to the desktop
application, further indicators of immersion and health evaluation from the UXIVE questionnaire
were of interest.</p>
        <p>Data Analysis The data pre-processing was done using Jupyter Notebook1 and the Python
library NumPy2.</p>
        <p>The data of the xAPI statements that were collected during the RePiX VR experience was
exported into a JSON file format. The xAPI statements contain an extension with the timestamp,
which enables the calculation of the diference between the timestamp when a stage started
and it finished. For each interaction, an xAPI statement was sent. Therefore, the question of
how often each interaction was tracked, an indicator was created that counted how often each
verb of the xAPI statement was sent. For each interactive task in the RePiX VR application that
was solved correctly, the learner gained a score. The score is higher the faster a task is solved.
The final task score is the average of all scores gained during the learning experience.</p>
        <p>The survey data was used in a CSV data format export from LimeSurvey3. Data from the
UXIVE was used for the survey score, health evaluation, and immersion evaluation. The survey
consists of questions that are either positively connoted, e.g. "Please indicate to what extent
you agree with each statement: [ I enjoyed the experience so much that I was energetic.]",
or negatively connoted, e.g. "Please indicate to what extent you agree with each statement:
[During the experience my eyes hurt.]". For each question, the learner should give a score using
an 8-point Likert Scale.</p>
        <p>Furthermore, the survey score was calculated by taking the diference of the average points
for all positively and negatively connoted questions. The health and immersion evaluation was
done by calculating the score for each question related to the topic of health or immersion.
Dashboard Implementation After the analysis and pre-processing of the data, the LAD was
developed (active URL in Appendix A). The pre-processing results were exported into a JSON
ifle, which was then imported into the dashboard. The dashboard was used to create interactive
visualizations and was developed using Vue.js4 for the frontend development and Apexcharts5
for the visualizations.</p>
        <p>Dashboard Visualization This section presents the finished dashboard. The dashboard
consists of three diferent pages: overview (Figure 1), position analysis (Figure 3), and comparison
(Figure 2). Each page presents the research questions on the left side, while the top right corner
is used as a menu bar to switch between the diferent pages. Below the research question, an
explanation for the selected box plots is given. Each box plot only shows data of the participants
1https://jupyter.org/, accessed: 14.12.2023
2https://numpy.org/, accessed: 14.12.2023
3https://www.limesurvey.org/, accessed: 14.12.2023
4https://vuejs.org/, accessed: 14.12.2023
5https://apexcharts.com/, accessed: 14.12.2023
that were between 5% and 95% of the data set, and their quantiles presenting the students from
30% to 70% of the data set. The visualizations for immersion and health evaluation use only the
participants’ data between 40% to 60% of the data because the data for these two visualizations
are widely spread, and the selected data still provides a good approximation of the average
participant. Next, each view provides general data information about the study. First of all, it
shows for which group which of the stages was interactive (like Table 1). Also, a distribution
of how many data sets for each group exist is given. Lastly, information about how many
participants did not finish the study is given.</p>
        <p>In addition to the information that each page of the dashboard provides, the overview page
(Figure 1) also provides global filters on the right side. These filters are used to change the
visualizations interactively. For example, when deselecting the checkbox of Group A, the
collected data of Group A will no longer be considered for the current visualizations.</p>
        <p>Through the development process, it was found that this approach makes it dificult to
compare the visualizations of diferent filter combinations, which is the reason why the comparison
page was added. The comparison page (Figure 2) is split into two halves. Each half of the page
has its local filter, independently configuring the visualizations on the same half. Therefore,
visualizations with diferent filters are shown directly beside each other and make a direct and
adaptive comparison of the data set possible. Using the comparison page makes it possible to
answer the research questions from educators and content creators.</p>
        <p>On the last page, the position analysis was
added to analyze the movements inside the
application. The goal was to create a heat
map to indicate the most common position in
a single time step, The overall position
distribution is shown on the dashboard in Figure
3. This visualization gives teachers an
impression beyond the interaction data as to whether
diferent perspectives were used to solve the
three-dimensional tasks and to check certain
objects. Position data can also help to
distinguish users who moved themselves using the
physical VR area (or WASD keyboard navi- Figure 3: The position page of the LAD provides
gation) versus learners who preferred to tele- a distribution of the position in the
port. virtual environment of all participants
in the RePiX VR study.</p>
        <p>Results To get answers to the given
research questions, diferent filters were set on
the implemented dashboard. The first research question (1) In which aspects difer the learning
experiences when using Virtual Reality instead of 2D desktop applications?, can be investigated
by selecting the VR filter on one side of the comparison page and the desktop filter on the
other side of the comparison page. The filter selection showed that RePiX VR was perceived
as more immersive, the survey score was higher, and the participants moved more in the VR
environment. At the same time, the VR environment resulted in more health issues, a lower
task score, and fewer interactions than the desktop environment.</p>
        <p>For the second research question (2) In which aspects difer the learning experiences for
interactive scenarios instead of behaviouristic informative texts?, the only quantitative indicator was
that participants spent more time in interactive than non-interactive scenarios.</p>
        <p>However, these are only the results that can be taken directly from the quantitative data.
Educators and content creators can already reflect on the teaching application RePiX VR at
a meta-level. This study already provided us with some valuable insights showing usability
issues and further steps. When quantitative data is also (manually) integrated, it shows that the
combination of data, like the time needed with assessment, is feasible and sensible. Interactive
tasks can thereby be used to provide learners with in-app feedback.</p>
        <p>
          Another publication, which analyzed quantitative data from an earlier study, has already
shown that a reliable text-to-speech system for the guiding robot is useful to provide feedback
to the learner without having to rely on text boxes [
          <xref ref-type="bibr" rid="ref6">21</xref>
          ]. This study confirms that people with
usability problems take longer and wish for more help in the free text answers, which can be
implemented most simply, for example, via timers.
        </p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>6. Discussion</title>
      <p>
        This paper presents a dashboard in closer detail, which can be used to evaluate various aspects
of the VR experience. By using it to reflect on the learning application, the dashboard itself is
implicitly put to the test. As indicated in the description, the basic structure of the dashboard can
also be transferred to learners. Necessary adjustments include anonymization and highlighting
one’s own performance, as well as a study on the connection between user behaviour and
outcome, which can be used to generate feedback. Check Van Leeuwen et al. for more difering
aspects [
        <xref ref-type="bibr" rid="ref7">22</xref>
        ].
      </p>
      <p>Other transfers are also feasible but are beyond the scope of this paper.</p>
      <p>
        One of the key educational challenges for usage poses the duration of the VR experience, so
introducing interactivity at every stage is not the solution. Educators should emphasize selected
learning objectives and create a guided tour which fits the curriculum. One option is to split
up the VR experience or ofer elective excursions for content beyond the core curriculum, e.g.
concerning the light stage or in combination with other applications like Virtual Ray Tracer 2.0
[
        <xref ref-type="bibr" rid="ref8">23</xref>
        ] or Rayground [
        <xref ref-type="bibr" rid="ref9">24</xref>
        ].
      </p>
      <p>
        Too many interactive tasks risk losing the bigger picture, meaning that the sum of
interactive tasks on basal levels of learning goal taxonomies fades the higher-level goals into the
background. Like reading code line by line without grasping the algorithm itself, learners
could lose themselves in individual steps without developing an understanding of the rendering
pipeline as a whole - based on the theory of the block model, an educational model for program
comprehension, which also argues on diferent dimensions of knowledge in computer science,
see [
        <xref ref-type="bibr" rid="ref10">25</xref>
        ].
      </p>
      <p>
        An open question remains about the efectiveness of the didactical setup (single usage): Can
this application be used for learning in groups, engaging in a collaborative reflection process,
or is the individual experience suficient, especially regarding the principles of self-regulated
learning [
        <xref ref-type="bibr" rid="ref11">26</xref>
        ].
      </p>
      <p>The evaluation and the insights generated by the dashboard should, for now, be considered as
an individual case study (with no claims for generalizability), as the user of the dashboard has
to know both the content being taught as well as the VR application itself. Therefore, it is only
used by a small number of persons yet. Besides, interpreting the presented data is a complex
process [11], deriving actions for the didactic decisions is a reflective process. Consequently,
only actual educators or educational content creators are viable focus groups for evaluating the
dashboard.</p>
      <p>In the future, this dashboard could be extended not only by providing new filters and
visualizations but also by customization possibilities based on the well-structured xAPI format.</p>
      <p>Another possibility is to transfer it to diferent use cases while keeping the same interfaces,
data collectors, and indicators. And thereby fulfilling the criteria for shared software resources,
cross-border collaborations and adoption of open-source software, see [13]. This is particularly
easy due to the standardized data specification xAPI and the standardized interfaces provided
by OmiLAXR. Incorporating new filters and additional information could further incorporate
JEDI into the dashboard, e.g. by highlighting information certain students have in common [13].
In our case, it could be possible that there is a cluster of students with shared experiences in
prior VR usage.</p>
      <p>Furthermore, we found that the application can be a valuable resource for students apart
from the accompanying integration during the course, e.g. as interactive preparation for the
ifnal exams, which is an insight from the qualitative data we collected in the survey.</p>
    </sec>
    <sec id="sec-7">
      <title>7. Conclusion</title>
      <p>
        In this paper, we presented a study and the first attempt to create a meaningful dashboard for
educators using the RePiX VR application. We tracked user behaviour (object interaction, eye
tracking, movement, head gestures). We conducted a pre- and post-test before and after the
learning experience to gain more insights about interactivity and learning efects. In the study,
we distinguished between diferent application variants (with diferent stages, interactive or
not). Finally, the PPDAC (Problem, Plan, Data, Analysis and Conclusion) cycle was used to
create an interactive LA dashboard; see [
        <xref ref-type="bibr" rid="ref12">27</xref>
        ]. Although the dashboard delivers a good overview
of our study, the interpretation needs some expertise in the learning content. As a result, our
research prototype has been identified as a valuable additional learning material for students
of the computer graphics course at our university and further steps for the development of
the application and didactic decisions were made. The collected data, in combination with our
dashboard, also emphasized xAPI as a useful data format for realizing LA dashboards because
of its flexibility and expandability.
[3] T. Susnjak, G. S. Ramaswami, A. Mathrani, Learning analytics dashboard: a tool for
providing actionable insights to learners, International Journal of Educational Technology
in Higher Education 19 (2022) 12. doi:10.1186/s41239-021-00313-7.
[4] T. C. Reeves, A Research Agenda for Interactive Learning in the New Millennium,
Association for the Advancement of Computing in Education (AACE), 1999, pp. 15–20. URL:
https://www.learntechlib.org/primary/p/17393/.
[5] R. Moreno, R. Mayer, Interactive Multimodal Learning Environments, Educational
Psychology Review 19 (2007) 309–326. doi:10.1007/s10648-007-9047-2.
[6] M. Gittinger, D. Wiesche, Systematic review of spatial abilities and virtual reality: The role
of interaction, Journal of Engineering Education n/a (2023). doi:10.1002/jee.20568.
[7] J. Buchner, M. Kerres, Media comparison studies dominate comparative research on
augmented reality in education, Computers &amp; Education 195 (2023) 104711. doi:10.1016/
j.compedu.2022.104711.
[8] B. Heinemann, S. Görzen, U. Schroeder, Teaching the basics of computer graphics in virtual
reality, Computers &amp; Graphics 112 (2023) 1–12. doi:10.1016/j.cag.2023.03.001.
[9] P. Blikstein, M. Worsley, Multimodal Learning Analytics and Education Data Mining: Using
Computational Technologies to Measure Complex Learning Tasks, Journal of Learning
Analytics 3 (2016) 220–238. doi:10.18608/jla.2016.32.11.
[10] H. Ouhaichi, D. Spikol, B. Vogel, Research trends in multimodal learning analytics: A
systematic mapping study, Computers and Education: Artificial Intelligence 4 (2023)
100136. doi:10.1016/j.caeai.2023.100136.
[11] X. Ochoa, Multimodal learning analytics - rationale, process, examples, and
direction, in: C. Lang, G. Siemens, A. F. Wise, D. Ga\v{v}evi\’{c}, A. Merceron (Eds.), The
handbook of learning analytics, 2 ed., SoLAR, Vancouver, Canada, 2022, pp. 54–65. Doi:
10.18608/hla22.006.
[12] K. Verbert, X. Ochoa, R. De Croon, R. A. Dourado, T. De Laet, Learning analytics dashboards:
the past, the present and the future, in: Proceedings of the Tenth International Conference
on Learning Analytics &amp; Knowledge, LAK ’20, Association for Computing Machinery,
New York, NY, USA, 2020, pp. 35–40. doi:10.1145/3375462.3375504.
[13] K. Williamson, R. Kizilcec, A Review of Learning Analytics Dashboard Research in Higher
Education: Implications for Justice, Equity, Diversity, and Inclusion, in: LAK22: 12th
International Learning Analytics and Knowledge Conference, LAK22, Association for
Computing Machinery, New York, NY, USA, 2022, pp. 260–270. doi:10.1145/3506860.
3506900.
[14] Deep Dive: Result, https://xapi.com/blog/deep-dive-result/, 2013.
[15] M. Ehlenz, B. Heinemann, T. Leonhardt, R. Röpke, V. Lukarov, U. Schroeder, Eine
forschungspraktische Perspektive auf xAPI-Registries, in: DELFI 2020 – Die 18. Fachtagung
Bildungstechnologien der Gesellschaft für Informatik e.V., Gesellschaft für Informatik e.V.,
Bonn, 2020, p. 6.
[16] B. Heinemann, M. Ehlenz, S. Görzen, U. Schroeder, xAPI Made Easy: A Learning Analytics
Infrastructure for Interdisciplinary Projects, International Journal of Online and Biomedical
Engineering (iJOE) 18 (2022) 99–113. doi:10.3991/ijoe.v18i14.35079.
[17] B. Heinemann, S. Görzen, U. Schroeder, RePiX VR - Learning environment for the
Rendering Pipeline in Virtual Reality, in: J.-J. Bourdin, E. Paquette (Eds.), Eurographics 2022
      </p>
    </sec>
    <sec id="sec-8">
      <title>A. Online Resources</title>
      <sec id="sec-8-1">
        <title>The application, data and all sources are available via</title>
        <p>• Dashboard code and online demo instance
• VR Learning Application (RePiX VR) code and website</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>J.</given-names>
            <surname>Radianti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Majchrzak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Fromm</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Wohlgenannt,</surname>
          </string-name>
          <article-title>A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned</article-title>
          , and research agenda,
          <source>Computers &amp; Education</source>
          <volume>147</volume>
          (
          <year>2020</year>
          )
          <article-title>103778</article-title>
          . doi:
          <volume>10</volume>
          .1016/j.compedu.
          <year>2019</year>
          .
          <volume>103778</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>A.</given-names>
            <surname>Ansone</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. F.</given-names>
            <surname>Dreimane</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Zalite-Supe</surname>
          </string-name>
          ,
          <article-title>Framework of Pedagogic and Usability Principles for Efective Multi-user VR Learning Applications</article-title>
          , in: M.
          <string-name>
            <surname>-L. Bourguet</surname>
            ,
            <given-names>J. M.</given-names>
          </string-name>
          <string-name>
            <surname>Krüger</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Pedrosa</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Dengel</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Peña-Rios</surname>
          </string-name>
          , J. Richter (Eds.),
          <source>Immersive Learning Research Network, Communications in Computer and Information Science</source>
          , Springer Nature Switzerland, Cham,
          <year>2024</year>
          , pp.
          <fpage>96</fpage>
          -
          <lpage>110</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>031</fpage>
          -47328-
          <issue>9</issue>
          _
          <fpage>7</fpage>
          .
          <string-name>
            <given-names>Education</given-names>
            <surname>Papers</surname>
          </string-name>
          , Reims, France,
          <year>2022</year>
          , p.
          <fpage>8</fpage>
          . doi:
          <volume>10</volume>
          .2312/eged.20221040.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>D. R.</given-names>
            <surname>Krathwohl</surname>
          </string-name>
          ,
          <article-title>A Revision of Bloom's Taxonomy: An Overview</article-title>
          ,
          <source>Theory Into Practice</source>
          <volume>41</volume>
          (
          <year>2002</year>
          )
          <fpage>212</fpage>
          -
          <lpage>218</lpage>
          . doi:
          <volume>10</volume>
          .1207/s15430421tip4104_2, publisher: Routledge.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>S.</given-names>
            <surname>Görzen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Heinemann</surname>
          </string-name>
          , U. Schroeder,
          <article-title>Towards using the xAPI specification for Learning Analytics in VR, in: Learning Analytics for Virtual Reality (LAVR) Workshop of the 14th International Conference on Learning Analytics and Knowledge (LAK24), Kyoto</article-title>
          , Japan,
          <year>2024</year>
          , p.
          <fpage>9</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>K.</given-names>
            <surname>Tcha-Tokey</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Christmann</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Loup-Escande</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Loup</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Richir</surname>
          </string-name>
          ,
          <article-title>Towards a Model of User Experience in Immersive Virtual Environments, Advances in Human-Computer Interaction 2018 (</article-title>
          <year>2018</year>
          )
          <article-title>e7827286</article-title>
          . doi:
          <volume>10</volume>
          .1155/
          <year>2018</year>
          /7827286, publisher: Hindawi.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>B.</given-names>
            <surname>Heinemann</surname>
          </string-name>
          , U. Schroeder,
          <article-title>Evaluating Usability and User Feedback in an Immersive Virtual Reality Environment for Computer Science Education</article-title>
          , in: O.
          <string-name>
            <surname>Viberg</surname>
            ,
            <given-names>I. Jivet</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>P.-J.</given-names>
            <surname>Muñoz-Merino</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Perifanou</surname>
          </string-name>
          , T. Papathoma (Eds.),
          <source>Responsive and Sustainable Educational Futures, Lecture Notes in Computer Science</source>
          , Springer Nature Switzerland, Cham,
          <year>2023</year>
          , pp.
          <fpage>718</fpage>
          -
          <lpage>724</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>031</fpage>
          -42682-7_
          <fpage>67</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [22]
          <string-name>
            <surname>A. Van Leeuwen</surname>
            ,
            <given-names>S. D.</given-names>
          </string-name>
          <string-name>
            <surname>Teasley</surname>
            ,
            <given-names>A. F.</given-names>
          </string-name>
          <string-name>
            <surname>Wise</surname>
          </string-name>
          ,
          <article-title>Teacher and Student Facing Learning Analytics</article-title>
          , in: C. Lang,
          <string-name>
            <given-names>G.</given-names>
            <surname>Siemens</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. F.</given-names>
            <surname>Wise</surname>
          </string-name>
          (Eds.),
          <source>The Handbook of Learning Analytics</source>
          ,
          <volume>2</volume>
          <fpage>ed</fpage>
          .,
          <source>SOLAR</source>
          ,
          <year>2022</year>
          , pp.
          <fpage>130</fpage>
          -
          <lpage>140</lpage>
          . doi:
          <volume>10</volume>
          .18608/hla22.
          <fpage>013</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [23]
          <string-name>
            <surname>C. S. Van Wezel</surname>
            ,
            <given-names>W. A.</given-names>
          </string-name>
          <string-name>
            <surname>Verschoore De La Houssaije</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Frey</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Kosinka</surname>
          </string-name>
          ,
          <source>Virtual Ray Tracer 2.0, Computers &amp; Graphics</source>
          <volume>111</volume>
          (
          <year>2023</year>
          )
          <fpage>89</fpage>
          -
          <lpage>102</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.cag.
          <year>2023</year>
          .
          <volume>01</volume>
          .005.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [24]
          <string-name>
            <given-names>A. A.</given-names>
            <surname>Vasilakis</surname>
          </string-name>
          , G. Papaioannou,
          <string-name>
            <given-names>N.</given-names>
            <surname>Vitsas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Gkaravelis</surname>
          </string-name>
          ,
          <article-title>Remote Teaching Advanced Rendering Topics Using the Rayground Platform</article-title>
          ,
          <source>IEEE Computer Graphics and Applications</source>
          <volume>41</volume>
          (
          <year>2021</year>
          )
          <fpage>99</fpage>
          -
          <lpage>103</lpage>
          . doi:10/gpq9pj.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>C.</given-names>
            <surname>Schulte</surname>
          </string-name>
          ,
          <article-title>Block model: An educational model of program comprehension as a tool for a scholarly approach to teaching</article-title>
          ,
          <source>Proceedings of the Fourth International Workshop on Computing Education Research</source>
          , ACM,
          <year>2008</year>
          , pp.
          <fpage>149</fpage>
          -
          <lpage>160</lpage>
          . doi:
          <volume>10</volume>
          .1145/1404520. 1404535, new York, NY, USA.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>P.</given-names>
            <surname>Winne</surname>
          </string-name>
          ,
          <article-title>Learning Analytics for Self-Regulated Learning</article-title>
          , in: C. Lang,
          <string-name>
            <given-names>G.</given-names>
            <surname>Siemens</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. F.</given-names>
            <surname>Wise</surname>
          </string-name>
          , D. Gaševic (Eds.),
          <source>The Handbook of Learning Analytics</source>
          ,
          <volume>1</volume>
          <fpage>ed</fpage>
          .,
          <source>Society for Learning Analytics Research (SoLAR)</source>
          , Alberta, Canada,
          <year>2017</year>
          , pp.
          <fpage>241</fpage>
          -
          <lpage>249</lpage>
          . URL: http: //solaresearch.org/hla-17/
          <fpage>hla17</fpage>
          -
          <lpage>chapter1</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>A.</given-names>
            <surname>Wolf</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Gooch</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. J. C.</given-names>
            <surname>Montaner</surname>
          </string-name>
          , U. Rashid, G. Kortuem,
          <article-title>Creating an Understanding of Data Literacy for a Data-driven Society</article-title>
          ,
          <source>The Journal of Community Informatics</source>
          <volume>12</volume>
          (
          <year>2016</year>
          ). doi:
          <volume>10</volume>
          .15353/joci.v12i3.3275,
          <issue>number</issue>
          :
          <fpage>3</fpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>