<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <issn pub-type="ppub">1613-0073</issn>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Towards using the xAPI specification for Learning Analytics in Virtual Reality</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Sergej Görzen</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Birte Heinemann</string-name>
          <email>heinemann@cs.rwth-aachen.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ulrik Schroeder</string-name>
          <email>schroeder@cs.rwth-aachen.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="editor">
          <string-name>Virtual Reality, Learning Analytics, xAPI, OmiLAXR Framework, Infrastructure</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>RWTH Aachen University</institution>
          ,
          <addr-line>Ahornstraße 55, 52074 Aachen</addr-line>
          ,
          <country country="DE">Germany</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Virtual Reality (VR) learning applications enable innovative learning opportunities whose efectiveness can be investigated with Learning Analytics (LA). Implementing Learning Analytics in Virtual Reality poses challenges, and isolated solutions are being created. This paper looks at the state-of-the-art current data tracking technologies and presents an approach to facilitate the development process of integrating xAPI for Learning Analytics in VR. It advocates the necessity of restrictions and concepts fostering discourse on additional requirements.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>CEUR
ceur-ws.org</p>
    </sec>
    <sec id="sec-2">
      <title>1. Introduction</title>
      <p>
        Virtual Reality (VR) applications for educational purposes have garnered significant attention
in various research domains, demonstrating positive impacts in educational contexts [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>
        The integration of VR technology for educational purposes, considering its multi-modal
aspects, presents developers, content creators, and designers with a myriad of challenges
spanning diverse hardware setups, didactic and instructional design, software development, and the
identification of meaningful metrics for evaluations. Learning Analytics (LA) is emerging as
a valuable option for evaluating multi-modal scenarios. With diverse objectives, such as
enhancing the learning process, identifying learning behaviors or dificulties, and recommending
interventions, LA design is intricately nuanced. The correct tracking of VR activities introduces
additional challenges (including the diverse array of VR approaches and equipment) [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
Developers may struggle not only with the complexities of the LA design process but also contend
with challenges related to multi-modal LA.
      </p>
      <p>
        One possible approach is to use the four dimensions of the Learning Analytics reference
model [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. A concrete definition of
      </p>
      <p>what data to track (environment) and how to achieve this
(method) is needed considering all stakeholders (who) and goals (why). Further, a ”correct” and
https://elearn.rwth-aachen.de/schroeder (U. Schroeder)
CEUR
Workshop
Proceedings
https://elearn.rwth-aachen.de/goerzen (S. Görzen); https://elearn.rwth-aachen.de/heinemann (B. Heinemann);
© 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
”complete” integration of LA data could influence further results. That makes this part of the
LA design step very important for all use cases.</p>
      <p>
        However, technological standards do already exist (see [
        <xref ref-type="bibr" rid="ref4 ref5">4, 5</xref>
        ]), yet overcoming interdisciplinary
and multi-modal challenges and limitations remains a huge task [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. This paper describes
research focusing on reducing the common work for developers while enabling Learning
Analytics for Virtual Reality scenarios. We chose the eXperience API (xAPI) specification for the
Learning Analytics data format. According to [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ], xAPI has chances for multi-modal contexts.
However, the research community needs a consensus on working with it. Thus, we developed
a software ecosystem that makes working with xAPI more convenient and consistent. This
ecosystem contains a tool set and a framework called OmiLAXR (more in section 4).
      </p>
      <p>As technical needs for such tasks are not well published, this paper aims to contribute to
the direction of requirements and challenges for enabling Learning Analytics (especially with
xAPI) in educational VR scenarios. To achieve this, we present a concept of our approach using
xAPI for Virtual Reality and how we mapped a representation of a VR scenario into the xAPI
specification. Further, this paper delivers the first results of a study where the participants used
our concept in practice supported by a framework we’ve implemented.</p>
    </sec>
    <sec id="sec-3">
      <title>2. Using xAPI for Learning Analytics in VR</title>
      <p>
        The eXperience API is organized in the JSON data format. It is designed to collect data from
a wide range of experiences. Utilizing xAPI, we articulate actors’ activities (agent or group)
through structured statements: an actor is doing (verb) something (object/activity). Augmenting
these statements with xAPI extensions enables the incorporation of additional details, such as
learning scenario specifics ( context extension), detailed information about the target activity
or object (activity extension), or supplementary insights into task progress (result extensions).
Each statement fragment has a URI as a unique identifier and additional description. While
working with xAPI it is helpful to use xAPI Registries1 for statement construction. The xAPI
specification was derived from SCORM, formerly designed for Learning Management Systems
(LMSs). But with the changes for xAPI, it became freer in terms of use and independent from
any platform. [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]
      </p>
      <p>
        At first glance, xAPI is easy to use, but its freedom makes the usage not trivial for virtual
reality. According to [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] xAPI has the potential for multi-modal learning scenarios (and VR
is one). However, more specifications and research on how to define statements in detail are
needed.
      </p>
      <p>After the definition of what interactions to track, developers should know exactly how to
design them in the form of an xAPI statement. This leads (in our experience) to questions like:
How do we call the activities? How generic or specific shall they be? What further information
about the interaction is needed? What extensions do we need, and how do we name them?
What value format shall the extension be (e.g. tuple, number, struct, etc.)? Unlike in an LMS,
where a mouse click triggers an interaction, there are some special challenges in VR. Developers
need to decide when users trigger new activities. This includes (for example) defining when
users are moving or are nodding their head, excluding jittering efects. Further, knowing how to</p>
      <sec id="sec-3-1">
        <title>1https://xapi.com/registry/, accessed 23.01.2024</title>
        <p>handle time-based sensor information like heart rate is important. Options are, e.g. to translate
them into activity-based data, make them a part of an xAPI statement in the form of extensions,
or ask if an additional data format is needed. Finally, developers may need more complex
statements that refer to each other. All of this is possible using xAPI, and xAPI profiles may
help with some of these challenges. However, designing xAPI profiles itself is not an easy task.
We decided that the usage and maintenance of xAPI registered may be enough for exploration
until a unification is found.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>3. Related Work</title>
      <p>
        There are established models, specifications, and diferent frameworks for working with LA
(see [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]). For example, in [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], the authors proposed a framework for STEM education in VR with
the four dimensions of Technology, Pedagogy, Psychology, and Learning Analytics. All four
dimensions are important for our framework. Thus, discussing related work in all disciplines
would be fair, but we limit here to a small set focusing on data-gathering.
      </p>
      <p>
        Related frameworks and tools The Unity Experiment Framework (UXF) for observing
human behavior in virtual environments is explained in [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. UXF supports real-time data
collection with configurable settings and tools, allowing the integration of sensors like eye tracking
and EEG. The framework streamlines experiment development. UXF is beneficial for supporting
research setups and enhancing data gathering. On the opposite VRSTK [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] takes a holistic
approach to support VR experiment creation in Unity. It provides scripts, components, and
tools for VR application development. It features scene replay, data import/export, multiplayer
support, and tracking of various elements such as movement, gaze, eyes, game objects, and EEG.
The UnityGBLxAPI2 framework already implements xAPI for Unity, focusing on game-based
learning and virtual worlds, especially in K-12 education. The implementation of xAPI is very
raw and may end up in typos or inconsistencies.
      </p>
      <p>Previous work The xAPI Registry was initiated to create conventions in the multidisciplinary
use of xAPI. Developers and researchers can propose changes over GitLab or the web interface
having the same URL as the IRI of a definition (e.g. https:// xapi.elearn.rwth-aachen.de/ definitions/
virtualReality/ verbs/ teleported). Its definition is written in JSON using a strict folder structure
represented also by the IRI path (e.g. {rootFolder}/definitions/virtualReality/verbs/teleported.json ).</p>
      <p>
        Using this xAPI Registry in web projects may be straightforward, but in VR projects, it is still
challenging (see introduction). Besides conceptual challenges, the manual application of the
registry can still lead to inconsistencies and coding overhead. As presented in [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], we developed
the xAPI Definitions Fetcher Tool for synchronizing the xAPI Registry and VR projects. For a
better Unity workflow for this tool, we also created the Unity package ”xAPI 4 Unity”. Instead
of providing verbs by strings and dictionaries, we deploy a strict syntax using developers’ IDE
(strict types, avoiding typos, showing field and method descriptions, consistency, correct usage).
Passing an xAPI verb can be done by calling, e.g. xAPI_Definitions.virtualReality.verbs.teleported .
This approach helps to ensure consistent xAPI development.
      </p>
      <sec id="sec-4-1">
        <title>2https://github.com/gblxapi/UnityGBLxAPI, accessed 15.12.2023</title>
        <p>
          When developing the VR application RePiX VR, we started to extend the xAPI Registry by
VR-related vocabulary for assessing the learning progress. We aimed to collect data that help us
to understand overall performance in the scenario, actions taken (activities like button presses),
gaze direction (eye tracking), head movement (nodding and shaking), and gain insights into the
learning environment itself, see [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ].
        </p>
        <p>Using a design-based research method, we implemented a data collection library using
the strict classes of the xAPI registry specialized for our use case and needs. Based on our
experience with this work, we extended the idea to a more generic framework. Considering
further technologies, we started to plan in the direction of eXtended Reality (XR). The concepts
will be explained in the next section.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>4. A proposal for supporting LA in XR</title>
      <p>
        To develop a sustainable solution for various applications, one needs to think outside of the
scope of a specific application stack. Involving thesis projects, analysis through diferent
dashboards, and conducted studies, we identified additional technical requirements from diverse
perspectives. Starting with basic data-gathering components for activities, especially for the
learning application RePiX VR [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], the result is a dedicated framework, known as OmiLAXR
(Open and modular integration of Learning Analytics in XR) to facilitate the seamless integration
of Learning Analytics in XR applications by using a ”Plug &amp; Play”-principle. In Fig. 1, the
approach is presented in an abstract representation, aiming to foster discussion while bypassing
technical details about Unity. It shows how we have mapped components from a learning
context into xAPI.
      </p>
      <p>XR Adapters System The XR Adapters System is the idea of having a ”Plug &amp; Play” mechanism.
Adapters can specify through interfaces (for example) how a XR User behaviors (teleport, interact,
gaze, ...) inside a specific XR framework. Subsequently, these interfaces are utilized to generate
xAPI statements by using a model of the current learning context. With the idea of adapters, the
integration of third-party libraries is enabled. They are allowed to directly communicate with
the Main Tracking System, opening the potential to create a bridge between the frameworks
mentioned in Related Work and the approach for connecting measuring technologies like
heartbeat, EEG, etc. Compatibility is an important criterion, and the idea of having adapters
may cover it.</p>
      <p>
        Learning Context Representation Our concept relies on a comprehensive representation of
a learning context (independently from any XR framework), which enriches an xAPI statement
with further information. Besides some statements from the System, in our design, the statements
are generated ”Learner-centered”. The Learner component serves as the primary representation,
embodying an XR and desktop representation (XR User and Non-XR User ), including head, body,
and hands. This component takes the role of the xAPI Actor (highlighted in green). Inside
our learning context model, we designed a Learning Scenario consisting of multiple Learning
Units, each containing Assignments with numerous Tasks (and recursively nested Sub-Tasks;
highlighted in orange). These components contribute to creating a structured learning path
within a learning context and need to be defined by developers. In our application, for example,
the stages of the rendering pipeline represent learning units, and learners must complete
assignments and tasks within each stage to continue the experience of the rendering pipeline
(see [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]). In VR, for task completion, there is a need for interaction. As not all Virtual Objects,
which are distinctly categorized as Pointables (for laser pointers) and Interactables (for direct
interactions), are interesting for analytics, we define the important as Trackable or Gazeable.
Additionally, roles such as Guide and Collaborator, represented in blue, denote significant player
or non-player participants and may potentially serve as xAPI Activities/Objects.
Main Tracking System The Main Tracking System manages a default and extensible
collection of modular tracking systems holding a collection of modular tracking controllers handling
each a small scope. For example, the Interaction Tracking System has separate controllers for
laser pointer, mouse, keyboard, or hand interactions.
      </p>
      <p>All tracking events get composed through an xAPI interface to a standardized xAPI format.
Final statements get forwarded asynchronously to storage controllers, e.g. for the Learning
Record Store, and get caught (in case of connection issues) on the local storage on the hard
drive.
xAPI Statement Representation As already mentioned, a learning context model is used
to set some xAPI fragments (like context, authority, or actor). But verbs and activities have to
be defined in (default or custom) tracking systems (including result- and activity extensions).
Following the xAPI specification, a context contains information about the platform in free
form (in our case, in the format of {framework}:{vr_type}:{operating_system}). This records the
framework and the VR setup in which the statement was created (e.g. desktop mode and
Windows 10). Besides simple information like language and instructor, with the help of context
extensions, we added more information about the learning scenario. For example, we track
(1) which XR application (game) was used, (2) the version, (3) if it has a specific game mode,
and (4) the users location in the learning path (learning scenario, learning unit, assignment and
task). We defined all needed parts in the xAPI registry (verbs, activities, and extensions for
Eye Tracking and Virtual Reality), but we also reused existing contexts, e.g. ”seriousGames” or
”generic”.</p>
    </sec>
    <sec id="sec-6">
      <title>5. Work in progress and results</title>
      <p>
        As already described, the former concept is an abstract view of what we have realized for
applying xAPI for virtual reality as a framework. Thus, an evaluation of this framework is
a practical evaluation of this concept. Accordingly, the data-collection mechanism was
(pre)evaluated through smaller studies (see [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]), study courses, and thesis projects for further use
cases. Following design-based research cycles, we adapted lacks and extended our framework
(e.g. by the learning scenario model) for experiments in learning research. In these studies, the
usefulness of the generated xAPI content was validated by creating several explorations. One
result, for example, is a Learning Analytics dashboard (see [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]), with which it’s possible to
compare diferent variants of a learning scenario. For this, the idea of using context and results
extensions of Fig. 1 was used. Each statement contains information at what point (at which
task, assignment, and learning unit) it was created and distinguished between diferent variants
of our research object used. In reflection, this concept worked well in conducting research with
diferent variants of the learning scenario. In addition to evaluating the data results, we wanted
to evaluate developers’ workflow using our approach by finding out challenges and how they
are satisfied with the generated statements considering the efort they needed to make. As it is
not trivial how to evaluate a framework (and an ecosystem), we created a concept of how to
design the study [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] and conducted the study in the summer term of 2023.
      </p>
      <p>Six months, we observed seven computer science master students (with pre-knowledge in LA)
as developers using OmiLAXR for the integration of xAPI into the VR application Teach-R3 and
analyzed the OmiLAXR ecosystem according to the guiding criteria: productivity, workflow,
usability, functionality, and challenges. As the final results of this observation study are still in
progress, we summarize some qualitative results from post-interviews and observations in this
paper.</p>
      <p>After struggling with setting up the framework, creating xAPI statements was easy and they
created statements very easily. The students said they enjoyed having a rich set of existing
tracking controllers and welcomed the automatized gathered information. They appreciated
creating additional xAPI statements using the xAPI Registry and a strict C# syntax. The students
could use xAPI extensions easily but had uncertainties regarding providing the correct types
for the values. Overall, they worked with xAPI, knowing the basics without diving deep into
specifications.</p>
      <p>Using the framework, the students created VR visualizations (e.g. a heat map on a surface).
In their use cases, diferent sources of position data were most important (e.g. position of player,
head, and hands). The log of positions is only done on changes and is done on an interval.
This behaviour fits well for visualizing the player’s path and the heat map, but, e.g. visualizing
the head or hands had some jittering efects. This is unambiguously a technical challenge.
One challenge was handling system activities (e.g. ”System triggered behaviour”). The xAPI
specification is not concrete in what an actor is. So, it was allowed to exchange the actor with a
system actor. This need was extended in the concept and framework, which happened during
the study.</p>
      <p>Even though the idea of how to work with xAPI in Virtual Reality is still a work in progress, all
goals of the participants for preparing Learning Analytics for Teach-R were completely achieved.
Missing components could been added easily due to the modular design. A developer-near
representation for xAPI management (and according to mechanisms realized by the framework)
helped here well. Diving deep into xAPI specification was unnecessary except for a few basics.</p>
    </sec>
    <sec id="sec-7">
      <title>6. Discussion</title>
      <p>The xAPI specification gives some freedom in how to use it. We consistently observed beginners
struggling to use xAPI. In addition, repetitive work often has to be done, and in bigger projects,
the usage without any guidelines may result in inconsistencies.</p>
      <p>
        Guided by a framework, the design of how to map a VR scenario into xAPI statements worked
well for our use cases. Further, the idea of the Learning Context Representation was clear on
both sides: our developer participants understood it fast, and also, for analysis by educators,
it was useful [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. But at the same time, the system needed to be explained because of bad
documentation. Here, it is important to be careful and find a good balance for instructional
designers and developers. The aim shall be to find an easy interface for designing a learning
scenario and to generate useful xAPI statements from it.
      </p>
      <p>Further, this approach does not explain how to handle time-based information best, like
movement or heart rate, avoiding a huge collection of senseless or repeating data. We think
this is clearly a technical challenge that has to be explored more.</p>
      <sec id="sec-7-1">
        <title>3https://teach-r.de, accessed 23.02.2024</title>
        <p>
          In addition, this approach focuses on ”simple logs” reflecting users’ ”breadcrumbs” [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ]
without more complex interdependencies. Although simple metrics may sufice for many
analytics goals, incorporating more complex xAPI statements, such as those involving context
activities [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ] or combining multiple statements semantically to a new one, could be beneficial.
For example, the combination of looking at the instruction, nodding with head, and solving
the task may lead to a statement like ”actor understood the task”. To some degree, this can be
done in post-analyses, but we believe that more interdependent tasks may open more doors to
information and, thus, deeper analytics.
        </p>
        <p>However, while we’ve delved into specific requirements and presented a concept for our xAPI
creation in VR, several questions still exist: What are the specific requirements for LA in VR?
Are more complex xAPI statements necessary, and if so, what form should they take, and how
can they be implemented? How can we ensure good quality in using xAPI for VR?</p>
      </sec>
    </sec>
    <sec id="sec-8">
      <title>7. Conclusion</title>
      <p>In this paper, we presented an approach to how to use xAPI for VR. This idea was guided by a
framework and was evaluated using (directly) two VR scenarios. Implementing xAPI efectively
in Virtual Reality is possible but demands constraints and tool support. These constraints could
be a specific syntax, a proper xAPI registry, profiles, or a combination of these elements. The
presented framework supports the first features. Although the framework is built for Unity, we
presented our technical concepts on a more abstract level to make them transferable to other
platforms. For example, we plan an exploration of WebXR technology.</p>
      <p>Supporting programmers in creating good quality Learning Analytics data is an iterative
process that is dificult to evaluate and depends on discussion with others. The efort shows
that a modular framework design may support many applications but makes the startup harder.</p>
      <p>Nevertheless, xAPI may be suited for VR, but the learning community is confronted with
conceptual and technological challenges. We need to explore more use cases and find the
limitations of xAPI in Virtual Reality. Focus groups with Virtual Reality and Learning Analytics
experts are important to discuss findings and gather valuable input.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>J.</given-names>
            <surname>Radianti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Majchrzak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Fromm</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Wohlgenannt,</surname>
          </string-name>
          <article-title>A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned</article-title>
          , and research agenda,
          <source>Computers &amp; Education</source>
          <volume>147</volume>
          (
          <year>2020</year>
          )
          <article-title>103778</article-title>
          . doi:
          <volume>10</volume>
          .1016/j.compedu.
          <year>2019</year>
          .
          <volume>103778</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>A.</given-names>
            <surname>Christopoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Pellas</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.-J. Laakso</surname>
          </string-name>
          ,
          <article-title>A Learning Analytics Theoretical Framework for STEM Education Virtual Reality Applications</article-title>
          ,
          <source>Education Sciences</source>
          <volume>10</volume>
          (
          <year>2020</year>
          )
          <article-title>317</article-title>
          . doi:
          <volume>10</volume>
          .3390/educsci10110317.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Chatti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. L.</given-names>
            <surname>Dyckhof</surname>
          </string-name>
          , U. Schroeder,
          <string-name>
            <given-names>H.</given-names>
            <surname>Thüs</surname>
          </string-name>
          ,
          <article-title>A reference model for learning analytics</article-title>
          ,
          <source>International Journal of Technology Enhanced Learning</source>
          <volume>4</volume>
          (
          <year>2012</year>
          )
          <article-title>318</article-title>
          . doi:10/gdm24h.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>S.</given-names>
            <surname>Schürstedt</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Geiger</surname>
          </string-name>
          ,
          <article-title>Einsatz von VR-Technologien in BIM/GIS</article-title>
          , in: Proceeding:
          <fpage>31</fpage>
          .
          <string-name>
            <surname>Forum</surname>
            <given-names>Bauinformatik</given-names>
          </string-name>
          , Universitätsverlag der TU Berlin, Berlin,
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>M.</given-names>
            <surname>Wolfel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Hepperle</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. F.</given-names>
            <surname>Purps</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Deuchler</surname>
          </string-name>
          , W. Hettmann,
          <article-title>Entering a new Dimension in Virtual Reality Research: An Overview of Existing Toolkits, their Features and Challenges</article-title>
          , in: 2021
          <source>International Conference on Cyberworlds (CW)</source>
          , IEEE, Caen, France,
          <year>2021</year>
          , pp.
          <fpage>180</fpage>
          -
          <lpage>187</lpage>
          . doi:
          <volume>10</volume>
          .1109/CW52790.
          <year>2021</year>
          .
          <volume>00038</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>M.</given-names>
            <surname>Ehlenz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Heinemann</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Leonhardt</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Röpke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Lukarov</surname>
          </string-name>
          , U. Schroeder,
          <article-title>Eine forschungspraktische Perspektive auf xAPI-Registries</article-title>
          , in: DELFI 2020 -
          <article-title>Die 18. Fachtagung Bildungstechnologien der Gesellschaft für Informatik e</article-title>
          .V., Gesellschaft für Informatik e.V.,
          <string-name>
            <surname>Bonn</surname>
          </string-name>
          ,
          <year>2020</year>
          , p.
          <fpage>6</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>B.</given-names>
            <surname>Miller</surname>
          </string-name>
          , Deep Dive: Result, https://xapi.com/blog/deep-dive-result/,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>J.</given-names>
            <surname>Brookes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Warburton</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Alghadier</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Mon-Williams</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Mushtaq</surname>
          </string-name>
          ,
          <article-title>Studying human behavior with virtual reality: The Unity Experiment Framework</article-title>
          ,
          <source>Behavior Research Methods</source>
          <volume>52</volume>
          (
          <year>2020</year>
          )
          <fpage>455</fpage>
          -
          <lpage>463</lpage>
          . doi:
          <volume>10</volume>
          .3758/s13428- 019- 01242- 0.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>B.</given-names>
            <surname>Heinemann</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Ehlenz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Görzen</surname>
          </string-name>
          , U. Schroeder, xAPI Made Easy:
          <article-title>A Learning Analytics Infrastructure for Interdisciplinary Projects</article-title>
          ,
          <source>International Journal of Online and Biomedical Engineering (iJOE) 18</source>
          (
          <year>2022</year>
          )
          <fpage>99</fpage>
          -
          <lpage>113</lpage>
          . doi:
          <volume>10</volume>
          .3991/ijoe.v18i14.
          <fpage>35079</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>B.</given-names>
            <surname>Heinemann</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Görzen</surname>
          </string-name>
          , U. Schroeder,
          <article-title>Teaching the basics of computer graphics in virtual reality</article-title>
          ,
          <source>Computers &amp; Graphics</source>
          <volume>112</volume>
          (
          <year>2023</year>
          ). doi:
          <volume>10</volume>
          .1016/j.cag.
          <year>2023</year>
          .
          <volume>03</volume>
          .001.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>B.</given-names>
            <surname>Heinemann</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Görzen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Dragoljić</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. F.</given-names>
            <surname>Meiendresch</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Troll</surname>
          </string-name>
          ,
          <string-name>
            <given-names>U.</given-names>
            <surname>Schroeder</surname>
          </string-name>
          ,
          <article-title>A Learning Analytics Dashboard to Investigate the Influence of Interaction in a VR Learning Application, in: Learning Analytics for Virtual Reality (LAVR</article-title>
          ) Workshop at the 14th
          <source>International Conference on Learning Analytics and Knowledge (LAK24)</source>
          , Kyoto, Japan,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>S.</given-names>
            <surname>Görzen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Heinemann</surname>
          </string-name>
          , U. Schroeder,
          <article-title>Ein Konzept zur Evaluierung eines Ökosystems für die Integration von Learning Analytics in Virtual Reality, Gesellschaft für Informatik e</article-title>
          .V.,
          <year>2023</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>V.</given-names>
            <surname>Camilleri</surname>
          </string-name>
          , S. de Freitas,
          <string-name>
            <given-names>M.</given-names>
            <surname>Montebello</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>McDonagh-Smith</surname>
          </string-name>
          ,
          <article-title>A case study inside virtual worlds: Use of analytics for immersive spaces</article-title>
          ,
          <source>in: Proceedings of the Third International Conference on Learning Analytics and Knowledge - LAK '13</source>
          , ACM Press, Leuven, Belgium,
          <year>2013</year>
          , p.
          <fpage>230</fpage>
          . doi:
          <volume>10</volume>
          .1145/2460296.2460341.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          <article-title>• OmiLAXR Website, OmiLAXR Docs for Developers, OmiLAXR: xAPI Data Tracking Ecosystem (GitLab), • ”xAPI 4 Unity” Package, xAPI Definition Fetcher , • VR Learning Application (RePiX VR)</article-title>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>