<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">A Learning Analytics Dashboard to Investigate the Influence of Interaction in a VR Learning Application ⋆</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Birte</forename><surname>Heinemann</surname></persName>
							<email>heinemann@cs.rwth-aachen.de</email>
							<affiliation key="aff0">
								<orgName type="institution">RWTH Aachen University</orgName>
								<address>
									<addrLine>Ahornstraße 55</addrLine>
									<postCode>52074</postCode>
									<settlement>Aachen</settlement>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Sergej</forename><surname>Görzen</surname></persName>
							<email>goerzen@cs.rwth-aachen.de</email>
							<affiliation key="aff0">
								<orgName type="institution">RWTH Aachen University</orgName>
								<address>
									<addrLine>Ahornstraße 55</addrLine>
									<postCode>52074</postCode>
									<settlement>Aachen</settlement>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Ana</forename><surname>Dragoljić</surname></persName>
							<email>ana.dragoljic@rwth-aachen.de</email>
							<affiliation key="aff0">
								<orgName type="institution">RWTH Aachen University</orgName>
								<address>
									<addrLine>Ahornstraße 55</addrLine>
									<postCode>52074</postCode>
									<settlement>Aachen</settlement>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Lars</forename><forename type="middle">Florian</forename><surname>Meiendresch</surname></persName>
							<email>lars.meiendresch@rwth-aachen.de</email>
							<affiliation key="aff0">
								<orgName type="institution">RWTH Aachen University</orgName>
								<address>
									<addrLine>Ahornstraße 55</addrLine>
									<postCode>52074</postCode>
									<settlement>Aachen</settlement>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Marc</forename><surname>Troll</surname></persName>
							<email>marc.troll@rwth-aachen.de</email>
							<affiliation key="aff0">
								<orgName type="institution">RWTH Aachen University</orgName>
								<address>
									<addrLine>Ahornstraße 55</addrLine>
									<postCode>52074</postCode>
									<settlement>Aachen</settlement>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Ulrik</forename><surname>Schroeder</surname></persName>
							<email>schroeder@cs.rwth-aachen.de</email>
							<affiliation key="aff0">
								<orgName type="institution">RWTH Aachen University</orgName>
								<address>
									<addrLine>Ahornstraße 55</addrLine>
									<postCode>52074</postCode>
									<settlement>Aachen</settlement>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">A Learning Analytics Dashboard to Investigate the Influence of Interaction in a VR Learning Application ⋆</title>
					</analytic>
					<monogr>
						<idno type="ISSN">1613-0073</idno>
					</monogr>
					<idno type="MD5">97B767365CBC36C3C2D656D9AD5D6685</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2025-04-23T18:52+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Virtual Reality</term>
					<term>Learning Analytics</term>
					<term>Dashboard</term>
					<term>Multi-modal Learning Analytics</term>
					<term>Rendering</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Learning in Virtual Reality offers various ways to make the learning process interactive, but the implementation of such features is complex, time-consuming and expensive. In order to evaluate the efficiency of interactive tasks, a Learning Analytics dashboard, presented in this paper, was created for both teachers/educators and content creators. The dashboard presents data from a study with different interactive/immersive and non-interactive/non-immersive variants of a learning application for the rendering pipeline, a showcase topic from computer graphics. The dashboard has been implemented with transferability in mind by using xAPI as a data format and can thus be easily transferred to other contexts.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction and Background</head><p>In recent years, the integration of Virtual Reality (VR) into educational settings has gained more and more interest <ref type="bibr" target="#b0">[1]</ref>. This surge in popularity is not without reason; on one hand, the technology gets affordable. On the other hand, the immersive and interactive nature of VR provides a unique learning experience. As educators and content creators explore the growing possibilities of VR in education, the need for assessment, design guidelines, best practices, and effective tools to assess and optimize these experiences becomes increasingly important, e.g. Ansone et al. discussing the need for a pedagogic framework for Usability in VR <ref type="bibr" target="#b1">[2]</ref>.</p><p>With Learning Analytics, we can go beyond the goal of providing learners with feedback on their performance using superficial descriptive analytics, a gap identified by Susnjak et al. <ref type="bibr" target="#b2">[3]</ref>. Rather it is possible to give teachers and content creators interesting insights into the Joint Proceedings of LAK 2024 Workshops, co-located with 14th International Conference on Learning Analytics and Knowledge (LAK 2024), Kyoto, Japan, <ref type="bibr">March 18-22, 2024.</ref> ⋆ You can use this document as the template for preparing your publication. We recommend using the latest version behavior of learners by visualizing learning traces in a Learning Analytics dashboard (LAD). This way, we will enable the creation of content that can be adapted to the actual needs of learners and teachers, create actionable insights for educators and gain fundamental knowledge about learning (behaviour) in VR environments, which should also be generalizable to other use cases.</p><p>The assumption that interactive tasks enhance learning outcomes is not new (see <ref type="bibr" target="#b3">[4]</ref> and <ref type="bibr" target="#b4">[5]</ref>), but we still investigate the design of learning environments. Interactive learning is the basis for many VR educational applications, e.g. to develop spatial abilities <ref type="bibr" target="#b5">[6]</ref>. However, the gap persists in the uncertainty surrounding which interactive tasks are most effective in conveying educational content. Addressing this question, we introduce a LAD designed explicitly for educators and content creators (persons creating educational material without teaching them directly). This dashboard serves as a tool for investigating the impact of interactivity in VR learning applications.</p><p>While the primary focus is on understanding the effects of interactivity, the learning environment, the data and the study also embark on a nuanced exploration by collecting data in both immersive VR and a desktop VR version without the goal of achieving a media comparison, see <ref type="bibr" target="#b6">[7]</ref>. The comparative analysis within the LAD facilitates a comprehensive examination of the two variants, presenting usability issues but also the nuances and similarities between immersive and desktop VR. The dashboard allows for the testing of interactive scenarios of varying complexity, as the application is engineered to be modular.</p><p>In essence, this paper presents a LAD to investigate the usage of VR in education. Primary stakeholders are teachers and content creators who are interested in the relationship between interactivity and learning outcomes. The insights gained from this study contribute not only to the refinement of existing VR educational practices but also pave the way for innovative, informed, and economical advancements in future applications.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Related Work</head><p>Related work to this can be examined from different angles, primarily from the perspective of Learning Analytics in VR, but also from the perspective of interactive learning and dashboard design. Related work in the field of computer graphics education is summarized in Heinemann et al. <ref type="bibr" target="#b7">[8]</ref>. To summarize the preliminary work, it can be said that very little use has been made of VR as a technology in teaching, but many approaches in computer graphics are interactive and practical.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Multimodal Learning Analytics and VR Learning Analytics in Virtual</head><p>Reality can be based on various data sources. The technology itself can be used to collect information beyond "simple log data" (often referred to as clickstreams in traditional learning environments), for example, movement data. This means that VR learning applications can, in contrast to classic setups, be evaluated without external sensors using multimodal Learning Analytics (MMLA), see <ref type="bibr" target="#b8">[9]</ref>. Oucaichi et al. state that VR is an ideal space to capture learners' behaviour and body movement, but they question whether VR is an ideal learning space <ref type="bibr" target="#b9">[10]</ref>. Besides this, the general analysis (as well as the collection, pre-processing, annotation and interpretation) of multimodal data remains a challenge <ref type="bibr" target="#b10">[11]</ref>. According to Oucaichi et al. <ref type="bibr" target="#b9">[10]</ref>, VR is one of the still rather small emerging technologies considered in MMLA research. This work provides approaches to answer the gap regarding the success of VR as a learning medium <ref type="bibr" target="#b9">[10]</ref>.</p><p>Learning Analytics Dashboards There are many studies on dashboards in general, as they are an important tool for involving people in the decision-making process for learning activity and data analysis <ref type="bibr" target="#b11">[12]</ref>. Initially, visualizations and predictions were the focus of research. Currently, a research focus is to extend the dashboards to include processes that involve stakeholders and, for example, multimodal data <ref type="bibr" target="#b11">[12]</ref>. In addition to current research topics, gaps identified in research on LADs, for example, include the lack of integration of JEDI concepts (justice, equity, diversity, and inclusion) <ref type="bibr" target="#b12">[13]</ref>. Including JEDI concepts comes with its own challenges and opportunities; we'll discuss mainly the theme of software development resources.</p><p>Learning Analytics Data Format One of the possible data formats specialized for tracking learner activities is the eXperience API (xAPI) specification. To briefly introduce xAPI here, the reason for using this specification is the following: we aim to leverage the eXperience API (xAPI) specification across our work in various forms and flavours of Learning Analytics. Through the use of xAPI statements, user activities are articulated using the syntax: an actor performs a specific verb on an object/activity. The incorporation of xAPI extensions helps to provide additional information about the learning scenario (termed context extension), details regarding the target activity (or object) (activity extension), or additional information concerning task progress (result extensions) <ref type="bibr">[14]</ref>. However, xAPI has a degree of freedom on how to use it and isolated solutions are often created. To achieve a more consistent usage standard, it is possible to use xAPI profiles and registries. Ehlenz et al. explore the perspectives afforded by existing and former xAPI registries, especially in an academic area <ref type="bibr" target="#b13">[15]</ref>. Possible solutions like an xAPI Registry are already implemented; see <ref type="bibr" target="#b14">[16]</ref>. The xAPI Registry was institutionalized to provide a profound basis for the scientific application of xAPI across various disciplines. Developers and researchers have the opportunity to suggest modifications, participate in the maintenance or add new definitions through either GitLab or the web interface. The registry is implemented humanfriendly for interdisciplinary usage, but also implements straightforward machine-readable interfaces. Furthermore, each unique identifier (IRI in xAPI lingo) for definitions is also a valid pointer to a readable version of all stored metadata for both machines and humans, making the data itself even more accessible and hopefully self-explaining.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Research Prototype: RePiX VR</head><p>The research prototype used for this study is the modular application Rendering Pipeline eXperience in Virtual Reality (RePiX VR) <ref type="bibr" target="#b7">[8]</ref>, demonstration link (YouTube). RePiX VR is an interactive, immersive learning scenario which focuses on teaching the nine core stages (explained in <ref type="bibr" target="#b15">[17]</ref>) of the rendering pipeline, a process of converting a 3D environment into a 2D image. It can be used either in a VR environment, using a head-mounted display, or on a computer (desktop VR).</p><p>One goal of the application is to address challenges for teachers, such as the didactic reduction for the different target groups, the difficult visualization of the procedural, basic conceptsespecially for individual steps of the pipeline -and the balance between details of the individual steps of the pipeline and the overview of the overall process. Especially at an abstract level, it is difficult to involve students and provide interactions that promote learning actively.</p><p>The learning objectives of the simulation are located at different taxonomy levels and thus enable both a gentle learning curve for beginners and an in-depth examination for advanced learners (revision of Bloom's taxonomy <ref type="bibr" target="#b16">[18]</ref>). The target dimensions of the application include typical declarative factual knowledge, learning the terminology and acquiring conceptual and procedural knowledge at various taxonomy levels. Examples of the range of learning objectives are conceivable from "Learners name the results of the individual steps of the pipeline" to "Learners can compare different lighting algorithms on a visual level and make a reasoned decision". In order to address learners with heterogeneous knowledge as intuitively as possible, the application was designed as a guided tour in which a robot serves as a contact person and teacher.</p><p>To achieve the goal of gathering data for our learning applications, we created the framework OmiLAXR (Open and modular integration of Learning Analytics in XR) <ref type="bibr" target="#b17">[19]</ref>. This framework was implemented in several development cycles and was based on requirements derived from experiences gained from various applications and use cases. For the tracking, we chose to use the xAPI specification and an xAPI Registry; see <ref type="bibr" target="#b14">[16]</ref>.</p><p>The OmiLAXR framework focuses on supporting an easy integration of xAPI in XR applications (in Unity). We created components and concepts for enabling the automatized gathering of VR environment activities by a set of tracking systems, e.g. interacting with an object, eye tracking data, movement and head gestures (nodding and shaking). By integrating OmiLAXR in our research object RePiX VR, we achieved a rich collection of user behaviour for our study (explained next).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Study</head><p>In October 2022, the RePiX VR project was tested with 128 (VR) + 159 (desktop) students who were participating in the computer graphics lecture of the RWTH Aachen University, resulting in 120 complete data records for the immersive VR variant, 56 complete data sets for the desktop variant. The high number of incomplete data sets in the desktop version is due to the voluntary completion of questionnaires and duplicate registrations from the circle of VR users (to study for the exam). To gain some insights into knowledge acquisition, each participant had to answer surveys directly before and after the learning experience (pre-and post-test design). The content-related questions were the same in both surveys. Still, the second survey also included the UXIVE questionnaire ( <ref type="bibr" target="#b18">[20]</ref>) with questions related to 10 concepts of VR experiences, like usability, flow, and presence questions.</p><p>To gain some insights into the effects of interactivity while learning, the application, texturing, lighting, and rasterization stage had two versions, an interactive and a non-interactive one; see Table <ref type="table">1</ref>. To create a controlled environment, six different application modes were created, where each game mode had different interactive stages. Since the interactive versions of each</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Variant Application Texturing Lighting Rasterization</head><formula xml:id="formula_0">A B C D E F</formula></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Table 1</head><p>Overview of all six different interaction modes/groups. The marks the interactive variant of a stage.</p><p>stage are more time-consuming than the non-interactive ones, each participant was shown two interactive and two non-interactive stages.</p><p>The non-interactive version of the application stage shows an animation of how each variable of a matrix changes the appearance of an object. The interactive version builds on the noninteractive version. After showing the animation, the learner should use the gained knowledge about matrices to move a sun, earth, and moon as they would move in a solar system.</p><p>The non-interactive texturing stage describes how a texture is stored in a 2D texture map and then applied to a 3D object. In the interactive version of the texturing stage, the learner can use a virtual spray can, which allows him to spray in different colours on the 3D object or the 2D texture map. If the participant colours the 3D object, the tint is applied to the texture, and vice versa.</p><p>The non-interactive version of the lighting stage gives a short simulation of what happens in a scenario without lighting by turning off the light in the whole learning scenario. The interactive version of the lighting stage shows an object and gives the learner flashlights. Through the flashlights, the learner can light the object from different angles with different light sources. The learner can also switch and compare the effects of different shading models (Flat shading, Gouraud shading, and Phong shading).</p><p>The non-interactive rasterization stage explains how, from the 3D scenario, a colour for each screen pixel is calculated so that a monitor can show a 3D scenario. In the interactive version of the rasterization stage, the learner can create triangles and modify the pixel density of a screen. Through those changes, the learner can experience how the created triangles appear on a monitor.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.">Results and Dashboard</head><p>This section will explain the LAD and present insights from the study.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Research Question The dashboard is created to give educators and content creators insights into two research questions: (1) In which aspects differ the learning experiences when using immersive Virtual Reality instead of 2D desktop applications? and (2) In which aspects differ the learning experiences for interactive scenarios instead of behaviouristic informative texts or animations?.</head><p>Indicators To create a dashboard that could answer the given research questions, the xAPI statements were filtered and pre-processed to create indicators. For both research questions, the indicators time spent per stage, quantity of specific interactions, task score, and survey score were used. For the difference between the learning experiences in VR compared to the desktop application, further indicators of immersion and health evaluation from the UXIVE questionnaire were of interest.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Data Analysis</head><p>The data pre-processing was done using Jupyter Notebook<ref type="foot" target="#foot_0">1</ref> and the Python library NumPy 2 .</p><p>The data of the xAPI statements that were collected during the RePiX VR experience was exported into a JSON file format. The xAPI statements contain an extension with the timestamp, which enables the calculation of the difference between the timestamp when a stage started and it finished. For each interaction, an xAPI statement was sent. Therefore, the question of how often each interaction was tracked, an indicator was created that counted how often each verb of the xAPI statement was sent. For each interactive task in the RePiX VR application that was solved correctly, the learner gained a score. The score is higher the faster a task is solved. The final task score is the average of all scores gained during the learning experience.</p><p>The survey data was used in a CSV data format export from LimeSurvey 3 . Data from the UXIVE was used for the survey score, health evaluation, and immersion evaluation. The survey consists of questions that are either positively connoted, e. Furthermore, the survey score was calculated by taking the difference of the average points for all positively and negatively connoted questions. The health and immersion evaluation was done by calculating the score for each question related to the topic of health or immersion.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Dashboard Implementation</head><p>After the analysis and pre-processing of the data, the LAD was developed (active URL in Appendix A). The pre-processing results were exported into a JSON file, which was then imported into the dashboard. The dashboard was used to create interactive visualizations and was developed using Vue.js 4 for the frontend development and Apexcharts 5 for the visualizations.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Dashboard Visualization</head><p>This section presents the finished dashboard. The dashboard consists of three different pages: overview (Figure <ref type="figure" target="#fig_1">1</ref>), position analysis (Figure <ref type="figure" target="#fig_3">3</ref>), and comparison (Figure <ref type="figure" target="#fig_2">2</ref>). Each page presents the research questions on the left side, while the top right corner is used as a menu bar to switch between the different pages. Below the research question, an explanation for the selected box plots is given. Each box plot only shows data of the participants  that were between 5% and 95% of the data set, and their quantiles presenting the students from 30% to 70% of the data set. The visualizations for immersion and health evaluation use only the participants' data between 40% to 60% of the data because the data for these two visualizations are widely spread, and the selected data still provides a good approximation of the average participant. Next, each view provides general data information about the study. First of all, it shows for which group which of the stages was interactive (like Table <ref type="table">1</ref>). Also, a distribution of how many data sets for each group exist is given. Lastly, information about how many participants did not finish the study is given.</p><p>In addition to the information that each page of the dashboard provides, the overview page (Figure <ref type="figure" target="#fig_1">1</ref>) also provides global filters on the right side. These filters are used to change the visualizations interactively. For example, when deselecting the checkbox of Group A, the collected data of Group A will no longer be considered for the current visualizations.</p><p>Through the development process, it was found that this approach makes it difficult to compare the visualizations of different filter combinations, which is the reason why the comparison page was added. The comparison page (Figure <ref type="figure" target="#fig_2">2</ref>) is split into two halves. Each half of the page has its local filter, independently configuring the visualizations on the same half. Therefore, visualizations with different filters are shown directly beside each other and make a direct and adaptive comparison of the data set possible. Using the comparison page makes it possible to answer the research questions from educators and content creators. On the last page, the position analysis was added to analyze the movements inside the application. The goal was to create a heat map to indicate the most common position in a single time step, The overall position distribution is shown on the dashboard in Figure <ref type="figure" target="#fig_3">3</ref>. This visualization gives teachers an impression beyond the interaction data as to whether different perspectives were used to solve the three-dimensional tasks and to check certain objects. Position data can also help to distinguish users who moved themselves using the physical VR area (or WASD keyboard navigation) versus learners who preferred to teleport.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Results</head><p>To get answers to the given research questions, different filters were set on the implemented dashboard. The first research question <ref type="bibr" target="#b0">(1)</ref> In which aspects differ the learning experiences when using Virtual Reality instead of 2D desktop applications?, can be investigated by selecting the VR filter on one side of the comparison page and the desktop filter on the other side of the comparison page. The filter selection showed that RePiX VR was perceived as more immersive, the survey score was higher, and the participants moved more in the VR environment. At the same time, the VR environment resulted in more health issues, a lower task score, and fewer interactions than the desktop environment.</p><p>For the second research question <ref type="bibr" target="#b1">(2)</ref> In which aspects differ the learning experiences for interactive scenarios instead of behaviouristic informative texts?, the only quantitative indicator was that participants spent more time in interactive than non-interactive scenarios.</p><p>However, these are only the results that can be taken directly from the quantitative data.</p><p>Educators and content creators can already reflect on the teaching application RePiX VR at a meta-level. This study already provided us with some valuable insights showing usability issues and further steps. When quantitative data is also (manually) integrated, it shows that the combination of data, like the time needed with assessment, is feasible and sensible. Interactive tasks can thereby be used to provide learners with in-app feedback.</p><p>Another publication, which analyzed quantitative data from an earlier study, has already shown that a reliable text-to-speech system for the guiding robot is useful to provide feedback to the learner without having to rely on text boxes <ref type="bibr" target="#b19">[21]</ref>. This study confirms that people with usability problems take longer and wish for more help in the free text answers, which can be implemented most simply, for example, via timers.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.">Discussion</head><p>This paper presents a dashboard in closer detail, which can be used to evaluate various aspects of the VR experience. By using it to reflect on the learning application, the dashboard itself is implicitly put to the test. As indicated in the description, the basic structure of the dashboard can also be transferred to learners. Necessary adjustments include anonymization and highlighting one's own performance, as well as a study on the connection between user behaviour and outcome, which can be used to generate feedback. Check Van Leeuwen et al. for more differing aspects <ref type="bibr" target="#b20">[22]</ref>.</p><p>Other transfers are also feasible but are beyond the scope of this paper. One of the key educational challenges for usage poses the duration of the VR experience, so introducing interactivity at every stage is not the solution. Educators should emphasize selected learning objectives and create a guided tour which fits the curriculum. One option is to split up the VR experience or offer elective excursions for content beyond the core curriculum, e.g. concerning the light stage or in combination with other applications like Virtual Ray Tracer 2.0 <ref type="bibr" target="#b21">[23]</ref> or Rayground <ref type="bibr" target="#b22">[24]</ref>.</p><p>Too many interactive tasks risk losing the bigger picture, meaning that the sum of interactive tasks on basal levels of learning goal taxonomies fades the higher-level goals into the background. Like reading code line by line without grasping the algorithm itself, learners could lose themselves in individual steps without developing an understanding of the rendering pipeline as a whole -based on the theory of the block model, an educational model for program comprehension, which also argues on different dimensions of knowledge in computer science, see <ref type="bibr" target="#b23">[25]</ref>.</p><p>An open question remains about the effectiveness of the didactical setup (single usage): Can this application be used for learning in groups, engaging in a collaborative reflection process, or is the individual experience sufficient, especially regarding the principles of self-regulated learning <ref type="bibr" target="#b24">[26]</ref>.</p><p>The evaluation and the insights generated by the dashboard should, for now, be considered as an individual case study (with no claims for generalizability), as the user of the dashboard has to know both the content being taught as well as the VR application itself. Therefore, it is only used by a small number of persons yet. Besides, interpreting the presented data is a complex process <ref type="bibr" target="#b10">[11]</ref>, deriving actions for the didactic decisions is a reflective process. Consequently, only actual educators or educational content creators are viable focus groups for evaluating the dashboard.</p><p>In the future, this dashboard could be extended not only by providing new filters and visualizations but also by customization possibilities based on the well-structured xAPI format.</p><p>Another possibility is to transfer it to different use cases while keeping the same interfaces, data collectors, and indicators. And thereby fulfilling the criteria for shared software resources, cross-border collaborations and adoption of open-source software, see <ref type="bibr" target="#b12">[13]</ref>. This is particularly easy due to the standardized data specification xAPI and the standardized interfaces provided by OmiLAXR. Incorporating new filters and additional information could further incorporate JEDI into the dashboard, e.g. by highlighting information certain students have in common <ref type="bibr" target="#b12">[13]</ref>.</p><p>In our case, it could be possible that there is a cluster of students with shared experiences in prior VR usage.</p><p>Furthermore, we found that the application can be a valuable resource for students apart from the accompanying integration during the course, e.g. as interactive preparation for the final exams, which is an insight from the qualitative data we collected in the survey.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="7.">Conclusion</head><p>In this paper, we presented a study and the first attempt to create a meaningful dashboard for educators using the RePiX VR application. We tracked user behaviour (object interaction, eye tracking, movement, head gestures). We conducted a pre-and post-test before and after the learning experience to gain more insights about interactivity and learning effects. In the study, we distinguished between different application variants (with different stages, interactive or not). Finally, the PPDAC (Problem, Plan, Data, Analysis and Conclusion) cycle was used to create an interactive LA dashboard; see <ref type="bibr" target="#b25">[27]</ref>. Although the dashboard delivers a good overview of our study, the interpretation needs some expertise in the learning content. As a result, our research prototype has been identified as a valuable additional learning material for students of the computer graphics course at our university and further steps for the development of the application and didactic decisions were made. The collected data, in combination with our dashboard, also emphasized xAPI as a useful data format for realizing LA dashboards because of its flexibility and expandability.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head></head><label></label><figDesc>g. "Please indicate to what extent you agree with each statement: [ I enjoyed the experience so much that I was energetic.]", or negatively connoted, e.g. "Please indicate to what extent you agree with each statement: [During the experience my eyes hurt.]". For each question, the learner should give a score using an 8-point Likert Scale.</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: The overview page of the LAD provides a general overview and allows filtering data.</figDesc><graphic coords="7,89.29,84.19,416.70,200.15" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 2 :</head><label>2</label><figDesc>Figure 2:The comparison page of the LAD provides the same functions as the overview page but shows the visualizations and filters twice. The blue local filter changes the visualization of the blue data analysis block, while the green local filter changes the green data analysis block. This provides the possibility to compare the visualizations for different filter configurations directly.</figDesc><graphic coords="7,89.29,323.76,416.69,200.53" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Figure 3 :</head><label>3</label><figDesc>Figure 3: The position page of the LAD provides a distribution of the position in the virtual environment of all participants in the RePiX VR study.</figDesc><graphic coords="8,318.47,290.24,166.67,136.44" type="bitmap" /></figure>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="1" xml:id="foot_0">https://jupyter.org/, accessed: 14.12.2023</note>
		</body>
		<back>
			<div type="annex">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>A. Online Resources</head><p>The application, data and all sources are available via • Dashboard code and online demo instance • VR Learning Application (RePiX VR) code and website</p></div>			</div>
			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda</title>
		<author>
			<persName><forename type="first">J</forename><surname>Radianti</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><forename type="middle">A</forename><surname>Majchrzak</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Fromm</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Wohlgenannt</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.compedu.2019.103778</idno>
	</analytic>
	<monogr>
		<title level="j">Computers &amp; Education</title>
		<imprint>
			<biblScope unit="volume">147</biblScope>
			<biblScope unit="page">103778</biblScope>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Framework of Pedagogic and Usability Principles for Effective Multi-user VR Learning Applications</title>
		<author>
			<persName><forename type="first">A</forename><surname>Ansone</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">F</forename><surname>Dreimane</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Z</forename><surname>Zalite-Supe</surname></persName>
		</author>
		<idno type="DOI">10.1007/978-3-031-47328-9_7</idno>
	</analytic>
	<monogr>
		<title level="m">Immersive Learning Research Network, Communications in Computer and Information Science</title>
				<editor>
			<persName><forename type="first">M.-L</forename><surname>Bourguet</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">J</forename><forename type="middle">M</forename><surname>Krüger</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">D</forename><surname>Pedrosa</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">A</forename><surname>Dengel</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">A</forename><surname>Peña-Rios</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">J</forename><surname>Richter</surname></persName>
		</editor>
		<meeting><address><addrLine>Nature Switzerland; Cham</addrLine></address></meeting>
		<imprint>
			<publisher>Springer</publisher>
			<date type="published" when="2024">2024</date>
			<biblScope unit="page" from="96" to="110" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Learning analytics dashboard: a tool for providing actionable insights to learners</title>
		<author>
			<persName><forename type="first">T</forename><surname>Susnjak</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><forename type="middle">S</forename><surname>Ramaswami</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Mathrani</surname></persName>
		</author>
		<idno type="DOI">10.1186/s41239-021-00313-7</idno>
	</analytic>
	<monogr>
		<title level="j">International Journal of Educational Technology in Higher Education</title>
		<imprint>
			<biblScope unit="volume">19</biblScope>
			<biblScope unit="page">12</biblScope>
			<date type="published" when="2022">2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">A Research Agenda for Interactive Learning in the New Millennium</title>
		<author>
			<persName><forename type="first">T</forename><forename type="middle">C</forename><surname>Reeves</surname></persName>
		</author>
		<ptr target="https://www.learntechlib.org/primary/p/17393/" />
	</analytic>
	<monogr>
		<title level="m">Association for the Advancement of Computing in Education</title>
				<imprint>
			<publisher>AACE</publisher>
			<date type="published" when="1999">1999</date>
			<biblScope unit="page" from="15" to="20" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">Interactive Multimodal Learning Environments</title>
		<author>
			<persName><forename type="first">R</forename><surname>Moreno</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Mayer</surname></persName>
		</author>
		<idno type="DOI">10.1007/s10648-007-9047-2</idno>
	</analytic>
	<monogr>
		<title level="j">Educational Psychology Review</title>
		<imprint>
			<biblScope unit="volume">19</biblScope>
			<biblScope unit="page" from="309" to="326" />
			<date type="published" when="2007">2007</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Systematic review of spatial abilities and virtual reality: The role of interaction</title>
		<author>
			<persName><forename type="first">M</forename><surname>Gittinger</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Wiesche</surname></persName>
		</author>
		<idno type="DOI">10.1002/jee.20568</idno>
	</analytic>
	<monogr>
		<title level="j">Journal of Engineering Education n/a</title>
		<imprint>
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">Media comparison studies dominate comparative research on augmented reality in education</title>
		<author>
			<persName><forename type="first">J</forename><surname>Buchner</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Kerres</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.compedu.2022.104711</idno>
	</analytic>
	<monogr>
		<title level="j">Computers &amp; Education</title>
		<imprint>
			<biblScope unit="volume">195</biblScope>
			<biblScope unit="page">104711</biblScope>
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">Teaching the basics of computer graphics in virtual reality</title>
		<author>
			<persName><forename type="first">B</forename><surname>Heinemann</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Görzen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">U</forename><surname>Schroeder</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.cag.2023.03.001</idno>
	</analytic>
	<monogr>
		<title level="j">Computers &amp; Graphics</title>
		<imprint>
			<biblScope unit="volume">112</biblScope>
			<biblScope unit="page" from="1" to="12" />
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Multimodal Learning Analytics and Education Data Mining: Using Computational Technologies to Measure Complex Learning Tasks</title>
		<author>
			<persName><forename type="first">P</forename><surname>Blikstein</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Worsley</surname></persName>
		</author>
		<idno type="DOI">10.18608/jla.2016.32.11</idno>
	</analytic>
	<monogr>
		<title level="j">Journal of Learning Analytics</title>
		<imprint>
			<biblScope unit="volume">3</biblScope>
			<biblScope unit="page" from="220" to="238" />
			<date type="published" when="2016">2016</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Research trends in multimodal learning analytics: A systematic mapping study</title>
		<author>
			<persName><forename type="first">H</forename><surname>Ouhaichi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Spikol</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Vogel</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.caeai.2023.100136</idno>
	</analytic>
	<monogr>
		<title level="j">Computers and Education: Artificial Intelligence</title>
		<imprint>
			<biblScope unit="volume">4</biblScope>
			<biblScope unit="page">100136</biblScope>
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Multimodal learning analytics -rationale, process, examples, and direction</title>
		<author>
			<persName><forename type="first">X</forename><surname>Ochoa</surname></persName>
		</author>
		<idno type="DOI">10.18608/hla22.006</idno>
	</analytic>
	<monogr>
		<title level="m">The handbook of learning analytics</title>
				<editor>
			<persName><forename type="first">C</forename><surname>Lang</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">G</forename><surname>Siemens</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">A</forename><forename type="middle">F</forename><surname>Wise</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">D</forename><surname>Ga\v{v}evi\'{c}</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">A</forename><surname>Merceron</surname></persName>
		</editor>
		<meeting><address><addrLine>SoLAR, Vancouver, Canada</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2022">2022</date>
			<biblScope unit="page" from="54" to="65" />
		</imprint>
	</monogr>
	<note>2 ed</note>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Learning analytics dashboards: the past, the present and the future</title>
		<author>
			<persName><forename type="first">K</forename><surname>Verbert</surname></persName>
		</author>
		<author>
			<persName><forename type="first">X</forename><surname>Ochoa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>De Croon</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">A</forename><surname>Dourado</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><forename type="middle">De</forename><surname>Laet</surname></persName>
		</author>
		<idno type="DOI">10.1145/3375462.3375504</idno>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the Tenth International Conference on Learning Analytics &amp; Knowledge, LAK &apos;20</title>
				<meeting>the Tenth International Conference on Learning Analytics &amp; Knowledge, LAK &apos;20<address><addrLine>New York, NY, USA</addrLine></address></meeting>
		<imprint>
			<publisher>Association for Computing Machinery</publisher>
			<date type="published" when="2020">2020</date>
			<biblScope unit="page" from="35" to="40" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">A Review of Learning Analytics Dashboard Research in Higher Education: Implications for Justice, Equity, Diversity, and Inclusion</title>
		<author>
			<persName><forename type="first">K</forename><surname>Williamson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Kizilcec</surname></persName>
		</author>
		<idno type="DOI">10.1145/3506860.3506900</idno>
	</analytic>
	<monogr>
		<title level="m">LAK22: 12th International Learning Analytics and Knowledge Conference</title>
				<meeting><address><addrLine>New York, NY, USA</addrLine></address></meeting>
		<imprint>
			<publisher>LAK22, Association for Computing Machinery</publisher>
			<date type="published" when="2022">2022</date>
			<biblScope unit="page" from="260" to="270" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<author>
			<persName><forename type="first">M</forename><surname>Ehlenz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Heinemann</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Leonhardt</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Röpke</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Lukarov</surname></persName>
		</author>
		<author>
			<persName><forename type="first">U</forename><surname>Schroeder</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Eine forschungspraktische Perspektive auf xAPI-Registries</title>
		<title level="s">Fachtagung Bildungstechnologien der Gesellschaft für Informatik e</title>
		<meeting><address><addrLine>Bonn</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2020">2020</date>
			<biblScope unit="page">6</biblScope>
		</imprint>
		<respStmt>
			<orgName>Gesellschaft für Informatik e.V.</orgName>
		</respStmt>
	</monogr>
	<note>DELFI 2020 -Die 18</note>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">xAPI Made Easy: A Learning Analytics Infrastructure for Interdisciplinary Projects</title>
		<author>
			<persName><forename type="first">B</forename><surname>Heinemann</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Ehlenz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Görzen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">U</forename><surname>Schroeder</surname></persName>
		</author>
		<idno type="DOI">10.3991/ijoe.v18i14.35079</idno>
	</analytic>
	<monogr>
		<title level="j">International Journal of Online and Biomedical Engineering (iJOE)</title>
		<imprint>
			<biblScope unit="volume">18</biblScope>
			<biblScope unit="page" from="99" to="113" />
			<date type="published" when="2022">2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<analytic>
		<title level="a" type="main">RePiX VR -Learning environment for the Rendering Pipeline in Virtual Reality</title>
		<author>
			<persName><forename type="first">B</forename><surname>Heinemann</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Görzen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">U</forename><surname>Schroeder</surname></persName>
		</author>
		<idno type="DOI">10.2312/eged.20221040</idno>
	</analytic>
	<monogr>
		<title level="m">Eurographics 2022 -Education Papers</title>
				<editor>
			<persName><forename type="first">J.-J</forename><surname>Bourdin</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">E</forename><surname>Paquette</surname></persName>
		</editor>
		<meeting><address><addrLine>Reims, France</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2022">2022</date>
			<biblScope unit="page">8</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b16">
	<analytic>
		<title level="a" type="main">A Revision of Bloom&apos;s Taxonomy: An Overview</title>
		<author>
			<persName><forename type="first">D</forename><forename type="middle">R</forename><surname>Krathwohl</surname></persName>
		</author>
		<idno type="DOI">10.1207/s15430421tip4104_2</idno>
	</analytic>
	<monogr>
		<title level="j">Theory Into Practice</title>
		<imprint>
			<biblScope unit="volume">41</biblScope>
			<biblScope unit="page" from="212" to="218" />
			<date type="published" when="2002">2002</date>
			<publisher>Routledge</publisher>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b17">
	<analytic>
		<title level="a" type="main">Towards using the xAPI specification for Learning Analytics in VR</title>
		<author>
			<persName><forename type="first">S</forename><surname>Görzen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Heinemann</surname></persName>
		</author>
		<author>
			<persName><forename type="first">U</forename><surname>Schroeder</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Learning Analytics for Virtual Reality (LAVR) Workshop of the 14th International Conference on Learning Analytics and Knowledge (LAK24)</title>
				<meeting><address><addrLine>Kyoto, Japan</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2024">2024</date>
			<biblScope unit="page">9</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b18">
	<analytic>
		<title level="a" type="main">Towards a Model of User Experience in Immersive Virtual Environments</title>
		<author>
			<persName><forename type="first">K</forename><surname>Tcha-Tokey</surname></persName>
		</author>
		<author>
			<persName><forename type="first">O</forename><surname>Christmann</surname></persName>
		</author>
		<author>
			<persName><forename type="first">E</forename><surname>Loup-Escande</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Loup</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Richir</surname></persName>
		</author>
		<idno type="DOI">10.1155/2018/7827286</idno>
	</analytic>
	<monogr>
		<title level="j">Advances in Human-Computer Interaction</title>
		<imprint>
			<biblScope unit="page">e7827286</biblScope>
			<date type="published" when="2018">2018. 2018</date>
			<publisher>Hindawi</publisher>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b19">
	<analytic>
		<title level="a" type="main">Evaluating Usability and User Feedback in an Immersive Virtual Reality Environment for Computer Science Education</title>
		<author>
			<persName><forename type="first">B</forename><surname>Heinemann</surname></persName>
		</author>
		<author>
			<persName><forename type="first">U</forename><surname>Schroeder</surname></persName>
		</author>
		<idno type="DOI">10.1007/978-3-031-42682-7_67</idno>
	</analytic>
	<monogr>
		<title level="m">Responsive and Sustainable Educational Futures</title>
		<title level="s">Lecture Notes in Computer Science</title>
		<editor>
			<persName><forename type="first">O</forename><surname>Viberg</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">I</forename><surname>Jivet</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">P.-J</forename><surname>Muñoz-Merino</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">M</forename><surname>Perifanou</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">T</forename><surname>Papathoma</surname></persName>
		</editor>
		<meeting><address><addrLine>Nature Switzerland; Cham</addrLine></address></meeting>
		<imprint>
			<publisher>Springer</publisher>
			<date type="published" when="2023">2023</date>
			<biblScope unit="page" from="718" to="724" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b20">
	<analytic>
		<title level="a" type="main">Teacher and Student Facing Learning Analytics</title>
		<author>
			<persName><forename type="first">A</forename><surname>Van Leeuwen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">D</forename><surname>Teasley</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">F</forename><surname>Wise</surname></persName>
		</author>
		<idno type="DOI">10.18608/hla22.013</idno>
	</analytic>
	<monogr>
		<title level="m">The Handbook of Learning Analytics</title>
				<editor>
			<persName><forename type="first">C</forename><surname>Lang</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">G</forename><surname>Siemens</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">A</forename><forename type="middle">F</forename><surname>Wise</surname></persName>
		</editor>
		<meeting><address><addrLine>SOLAR</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2022">2022</date>
			<biblScope unit="page" from="130" to="140" />
		</imprint>
	</monogr>
	<note>2 ed</note>
</biblStruct>

<biblStruct xml:id="b21">
	<analytic>
		<title level="a" type="main">Virtual Ray Tracer 2.0</title>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">S</forename><surname>Van Wezel</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><forename type="middle">A</forename><surname>Verschoore De La Houssaije</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Frey</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Kosinka</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.cag.2023.01.005</idno>
	</analytic>
	<monogr>
		<title level="j">Computers &amp; Graphics</title>
		<imprint>
			<biblScope unit="volume">111</biblScope>
			<biblScope unit="page" from="89" to="102" />
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b22">
	<analytic>
		<title level="a" type="main">Remote Teaching Advanced Rendering Topics Using the Rayground Platform</title>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">A</forename><surname>Vasilakis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Papaioannou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Vitsas</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Gkaravelis</surname></persName>
		</author>
		<idno>doi:10/gpq9pj</idno>
	</analytic>
	<monogr>
		<title level="j">IEEE Computer Graphics and Applications</title>
		<imprint>
			<biblScope unit="volume">41</biblScope>
			<biblScope unit="page" from="99" to="103" />
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b23">
	<analytic>
		<title level="a" type="main">Block model: An educational model of program comprehension as a tool for a scholarly approach to teaching</title>
		<author>
			<persName><forename type="first">C</forename><surname>Schulte</surname></persName>
		</author>
		<idno type="DOI">10.1145/1404520.1404535</idno>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the Fourth International Workshop on Computing Education Research</title>
				<meeting>the Fourth International Workshop on Computing Education Research<address><addrLine>new York, NY, USA</addrLine></address></meeting>
		<imprint>
			<publisher>ACM</publisher>
			<date type="published" when="2008">2008</date>
			<biblScope unit="page" from="149" to="160" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b24">
	<analytic>
		<title level="a" type="main">Learning Analytics for Self-Regulated Learning</title>
		<author>
			<persName><forename type="first">P</forename><surname>Winne</surname></persName>
		</author>
		<ptr target="http://solaresearch.org/hla-17/hla17-chapter1" />
	</analytic>
	<monogr>
		<title level="m">Society for Learning Analytics Research (SoLAR)</title>
				<editor>
			<persName><forename type="first">C</forename><surname>Lang</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">G</forename><surname>Siemens</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">A</forename><forename type="middle">F</forename><surname>Wise</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">D</forename><surname>Gaševic</surname></persName>
		</editor>
		<meeting><address><addrLine>Alberta, Canada</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2017">2017</date>
			<biblScope unit="page" from="241" to="249" />
		</imprint>
	</monogr>
	<note>The Handbook of Learning Analytics</note>
</biblStruct>

<biblStruct xml:id="b25">
	<analytic>
		<title level="a" type="main">Creating an Understanding of Data Literacy for a Data-driven Society</title>
		<author>
			<persName><forename type="first">A</forename><surname>Wolff</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Gooch</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">J C</forename><surname>Montaner</surname></persName>
		</author>
		<author>
			<persName><forename type="first">U</forename><surname>Rashid</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Kortuem</surname></persName>
		</author>
		<idno type="DOI">10.15353/joci.v12i3.3275</idno>
	</analytic>
	<monogr>
		<title level="j">The Journal of Community Informatics</title>
		<imprint>
			<biblScope unit="volume">12</biblScope>
			<date type="published" when="2016">2016</date>
		</imprint>
	</monogr>
	<note>3</note>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
