<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>A VR End-User Development Toolbox for Media Study Students - An Initial Experience Report</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Sebastian Krois</string-name>
          <email>sebastian.krois@upb.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Kevin Scharke</string-name>
          <email>kscharke@mail.upb.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Enes Yigitbas</string-name>
          <email>enes.yigitbas@upb.de</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Gudrun Oevel</string-name>
          <email>gudrun.oevel@upb.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Paderborn University</institution>
          ,
          <addr-line>Warburger Straße 100, 33098, Paderborn</addr-line>
          ,
          <country country="DE">Germany</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Paderborn University</institution>
          ,
          <addr-line>Zukunftsmeile 2, 33102, Paderborn</addr-line>
          ,
          <country country="DE">Germany</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Virtual Reality (VR) has a wide range of possible applications, one of which is education. In a course of our University's media study, students create VR applications to be used in future iterations of courses they already passed, so younger students benifit from their experience. They learn VR development as well as think about concepts to integrate VR into teaching. Students may but not need to have previous development experience, so we need a tool that is usable for beginners but does not restrict experienced users. To achieve that, we develop a toolbox as an additional layer on Unity's XR Interaction Toolkit Examples. In this paper, we describe the concept, current development state, and results of an initial evaluation during the latest iteration of the course. The tool itself was, apart from smaller bugs, rather easy to use, but we identified the need to create (better) materials for introduction and guidance.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Virtual Reality</kwd>
        <kwd>End-User Development</kwd>
        <kwd>VR Toolbox</kwd>
        <kwd>VR Development in Unity</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>R1: Ease of use As the course is part of media studies, we cannot rely on the student’s programming
skills or knowledge about game/VR development in general. Hence, the tool needs to be easy enough to
understand, so we only need a small amount of time for training on the tool. Additionally, users in EUD
scenarios should be able to do the work themselves. So the application should not need additional work
or support to be able to run.</p>
      <p>R2: Free Choice of Assets We do not know in advance, what the application will be about. The
students will develop the concept during the course. So we cannot provide an editor with predefined
assets, but need to allow the creation of assets or to use some which are publicly available.
R3: Variety of Interaction By using VR, we can explore a wide range of interaction types. The
tool should allow diferent types of simple interaction out-of-the-box but also ofer the possibility for
extensions with complex interactions.</p>
      <p>R4: Create Logic Flow Not only should there be ways to interact with assets, but we need to configure
efects for the user’s actions. To be able to create meaningful gameplay that utilizes the features of VR,
the tool needs an easy-to-use interface to define what happens when.</p>
      <p>R5: Wide Platform Support The developed applications are aimed to be used in diferent courses at
the university. As there exist diferent VR platforms, we want the applications to run on as many of
them as possible.</p>
      <p>R6: Ease of Initialization The tool’s setup should be easy and do not require many steps to complete.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related Work</title>
      <p>The idea of creating end-user editors for VR is not new. There are multiple applications, created for
research projects as well as commercially available ones. In this section, we give an overview of projects
that aim to provide such an editor from both categories.</p>
      <p>
        Research Projects In 2022, Coelho et al.[
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] conducted a systematic literature review of VR authoring
tools. Here, we see that most academic projects focus on the creation of 360° images or videos([
        <xref ref-type="bibr" rid="ref4 ref5">4, 5</xref>
        ]).
Interacting with objects and the environment is a crucial part and one of the main advantages of using
VR. So we cannot use projects aiming at creating only 360° views. Another category of works created
tools, which are domain-specific. So, even when most of the requirements were fulfilled, they are tied
to their domain [
        <xref ref-type="bibr" rid="ref6 ref7 ref8">6, 7, 8</xref>
        ] or are designed for other concepts going further than VR and thus need more
setup [
        <xref ref-type="bibr" rid="ref7 ref9">9, 7</xref>
        ]. Also, there are tools, which need to be hosted on a web server [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. They need to be set up
by a developer and the server needs to be maintained so the tool is available when being used. That
creates the need for a developer with a larger skillset than the users the application is created for. Some
projects, like XRSpotlight by Frau et al.[11], are designed to help users to get started with XR, but they
expect users to have some prior knowledge of programming.
      </p>
      <p>Finally, the tool will be used in the lecture, so we need the project to be available and running on
modern hardware. For most of the projects presented above, the project itself is not available or we
were not able to set them up and get them running as they use out-of-date libraries and are not updated
since publication.</p>
      <p>Commercial In addition to tools emerging from research, there are some commercially available
end-user editors for VR. Similar to the scientific projects (Section 2), there are multiple tools, allowing
the creation of scenes containing 360° images or videos (Mobfish 1). Some other projects (SimLab Soft VR
Studio2 or spatial.io3) ofer the possibility to create interactive VR scenes. You can use predefined or your
own models, and they support multiple platforms. But they come with only limited access to features
in the free version and, even with paid subscriptions, ofer only a small set of possible interactions. In
previous iterations of the course, we used Spatial to create virtual environments. It is designed to be
simple and only provide the most necessary controls for creating a VR environment, like grabbing an
object to move it and scale or rotate it using gestures, users can also upload their own 3D models and
other assets. However, the free version of Spatial only allows for 500MB of storage space and does not
provide a way to completely delete models. Using the VR and web versions only provide grabbing. With
1https://mobfish.net</p>
      <sec id="sec-2-1">
        <title>2https://simlab-soft.com</title>
      </sec>
      <sec id="sec-2-2">
        <title>3https:/spatial.io</title>
        <p>the Unity framework, a few further interactions are possible, but they are still very limited, and editing
via VR or web is not possible. Finally, everything is stored on and loaded from a server, our students
experienced technical dificulties, such as the server not loading the level or changes made not being
saved.</p>
        <p>Discussion As presented above, there are some applications for creating VR applications available.
But for the academic projects, the applications are often not available anymore or are designed to only
work in a very specific domain. The commercial applications are mostly available, but, even with a
paid subscription, do not fulfill our requirements. Additionally, when tools rely on a server connection,
connection errors will most likely occur after some time. To have a reliable tool available during the
course, which satisfies our requirements, we created our own VR-Toolkit on top of Unity, which we
currently use and evaluate. The tool is independent from the domain it is used for, so, when the project
is published upon completion, it can be used for all kinds of domains, not just teaching.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Concept and Implementation</title>
      <p>The requirements defined in Section 1 can be grouped into two types. First, there are requirements to
ensure working with the toolkit is easy (R1, R5, R6, simplicity). Second, we ensure that users have
great freedom to create the application they want in its correct domain (R2, R3, R4, support creativity).
Support Creativity To balance between simplicity and the support of creativity, we searched for
a base to start, which does not restrict developers’ possibilities. As we use the Unity game engine
for multiple courses and lectures, we decided to also use it for the toolbox. To communicate with the
headsets, we used Unity’s XR Interaction Toolkit4 with examples project5. It comes with a complete
Character Controller and a wide range of interactions which are fully implemented and configured
to work with their corresponding 3D models. To add 3D models (R2), Unity can import some model
formats by default (e.g. fbx). When creating models by themselves, users can use one of the
supported file formats. Modells from an asset library often use formats like .glb or .gltf. With adding
glTFast6, to the project, they can also be used. Users have full access to all of Unity’s and the SDK’s
APIs and features (R3, R4), so experienced users or programmers are not restricted in what they
want to create. Therefore we have a base, which fully supports the support creativity requirements.
Provide Simplicity By using Unity’s XR
Interaction Toolkit, the application can run on all major VR
platforms which support OpenXR (R5). For the setup,
users need to open Unity’s XR Interaction Toolkit
Examples and drag-and-drop a .unitypackage into the
project. With that the toolbox is fully set-up (R6). To
provide the desired simplicity (R1), we take the
features provided and add a layer on top to simplify its
use. We remove information and configuration
options when they are not necessary. For the remaining
information and options, we create a simplified
visualization and extend the toolkit to support a simple, Figure 1: Architecture Overview
event-based logic flow.</p>
      <p>In Section 3.1 we describe the architecture we used to allow the logic flow. Followed by Section 3.2,
where we explain the simplified inspector. Finally, in Section 3.3, we present the objects and
functionalities, we made available in the toolkit. The most recent version of the toolkit is available here
(https://github.com/ZIM-VR/VR-Toolkit) and will be updated until completion.</p>
      <sec id="sec-3-1">
        <title>3.1. Architecture Overview</title>
        <p>the developing user (developer). With some of them (Interactable), the player can interact, e.g. grab
them. The others (Non-Interactable) can be for decoration only, or provide diferent functionalities. In
Section 3.3, we describe the available objects in more detail. All objects can be created and configured
(e.g. positioned) by the developer. To allow the creation of a logic flow, we created an event-based
workflow. Interactable Objects have an Interactor Script ( light blue ) attached, which observes if its
specific interaction happened. That can be the object being grabbed by the player or placed in a specific
spot and triggering an associated event. Developers can configure what happens after the event occurs
in the edit mode ( green ) using a custom interface (Section 3.2). For that, they can define a set of actions
(Event List, orange ), which are executed after the event occurs. For the action, we need the object,
where it should happen and the configuration what should happen. Each pair of object and action
is stored in a data holder called Event Data (white). When, during runtime ( yellow ), the Interactor
Script observes interaction with the Player, it passes the corresponding Event List to the Event Handler
( violet ). It then iterates over all Event Data elements of the event list and executes the actions.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Simplified Inspector</title>
        <p>As described above, we want to simplify the configuration as much as possible. For that, we decided
to only show as much information as necessary to configure the object, and also let the users only
configure what they really need to. As we want to stay inside Unity, we decided to override its
default inspector by creating a custom PropertyDrawer. We implemented two types of custom inspector
windows. The first is made for objects developers do not have to interact with, e.g. decorative objects.
In that case, we override the existing inspector window with an empty one, so no data is displayed.
If developers need access to an overridden and therefore hidden inspector nevertheless, we provide
a separate settings window, where specific components can be whitelisted so they will be shown.</p>
        <p>The second type of inspector is made for objects the developer
configures (Figure 2). The inspectors for interactable objects follow
the same layout, which consists of multiple fields describing an
action. In that, events can be configured and will be executed as
soon as the action triggers. The event field consists of three parts.</p>
        <p>The upper section ( green ) consists of a field where the event’s type
can be specified i.e. which kind of action is executed when the event
triggers. These currently include activation (turn an object on or
of), audio (play/stop audio), video (play/stop a video), light, and
Figure 2: View in Unity scene (load a scene), c.f. Section 3.3. Additionally, it is possible to
select the type Unity Event which shows the default UnityEvent user
interface. This allows experienced developers to create and execute their own actions. Depending
on the event’s type, the PropertyDrawer displays diferent user interfaces which all follow the same
visual hierarchy. For actions that are executed on a specific object, it can be referenced at the top. For
some types (e.g. scene), the execution is not bound to one object, so we do not need to configure one.
To provide simplicity, the object field only accepts objects of the corresponding types, e.g., an audio
event’s object field would only accept an Audio Player object. Below, there is a drop-down field where
users can choose one of the actions available ( red ) on the selected object. When an action is selected,
further fields for configuration are displayed ( violet ). Selecting ’stop audio’ does not need additional
parameters unlike the event action ’set audio clip’ which needs an audio clip field.</p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Available Prefabs</title>
        <p>In previous semesters, we asked students, which kind of interaction and functionalities they would like
to have available for creating VR applications. Based on their suggestions and on the tools provided by
Unity’s Interaction Toolkit Examples we created the following prefabs.</p>
        <p>Interactable An example of an interactable object is the PushButton — a button that can be pressed
in the virtual reality environment. One of the available actions is ’OnButtonPress’ which is triggered
when the button is pushed in, or ’OnButtonRelease’ which triggers when the button is released. Here,
developers can configure the actions as described in Section 3.2. Other interactable objects are Grabbable
Objects, Socket Interactors (grabbable objects which can be placed on) Socket Shapes, and Player Triggers
(areas that invoke an event when the player enters or exits it).</p>
        <p>Non-Interactable Players cannot interact with such objects, but they can still execute an action. In
most cases, the action is not configured on the Non-Interactable object itself, but within the Event Data
of the Interactable object invoking the event. A Non-Interactable object is, for example, the AudioPlayer.
Players cannot directly interact with it — but can be configured, e.g., to play a sound after a PushButton
is pressed. The AudioPlayer could also be used to play background music. In this case, it would not
play an audio file after an event occurred, but is configured to play from the application startup. Other
non-interactable objects are PlayerSpawnpoints (determines, where the player starts), TextFields, and
VideoPlayers. All objects presented above use a simplified inspector (Section 3.2). As we build upon
Unity’s tools, experienced users can create their own (Non-) Interactable objects or commission that
feature from other developers. Those objects can then be used together with the ones we already provide.
Finally, as described at the beginning of this section, we support diferent forms of custom-imported 3D
models. All functionalities described above can easily be used with custom models. To do so, the custom
models can be drag-and-dropped into the corresponding prefab, if necessary the scale and position of
the model can be adjusted.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Initial Evaluation</title>
      <p>For an initial evaluation, we let our students use the
work-inprogress version of our tool in the latest iteration of the course.
12 students were divided into five groups, each created a VR
application. To give an example of how the tool was used, we briefly
describe one project. The application lets player listen to music
using diferent player mediums. One type is the record player shown
in Figure 3. The record sticking out of the cover on the right is a
Grabbable Object, the player a Socket Interactor. Players can grab
the record and place it on the player. If done correctly, two Push
Buttons are enabled. Real record players usually have two speed
settings, one for long players and one for singles. Depending on
which button is pressed, the corresponding speed is simulated.
Figure 4 shows the inspector of the button playing the faster track.</p>
      <p>Two events are used to play the track. The first Audio Event has
the action Set Audio Clip selected, which tells the Audio Player to
use the sped-up version. The second event triggers the Audio Player
to start playing. The group also implemented a Walkman and a
cassette player analogously. As this was the first time a larger group
of developers used the tool, some small bugs occurred during the
use (e.g. Socket Interactors snapping to the wrong position), which
we will not describe in detail here.
Learnings We introduced the students to the tool at the start of
the semester, before they started the conception phase. We wanted
them to know the possibilities while creating the concept. But, some
tailmlye, wpahsesnedstbaerttwinegednetvheeloipnmtroednut,csttioundeanntds firdsetvcerleoaptmede ntht.eAmdodditeilosn- Figure 4: Event Config
and levels, and even more time passed before they started creating interactions or logic. Hence, we
needed to redo an introduction to the creation of interactions and logic. For some logic which is more
complex then simple if-then relations, some additional explanation was necessary. However, after a
short second introduction and some guidance in the beginning, the students were able to create the
majority of their application on their own. This indicates the tool itself is rather easy to use, but we need
to work on the initial explanation, so it is easier to get started with development. We provided a small
demo scene and textual description of the diferent objects, but students wished for a more sophisticated
demo and manual. Also, we noticed that, in comparison to the previous semester where Spatial was
used, the students managed to develop way more interactive applications while not needing more
support. When the final version of the tool is released, we plan to perform a more in-depth evaluation,
where we evaluate our tool against other available tools (like Spatial) and Unity’s XR Interaction Toolkit
Examples without the layer our tool provides.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Summary and Further Development</title>
      <p>In this paper, we presented the concept of a course for our media study students, in which they create a
VR application which can be used for other courses during their study. We defined our requirements
based on guidelines for EUD and created a first version of a tool which fulfills them. We did an initial
evaluation during the latest iteration of the course, which resulted in positive feedback, all students
were able to create the applications they planned with little help. We found several software bugs which
will be fixed during further development. Also, we noticed the need to prepare a better introduction,
demo, and manual for the final version. Afterwards, we plan to perform a larger evaluation which
measures its performance against other tools.</p>
    </sec>
    <sec id="sec-6">
      <title>Declaration on Generative AI</title>
      <sec id="sec-6-1">
        <title>The authors have not employed any Generative AI tools.</title>
        <p>[11] V. Frau, L. D. Spano, V. Artizzu, M. Nebeling, Xrspotlight: Example-based programming of xr
interactions using a rule-based approach, Proc. ACM Hum.-Comput. Interact. 7 (2023). doi:10.
1145/3593237.</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Tan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <article-title>Augmented and virtual reality (ar/vr) for education and training in the aec industry: A systematic review of research and applications</article-title>
          ,
          <source>Buildings</source>
          <volume>12</volume>
          (
          <year>2022</year>
          ). doi:
          <volume>10</volume>
          .3390/buildings12101529.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>H.</given-names>
            <surname>Lieberman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Paternò</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Klann</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Wulf</surname>
          </string-name>
          ,
          <string-name>
            <surname>End-User Development</surname>
          </string-name>
          : An Emerging Paradigm, Springer Netherlands, Dordrecht,
          <year>2006</year>
          , p.
          <fpage>1</fpage>
          -
          <lpage>8</lpage>
          . doi:
          <volume>10</volume>
          .1007/1-4020-5386-X_
          <fpage>1</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>H.</given-names>
            <surname>Coelho</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Monteiro</surname>
          </string-name>
          , G. Gonçalves,
          <string-name>
            <given-names>M.</given-names>
            <surname>Melo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Bessa</surname>
          </string-name>
          ,
          <article-title>Authoring tools for virtual reality experiences: a systematic review</article-title>
          ,
          <source>Multimedia Tools and Applications</source>
          <volume>81</volume>
          (
          <year>2022</year>
          )
          <fpage>28037</fpage>
          -
          <lpage>28060</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>S. H. H.</given-names>
            <surname>Shah</surname>
          </string-name>
          , K. Han,
          <string-name>
            <given-names>J. W.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <article-title>Real-time application for generating multiple experiences from 360° panoramic video by tracking arbitrary objects and viewer's orientations</article-title>
          ,
          <source>Applied Sciences</source>
          <volume>10</volume>
          (
          <year>2020</year>
          )
          <article-title>2248</article-title>
          . doi:
          <volume>10</volume>
          .3390/app10072248.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Zhao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Ma</surname>
          </string-name>
          , Shadowplay2.5d:
          <article-title>A 360-degree video authoring tool for immersive appreciation of classical chinese poetry</article-title>
          ,
          <source>J. Comput. Cult. Herit</source>
          .
          <volume>13</volume>
          (
          <year>2020</year>
          ) 5:
          <fpage>1</fpage>
          -
          <lpage>5</lpage>
          :
          <fpage>20</fpage>
          . doi:
          <volume>10</volume>
          .1145/3352590.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>R.</given-names>
            <surname>Blonna</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. S.</given-names>
            <surname>Tan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Tan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. P.</given-names>
            <surname>Mora</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Atienza</surname>
          </string-name>
          ,
          <article-title>Vrex: A framework for immersive virtual reality experiences</article-title>
          ,
          <source>in: 2018 IEEE Region Ten Symposium (Tensymp)</source>
          ,
          <year>2018</year>
          , p.
          <fpage>118</fpage>
          -
          <lpage>123</lpage>
          . doi:
          <volume>10</volume>
          . 1109/TENCONSpring.
          <year>2018</year>
          .
          <volume>8692018</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>S.</given-names>
            <surname>Krois</surname>
          </string-name>
          , E. Yigitbas,
          <article-title>Prototyping cross-reality escape rooms</article-title>
          , in: M.
          <string-name>
            <surname>K. Lárusdóttir</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          <string-name>
            <surname>Naqvi</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          <string-name>
            <surname>Bernhaupt</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Ardito</surname>
          </string-name>
          , S. Sauer (Eds.),
          <source>Human-Centered Software Engineering</source>
          , Springer Nature Switzerland, Cham,
          <year>2024</year>
          , p.
          <fpage>84</fpage>
          -
          <lpage>104</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>031</fpage>
          -64576-
          <issue>1</issue>
          _
          <fpage>5</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>E.</given-names>
            <surname>Zidianakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Partarakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ntoa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Dimopoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Kopidaki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Ntagianta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Ntafotis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Xhako</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Pervolarakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Kontaki</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Zidianaki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Michelakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Foukarakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Stephanidis</surname>
          </string-name>
          ,
          <article-title>The invisible museum: A user-centric platform for creating virtual 3d exhibitions with vr support</article-title>
          ,
          <source>Electronics</source>
          <volume>10</volume>
          (
          <year>2021</year>
          )
          <article-title>363</article-title>
          . doi:
          <volume>10</volume>
          .3390/electronics10030363.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>A.</given-names>
            <surname>Bellucci</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Zarraonandia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Díaz</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Aedo</surname>
          </string-name>
          ,
          <article-title>End-user prototyping of cross-reality environments</article-title>
          ,
          <source>in: Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction</source>
          , TEI '17,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2017</year>
          , p.
          <fpage>173</fpage>
          -
          <lpage>182</lpage>
          . doi:
          <volume>10</volume>
          .1145/3024969.3024975.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>E.</given-names>
            <surname>Yigitbas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Klauke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Gottschalk</surname>
          </string-name>
          , G. Engels,
          <article-title>Vreud - an end-user development tool to simplify the creation of interactive vr scenes</article-title>
          ,
          <source>in: 2021 IEEE Symposium on Visual Languages and HumanCentric Computing (VL/HCC)</source>
          ,
          <year>2021</year>
          , p.
          <fpage>1</fpage>
          -
          <lpage>10</lpage>
          . doi:
          <volume>10</volume>
          .1109/VL/HCC51201.
          <year>2021</year>
          .
          <volume>9576372</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>