<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Social and emotional presence of characters in virtual reality video games</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Rafael Márquez</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Federico Peinado</string-name>
          <email>email@federicopeinado.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Complutense University of Madrid, c/ Profesor José García Santesmases 9</institution>
          ,
          <addr-line>28040 Madrid</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>I Congreso Español de Videojuegos</institution>
        </aff>
      </contrib-group>
      <abstract>
        <p>The future of video games lies in Virtual Reality, and this new medium demands greater detail not only in the environments but also in the interactive characters that inhabit them. Our project consists of developing a software framework that aims to incorporate emotions to existing non-player characters to increase its social and emotional presence, improving the immersion achieved in video games. To do so, it provides a range of behaviors that the characters will adopt based mainly on an computational model of emotions. This emotional model modulates the character's behaviour according to the events that are continuously received by the character during execution. In turn, the emotional state at any given moment depends on the emotional tendency or personality that has been defined for the character. The framework, called VR NPC, is implemented as a extension of Behavior Designer and Unity. A preliminary experiment has been conducted on a demonstration scenario which seems to make it clear that players are able to perceive and appreciate such emotions in non-player characters.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Video Game Development</kwd>
        <kwd>Afective Computing</kwd>
        <kwd>Immersion</kwd>
        <kwd>Development Tool</kwd>
        <kwd>Unity</kwd>
        <kwd>Non-Player Character</kwd>
        <kwd>Gestures</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
    </sec>
    <sec id="sec-2">
      <title>2. Related Work</title>
      <p>
        There are not many works that study the figure of the non-player character (NPC) in videogames,
with notable exceptions [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>
        OZ Project [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ] was one of the first projects that sought to represent believable characters
in VR, since they saw that a lot of researchers tend to give more importance to the physical
aspect of the environment. The goal of this project was to create simulations of characters with
a series of characteristics that could help artists create truly interactive and dramatic worlds.
They use the OCC model of emotions, basically modeling one specific emotion and a cause that
provoked it. One of their early examples was Lyotard, a simulation of an apartment in which
we control its residents and can interact with a cat that responds to our actions.
      </p>
      <p>
        Lit sens [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] is a system focused on sound immersion with the goal of improving the feeling
of presence of VR players through ambient music. It mixes diferent pieces of melodies to form
an abstract song according to the performance and the situation in the game, accompanying
emotions that are tagged to the game event that is occurring.
      </p>
      <p>
        Finally, we are building on earlier work [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] that aims to create a NPC guide for an escape
room in VR with a strong social presence, giving the necessary indications to the player for
solving the game. In the absence of an emotional model, we wanted to extend this work and
create VR NPC, a more complete toolkit for building believable characters for VR games.
      </p>
    </sec>
    <sec id="sec-3">
      <title>3. VR NPC Framework</title>
      <p>To develop this project we used Scrum, the agile methodology, with sprints of 2 weeks, working
over an academic year with 3 major milestones to assess progress.</p>
      <p>
        The tools used in this project are Unity, Behavior Designer [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] (an extension for using behavior
trees), and other plugins such as VRGestureRecognizer [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], MiVRy [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], Dotween [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], plus the
example scenario VR Escape Room [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ].
      </p>
      <p>Behavioral systems usually have these elements in common, which are the ones that we have
considered: Scenery or environment, Media (VR, in our case) and NPCs (see Figure 1).</p>
      <p>For the VR NPC framework, an architecture with three important parts is used:
• Emotional Selector: It connects to the Emotional Controller and chooses the appropriate
behavior according to the data received. A new type of Behavior designer sequence nodes
have been implemented in order to select the branch of the tree to be executed.
• Emotional Controller: It manages the data of each emotion and makes the
necessary changes when an Event occurs. It is implemented as a Singleton object, and six
MonoBehavior classes model the emotions.
• Events: A series of scripts that are executed when certain conditions are met. They have
been implemented by means of Behavior Designer conditional nodes and actions, plus
some Unity scripts.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Results and Discussion</title>
      <p>To test the framework, it has been used to develop an example scenario in which players must
solve an Escape Room with the only help of a robotic NPC whose personality (or emotional
tendency) can be chosen at the start. The hypothesis is that the manifestation of emotions,
according to the chosen model, will give greater credibility to the character and allow the player
to establish stronger bonds with him/her, involving him more in the game (via the so-called
social presence) and even achieving a better performance on his part.</p>
      <p>An A/B test was carried out with 7 users, dividing the participants into 2 groups, conducting
a slightly diferent experiment with each one.</p>
      <p>• Group A: This control group had the emotional enhancements applied to the character
DEACTIVATED, so it only shows emotions in specific situations, predefined in the game
script. In the rest of the cases the character shows a neutral expression, both in his face
and in his bodily expressiveness.
• Group B: This group had the character emotional enhancements ACTIVATED so when
the character react to diferent events that occur in the environment, such as the player
dropping an important object on the ground, it always shows relevant emotions.</p>
      <p>The responses of both groups to the post-experiment questionnaire are summarised in Table 1.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions</title>
      <p>From what we have been able to see in the results obtained from the test, and ignoring the
fact that we do not yet have a large set of testers with which to generalise the results, we can
appreciate a significant improvement in the emotional model compared to the unemotional one.</p>
      <p>Given this situation, it can be said that the desired objective of increasing the emocional and
social presence obtained in the performance of the test in the NPC is getting closer.</p>
      <p>Question
Players enjoying the game.</p>
      <p>Players escaping the room.</p>
      <p>Players assisted by the NPC.</p>
      <p>NPC’s personalities chosen.</p>
      <p>Players identifying NPC’s emotions.</p>
      <p>Emotions identified (of 7).</p>
      <p>In general, it can be said that this framework has been successfully integrated with the
previous work, since it has not caused problems in the actions performed by the NPC in the
demo scenario and the whole system has worked normally.</p>
      <p>We are able to ofer the community a computational model of emotions for VR characters
that players will be able to identify and enjoy.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>D.</given-names>
            <surname>Pinchbeck</surname>
          </string-name>
          ,
          <article-title>An analysis of persistent non-player characters in the first-person gaming genre 1998-2007: a case for the fusion of mechanics and diegetics</article-title>
          ,
          <source>Eludamos: Journal for Computer Game Culture</source>
          <volume>3</volume>
          (
          <year>2009</year>
          )
          <fpage>261</fpage>
          -
          <lpage>279</lpage>
          . doi:
          <volume>10</volume>
          .7557/23.6009.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>J.</given-names>
            <surname>Bates</surname>
          </string-name>
          ,
          <source>The Nature of Characters in Interactive Worlds and The Oz Project</source>
          ,
          <year>1992</year>
          . URL: https://citeseerx.ist.psu.edu/viewdoc/download?doi
          <source>=10.1.1.50.6500&amp;rep=rep1&amp;type= pdf.</source>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>J.</given-names>
            <surname>Bates</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Witkin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Altucher</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Hauptman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Kantrowitz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Loyall</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Murakami</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Olbrich</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Popovic</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Reilly</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Sengers</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Welch</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Weyhrauch</surname>
          </string-name>
          ,
          <article-title>Worlds and images (</article-title>
          <year>2002</year>
          ). URL: https://www.cs.cmu.edu/afs/cs/project/oz/web/worlds.html.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>M. L.</given-names>
            <surname>Ibáñez</surname>
          </string-name>
          ,
          <article-title>Incrementar la presencia en entornos virtuales</article-title>
          ,
          <year>2019</year>
          . URL: https://eprints. ucm.es/id/eprint/60856/1/T41835.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>A. del Castillo</given-names>
            <surname>Espejo-Saavedra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. S.</given-names>
            <surname>Gómez</surname>
          </string-name>
          , Diseño y Desarrollo de Personajes con Presencia Social en Videojuegos de Realidad Virtual,
          <year>2019</year>
          . URL: https://eprints.ucm. es/id/eprint/61964/1/SERRANO_GOMEZ_Diseno_y_Desarrollo_de_Personajes_con_ Presencia_Social_en_Videojuegos_de_Realidad_Virtual_
          <volume>4398577</volume>
          _
          <fpage>953433568</fpage>
          .pdf.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Opsive</surname>
          </string-name>
          , Behavior designer,
          <year>2014</year>
          . URL: https://opsive.com/assets/behavior-designer.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7] KorinVR, VR Gesture Recognizer,
          <year>2013</year>
          . URL: https://github.com/korinVR/ VRGestureRecognizer.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>MARUI-PlugIn</surname>
          </string-name>
          , Mivry,
          <year>2019</year>
          . URL: https://assetstore.unity.com/packages/templates/ systems/mivry-3d
          <article-title>-gesture-recognition-143176#releases</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Demigiant</surname>
          </string-name>
          , Dotween,
          <year>2015</year>
          . URL: https://assetstore.unity.com/packages/tools/animation/ dotween-hotween-v2-
          <volume>27676</volume>
          #
          <fpage>description</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Unity</surname>
            <given-names>Technologies</given-names>
          </string-name>
          ,
          <source>VR Beginner: The Escape Room</source>
          ,
          <year>2019</year>
          . URL: https://learn.unity.com/ project/vr
          <article-title>-beginner-the-escape-room.</article-title>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>