=Paper= {{Paper |id=Vol-3305/short7 |storemode=property |title=Social and emotional presence of characters in virtual reality video games (poster) |pdfUrl=https://ceur-ws.org/Vol-3305/short7.pdf |volume=Vol-3305 |authors=Rafael Márquez,Federico Peinado |dblpUrl=https://dblp.org/rec/conf/cev/MarquezP22 }} ==Social and emotional presence of characters in virtual reality video games (poster)== https://ceur-ws.org/Vol-3305/short7.pdf
Social and emotional presence of characters in
virtual reality video games
Rafael Márquez, Federico Peinado*
Complutense University of Madrid, c/ Profesor José García Santesmases 9, 28040 Madrid, Spain

                                      Abstract
                                      The future of video games lies in Virtual Reality, and this new medium demands greater detail not only
                                      in the environments but also in the interactive characters that inhabit them. Our project consists of
                                      developing a software framework that aims to incorporate emotions to existing non-player characters to
                                      increase its social and emotional presence, improving the immersion achieved in video games. To do so,
                                      it provides a range of behaviors that the characters will adopt based mainly on an computational model
                                      of emotions. This emotional model modulates the character’s behaviour according to the events that
                                      are continuously received by the character during execution. In turn, the emotional state at any given
                                      moment depends on the emotional tendency or personality that has been defined for the character. The
                                      framework, called VR NPC, is implemented as a extension of Behavior Designer and Unity. A preliminary
                                      experiment has been conducted on a demonstration scenario which seems to make it clear that players
                                      are able to perceive and appreciate such emotions in non-player characters.

                                      Keywords
                                      Video Game Development, Affective Computing, Immersion, Development Tool, Unity, Non-Player
                                      Character, Gestures




1. Introduction
Virtual Reality (VR) has become a field of study that encompasses many facets, whether it is the
possibilities of communication between different people, leisure or virtual work. The ambitious
Metaverse project promoted by Meta (formerly known as Facebook) is an example of the current
boom. There are also companies with great weight in the field of video games such as Sony or
Microsoft that are also investing in this technology.
   The problem to be solved by this work is how to improve the different behaviors, mainly
of Non-Player Characters (NPC) in VR, so that the player perceives them in a socially and
emotionally appropriate way. Emotionless or lifeless characters produce feelings that are the
opposite of what one seeks in a truly immersive environment.




I Congreso Español de Videojuegos, December 1–2, 2022, Madrid, Spain
*
 Corresponding author.
$ rafmarqu@ucm.es (R. Márquez); email@federicopeinado.com (F. Peinado)
€ http://federicopeinado.com/ (F. Peinado)
 0000-0002-8893-0020 (F. Peinado)
                                    © 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
 CEUR
 Workshop
 Proceedings
               http://ceur-ws.org
               ISSN 1613-0073
                                    CEUR Workshop Proceedings (CEUR-WS.org)
2. Related Work
There are not many works that study the figure of the non-player character (NPC) in videogames,
with notable exceptions [1].
   OZ Project [2, 3] was one of the first projects that sought to represent believable characters
in VR, since they saw that a lot of researchers tend to give more importance to the physical
aspect of the environment. The goal of this project was to create simulations of characters with
a series of characteristics that could help artists create truly interactive and dramatic worlds.
They use the OCC model of emotions, basically modeling one specific emotion and a cause that
provoked it. One of their early examples was Lyotard, a simulation of an apartment in which
we control its residents and can interact with a cat that responds to our actions.
   Lit sens [4] is a system focused on sound immersion with the goal of improving the feeling
of presence of VR players through ambient music. It mixes different pieces of melodies to form
an abstract song according to the performance and the situation in the game, accompanying
emotions that are tagged to the game event that is occurring.
   Finally, we are building on earlier work [5] that aims to create a NPC guide for an escape
room in VR with a strong social presence, giving the necessary indications to the player for
solving the game. In the absence of an emotional model, we wanted to extend this work and
create VR NPC, a more complete toolkit for building believable characters for VR games.


3. VR NPC Framework
To develop this project we used Scrum, the agile methodology, with sprints of 2 weeks, working
over an academic year with 3 major milestones to assess progress.
   The tools used in this project are Unity, Behavior Designer [6] (an extension for using behavior
trees), and other plugins such as VRGestureRecognizer [7], MiVRy [8], Dotween [9], plus the
example scenario VR Escape Room [10].
   Behavioral systems usually have these elements in common, which are the ones that we have
considered: Scenery or environment, Media (VR, in our case) and NPCs (see Figure 1).
   For the VR NPC framework, an architecture with three important parts is used:

    • Emotional Selector: It connects to the Emotional Controller and chooses the appropriate
      behavior according to the data received. A new type of Behavior designer sequence nodes
      have been implemented in order to select the branch of the tree to be executed.
    • Emotional Controller: It manages the data of each emotion and makes the neces-
      sary changes when an Event occurs. It is implemented as a Singleton object, and six
      MonoBehavior classes model the emotions.
    • Events: A series of scripts that are executed when certain conditions are met. They have
      been implemented by means of Behavior Designer conditional nodes and actions, plus
      some Unity scripts.
Figure 1: Our robotic NPC manifesting the emotion of sadness at a medium level of intensity.


4. Results and Discussion
To test the framework, it has been used to develop an example scenario in which players must
solve an Escape Room with the only help of a robotic NPC whose personality (or emotional
tendency) can be chosen at the start. The hypothesis is that the manifestation of emotions,
according to the chosen model, will give greater credibility to the character and allow the player
to establish stronger bonds with him/her, involving him more in the game (via the so-called
social presence) and even achieving a better performance on his part.
   An A/B test was carried out with 7 users, dividing the participants into 2 groups, conducting
a slightly different experiment with each one.
    • Group A: This control group had the emotional enhancements applied to the character
      DEACTIVATED, so it only shows emotions in specific situations, predefined in the game
      script. In the rest of the cases the character shows a neutral expression, both in his face
      and in his bodily expressiveness.
    • Group B: This group had the character emotional enhancements ACTIVATED so when
      the character react to different events that occur in the environment, such as the player
      dropping an important object on the ground, it always shows relevant emotions.
  The responses of both groups to the post-experiment questionnaire are summarised in Table 1.


5. Conclusions
From what we have been able to see in the results obtained from the test, and ignoring the
fact that we do not yet have a large set of testers with which to generalise the results, we can
appreciate a significant improvement in the emotional model compared to the unemotional one.
   Given this situation, it can be said that the desired objective of increasing the emocional and
social presence obtained in the performance of the test in the NPC is getting closer.
                Question                          A                              B
       Players enjoying the game.                2/3                            4/4
       Players escaping the room.                1/3                            4/4
      Players assisted by the NPC.               2/3                            4/4
       NPC’s personalities chosen.                -                   Joyful, Angry, 2x Sad
   Players identifying NPC’s emotions.           2/3                            3/4
        Emotions identified (of 7).      Fear, Joy, Surprise   Fear, Joy, Surprise, Anger, Sadness
Table 1
Summary of A/B test responses.


  In general, it can be said that this framework has been successfully integrated with the
previous work, since it has not caused problems in the actions performed by the NPC in the
demo scenario and the whole system has worked normally.
  We are able to offer the community a computational model of emotions for VR characters
that players will be able to identify and enjoy.


References
 [1] D. Pinchbeck, An analysis of persistent non-player characters in the first-person gaming
     genre 1998-2007: a case for the fusion of mechanics and diegetics, Eludamos: Journal for
     Computer Game Culture 3 (2009) 261–279. doi:10.7557/23.6009.
 [2] J. Bates, The Nature of Characters in Interactive Worlds and The Oz Project, 1992.
     URL: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50.6500&rep=rep1&type=
     pdf.
 [3] J. Bates, A. Witkin, J. Altucher, A. Hauptman, M. Kantrowitz, B. Loyall, K. Murakami,
     P. Olbrich, Z. Popovic, S. Reilly, P. Sengers, W. Welch, P. Weyhrauch, Worlds and images
     (2002). URL: https://www.cs.cmu.edu/afs/cs/project/oz/web/worlds.html.
 [4] M. L. Ibáñez, Incrementar la presencia en entornos virtuales, 2019. URL: https://eprints.
     ucm.es/id/eprint/60856/1/T41835.pdf.
 [5] A. del Castillo Espejo-Saavedra, R. S. Gómez, Diseño y Desarrollo de Personajes con
     Presencia Social en Videojuegos de Realidad Virtual, 2019. URL: https://eprints.ucm.
     es/id/eprint/61964/1/SERRANO_GOMEZ_Diseno_y_Desarrollo_de_Personajes_con_
     Presencia_Social_en_Videojuegos_de_Realidad_Virtual_4398577_953433568.pdf.
 [6] Opsive, Behavior designer, 2014. URL: https://opsive.com/assets/behavior-designer.
 [7] KorinVR, VR Gesture Recognizer, 2013. URL: https://github.com/korinVR/
     VRGestureRecognizer.
 [8] MARUI-PlugIn, Mivry, 2019. URL: https://assetstore.unity.com/packages/templates/
     systems/mivry-3d-gesture-recognition-143176#releases.
 [9] Demigiant, Dotween, 2015. URL: https://assetstore.unity.com/packages/tools/animation/
     dotween-hotween-v2-27676#description.
[10] Unity Technologies, VR Beginner: The Escape Room, 2019. URL: https://learn.unity.com/
     project/vr-beginner-the-escape-room.