=Paper= {{Paper |id=Vol-2246/GHItaly18_paper_04 |storemode=property |title=Things from Another World. VR, UI and UX through Run of Mydan |pdfUrl=https://ceur-ws.org/Vol-2246/GHItaly18_paper_04.pdf |volume=Vol-2246 |authors=Ilaria Mariani,Alan Mattiassi |dblpUrl=https://dblp.org/rec/conf/avi/MarianiM18 }} ==Things from Another World. VR, UI and UX through Run of Mydan== https://ceur-ws.org/Vol-2246/GHItaly18_paper_04.pdf
                                Things from Another World.
                            VR, UI and UX through Run of Mydan
                                     Ilaria Mariani                                   Alan Mattiassi
                                 Department of Design,                          Dipartimento di Economia
                                 Politecnico di Milano                        "Marco Biagi", Università degli
                                       Milan, Italy                          Studi di Modena e Reggio Emilia
                               ilaria1.mariani@polimi.it                               Modena, Italy
                                                                                alan.mattiassi@unimore.it
ABSTRACT                                                                     headsets and some must-have contents (revenue up from
When it comes to games in Virtual Reality (VR), User                         1.8 in 2016, to 2.2B in 2017, and it is estimated to grow to
Interfaces (UI) require peculiar attention, since they imply                 4.5B in 2018 [23]). The main profile devices are
different interactions and uses than games experienced on                    PlayStation VR, Oculus Rift and HTC Vive.
two dimensional screens. Through the examination of the
case study Run of Mydan, a first person single-player and                    What changes – and what is also challenged – in
multiplayer flying VR shooter, we discuss and ruminate on                    comparison with the traditional way of playing video
VR UI, and its influence on players in terms of UX (user                     games, is the player sense of interaction and immersion.
experience). Drawing specific attention on affordances,                      Experiencing VR games, players interact with their
usability, discoverability and feedback, we analyse how the                  surroundings (artificial environment) through senses and
developers of this game dealt with the UI as embedded into                   limbs, and the information flow is bidirectional: through
the environment or displayed on the avatar’s body. We                        senses, it goes from the environment to players; through
focus on how a diegetic interface facilitates the player in                  limbs, it goes backward. Conceptually, this way of
effortlessly understanding the virtual world and reaching                    interacting with the virtual world is more realistic than the
immersion. Based on the findings, we conclude that UX                        mouse-and-keyboard (for pc games) or pad/joystick (for
benefits from the intuitive diegetic solutions that the                      consoles and arcades games) mediated way: indeed, players
developers adopted, providing support for the “zero                          use motor schemes that have been previously acquired in
interface” approach in conveying information in virtual,                     the naturalistic setting of the physical environment to
three dimensional environments.                                              perform everyday actions. On the contrary, traditional ways
                                                                             of mediating interaction involve acquiring motor patterns
Author Keywords                                                              that are only loosely related with the resulting in-game
Games; Virtual Reality; User Interface; User Experience;                     meaning (as pressing the “w” key for moving the avatar
Immersion.                                                                   forward) and often suffer from cross-modality interferences
ACM Classification Keywords                                                  (as using buttons on the left to have the avatar perform
K.8.0. Personal Computing: General – Games; H.5.m.                           actions on the right [13]). However, even by taking realism
Information interfaces and presentation (e.g., HCI):                         into account, there are recurrent perceptual UX problems.
Miscellaneous.                                                               Ever since their conception in the 60’s, VR via head-
                                                                             mounted systems has been involved firstly in visual and
INTRODUCTION                                                                 secondary auditory senses [24], revealing a quite persistent
Virtual reality (VR) is a system of principles, methods and                  lack of coverage on other senses. In particular, the absence
techniques intended to give the player a more realistic way                  of haptic stimuli and of a physicality of the virtual world
of perceiving and experiencing the surroundings by                           impacts on VR realism and immersion [15,16]. Although
reproducing via modelling and simulation an artificial,                      the use of wearable technology, physical props, and the
three-dimensional space. Especially speaking of games, VR                    possibility of having players walking while navigating VR
is now on the rise, benefitting from price cuts on VR                        environments (e.g.: [5]), the problem is still far from an
                                                                             easy solution. In fact, such approaches cannot fully nor
                                                                             smoothly recreate the experience of touching objects.
                                                                             That said, we are in front of haptic and proprioception
                                                                             issues, where the first refers to the sensory domain of touch,
                                                                             that have already been mentioned, while the second relates
                                                                             to those stimuli that are produced and perceived because of
GHItaly18: 2nd Workshop on Games-Human Interaction, May 29th, 2018,          the position of our body in the environment/space and its
Castiglione della Pescaia, Grosseto (Italy)
Copyright © 2018 for the individual papers by the papers' authors. Copying
                                                                             locomotion. Indeed, our brain continuously checks the
permitted for private and academic purposes. This volume is published and    proprioceptive and visual consequences of motor
copyrighted by its editors.                                                  commands (e.g.: [2]). In some cases, a mismatch occurs,
such as when patients with an amputated limb try to move          throughout its design process (from the first demo to the
it. The majority of these patients experience the vivid           current version). We used variable methods: informal
presence of a “phantom limb” associated with extreme pain.        interviews, direct observation, participant observation via
Crucially, by restoring the visual feedback, the pain is also     moderate participation in the design phase (impacting on
instantly reduced [19], suggesting that the sensorimotor          the implementation of the game) and playing with the
mismatch may be interpreted by the brain as pain [18].            current version, collective discussions, and self-analysis. As
Similarly, when looking at virtual reality, the matching          a matter of fact, one of the authors of this paper has been
between what is being perceived via visual and non-visual         the first playtester of the early access release, the first
channels may evoke bizarre experiences. One such                  public version of the game. The benefits of conducting
experience is known as the cybersickness or VR sickness, a        observation and interacting over an extend span of time
feeling that closely resembles motion sickness.                   (about 1 year, at the time of writing) lies in the collection of
Interestingly, while for motion sickness vestibular               those reasonings that are not influenced by a posteriori fact,
stimulation is necessary, with visual stimulation being a         but rather progressive improvements, for example how
possible contributing factor, in VR sickness only the visual      discrepancies have been resolved.
stimulation occurs [11]. In line with the phantom limb
                                                                  According to Howell [7], we conducted the research: 1)
example, this suggests that the mismatch between the
                                                                  Establishing encounters with the developer team before
information flowing through the visual channel and that
                                                                  starting the study; 2) In the field, entering the community
elaborated through another channel (in this case,
                                                                  since the game was in a demo phase; 3) Recording
proprioceptive information) is the triggering factor. This
                                                                  observations and data via (a) field notes and (b) semi-
issue is still being dealt with.
                                                                  structured interviews, being aware of possible subjective
In developing the first VR contents, multiple UX problems         biases and prejudices [1,6,21]; 4) Analyzing data by (a)
popped out concerning UI. To lessen its impact, answering         thematic and (b) narrative analysis.
the necessity of delivering complex information, a design
                                                                  RESULTS AND DISCUSSION
solution can be embedding information into the                    Run of Mydan is a first person single-player and multiplayer
environment, following the idea that the best interface is no     flying VR shooter. The two game modes differ not only in
interface [10] – trying not to expose it, and not even to refer   the number of players involved but also in the navigation
to the way it is used in two dimensional games. A three-          mechanics. Thus, we go from describing common features
dimensional environment offers different affordances than a       to vetting into the main differences. Finally, we analyse the
two-dimensional one, including those that refer to our            menu interface and its UI, as an element deliberately
innate perception-action patterns (since we live in a three
                                                                  designed as separate from the game environment. The game
dimensional space) [4,14,17]. Taking advantage of this, VR
                                                                  can be played either with the HTC Vive VR system or with
designers could approach UX problems simply demanding             the Oculus Rift one, alongside a pair of controllers and
the interaction learning to already known patterns.               turrets to track the player. In the following, we embrace a
In the light of this reasoning, our research question regards     twofold perspective: the one of the player who experiences
how players grasp information from the VR environment,            the game, and the one of the developers who took specific
and hence understand how to interact with the game                decisions in terms of interaction, aesthetics and so on.
elements. Sense-making issues (as wayfinding or                   General features
interaction with the environment), complications with             The entire gameplay is based on a singular assumption: the
actual navigation in real spaces, and interaction with the UI     player’s avatar and the enemies can be damaged until they
(techniques from handheld to full-body), are just some of         die. As such, the main goal of the player is to survive and
the main problems. Recognizing their existence, as well as        kill enemies. In Run of Mydan players can perform the
the inconsistencies and discrepancies that tag along, we          following actions: moving, attacking (with the currently
propose to go through a case study for ruminating how             selected weapon) or blocking (by generating and using a
hands-on experiences can serve to unpack some recurrent           shield). Depending on the game mode, these actions have
problem and overcome frequent usability issue. Then, based        different effects in terms of gameplay, since can affect
on our observations we discuss the game affordances as            different game elements.
usability, discoverability, feedback, and what kind of
information the UI convey to those who play.                      Acknowledging Lee and colleagues’ research on avatars’
                                                                  spatial navigation of virtual environments [12], and relying
METHODOLOGY                                                       on the concept of peripersonal space alongside the
From a methodological perspective, the research conducted         extrapersonal and personal ones, we draw our attention on
on the artifact is based on qualitative research on Virtew’s      its implication in terms of interactions between avatar,
Run of Mydan (2017) as our case study. The analysis               player, and environment. According to the authors,
follows how the investigation of the game experience              navigating the space with their avatar, players tend to
according to user-centered approach resulted into                 ignore visual stimuli located “outside of the avatars’
implications and implementations of the VR game,
                                                                  peripersonal spaces in which the avatars cannot interact,
thereby irrelevant informational space” [12]. To increase       configurations are irrelevant to the current rumination, what
immersivity, the game uses no head-up display (HUD) to          concerns us is that in the multiplayer mode, the interaction
convey information, but it is either embeded it in the          with others consists of fights and occurs just through
environment or embodied it in the three-dimensional             weapons – not barehanded harm can be provided. When
elements themselves. Following the "zero interface"             damage is received, the avatar health decreases
principle of the VR medium, the UI and its elements have        proportionally to the hits; recovery starts few seconds after
been made contextual to the environment itself, therefore       the last hit. When the avatar dies, it respawns with full
fluidly merged with the it or with the avatar, as explained     health, and a point is given to the opposing enemy/team.
below. In terms of affordances, embedded or embodied UIs
                                                                Single-player mode
have an impact on usability: they do not communicate their      The single-player mode differs from the multiplayer mode
presence to players, but players get to intuitively know        mainly because it does not allow “free flight” movement:
about their existence and function. Run of Mydan UI mainly      the avatar is indeed enchained to a floating platform. As
relies on discoverability, as the degree of ease with which     such, the movement system previously seen for the
the player discovers the elements and features of the game      multiplayer mode only moves the platform slightly on the
system as far as they are first encountered, and on the game    left or right or makes it accelerate or slow down; the
ability to contextually and timely provide understandable       platform follows an invisible path (a sort of rail) over which
feedback of what is going on. For example, if the flying        the player has little control. However, the player can move
modes are grounded on quite intuitive and graspable             across the two-dimensional plane of the platform,
interactions, especially because they rely on the well-known
                                                                corresponding to about 2.5x2.5 meters. By walking around
Ironman and Superman imaginary with equally clear
                                                                the tracked area, the movement, centered on the avatar, is
affordances, the use of weapons is otherwise based on a         recreated in the VR space. Because of some chains, which
different and “less natural” reasoning and affordances that     are a simple, narrative-based and very effective visual
require to start a learning process – as discussed below.       stratagem to communicate the game affordances, the player
Multiplayer mode(s)                                             is led to know that the platform is the only walkable space.
The multiplayer mode features a 3D, gravity free                As such, the player is informed that by being on the
environment in which player’s avatar floats and moves.          platform s/he can move as if s/he was affected by gravity
Such environment is a finished world, and its extension is      (even if the platform is not, since it moves floating along an
signaled by an invisible wall that appears once encountered:    invisible rail). This aspect alongside the properties of some
when a player reaches such border, a visual effect that         environmental elements and enemies are a source of
could be described as a disintegrating net of floating          ambiguity, since some elements are inexplicably subject to
orange-stroked triangles appears. If the avatar does not        gravity or not. This is certainly an unresolved issue that
touch such fringe, nothing signals its existence neither its    produces a cognitive dissonance, due to the environmental
proximity: matter-of-factly players are allowed to see the      physics; however, the chain expedient provides a diegetic
rest of the world through such net, but they cannot             reason to the player who cognitively matches what s/he sees
reach/explore it. This brings to a situation that provides      to what s/he understands of the world: in-game position and
situated information just when it is needed, and in the         locomotion, as well as the feeling of gravity itself.
meanwhile it is providing enhancing player’s feeling of
                                                                Focusing on the interaction with the world, the player
being in a full world rather than in a mere portion.
                                                                simultaneously needs to 1) defend from environmental
Then, focusing on in-game locomotion, players can perform       perils and assaults/offences by standard enemies as well as
360 movements to navigate the environment. Movements            giant end-level bosses, through the use of shields and flying
can be performed in two modalities, selected in the settings.   skills, and 2) attack such enemies or dangerous
Selecting the Ironman mode, for moving players need to          environmental elements using the available weapons. In
point the controllers at the direction that they want to be     certain occasions, players also need to use an appropriate
pushed from (i.e., pointing in front of myself to be            combination of attack, defense and movement. In this
propelled backwards); in the Superman mode, players point       mode, enemies do not recover from damage, but the
at the direction they want to be pulled in (i.e., pointing in   player’s avatar does. Then, when the avatar dies, it
front of myself to go forward). In the early design phases,     respawns in a previous checkpoint, full health. However,
the Ironman mode was the only modality. However, tests          checkpoints presence and position are not communicated,
ran users showed that such modality of fly was not easily       and players are not aware of their existence, until the avatar
grasped and handled by all those who played it. In answer       dies and respawns. This choice undermines the player’
to such issue, the “superman” mode has been introduced.         sense of consistency of actions, since s/he is not informed
                                                                about a mechanic that is available in the game.
That said, we switch from the interaction with the
environment to the one with other online players. Players       Menu UI and player UX
can enter several configurations (1 on 1 deathmatch, team       The menu on which the player selects the game mode,
vs team deathmatch, dominion, etc), however while such          configures settings, customizes avatars and so on is located
in a separate space from the one in which the the gameplay       facilitating the path from engagement to engrossment and
occurs. By entering this space (fig. 1), the player leaves the   total immersion [3]. In fact, all the information regarding
game environment and a new nightly environment appears,          in-game meaningful statuses are embedded with
with a menu consisting of a set of three-dimensional             appropriately diegetic representations [8]. Recognizing the
buttons and writings spanning 180° of the visual field and       central role of immersion, in the following we expand some
being centered on the player (fig. 1). The menu appears in       of the concepts that contribute to it (and its maintenance).
the peripersonal space [12], so that buttons are reachable by    The first point regards information overload, possibly due
extending the hand with no need for additional movements         to an excessive amount of informative elements that could
such as steps. However, by not having physicality nor            affect in a negative way the player decision-making
haptic feedback, the virtual hand can use the same space         process. In fact, dealing with a virtual representation, we
occupied by the button. The interaction with the buttons         undergo a peculiar contradiction that stands “between our
requires to have the virtual index finger “inside” the virtual   impression of virtually unlimited perceptual content and the
button, where the status of “being pressed” is signaled by a     existence of severe attentional limitations” [22]. To reduce
rather unexpected reaction: the button becomes partially         the information overload and smoot as much as possible the
transparent. This interaction results counterintuitive and       play experience, the game developers firstly developed an
unrealistic, and produces a certain ambiguity.                   effective navigation system for both the single and
                                                                 multiplayer modes, fitting the game general coherence and
                                                                 timely providing answers to the task to accomplish in the
                                                                 game space. Then the 3D environment has been designed
                                                                 for balancing (and ameliorating) how information is
                                                                 provided. Aptly, the way in which the game system shows
                                                                 visual information should regard only the items needed to
                                                                 accomplish a task (at hand), coherently and timely, instead
                                                                 of forming an extended, detailed representation of the full
                                                                 variety of objects in the surrounding environment [20]. The
                                                                 UI should enable to handle multiple and dynamic
                                                                 information, also exploiting our spatial cognition
                                                                 capabilities. Just the player’s damage information is
                                                                 traditionally provided, responding to the very habits of
               Figure 1. The menu and its UI.                    players: when the player is repeatedly hit and damaged, the
                                                                 vision of the world turns red with a contrast that gets
                                                                 stronger the more serious the damage is. When life is
                                                                 recovered the colour returns to normal and the life bar on
                                                                 the forearm fills up. That said, during the gameplay, the
                                                                 player is provided with some information that rather than
                                                                 being overlaid on the screen, are wisely situated into the
                                                                 virtual space. This allows players to bypass them by
                                                                 “walking across the information themselves”, providing the
                                                                 conceptual, and cognitive, implication that they are part of
                                                                 the VR world. According to our direct experience, the way
                                                                 in which the UI has been embedded in the avatar’s body as
                                                                 well as in the environment itself contributes to increase
                                                                 immersivity rather than producing a sort of detachment due
Figure 2. The button getting transparent when being presses.
                                                                 overlay of information. Indeed, recognizing the
Moreover, to perform every choice here mediated by these         potentialities of the avatar’s body in being a diegetic
three-dimensional buttons (fig. 2), the player has to push       element that can be used for providing supplementary
the trigger on the controller with the index finger. In so       information, basic information as health and weapon
doing, we obtain a sensorimotor mismatch: even if the            readiness states are displayed on the arm of the avatar (in
player is pressing the button on the controller, the virtual     line with how it has been done in Dead Space 2, a solution
hand does not move, and the button does not look as “being       already discussed in [9], [8] and [25]).
pressed”. According to our experience, this mismatch is
                                                                 On the contrary, more advanced information is conveyed by
perceived as a friction with the interaction, resulting into a
                                                                 means of several intuitive affordances. For example, the
fracture of the immersion [3,15].
                                                                 avatar status is represented on the arm and the line of fire
UI embedded in the gameplay                                      can be inferred by aligning a set of three-dimensional
The UI has been designed and implemented to let the              floating triangles resembling the behaviours of aiming with
player’s experience be as immersive as possible by               a shotgun (fig. 3). From an UX point of view, these UI
design choices result as consistent as meaningful, in           overall manipulation. Reducing the number of choices to a
addition to be diegetic. They allow players to behave in a      maximum of three weapons to be selected among all those
natural way, and simply check their arm for information         present in the game while in the the menu space, the
about their health and weapon recharge states, or point their   cognitive load is relatively low and the manipulation
weapon using triangles to aim, rather than adding layers of     happening directly on the interested hand keeps the
information in the environment – as a non-diegetic life bar     selection intuitive. This iteration certainly took into
or aiming cross in the middle of the field of view.             consideration the principle of discoverability, while the
                                                                second and third ones could not be described as user-
                                                                friendly, even if we recognise the attempt to maintain a
                                                                diegetic coherence. In Run of Mydan, the first playtests
                                                                highlighted troublesome interactions, showing a persistent
                                                                discrepancy between perceived affordances and unexpected
                                                                results. To obtain a coherence between perceived and real
                                                                affordances, the developers modified certain interactions
                                                                (as the weapon selection one) and introduced specific
                                                                feedback that are consistent throughout the gameplay [4],
                                                                but also meaningful in narrative terms.
    Figure 3. The avatar arm with the health and weapon
      readiness state, and the weapon pointing system.          Finally, to convey further information they introduced the
                                                                haptic feedback. Controllers vibrate when a player uses a
Like navigation, also the interaction with the environment
                                                                weapon to shoot, but also when some weapons are ready or,
occurs through movements of the upper part of the body:
                                                                conversely, when some other weapons are fully discharged.
pushing the controller buttons, or orienting them as an
                                                                Additionally, vibration occurs also in one instance in which
extension of the player’ arms in the space to fly or shoot,
                                                                the shield is broken. In our experience, while vibration feels
avoiding complications due to composite actions. The
                                                                like a nice feature of these few actions, the coherence with
weapon selection currently involves the use of a dedicated
                                                                which it is implemented conveys little meaning.
button on the controller that can be pressed with each
thumb to change the selected weapon on the corresponding        CONCLUSION
virtual hand. The selected weapon is communicated by an         We analysed the use of UI in a three dimensional, virtual
icon on the back of the corresponding hand of the avatar,       environment in the game Run of Mydan, in which the
becoming in turn an embedded information. While no              developer’s attempt was to adopt the diegetic approach to
visual feedback in the VR space corresponds to the thumb        facilitate both players’ immersion and their understanding
movement, this design solution solves a number of issues        of the game world. While the resulting product reaches this
that were detected during the playtest sections. The first      goal in many aspects, some issues are still left to be
iteration to select the weapon was a swipe on the controller    satisfyingly solved. In fact, the diegetic informative
pad, that provided no feedback, but the appearance of the       elements embedded in the environment/body are coherent
selected weapon in the avatar hand in a following moment        with both the design and psychological guidelines that
than the selection itself. The second iteration involved the    suggest to use intuitive patterns and affordances. These, in
appearance of a semi-transparent fan-like panel presenting      turn, trigger already known motor patterns and facilitate the
the possible choices on the back of the hand on which the       learning process, while rendering the gameplay more
weapon was being selected. In this case, the player had to      intuitive and the immersion deeper. Along with these
reach the back of that hand with the opposing hand, and act     benefits, the sensorimotor matching and the cognitive
on it with a complex manipulation involving a spline            match between the bodily feelings and the visual
generated on the wrist that needed to be connected with the     stimulation are taken into account and exploited with
weapon icon. Then, after performing such a complex              diegetic solutions. The result is a game in which most of the
manipulation with the opposing hand, the weapon was             information is conveyed in an intuitive and straightforward
selected for the hand that was not manipulating. Playtesters    manner, and players can quickly grasp it and effortlessly
reported this solution as very counterintuitive. The third      interact with the world. However, a number of issues
iteration of the weapon-choice interface involved less          remain open: non-diegetic elements present various degrees
manipulation by the opposing hand. The panel of choices         of interference with total immersion. A great deal of work
was placed on the back of the shield: in so doing, to change    has been devoted to the weapon-selection interface, but the
the weapon on one hand the player had to press the              “non-diegetic button press” solution still leaves the
controller shield button (a thumb press) with that hand.        cognitive load low while representing the current status in a
However, the problem remained, as the hand performing           diegetic way. On the opposite, the menu interface is by
the selection still wasn’t the one being affected by it. The    definition non-diegetic, in that the game needs to be paused
second and third iterations did not solve the issue, since      to access it. However, in this space the UI is represented in
they produced expectations later disappointed. The last and     a three dimensional space with convenient affordances. In a
current iteration simplifies the problem by limiting the        sense, while the menu space is separate from the game
space, both have their own diegetic, but incompatible,               http://www.gdcvault.com/play/1017723/CraftingDestru
meanings. Unfortunately, the interaction with the menu               ction-The-Evolution-of
breaks the immersion, forcing the player to use body             10. Golden Krishna. 2015. The Best Interface is No
movements with no in-game reconstruction and based on                Interface: The Simple Path to Brilliant Technology.
the counterintuitive assumption that two objects may                 New Riders, USA.
occupy the same spatial position. This analysis suggests
that the diegetic, no interface is the best interface approach   11. Joseph J. and LaViola Jr. 2000. A Discussion of
[10] is useful in providing a barrier-less path to total             Cybersickness in Virtual Environments. SIGCHI Bull.
immersion in VR. However, while single case studies are              32, 1: 47-56.
useful in exploring state-of-the-art solutions, whereas          12. Jooyeon Lee, Manri Cheon, Seong-Eun Moon and
evidence-based directions for UI and UX design are                   Jong-Seok Lee. 2016. Peripersonal Space in Virtual
required the topic needs further quantitative-methods                Reality. Proceedings of the 29th Annual Symposium on
explorations.                                                        User Interface Software and Technology, ACM, 207-
ACKNOWLEDGMENTS
                                                                     208.
We thank Virtew and its team for their time and valuable         13. Alan Mattiassi. 2017. Command Systems and Player-
contribution, for providing us with materials, screenshots           Avatar Interaction in Successful Fighting Games in
and access to the game.                                              Light of Neuroscientific Theories and Models. GHItaly
                                                                     CEUR Proceedings.
REFERENCES
1.   Anne-Marie Ambert, Patricia A. Adler, Peter Adler,          14. Joanna McGrenere and Wayne Ho. 2000. Affordances:
     and Daniel F. Detzner. 1995. Understanding and                  Clarifying and evolving a concept. Graphics Interface,
     Evaluating Qualitative Research. Journal of Marriage            179-186.
     and Family 57, 4: 879–893.                                  15. Alison McMahan. 2003. Immersion, engagement and
2.   S. J. Blakemore and J. Decety. 2001. From the                   presence. In The Video Game Theory Reader, Mark JP
     perception of action to the understanding of intention.         Wolf and Bernard Perron (eds.). Routledge,
     Nature Reviews. Neuroscience 2, 8: 561–567.                     London/New York, 67-86.
3.   Emily Brown and Paul Cairns. 2004. A Grounded               16. Janet Horowitz Murray. 1997. Hamlet on the
     Investigation of Game Immersion. CHI ’04 Extended               Holodeck: The Future of Narrative in Cyberspace. MIT
     Abstracts on Human Factors in Computing Systems,                Press, Cambridge, MA.
     ACM, 1297–1300.                                             17. Donald A. Norman. 1999. Affordance, Conventions,
4.   Cardona-Rivera Rogelio Enrique and Young R.                     and Design. Interactions 6, 3: 38-43.
     Michael. 2014. A Cognitivist Theory of Affordances          18. Vilayanur S. Ramachandran and Eric L. Altschuler.
     for Games. DiGRA Conference. 2013.                              2009. The use of visual feedback, in particular mirror
5.   Lung-Pan Cheng, Thijs Roumen, Hannes Rantzsch, et               visual feedback, in restoring brain function. Brain: A
     al. 2015. TurkDeck: Physical Virtual Reality Based on           Journal of Neurology 132, Pt 7: 1693-1710.
     People. Proceedings of the 28th Annual ACM                  19. Vilayanur S. Ramachandran, David Brang and Paul D.
     Symposium on User Interface Software & Technology,              McGeoch. 2009. Size reduction using Mirror Visual
     ACM, 417-426.                                                   Feedback (MVF) reduces phantom pain. Neurocase 15,
6.   Kathleen M. DeWalt and Billie R. DeWalt. 1998.                  5: 357-360.
     Participant observation. In Handbook of Methods in          20. Ronald A. Rensink. 2000. The dynamic representation
     Cultural Anthropology, H. Russel Bernard (ed.).                 of scenes. Visual Cognition 7, 1–3: 17-42.
     AltaMira Press, Walnut Creek, CA, 259-299.
                                                                 21. Norman K. Denzin and Yvonna S. Lincoln (eds.),
7.   Joseph T. Howell. 1972. Hard Living on Clay Street:             Handbook of Qualitative Research (2nd Ed.). Sage
     Portraits of Blue Collar Families. Waveland Press,              Publications, Thousand Oaks, CA.
     Inc., Prospect Heights, IL, 392-403.
                                                                 22. Claudia Roda. 2011. Human Attention in Digital
8.   Ioanna Iacovides, Anna Cox, Richard Kennedy, Paul               Environments. Cambridge Univ. Press, New York, NY.
     Cairns and Charlene Jennett. 2015. Removing the
     HUD: The Impact of Non-Diegetic Game Elements               23. SuperData 2018. Year in Review 2017. Digital and
     and Expertise on Player Involvement. Proceedings of             Interactive Media.
     the 2015 Annual Symposium on Computer-Human                 24. Ivan E. Sutherland 1968. A head-mounted three
     Interaction in Play - CHI PLAY ’15, ACM, 13-22.                 dimensional display. Proceedings AFIPS ’68, 757-776.
9.   Dino Ignacio. Crafting Destruction: The Evolution of        25. Max Taylor. 2017. Augmenting The HUD: A Mixed
     Dead Space User Interface. Game Developers                      Methods Analysis on the Impact of Extending the
     Conference 2013 Talk. Retrieved from:                           Game UI Beyond the Screen