<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>An experimental protocol to access immersiveness in video games</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Marika Malaspina</string-name>
          <email>m.malaspina11@campus.unimib.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jessica Amianto Barbato</string-name>
          <email>jessica.amiantobarbato@unimib.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Marco Cremaschi</string-name>
          <email>marco.cremaschi@unimib.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Francesca Gasparini</string-name>
          <email>francesca.gasparini@unimib.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alessandra Grossi</string-name>
          <email>alessandra.grossi@unimib.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Aurora Saibene</string-name>
          <email>aurora.saibene@unimib.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>NeuroMI, Milan Center for Neuroscience</institution>
          ,
          <addr-line>Piazza dell'Ateneo Nuovo 1, 20126, Milano</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>University of Milano-Bicocca</institution>
          ,
          <addr-line>Viale Sarca 336, 20126, Milano</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>In the video game industry, great importance is given to the experience that the user has while playing a game. In particular, this experience benefits from the players' perceived sense of being in the game or immersion. The level of user immersion depends not only on the game's content but also on how the game is displayed, thus on its User Interface (UI) and the Head's-Up Display (HUD). Another factor influencing immersiveness that has been found in the literature is the player's expertise: the more experience the user has with a specific game, the less they need information on the screen to be immersed in the game. Player's level of immersion can be accessed by using both questionnaires of their perceived experience and exploiting their behavioural and physiological responses while playing the target game. Therefore, in this paper, we propose an experimental protocol to access immersiveness of gamers while playing a third-person shooter (Fortnite) with UIs with a standard, a dietetic, and a proposed HUD. A subjective evaluation of the immersion will be provided by completing the Immersive Experience Questionnaire (IEQ), while objective indicators will be provided by face tracking, behaviour and physiological responses analyses. The ultimate goal of this study is to define guidelines for video game UI development that can enhance the players' immersion.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Immersiveness</kwd>
        <kwd>Head's-Up Displays</kwd>
        <kwd>Afective gaming</kwd>
        <kwd>Third-person shooter</kwd>
        <kwd>PlayStation</kwd>
        <kwd>Fortnite</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        In the gaming sector, immersion is an element considered an important interaction experience
[
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. In literature, the concept of immersion is generally reconnected and defined through the
concept of flow , settled by the psychologist Mihály Csikszentmihalyi in 1975 [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. The author
has described flow as an optimal experience, thus an experience which is achieved by people
when they enter into a state of euphoria, characterised by total attention on what they are
doing, by loss of sense of self, and by an alteration of time perception.
      </p>
      <p>
        While Csikszentmihalyi’s studies were performed on diferent types of people, such as
sportsmen and artists, lately, the concept has been translated to other fields. In particular, considering
video games, the literature presents diferent opinions about the veracity overlap of the flow
and immersion concepts [
        <xref ref-type="bibr" rid="ref1 ref4">1, 4</xref>
        ], where immersion can be simply described in this case as “the
sense of being in the game”. This means that immersion can be considered not only as a feeling
of presence in the game but also as a real cognitive statement whereby people are immersed in
their activity (but not as much as in flow) [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
      </p>
      <p>
        The subject of immersion in the video game context has prompted numerous studies and
projects. For example, in Sanders and Cairns’s work [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ], video game immersion is analysed
considering the timing of the game session and the use of music, while Christou [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] provides an
analysis of the relationship between immersion and appeal.
      </p>
      <p>A significant aspect of this topic involves analysing this phenomenon starting from the video
game UI. The UI plays a crucial role as it is the point of interaction between the player and the
video game. It also acts as a filter for the player’s game experience.</p>
      <p>
        Among the works that focus on the study of immersion in video games by considering their
UIs, Iacovides et al.’s one [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] compares the diegetic and non-diegetic UI versions of the case
study, which is Battlefield 3 1, and evaluates if both UIs lead to a positive game experience using
questionnaires. Afterwards, they examine the involvement experienced by novices and experts
for each of the proposed UI versions. Notice that, in this case, for expert is intended for a player
using a specific game or a game category for at least half an hour per week. The authors find
that players with expertise can be more immersed when the HUD is dietetic, i.e., when no
information is displayed.
      </p>
      <p>
        In gaming, another aspect that can be investigated to analyse immersivity is the players’
physiological response during the gameplay experience. As part of the research on this topic,
Lages [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] introduces the concept of immersive entertainment to identify all the entertainment
applications devised for immersive systems. The author presents eight factors influencing
immersive entertainment: (i) presence and user experience understanding, (ii) characters and
storytelling, (iii) interaction technique design, (iv) virtual environment rendering, (v)
physiological sensing and biofeedback introduction, (vi) social experience improvement, (vii) user
safeguarding and responsible design, and (viii) content creation tool availability.
      </p>
      <p>In particular, Lages points out for factor (v), on which we are particularly interested, that
there is currently (2021, year of publication) a vast use of facial feature trackers, but that
newer headsets integrating these sensors with physiological ones, i.e., electroencephalogram
(EEG), electrooculogram (EOG), electromyogram (EMG), electrodermal activity (EDA), and
photoplethysmogram (PPG), are being developed, especially to detect users’ emotional states.</p>
      <p>
        On this topic, Hughes and Jorda [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] present biosignals commonly used in research on gaming,
especially for rehabilitation purposes and/or targeting people with disabilities. In particular, the
authors find that mostly EMG, EEG, EDA, eye tracking with EOG, and electrocardiogram ( ECG)
non-invasive technologies are employed considering both user monitoring (passive mode) and
1Electronic Arts Inc. (https://www.ea.com/en-gb/games/battlefield/battlefield-3?setLocale=en-gb accessed
September 21, 2023)
afective gaming (active mode). Considering the provided analyses, EMG is involved in an active
modality, and thus for game control or rehabilitation exercises; EEG is eficient for emotion
detection and neurological assessment in a passive mode, while its signals can be used actively
for game control and virtual environment change; eye tracking and EOG provide information
on the user’s fatigue, the points of interest in the visualised content, and more actively these
signals can be used to control games; ECG allows stress monitoring and the level of engagement,
as well as changes of environment and gameplay experience according to the monitored signals;
these changes can be also enabled by analysing the EDA signals, which are good indicators of
stress, panic, and other negative emotions.
      </p>
      <p>
        Despite the extensive research conducted using physiological signals, their implementation
within the context of video gaming remains limited. Bian et al. [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] propose a model based on
the idea that flow is a state of moderate arousal accompanied by a state of joy. For this reason,
physiological parameters related to these processes can predict the flow state. In the proposed
model, five indicators are included, i.e., facial EMG, cardiovascular activity, EDA, respiratory
activity, and EEG, accompanied by their features. Notice that this model is also based on a
previous study [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], which found the connection between flow and heart rate variability using
a driving simulator and is applied in the context of Virtual Reality (VR).
      </p>
      <p>Therefore, inspired by Iacovides et al. and Bian et al. works, we propose an experimental
protocol to evaluate immersion, considering the characteristics of UIs in the Fortnite2 video
game, on the level of expertise of Fortnite users as well as on their physiological responses.</p>
      <p>In particular, the IEQ introduced by Iacovides et al. will be used to evaluate the perceived
subjects’ level of immersion. Moreover, face tracking, EEG, Galvanic Skin Response (GSR), PPG,
and respiratory data will be collected for a physiological evaluation of the phenomenon under
analysis, following guidelines made from Bian et al.’s physiological model.</p>
      <p>This study aims to define guidelines for creating video game UIs that can optimise the
immersion of expert and novice players within the target video game category, i.e., third person
shooter (TPS). Thus, the research coming from the proposed experimental protocol is intended
to answer the following research questions:
• How can we discriminate diferent levels of immersiveness?
• Do the physiological signals contribute to the interpretation of the levels of immersiveness
considering them separately or as an ensemble?
• Does the players’ expertise influence the outcome, and in what measure?
• Does the information present on the UI afect the players’ immersiveness?</p>
      <p>The rest of the paper is organised as follows. Section 2 provides a brief overview of the role
of Artificial Intelligence ( AI) in physiological signal interpretation, especially in the fields of
afective computing and gaming. Section 3 and Section 4 report the experimental settings and
protocols. Finally, conclusions are drawn in Section 5.
2Epic Games Inc. (https://store.epicgames.com/en-US/p/fortnite?lang=en-US accessed September 19, 2023).</p>
    </sec>
    <sec id="sec-2">
      <title>2. The role of artificial intelligence</title>
      <p>
        Video games elicit emotions in their players, influencing their sense of being in the game [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
Moreover, the player’s behaviour directly influences the gameplay and game objectives.
      </p>
      <p>
        Afective computing automatically recognises human emotions by studying implicit measures
derived from physiological sensors and facial expressions, allowing close interaction between
humans and machines [
        <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
        ]. In fact, afective models can be designed to exploit and translate
the data collected from the previously cited sensors (Section 1). This enables computer systems
to perceive and interpret human emotions and consequently provide intelligent, empathetic,
and responsive interactions [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ].
      </p>
      <p>
        Therefore, afective computing can be leveraged to dynamically change a video game,
adjusting the storyline by identifying a player’s emotional state and actions. This particular
application, which may also include game control by means of physiological signals, is called
afective gaming [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ] and has as its main aim the creation of more profound and immersive
gaming experiences that can evoke strong emotions.
      </p>
      <p>A notable and recent example of this approach is Before Your Eyes3, a narrative-driven
adventure game that takes players on an emotional journey through the life of a character
named Benjamin. Players navigate the story by blinking their eyes, whose movements are
tracked by a webcam or an eye-tracking technology.</p>
      <p>Besides being a powerful storytelling medium, the game becomes a means to evoke genuine
emotions. In particular, Before Your Eyes tries to elicit empathy and understanding of Benjamin’s
story in the players. This capacity may represent a good starting point to drive social change and
promote a more compassionate society. Moreover, afective gaming can represent a potential
therapeutic tool, allowing users to explore and process their emotions.</p>
      <p>
        Another important application of afective gaming can be related to the Dynamic Dificulty
Adjustment (DDA), which is a mechanism that pushes the player to change the game’s dificulty
level, considering their performance. This operation may not be performed manually by a
player during the gameplay and could additionally demoralise the player, who may feel like is
resorting to this expedient to continue the game. Using information from diferent sensors, it is
possible to tune the game dificulty and maximise one of four emotional responses: challenge,
competence, flow, and valence [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ].
      </p>
      <p>Given these premises, afective computing and gaming will be considered in a first phase
to evaluate the players’ responses to the diferent UIs proposed in the following experimental
protocol, trying to assess users’ levels of immersiveness and their perceived emotions. In a
second phase, a model will be devised to dynamically change the game UI according to the
real-time perceived immersion by the players, exploiting the outcome of the previous phase.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Experimental settings</title>
      <p>This Section provides the information related to the experimental settings of the proposed
experimental protocol, which is divided into three main phases (detailed in Section 4). Therefore,
3https://www.beforeyoureyesgame.com/ accessed September 20, 2023.
a brief description of each questionnaire, physiological sensors, and gaming devices will be
presented during the research.</p>
      <sec id="sec-3-1">
        <title>3.1. Questionnaires</title>
        <p>The experimental design includes the implementation of a preliminary questionnaire, utilised
during Phase 0 (Section 4.1), and a pre-test and an IEQ questionnaire, employed during the Phases
1 (Section 4.2) and 2 (Section 4.3).</p>
        <p>The initial survey analyses responses from expert gamers4, focusing on their preferences
for shooter games. The questionnaire prompts users to identify the elements that require
modification, relocation, or exclusion, with the aim of enhancing the gaming experience.</p>
        <p>
          The questionnaire comprises two parts. The first part is a demographic section covering the
participant’s age, gender, and gaming behaviour. The second part presents various sections of
the game frame and the complete game frame of the case study, i.e., Fortnite. Participants are
requested to comment on potential improvements to individual sections that could enhance
their overall gaming experience. Figure 1 depicts an example of the game frame proposed to the
participant and its accompanying question: “Focusing on this game frame, do you think that it
is possible to modify, move, or delete some of its elements to improve your gaming experience?”.
Notice that the questionnaire is intended for Italian speakers only, and its purpose is purely
to gather feedback from expert users, which will inform adjustments to the HUD in line with
ifndings from Nielsen’s [
          <xref ref-type="bibr" rid="ref19">19</xref>
          ] and Federof’s [
          <xref ref-type="bibr" rid="ref20">20</xref>
          ] heuristics.
        </p>
        <p>In Phases 1 and 2, other two questionnaires are employed. The first one is a pre-test
questionnaire, which collects some demographics on the subjects and their gaming expertise.</p>
        <p>
          The second questionnaire, i.e., IEQ, surveys the subject’s video game experience during
4Expert gamers are people who play the shooter game category at least half an hour a week.
the experimental phases. The IEQ, which was also used in the gaming sector by Iacovides
et al. [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ], is a questionnaire that reveals the general experience of immersivity [
          <xref ref-type="bibr" rid="ref4">4</xref>
          ]. IEQ has
ifve dimensions, which are analysed by the diferent proposed questions. In particular, these
dimensions reflect diferent aspects of an immersive experience, i.e., (i) cognitive involvement,
(ii) emotional involvement, (iii) real-world dissociation, (iv) challenge, and (v) control. These
ifve interrelated components are evoked by 31 questions, which have a 5 point-scale.
        </p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Face tracking and behaviour analysis</title>
        <p>
          A GoPro HERO10 Black5 is used to collect video recordings (5.3K60 + 4K120 video resolution)
to be exploited for face tracking and behaviour analysis. In fact, previous research [21] has
demonstrated its ability to provide data on detecting the participant’s facial behaviour. This
feature permits facial tracking and analyses of the user’s visual movements to compensate for
the lack of an eye-tracking device [
          <xref ref-type="bibr" rid="ref4">4</xref>
          ].
        </p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Physiological devices</title>
        <p>This Section provides a brief overview of the technical characteristics of the selected
physiological devices. Figure 2 depicts the EEG (Section 3.3.1), the GSR/PPG (Section 3.3.2), and the
respiration (Section 3.3.3) devices.
5GoPro Inc. (https://gopro.com/en/us/shop/cameras/hero10-black/CHDHX-101-master.html accessed September 19,
2023).</p>
        <sec id="sec-3-3-1">
          <title>3.3.1. Electroencephalographic device</title>
          <p>The Unicorn Brain Interface6 (Unicorn) is used to record the EEG signals. This interface comprises
the acquisition software, the EEG cap, a default eight electrode set (Fz, C{3, z, 4}, Pz, PO{7, 8},
and Oz plus ground and reference sensors placed on the subjects’ mastoids), and a Bluetooth
dongle. This wearable device has the advantage of not relying on wired connections and being
characterised by hybrid sensing technologies, i.e., the electrodes can be used with a dry or wet
configuration.</p>
          <p>Notice that all the available electrodes will be employed to capture neural responses elicited
by the game events, considering that Bian et al. suggest the analyses on the alpha and beta
frequency bands only without specifying electrode positioning and that other works [22] show
that electrodes from diferent brain areas are selected when emotion recognition is involved. In
particular, Gannouni et al. [22] found that each emotion is coupled with a specific frequency
band and a specific electrode set covering the whole brain surface.</p>
          <p>Finally, notice that the Unicorn analog-to-digital converter has a resolution of 24 bits and a
sampling frequency of 250Hz. The Unicorn also presents 3-axis accelerometer and gyroscope.
The acceleration and gyroscope ranges are ± 8g and ± 1000°/s setting in x, y, and z directions.
The bandwidth is 44.8Hz for the accelerometer while 41Hz for the gyroscope.</p>
        </sec>
        <sec id="sec-3-3-2">
          <title>3.3.2. Galvanic skin response and photoplethysmogram</title>
          <p>GSR and PPG signals are acquired with the Shimmer3 GSR+ Unit7 (Shimmer GSR). This device
comprises two electrodes to monitor skin conductivity and an optical pulse sensor to be placed
on the earlobe. Moreover, it is integrated with 3-axis accelerometer, gyroscope, magnetometer,
and altimeter. Both PPG and GSR data are acquired using a sampling frequency of 128Hz.</p>
          <p>As for the Unicorn, the Shimmer GSR is a wearable and wireless device based on Bluetooth
technology. The device measurement range is 10k-4.7MΩ (.2uS – 100uS) ± 10%. 22k-680kΩ
(1.5-45uS) ± 3%, while the frequency range is 15.9Hz. In the proposed experiment, the device
will be placed on the participant’s left hand with the two GSR electrodes attached respectively
to the index and pinkie fingers to reduce possible artefacts due to the use of the joypad.</p>
        </sec>
        <sec id="sec-3-3-3">
          <title>3.3.3. Respiration device</title>
          <p>The Shimmer3 EMG Unit8 (Shimmer EMG) records the respiration demodulation through ECG
data. The device consists of two pairs of electrodes placed on the midclavicular lines to measure
the respiration rate as charging impedance across the chest [23]. In addition, an electrode is
placed on the right side of the sternum as a reference. A sampling frequency of 128 Hz is used
to acquire the signals.</p>
          <p>Notice that Shimmer3 EMG is also a non-invasive device that allows monitoring and collecting
data through Bluetooth wireless technology.
6g.tec medical engineering GmbH (https://www.unicorn-bi.com/ accessed August 16, 2023).
7Shimmer (https://shimmersensing.com/product/shimmer3-gsr-unit/ accessed August 16, 2023).
8Shimmer (https://shimmersensing.com/product/shimmer3-emg-unit/ accessed August 16, 2023).</p>
        </sec>
      </sec>
      <sec id="sec-3-4">
        <title>3.4. Gaming device and selected game</title>
        <p>PlayStation®59 has been selected as the gaming device with its DualSense™wireless controller.</p>
        <p>The selected TPS game is Fortnite. The authors of this paper selected this game for two
main reasons. Firstly, Fortnite is a famous and appreciated game, having more than 350 million
registered users by the end of 202210. Secondly, the HUD has a very important role in the
game. In fact, Celia Hodent and her team devised the HUD to be informative and non-intrusive,
considering the whole user experience. According to Hodent11, information is always available
to the players avoiding taxing them with learning and memorising everything, thus avoiding
an excessive cognitive load.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Experimental Protocol</title>
      <p>The experimental protocol is divided into three main phases of interface denfiition, assessment
of the immersiveness with expert players, and assessment of usability with non-expert players.</p>
      <p>In fact, Phase 0 is devoted to the definition of a HUD (from now on proposed HUD) in-between
the standard Fortnite UI and the one that does not report any information on the screen to
enhance the immersion of the player while avoiding removing useful information to complete
the game session.</p>
      <p>
        Before briefly introducing the other phases, notice that the three UIs that will be used in the
experiment can be summarised as follows:
• The UI with Fortnite standard HUD;
• The UI without any information, thus without the HUD;
• The UI created using gamers’ opinion and the heuristics present in the literature [
        <xref ref-type="bibr" rid="ref19 ref20">19, 20</xref>
        ],
called proposed HUD.
      </p>
      <p>
        Notice that the first two interfaces are considered following the example of Iacovides et al. [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ].
In fact, Iacovides et al. compare these two types of interfaces and find that expert users were
more immersed when using the UI without a HUD, and thus without information that would
distract them. This observation did not apply to non-expert players, who did not obtain any
increase in immersion probably due to their inexperience in the target game, i.e., Battlefield 3 .
      </p>
      <p>Proceeding with the subsequent phases, Phase 1 is intended for expert Fortnite players, i.e.,
Fortnite users that have played the game for at least two years and one hour per week. This
phase will assess the efectiveness of the proposed HUD resulting from Phase 0, comparing it to
the case where the participants play the game using the non-diegetic Fortnite HUD. Instead,
Phase 2 will evaluate the usability of the proposed HUD by non-expert Fortnite players, i.e.,
participants who know how to use a PlayStation console but have never played Fortnite. This
is to understand how expertise influences the definition of the HUD and how the HUD adapts
to diferent users. Moreover, immersiveness will also be accessed for non-expert players.
9Sony Interactive Entertainment (https://www.playstation.com/en-gb/ps5/?smcid=pdc%3Aen-gb%3Aprimary%
20nav%3Amsg-hardware%3Aps5 accessed September 19, 2023).
10everyeye.it (https://www.everyeye.it/notizie/giocatori-fortnite-totale-risposta-612779.html accessed September 21,
2023).
11Celia Hodent (https://celiahodent.com/understanding-the-success-of-fortnite-ux/ accessed September 21, 2023).
0
1
2</p>
      <p>Interface definition
Immersiveness as- Fortnite expert
sessment with ex- player aged 18-35
pert participants y.o., with a mimum
of 2 years
experience in Fortnite and
playing to it at least
1 hour per week
Usability and Fortnite non-expert
immersiveness player aged 18-35
assessment with y.o., using a
PlayStanon-expert partici- tion console with a
pants joypad</p>
      <p>Physiological responses will be collected to evaluate the best interface type for immersiveness
optimisation considering the users’ experience while playing the game.</p>
      <p>Table 1 summarises the characteristics of each phase and provides information on the
participants’ inclusion criteria, the questionnaires that will be provided, and the physiological data
that will be recorded.</p>
      <p>As a final remark, notice that players are left free to choose which hero to play during the
experiment, as outfits do not guarantee competitive advantages within the game [24].</p>
      <sec id="sec-4-1">
        <title>4.1. Phase 0: interface definition</title>
        <p>
          The aim of Phase 0 is to formulate a revised Fornite interface with HUD based on the responses
obtained from the preliminary questionnaire detailed in Section 3.1. The modifications will
thus respond to the preferences of experienced users and the requirements imposed by Nielsen
[
          <xref ref-type="bibr" rid="ref19">19</xref>
          ] and Federof’s [
          <xref ref-type="bibr" rid="ref20">20</xref>
          ] heuristics. The proposed HUD will later be presented to expert and
non-expert players in Phase 1 and 2, respectively.
        </p>
        <sec id="sec-4-1-1">
          <title>4.1.1. Inclusion and exclusion criteria</title>
          <p>Only expert players will be involved in this definition phase. In this phase, for expert is intended
for a user who plays shooter games for at least 30 minutes per week. Notice that in this phase,
the user is not necessarily an expert in Fortnite.</p>
          <p>Participant plays a Preliminary ques- none
TPS for at least 30 tionnaire plus
min per week Nielsen and</p>
          <p>Federof’s
heuristics
pre-test question- EEG, GSR, PPG,
resnaire and IEQ piration, facial
ex</p>
          <p>pression
pre-test question- EEG, GSR, PPG,
resnaire and IEQ piration, facial
expression</p>
        </sec>
        <sec id="sec-4-1-2">
          <title>4.1.2. Materials and methods</title>
          <p>
            The materials and methods used in this phase are:
• The 10 usability heuristics of Nielsen [
            <xref ref-type="bibr" rid="ref19">19</xref>
            ];
• The heuristics for the gaming sector identified and analysed by Federof [
            <xref ref-type="bibr" rid="ref20">20</xref>
            ];
• A preliminary questionnaire for the expert gamers12.
          </p>
          <p>Unreal Engine®13 will be used to create the proposed HUD. Notice that this tool is the same
developed and used by Epic Games, Fortnite developer.</p>
        </sec>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Phase 1: immersiveness with expert players</title>
        <p>The main aim of Phase 1 is to evaluate the immersion of the proposed HUD obtained in Phase 0
with respect to the case study dietetic HUD version.</p>
        <p>In fact, expert Fortnite players will have to play a game session with the UI (i) without HUD
and (ii) with the proposed HUD.</p>
        <p>The experiment is conducted in a within-subject condition through two gaming sessions
carried out randomly by expert players.</p>
        <p>
          So, it is a (2x1) experiment with p=0.8 (the same as Iacovides et al. [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ]), and the number of
expert players needed is 34.
        </p>
        <sec id="sec-4-2-1">
          <title>4.2.1. Inclusion and exclusion criteria</title>
          <p>Expert users aged between 18 and 35 who have played Fortnite for at least two years and play
it an hour per week will be involved in Phase 1. Expert users who have played shooter games
but not Fortnite will be excluded.</p>
        </sec>
        <sec id="sec-4-2-2">
          <title>4.2.2. Materials and methods</title>
          <p>In Phase 1, a pre-test questionnaire14 and an IEQ15 after each game sessions will be presented
to the participants.</p>
          <p>
            As for Iacovides et al. [
            <xref ref-type="bibr" rid="ref8">8</xref>
            ], participant expertise and the number of interfaces (two) will be
considered to understand the immersion level of the expert during the game sessions.
          </p>
          <p>
            Face tracking and player behaviour will be derived from the videos recorded by means of GoPro
HERO10 Black. Moreover, the following devices will be used to exploit diferent physiological
indicators, as suggested by Bian et al.’s framework [
            <xref ref-type="bibr" rid="ref11">11</xref>
            ]: Unicorn, Shimmer GSR, Shimmer EMG.
Section 3.3 reports more details on the devices.
12Available at https://docs.google.com/forms/d/e/1FAIpQLSfvCgCW9XxWfKekU5WOlxmAUwVEPqt-oPBktEdfolcWVNdG4g/
viewform?usp=sf_link
13Epic Games Inc. (https://www.unrealengine.com/en-US accessed September 21, 2023).
14Available at https://docs.google.com/forms/d/e/1FAIpQLSeKRO65yJVSSzps5XrWL9PXB2YDJqg-9ZmuXR7UKzpc3P_
4ng/viewform?usp=sf_link
15provided by Jennet et al. [
            <xref ref-type="bibr" rid="ref4">4</xref>
            ] at https://docs.google.com/forms/d/e/1FAIpQLSeRMgdk4_
          </p>
          <p>IMxZ-LFcrj9le8bjRaclUHq1G80Y1t_OWoHKGmmw/viewform?usp=sf_link.</p>
        </sec>
      </sec>
      <sec id="sec-4-3">
        <title>4.3. Phase 2: usability with non-expert players</title>
        <p>The main aim of Phase 2 is to evaluate the usability of the proposed HUD based on the feedback
of expert shooter gamers obtained in Phase 0 with respect to the classic interface of the case
study.</p>
        <p>In fact, non-expert Fortnite players will have to play a game session with the UI (i) with the
standard HUD, and (ii) with the proposed HUD.</p>
        <p>The experiment is conducted in a within-subject condition through two random gaming
sessions by non-expert players.</p>
        <p>
          So, it is a (2x1) experiment with p=0.8 (the same as Iacovides et al. [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ]), and the number of
non-expert players needed is 34.
        </p>
        <p>Besides accessing the proposed HUD usability, the player level of immersion will also be
evaluated.</p>
        <sec id="sec-4-3-1">
          <title>4.3.1. Inclusion and exclusion criteria</title>
          <p>Non-expert users who have not played Fortnite aged between 18 and 35 will participate in this
experiment phase. These non-expert users are selected among players who know how to play
with a PlayStation console.</p>
        </sec>
        <sec id="sec-4-3-2">
          <title>4.3.2. Materials and methods</title>
          <p>As for Phase 1, the pre-test questionnaire and IEQ will be presented to the participants. Moreover,
participant expertise and the number of interfaces (two) will be considered to understand the
usability of the proposed HUD as well as the immersion level of the non-expert player during
the game sessions.</p>
          <p>The same devices reported for Phase 1 will be used: GoPro HERO10 Black for face tracking and
behaviour analysis, and Unicorn, Shimmer GSR, and Shimmer EMG for physiological responses
recording.</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions</title>
      <p>In this paper, a brief overview of the experimental protocol intended to assess the immersiveness
of expert and non-expert participants while playing a third-person shooter is presented.</p>
      <p>
        The experiment will be carried out by following literature works [
        <xref ref-type="bibr" rid="ref11 ref8">8, 11</xref>
        ] that successfully
assessed players’ immersion levels considering diferent HUDs and physiological responses.
Moreover, a HUD for the Fortnite UI will be proposed based on expert shooter players’ opinions
to enhance the gamers’ immersiveness.
      </p>
      <p>Afective computing and gaming will be exploited to clearly understand users’ experience,
behaviour, and levels of immersion by leveraging the information captured by physiological
sensors.</p>
      <p>In future works, these AI applications will be further introduced to provide a dynamic
mechanism that can change the HUD in real-time depending on the single-player responses.
[21] J. Wang, Y. Cheng, R. S. Feris, Walk and learn: Facial attribute representation learning
from egocentric video and contextual data, in: Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition (CVPR), 2016.
[22] S. Gannouni, A. Aledaily, K. Belwafi, H. Aboalsamh, Emotion detection using
electroencephalography signals and a zero-time windowing-based epoch estimation and relevant
electrode identification, Scientific Reports 11 (2021) 7071.
[23] C. Redmond, Trans-thoracic impedance measurements in patient monitoring, EDN</p>
      <p>Network (2013).
[24] T. Schöber, G. Stadtmann, Fortnite: The business model pattern behind the scene,
European University Viadrina Frankfurt (Oder) Department of Business Administration and
Economics Discussion Paper (2020).</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>P. C. E Brown</surname>
          </string-name>
          ,
          <article-title>A grounded investigation of game immersion, Extended Abstracts of Human Factors in Computing Systems (</article-title>
          <year>2004</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>M.</given-names>
            <surname>Csikszentmihalyi</surname>
          </string-name>
          ,
          <article-title>Beyond boredom and anxiety</article-title>
          . san francisco, CA, US:
          <string-name>
            <surname>Jossey-Bass</surname>
          </string-name>
          (
          <year>1975</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>M.</given-names>
            <surname>Csikszentmihalyi</surname>
          </string-name>
          ,
          <source>Flow: The Psychology of Optimal Experience</source>
          ,
          <year>1990</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4] . A.
          <string-name>
            <surname>W. C Jennet</surname>
            ,
            <given-names>A L</given-names>
          </string-name>
          <string-name>
            <surname>Cox</surname>
          </string-name>
          ,
          <article-title>Measuring and defining the experience of immersion in games</article-title>
          ,
          <source>International Journal of Human-Computer Studies</source>
          <volume>66</volume>
          (
          <year>2008</year>
          )
          <fpage>641</fpage>
          -
          <lpage>661</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>A. C.</surname>
          </string-name>
          <article-title>M Taylor, Augmenting the hud: A mixed methods analysis on the impact of extending the game ui beyond the screen (</article-title>
          <year>2017</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>C. P.</given-names>
            <surname>Sanders</surname>
          </string-name>
          <string-name>
            <surname>T</surname>
          </string-name>
          ,
          <article-title>Time perception, immersion and music in videogames</article-title>
          ,
          <source>Proceedings of HCI 2010 (HCI)</source>
          (
          <year>2010</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>G. Christou,</surname>
          </string-name>
          <article-title>The interplay between immersion and appeal in video games</article-title>
          ,
          <source>Computers in human behavior 32</source>
          (
          <year>2014</year>
          )
          <fpage>92</fpage>
          -
          <lpage>100</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>I.</given-names>
            <surname>Iacovides</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Cox</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Kennedy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Cairns</surname>
          </string-name>
          ,
          <string-name>
            <surname>C.</surname>
          </string-name>
          <article-title>Jennett, Removing the hud: the impact of non-diegetic game elements and expertise on player involvement</article-title>
          ,
          <source>in: Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play</source>
          ,
          <year>2015</year>
          , pp.
          <fpage>13</fpage>
          -
          <lpage>22</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>W. S.</given-names>
            <surname>Lages</surname>
          </string-name>
          ,
          <article-title>Opportunities and challenges in immersive entertainment, Anais Estendidos do XX Simpósio Brasileiro de Jogos e Entretenimento Digital (</article-title>
          <year>2021</year>
          )
          <fpage>1037</fpage>
          -
          <lpage>1040</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>A.</given-names>
            <surname>Hughes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Jorda</surname>
          </string-name>
          ,
          <article-title>Applications of biological and physiological signals in commercial video gaming and game research: a review</article-title>
          ,
          <source>Frontiers in Computer Science</source>
          <volume>3</volume>
          (
          <year>2021</year>
          )
          <fpage>557608</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Bian</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Yang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Gao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Zhou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Sun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Meng</surname>
          </string-name>
          ,
          <article-title>A framework for physiological indicators of flow in vr games: construction and preliminary evaluation</article-title>
          ,
          <source>Personal and Ubiquitous Computing</source>
          <volume>20</volume>
          (
          <year>2016</year>
          )
          <fpage>821</fpage>
          -
          <lpage>832</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>T.</given-names>
            <surname>Tozman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. S.</given-names>
            <surname>Magdas</surname>
          </string-name>
          , H. G. MacDougall, R. Vollmeyer,
          <article-title>Understanding the psychophysiology of flow: A driving simulator experiment to investigate the relationship between lfow and heart rate variability</article-title>
          ,
          <source>Computers in Human Behavior</source>
          <volume>52</volume>
          (
          <year>2015</year>
          )
          <fpage>408</fpage>
          -
          <lpage>418</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>D.</given-names>
            <surname>Setiono</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Saputra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Putra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. V.</given-names>
            <surname>Moniaga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Chowanda</surname>
          </string-name>
          ,
          <article-title>Enhancing player experience in game with afective computing</article-title>
          ,
          <source>Procedia Computer Science</source>
          <volume>179</volume>
          (
          <year>2021</year>
          )
          <fpage>781</fpage>
          -
          <lpage>788</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>R. W.</given-names>
            <surname>Picard</surname>
          </string-name>
          ,
          <article-title>Afective computing for hci</article-title>
          .,
          <source>in: HCI (1)</source>
          , Citeseer,
          <year>1999</year>
          , pp.
          <fpage>829</fpage>
          -
          <lpage>833</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>J.</given-names>
            <surname>Marín-Morales</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Llinares</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Guixeres</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Alcañiz</surname>
          </string-name>
          ,
          <article-title>Emotion recognition in immersive virtual reality: From statistics to afective computing</article-title>
          ,
          <source>Sensors</source>
          <volume>20</volume>
          (
          <year>2020</year>
          )
          <fpage>5163</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>J.</given-names>
            <surname>Tao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Tan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Picard</surname>
          </string-name>
          ,
          <article-title>Afective computing and intelligent interaction</article-title>
          , volume
          <volume>3784</volume>
          , Springer,
          <year>2006</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>I.</given-names>
            <surname>Kotsia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Zafeiriou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Fotopoulos</surname>
          </string-name>
          ,
          <article-title>Afective gaming: A comprehensive survey</article-title>
          ,
          <source>in: Proceedings of the IEEE conference on computer vision and pattern recognition workshops</source>
          ,
          <year>2013</year>
          , pp.
          <fpage>663</fpage>
          -
          <lpage>670</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>J.</given-names>
            <surname>Moon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Choi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Park</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Choi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.-H.</given-names>
            <surname>Hong</surname>
          </string-name>
          ,
          <string-name>
            <surname>K.-J. Kim</surname>
          </string-name>
          ,
          <article-title>Diversifying dynamic dificulty adjustment agent by integrating player state models into monte-carlo tree search</article-title>
          ,
          <source>Expert Systems with Applications</source>
          <volume>205</volume>
          (
          <year>2022</year>
          )
          <fpage>117677</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>NielsenNorman-Group</surname>
          </string-name>
          ,
          <article-title>10 usability heuristics for user interface design (</article-title>
          <year>2020</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Federof</surname>
          </string-name>
          ,
          <article-title>Heuristics and usability guidelines for the creation and evaluation of fun in video games (</article-title>
          <year>2002</year>
          ).
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>