=Paper= {{Paper |id=Vol-2246/GHItaly18_paper_06 |storemode=property |title=EmoBrain: Playing with Emotions in the Target |pdfUrl=https://ceur-ws.org/Vol-2246/GHItaly18_paper_06.pdf |volume=Vol-2246 |authors=Vincenzo Liberti,Valeria Carofiglio,Berardina Nadja De Carolis,Fabio Abbattista |dblpUrl=https://dblp.org/rec/conf/avi/LibertiCCA18 }} ==EmoBrain: Playing with Emotions in the Target== https://ceur-ws.org/Vol-2246/GHItaly18_paper_06.pdf
                  EmoBrain: Playing with Emotions in the Target
                   Vincenzo Liberti                                    Valeria Carofiglio            Berardina De Carolis
                Department of Computer                              Department of Computer          Department of Computer
                         Science                                             Science                        Science
                       Bari, Italy                                          Bari, Italy                    Bari, Italy
                vi.liberti@outlook.com                             valeria.carofiglio@uniba.it      berardina.decarolis@uni
                                                                                                              ba.it
                                                                  Fabio Abbattista
                                                           Department of Computer Science
                                                                      Bari, Italy
                                                              fabio.abbattista@uniba.it


ABSTRACT                                                                         decision-making [1]. Being aware of self emotional state
Sensing and understanding human emotional behaviour                              and start working on its regulation is important to achieve a
seems to be essential to keep engaging interactions                              better wellness level, since emotion regulation can help to
sustained over longer periods. The general purpose of this                       mitigate emotion related biases in our everyday tasks.
work is to implement a platform able to recognize and                            Recently serious games have been used as a mean to learn
employ human emotions in an interactive game: The                                how to control and regulate emotions. Many serious games
EmoBrain Interface (EI). EI interaction is a cycle of                            developed in this field use biofeedback information to
stimulus and feedbacks where the user receives a visual                          display player’s emotional state and help them to train in
input and completes tasks, just controlling his emotional                        order to improve his own wellness.
state by activating self-training strategies. EI has been used
in the context of a serious game: Quiet Bowman (QB). QB                          Our research work is placed in this context. We carried on
allows the users: (i) to experience emotional behaviours and                     previous research [2,3,4] and developed a serious game,
(ii) to explore the game dynamics related to emotions in                         called “Quiet Bowman” (QB). QB allows the users: (i) to
order to manage their own emotional state. The outcomes                          experience emotional behaviors; and (ii) to explore the
seem to be encouraging. Users appreciated the game: they                         game dynamics related to emotions to manage their own
felt involved and committed to achieve the required goals.                       emotional state. This is related to the implementation of a
This encourages us to make new experiments to improve                            platform able to recognize and use user’s emotions as input
the accuracy of our classifiers and therefore the impact of                      to the interactive game: The EmoBrain Interface (EI). The
the serious game on the autogenous training of the user.                         EI implements a cycle of stimulus and feedback where the
                                                                                 user receives a visual output and completes tasks, just
Author Keywords                                                                  controlling his emotional state, by activating autogenous
HCI; Emotions; Game; Brain Computer Interface;                                   training strategies.
ACM Classification Keywords
H.5.2. User Interfaces. I.2.1 Applications and Expert Systems
                                                                                 Although there are many dimensions associated with
                                                                                 emotions, according to Picard [5] the two most commonly
                                                                                 used dimensions of emotion are valence, and arousal.
INTRODUCTION                                                                     Picard also notes that the valence and arousal dimensions
Emotions are part of our everyday living and influence                           are critical in games applications [5]. The recognition of
many human processes such as cognition, perception, and                          user’s emotional state has to be performed in an implicit
everyday tasks such as learning, communication and                               and transparent way, so as to be non-invasive and more
                                                                                 effective. Moreover, traditional methods, such as self-report
                                                                                 or interviews, are only partially useful (and concern the
                                                                                 preliminary research activities only), because they are based
                                                                                 on sampling techniques or simply on the a-posteriori user’s
                                                                                 perception of the game environment.

GHItaly18: 2nd Workshop on Games-Human Interaction, May 29th, 2018,
                                                                                 We decided to couple the Emotional Brain Computer
Castiglione della Pescaia, Grosseto (Italy)                                      Interfaces (EBCI) [6,7] to traditional methods.
Copyright © 2018 for the individual papers by the papers' authors. Copying       An EBCI is a particular kind of a Brain Computer Interface
permitted for private and academic purposes. This volume is published and
copyrighted by its editors.                                                      (BCI). A BCI [8] is a direct communication pathway
between the brain and an external device. By means of an         At the beginning of the interaction, in order to alter his
electroencephalogram (EEG), it records human brain               emotional state, an exogenous stimulus (i.e. photos or other
activity in the form of electrical potentials (EPs), through     multimedia content) is send to the user. The EI detects such
multiple electrodes that are placed on the scalp. EPs are        emotional alterations and sends feedback messages to the
processed to obtain features that can be grouped into a          user. The feedback received from the user will, in turn, be
feature vector: Depending on the brain activity, distinctive     used by him to initiate an appropriate endogenous
known patterns in the EEG appear. These are automatically        stimulation strategy, to achieve the emotional goal.
recognized by the BCI and associated with a given action         User Application: Quiet Bowman (QB)
on the BCI application. The outcome of this action can be        In order to let EI platform handle user emotions in the
perceived by the user in terms of application feedback. In       scope of an interactive game, we implemented QB. As
this case, his brain activity is consequently modulated. The     stated, QB is a simplified version of a serious game for
kind of EBCI mostly used in this work is the reactive EBCI       autogenous training. It is an archery game in which the
[9]. A reactive EBCI can send stimuli and extract                player plays with the goal of achieving an emotional state
information from user’s brain elaboration. So, if during a       represented in the centre of the target (emotional goal).
session game, the player has been scared or disgusted, the       Four emotions could be the emotional goal: Calm,
EI will recognize a medium-high value for arousal with a         Happiness, Anger or Sadness. We chose these emotions
strongly negative valence. Otherwise, will be recognised         because their components (valence and arousal) better
positive valence and high arousal, for happiness.                match the emotional chose values: high, medium and low
In the scope of our work, the feedback about the progress of     for arousal; positive, neutral and negative for valence.
the game is a key tool: it allows the players to monitor their   Indeed, calm, for example, has neutral values for valence
performance, but it also generates new stimuli caused by         and slightly low for arousal. On contrary, anger has high
the achievement of the goal (feedback loop).                     arousal and a low valence. In addition, these four emotions
                                                                 are commonly and clearly identified by users.
The paper is organized as follows: in Section 2 we provide
a description of the architecture of the system. In Section 3    The only controller in the game is user’s affective state: at
we illustrate the experimental study we conducted to train       each step, the user emotion is recognized by the EI and
and test our system. In Section 4 the results of the study are   represented as an arrow on the target. Each step is
discussed. Conclusions and future work directions are            automatically triggered by a timer. To win the game, the
reported in Section 5.                                           player must be able to drive his emotional state in order to
                                                                 achieve the emotional goal, by activating any endogenous
OVERVIEW OF THE ARCHITECTURE
                                                                 stimulation strategy.
The EI is a distributed platform to recognize and employ
human emotions to drive an interactive system (Fig. 1). The      A screenshot of the game can be seen in Fig. 2. Seven shots
EI includes an input device to record user’s EEG. The brain      were fired. The seventh (t7) hit the target. The messages
activity (EEG signal) is then transmitted to the BCI that        exchange among EI, QB and the user takes the name of
analyses and processes it for to recognize user’s emotion        feedback loop. So, for example, let the user play whit the
following a visual stimulation. The EI also includes QB, a       emotion of Calm. At the beginning of the interaction, in
serious game for autogenic training, in which the user is a      order to alter his emotional state, a picture is presented to
player that shoots arrows in order to hit the target             the user. The EI detects such emotional alterations and
(emotional goal) and receive feedbacks, according to his         sends a feedback messages to the user (an arrow in the
emotional state.                                                 target: t1, in Fig.2).




       Figure 1. The EmoBrain Interface Architecture.                              Figure 2. QB Screenshot.
The feedbacks received from the user will, in turn, be used             itself, which would certainly affect the progress of the
by him to initiate an appropriate endogenous stimulation                game itself.
strategy, to reach a state of quiet. In doing so, however, he      b.   User Active participation: As an archer, the user
will send new data to the EI and a new arrow will be fired              determines the direction of the arrows, by controlling
from the QB. The display of the latter arrow (or the sound,             his emotions.
if set, for this event) on the target constitutes a new stimulus   Brain Computer Interface and Classification
(exogenous) for the player, who realizes that he is able to        The BCI is realized by BCI2000 [11]. BCI2000 is widely
control the bow working of his emotional state. This               used in medical field and research. The main module is the
induces him to elaborate endogenous stimuli which                  Operator which provides a user graphical interface and
generate the feedback chain.                                       allows to define parameters for each experimental session,
At the beginning of the interaction, game parameters can be        to start and stop recording sessions. Under the Operator
set. Among the other, a target emotional state (emotional          module, the Data Source Module configures the
goal) is selected. According the Russell’s Circumplex              EmotivTMEpoc Headset’s sensors (information channel)
Model of Emotions [10], in the scope of our work, emotions         [12] used for brain activity recordings. It converts brain
are represented as a combination of valence and arousal,           activity into a long byte string to obtain a numeric value
both ranging from low (-1) to neutral (0) to high (1) values:      form each sensor receiving 4 of 14 channels at 128 Hz with
A 3x3 grid is thus obtained in which only the combinations         a SampleBlockSize at 32; the SourceChGain, set from
(0, -1), (1,1), (-1,1) and (-1, -1) represent a emotional goals,   manufacturer at 0.003 µV, converts the transmitted values
respectively calm, happiness, anger and sadness (Fig. 3.a).        to the amplifier from analogic to digital for each channel.
According to the selected emotional goal, QB draws a               The Data Source Module also removes most of the EEG
target (Fig 3.b).                                                  noise applying a low-pass filter at 30 Hz and a high-pass
                                                                   filter at 0.1 Hz; then the results are stored in a .dat file
                                                                   easily convertible into .csv extension for the offline
                                                                   analysis. Each file is divided into a header with operative
                                                                   information and a payload section that contains the raw
                                                                   signals.
                                                                   Subsequently data are sent to the Signal Processing Module
                                                                   that uses a filter-chain to analyse and process signals
                                                                   (Spatial and Temporal Filter) converting it in the result
                                                                   output, by using machine learning algorithms. The Signal
                                                                   Processing Module used, includes a Common Average
         a.                                        b.              Reference [13] several used to identify small signal sources
              Figure 3. Target implementation                      in very noisy recordings and a Fast Fourier Transform Filter
                                                                   to extract relevant signal’s features.
Cartesian axes projected on the target, highlighting the
                                                                   Two Support Vector Machine classifiers have been
values 1, -1 and 0 used to identify the coordinates in which
                                                                   implemented by LibSVM [14]: one for the valence and one
to shoot the arrows. It should be noticed that such a
                                                                   for the arousal. Although the classifiers were distinct, the
structured Cartesian pattern places the centre of the target
                                                                   prediction schemes model and the functions were similar:
no longer in the coordinates (0,0), but in one of the four
                                                                   particularly the classifiers were multiclass and able to
aforementioned pairs, depending on the target emotion of
                                                                   recognize between three output classes (one for each of
the game. As a consequence, the centre of the target
                                                                   three values for valence and arousal): 1, 0, -1. With a
represents one of the possible emotions (Calm, happiness,
                                                                   precision of around 0.68 and recall of 0.77, classifiers sent
anger or sadness). For this reason, QB performs a last
                                                                   messages to QB that converts these values in arrow's
coordinate conversion based on the total pixels, to always
                                                                   coordinates. To train the classifiers we recorded about
get the image centered. The player has 8 chances out of 9
                                                                   128000 examples (128 features for each one), equally
to make mistakes.
                                                                   divided for each of the 3 classes both for valence and for
Two characteristics of QB make it an application                   arousal. The magnitude of each stimulation frequency has
particularly useful for our purposes:                              been used for classification. The dataset of signals recorded
                                                                   was scaled and normalized by tools provided by LibSVM.
a.   Interaction level: The metaphor behind the game gives
                                                                   The software also includes algorithms to perform features
     the user a clear understanding of game rules, as well
                                                                   extraction and selection (through cross validation). The
     as a clear interpretation of the effect of every move he
                                                                   classifiers use a kernel type rbf with gamma 1 for the 3
     makes and of how far he is from achieving his goal
                                                                   labels 3, 2, 1 (then converted in 1, 0, -1).
     (winning the game). This allows to reduce the noise in
     terms of reduction of emotions due to the interaction         The start-preference panel of QB is not only used to
                                                                   configure the game parameters, but also to perform the
classifier training phase. It’s possible, indeed, to turn off the    user play to the Calm (low arousal and neutral valence),
gaming interaction and use only the stimuli selection                then the initial stimulus will be a picture from IAPS that
implementation. After an emotion is chosen, QB loads the             elicit at least high arousal and neutral/negative valence (i.e.
opportune initial emotional stimulus for the user. Then, we          the Anger).
can record brain activity while QB offers stimuli to the user
                                                                     The platform and the game ran on a PC connected to a
according to the chosen time and repetition settings.
                                                                     15.6-inch screen, placed at a distance of about forty
EXPERIMENTAL STUDY                                                   centimetres from the user's gaze. The PC was equipped with
Approach                                                             a 2.40 Ghz processor, 4 GB of RAM and with Windows 10
Each experimental session opens with a phase of data                 64-bit. To avoid source of distraction, one user at time
collection on the user (pre-test) and ends with an interview         experimented the EI interface. The data acquisition ran on
to the user (post-test). The aim of our analysis is to evaluate      the     same     PC      and      exchanged    the     raw
the use of the interactive application in inducing chosen            electroencephalography (EEG) data to the platform.
emotional states. In particular, we want to evaluate the
quality of the interaction in terms of (i) effectiveness of
process and (ii) usability of the application, perceived by          Participants and Data Gathering
users. The first evaluation reflects an objective quality of         Ten participants (5 female and 5 male), aged between 20
the software in terms of objectives achieved, depending on           and 60 (µ= 38.4; σ=15.28) participated in the experiment.
the heterogeneity of the users; the second, on the other             Before the experiment, each participant has been asked to
hand, is a subjective measure, linked to each user and               sign an informed consent and, subsequently, to answer a
influenced by aspects such as satisfaction about the use of          preliminary questionnaire in order to set an initial profile of
the platform, about convenience of the devices and about             the user. All participants had normal or corrected-to-normal
utility. How quickly the user can change his emotional state         vision and described themselves as daily computer users.
with respect to the ongoing feedback cycles will be an               Nobody had experience with EEG or BCIs.
evaluation metrics.                                                  Prior to the experiment, 14 electrodes were placed
Experimental Setup                                                   according to the international 10- 20 system [16]. For the
The experiment held in this study consisted of single                EEG data acquisition, the EmotivTM Epoc headset was
sessions (see Fig. 4). Each session was divided into two             used. According to the literature [9], during each session
trials, a familiarity trial and the game trial. In the familiarity   EEG data was used from four frontal electrodes (AF3, AF4,
trial participants could get used to drive their emotional           F3, F4). The EEG data was handled as in stated in section
state by selecting endogenous stimulation strategies, that           Brain Computer Interface.
shoot arrows in the target, this to avoid confounding                In order to check the interaction quality in terms of the
variables such as learning a strategy to play the game. Each         users’    satisfaction   and   the    perceived-from-user
game trial consisted of four repetitions of sequences (one           effectiveness in winning the game, at the end of the
for each emotion among Calm, Happiness, Anger, Sadness)              interaction each user answered a self-assessment
starting with a relaxing time (ten seconds) followed by a            questionnaire. It contains 9 questions and is designed to
visual stimulation (five seconds) from The International             measure different factors linked to immersion (cognitive
Affective Picture System (IAPS) [15], followed by a                  involvement, emotional involvement, challenge and
preparation time (two seconds) and lasted in 45 seconds.             control).
Among repetitions, participants were given a break of five
minutes.                                                             Furthermore, some game statistics - such as for each
                                                                     emotional goal, the average number of arrows shot in the
IAPS is a database of pictures designed to provide a                 center of the target, mediated on all users, or the valence
standardized set of pictures for studying emotions that has          and the arousal trend, mediated on all the repetition of all
been widely used in psychological research. It is the                the game sessions - were collected while participants played
essential property of the IAPS that the stimulus set is              the game.
accompanied by a detailed list of average ratings of the
emotions elicited by each picture, in term of valence and            The valence and arousal trend would be a good indication
arousal. This shall enable we to previously select stimuli           of how quickly users can change their emotional state with
eliciting a specific range of emotions: picture were chosen          respect to the ongoing feedback cycles.
with high arousal and negative valence to elicit in the user         After the experiments, scores were obtained for the
anger-disgust, high arousal and positive valence to elicit in        immersion factors and the game statistics.
the user happiness-joy, low arousal and neutral valence to
elicit in the user calm-stillness, low arousal and negative
valence to elicit sadness in the user, so that it could be           RESULTS
offered to the player, an initial stimulus opposed to the            Based on the post-questionnaire, the autogenous training
emotion that is the emotional goal of the game. So far, if the       game was highly appreciated in terms of emotional
involvement, control, cognitive involvement and challenge.        approach the emotions of the opposite quadrant (according
The scores, expressed on a scale of values from 1 to 5,           to Russell's model of representation of emotions), that is a
averaged over participants, are shown in Table1.                  rapprochement with the emotional goals, on average; an
                                                                  average positive correlation of both valence and arousal
                                                                  over the time indicate a rapprochement with the emotional
                                µ                                 goal of happiness. Sadness is an exception: although the
                 Emotional      4.7                               arousal tends to decay on average, the valence tends to
                                                                  remain positive. This may be due to the fact that when
                 Control        4.5
                                                                  playing with sadness, the user is called to activate an
                 Cognitive      4.7                               endogenous strategy in order to move from an emotion with
                                                                  a very positive valence (happiness), to an emotion with a
                 Challenge      4.9
                                                                  very negative valence (sadness), this in a playful context. In
Table 1. Average results about several factors linked to          addition, it is known that the decay of emotions depends on
immersion. On a scale of values from 1 to 5, where 1 indicates    the emotions itself, but also on the context in which
very negative and very positive                                   emotion appears [5]: in a playful context, as in our case,
                                                                  happiness could be a slow-decay-emotion and therefore,
On average, users claim to have had fun and fully                 when sadness appears it is plausible that happiness is not
understand how to interact with the platform. These results       yet completely decayed, causing an overlapping of the two
are opposed to the initial skepticism about the possibility of    emotions (as in “odi et amo” - microwave metaphor [5]), as
playing a game only by governing their emotional flow.            well as noise in the detection of emotions.
From the analysis of the pre-test questionnaires, these initial
impressions are more evident in users with little digital         (V/t)/(A/t)   Calm             Happiness      Anger          Sadness
experience. This could be linked to the classical concept of      User1         -0.144 / 0       0.433 / -0.392 0.204 / 0.405 0.632 / -0.392
controller, commonly understood as a tangible hardware
device (mouse, keyboard ...). About the EMOTIV device             User2         0 / 0.200        0 / 0.882      -0.790 / 0     0.433 / -0.784
there are differences in terms of comfort: on a scale of          User3         0/0              0.474 / -0.685 0.288 / 0      0.866 / -0.002
values from 1 to 5 where 1 indicates extreme discomfort
and with 5 maximum comfort, the device obtains a rating of        User4         -0.47 / -0.385   -0.316 / 0.204 -0.686 / 0     0/0
4.5. All users have assigned values 4 or 5. The 75% of 4          User5         0.144 / -0.200   -0.408 / 0     0.144 / -0.204 0.433 / -0.203
were from women with long hair. Finally, only one user
complained of an excessive pressure on the head during the        User6         0.358 / 0.243    0 / -0.500     0.408 / 0      -0.560 / 0.002
use of the device, for prolonged periods.                         User7         -0.790 / 0.300   0.72 / 0.490   0/0            0 / 0.036
During the experiments, game statistics were collected: The       User8         0.351 / -0.153   0.002 / 0,211 0 / 0.35        -0.433 / -0.511
72% of the arrows were shot in the center of the target or in
                                                                  User9         0 / -0.245       -0.351 / 0.057 -0.003 / -0.11 -0.158 / 0.319
a close range. Anger is the emotions for which it is easier to
activate endogenous strategies (85%); Calm follows (60%).         User10        0.103 / -0.002   -0.206 / 0     -0.31 / -0.888 -0.491 / -0.142
Activating endogenous strategies for happiness or
                                                                  Average       0.045/-0.024     0.034/0.07     -0.074/0.035   0.072/-0.168
depression seems to be a difficult task. This is probably due
to the fact that these two emotions correspond to a mood
more than to an-event-related emotion. Furthermore,               Table 2. Pearson correlation coefficient, with respect to
playing with anger in the target does not seem to strain the      detected values of valence and arousal, over time.
user, regardless of which emotions he has played                  A final consideration: some correlation coefficients in
previously, during the same experiment: users needed less         Table 2 have value zero. Even these cases can be
time to focus on the target.                                      considered positive since they occurred into two different
Finally, given an emotion, in order to evaluate if and how        conditions: (i) when a player has shot an arrow that almost
quickly users changed their emotional state with respect to       immediately hit the center of the target, (ii) when a large
the ongoing feedback cycles, we calculated Pearson                number of arrows shot in the center of the target, when the
correlation coefficient, with respect to detected values of       remaining ones were too far.
valence and arousal, over time. Table 2 shows the Pearson
coefficients, for each emotional goal, for each user.             CONCLUSIONS
According to our experimental setup, as the initial stimulus      In this paper we presented a serious game, called “Quiet
will be opposed to the emotion that is the emotional goal of      Bowman”, in which we employ human emotions for two
the game, if users play with anger (resp. calm) in the target,    main purposes: (i) to experience emotional behaviors, and
an average negative correlation (resp. positive) of valence       (ii) to explore the game dynamics related to emotions in
over the time and an average positive correlation (resp.          order to learn how to manage one own emotional state. In
negative) of arousal over the time indicate a tendency to
particular, the game uses a platform, the EmoBrain            7. Bos, D., “EEG-based emotion recognition – the
Interface, to recognize emotions from EEG signals that are       influence of visual and auditory stimuli”, Emotion,
used as input for the interacting with the game. The EI          57(7):1798-806, 2006.
interaction is a cycle of stimulus and feedbacks where the    8. Van Gerven, M., et Al.. The brain–computer interface
user receives a visual input and completes tasks, just           cycle. Journal of neural engineering 6 (4), 041001
controlling his emotional state, by activating endogenous
                                                              9. Zander, T.O., Kothe,C., Jatzev, S., and Gaertner, M.,
training strategies. The EI platform is able to recognize
                                                                 Enhancing human computer interaction with input from
emotions according to the Russell’s Circumplex Model of
                                                                 active and passive brain-computer interfaces. In Brain-
Emotions using two classifiers one for the valence and the
                                                                 Computer Interfaces - Ap Applying our Minds to
other for the arousal dimension.
                                                                 Human-Computer Interaction, pp. 181-199, 2010.
Results of the study presented in the paper show that the
autogenous training game was highly appreciated in terms      10. J. Posner, J. A. Russell, and B. S. Peterson, “The
of emotional involvement, control, cognitive involvement          circumplex model of affect: an integrative approach to
and challenge. Moreover they show that is possible to play        affective neuroscience, cognitive development, and
a game only by governing one own emotional flow and               psychopathology”, Development and psychopathology,
encourage us to make new experiments in order to improve          17(3):715-734, 2005.
the accuracy of our classifiers and therefore the impact of   11. Schalk, G., Mellinger, J. A practical guide to brain–
the serious game on the autogenous training of the user.          computer interfacing with BCI2000. Springer, London.
                                                                  (2010)
ACKNOWLEDGMENTS                                               12. https://www.emotiv.com/epoc/
We thank all the volunteers who participated in the study     13. Cooper, R., Osselton, J., Shaw, J.: EEG technology.
presented in this paper.                                          Butterworths (1969)
REFERENCES                                                    14. LIBSVM:     A     Library  for    Support   Vector
1. Norman, D.: Emotion & design: attractive things work           Machines ACM Trans. Intell. Syst. Technol., Vol. 2,
   better. Interactions 9(4), 36–42 (2002)                        No. 3. (May 2011), doi:10.1145/1961189.1961199 by
2. Carofiglio, V., Abbattista, F. Understanding, Modeling         Chih C. Chang, Chih J. Lin
   and Exploiting User's Emotions for Brain-Driven            15. Lang, P.J., Bradley, M.M., & Cuthbert, B.N. (2008).
   Interface Design Application to an Adaptive-3D-                International affective picture System (IAPS): Affective
   Virtual-Environment. INTERNATIONAL JOURNAL                     ratings of pictures and instruction manual. Technical
   OF                               PEOPLE-ORIENTED               Report A-8. University of Florida, Gainesville, FL.
   PROGRAMMING.Vol.3,pp.1-21.            ISSN:2156-1796.
                                                              16. Reilly, E.L. EEG Recording and Operation of the
   2014.
                                                                  Apparatus.      In:   Electroencephalography:      Basic
3. Carofiglio, V. Abbattista,F. A Rough BCI-based                 Principles, Clinical Applications and Related Fields, pp.
   Assessment of User's Emotions for Interface                    139–160. Lippincott Williams & Wilkins (1999)
   Adaptation: Application to a 3D-Virtual- Environment
   Exploration Task. Proceedings of the First International
   Workshop on Intelligent User Interfaces: Artificial
   Intelligence meets Human Computer Interaction
   (AI*HCI 2013) A workshop of the XIII International
   Conference of the Italian Association for Artificial
   Intelligence (AI*IA 2013). 2013.
4. Carofiglio, V. Abbattista, F. BCI-Based User-Centered
   Design for Emotionally-Driven User Experience. Cases
   on Usability Engineering: Design and Development of
   Digital Products. pp.299--320. doi: 10.4018/978-1-
   4666-4046-7; ISBN:9781466640467; Miguel A. Garcia-
   Ruiz (Pubs.), University of York, UK, 2013.
5. Picard, R.W., Affective computing. 1995
6. Garcia-Molina, G., Tsoneva,T., Nijholt, A., Emotional
   brain-computer interfaces, International Journal of
   Autonomous and Adaptive Communications Systems,
   v.6         n.1,          p.9-25,           December
   2013 [doi>10.1504/IJAACS.2013.05068]