=Paper= {{Paper |id=Vol-2250/WS_LA_paper4 |storemode=property |title=Detecting Academic Emotions from Learners’ Skin Conductance and Heart Rate: Data-Driven Aproach using Fuzzy Logic |pdfUrl=https://ceur-ws.org/Vol-2250/WS_LA_paper4.pdf |volume=Vol-2250 |authors=Fadi Moukayed,Haeseon Yun,Albrecht Fortenbacher,Tom Bisson |dblpUrl=https://dblp.org/rec/conf/delfi/MoukayedYFB18 }} ==Detecting Academic Emotions from Learners’ Skin Conductance and Heart Rate: Data-Driven Aproach using Fuzzy Logic== https://ceur-ws.org/Vol-2250/WS_LA_paper4.pdf
                                Daniel Schiffner (Hrsg.): Proceedings of DeLFI Workshops 2018
      co-located with 16th e-Learning Conference of the German Computer Society (DeLFI 2018)
                                                        Frankfurt, Germany, September 10, 2018

Detecting Academic Emotions from Learners’ Skin Con-
ductance and Heart Rate: A Data-driven Approach using
Fuzzy Logic

Fadi Moukayed1, Haeseon Yun2, Tom Bisson3, Albrecht Fortenbacher4



Abstract: In this feasibility study, we have designed and conducted an Emotional Picture Experi-
ment, which is based on International Affective Picture System (IAPS). We presented 96 affective
visual stimuli to 27 participants and asked them to rate their emotions in terms of valence and
arousal, while recording their EDA and ECG sensor data. On the recorded sensor data, we adapted
and applied a fuzzy logic model which was presented in 2007 by [MA07]. To optimize this fuzzy
logic approach in terms of finding better membership functions and rule sets, we used a genetic
algorithm. The results show closeness of valence and arousal values derived from sensor data to
ratings of the affective pictures provided by IAPS, which means that our fuzzy logic system could
be used to assess and detect emotions in a learning context.
Keywords: academic emotions; arousal; valence; affective pictures; fuzzy logic; genetic algo-
rithm; learning indicators



1     Introduction

Learners experience positive emotional states (happiness, satisfaction) as well as nega-
tive emotional states (boredom, frustration) [Gr16]. In spite of broad previous studies in
emotion, research of emotion in a learning context is still in a beginning stage. In our
study, we aim to investigate the relationship between sensor data and academic emotion-
al states. Firstly, we adopted the two-dimensional emotion model to apply academic
emotion conceptually [Yu17]. Secondly, we designed and conducted an emotional pic-
ture experiment based on the International Affective Picture System (IAPS) while col-
lecting physiological sensor data using ECG (electrocardiogram) and EDA (electroder-
mal activity) sensors. Thirdly, we applied a fuzzy logic approach which was used in
Mandryk and Atkins [MA07] to accumulated sensor data. Based on first results, we
optimized our approach using genetic optimization. We finalized the paper by reflecting
on our experiment data analysis and discuss future work on emotional assessment to
support learners in form of a context-aware learning-support system.

1
  HTW Berlin, Wilhelminenhofstraße 75a, 12459 Berlin, Germany, fadi.moukayed@student.htw-berlin.de
2
  HTW Berlin, Wilhelminenhofstraße 75a, 12459 Berlin, Germany, haeseon.yun@htw-berlin.de
3
  HTW Berlin, Wilhelminenhofstraße 75a, 12459 Berlin, Germany, tom.bisson@htw-berlin.de
4
  HTW Berlin, Wilhelminenhofstraße 75a, 12459 Berlin, Germany, albrecht.fortenbacher@htw-berlin.de
Fadi Moukayed, Haeseon Yun, Tom Bisson, Albrecht Fortenbacher

2     Emotional Picture Experiment (EPE)

2.1    Academic Emotions and Sensor Data

Learners encounter the changes between positive and negative emotional states with
varying degree and the negative condition such as boredom undermines learning
achievement and experience [Gr16]. In fact, learners’ knowledge attainment is linked to
emotions that learners face in an academic situation [IYF10]. Academic emotions which
are defined as emotions that are relevant in a learning context [Wo09] are highly relevant
to successful learning. During learning, learners experience various emotions, yet expe-
riencing the whole spectrum of emotion during learning is not common. For example,
boredom, frustration, excitement and satisfaction may be emotions that learners experi-
ence whereas disgust, fear or sadness are not considered frequently in a learning context.
To model emotion, one study utilized Ekman’s facial expression data to investigate emo-
tions occurred during a learning situation [Ar09] and others took a two-dimensional
model of emotion [LB07a] [Wu94]. Our classification of academic emotion [YFP17]
relates to a two-dimension model of emotion which consists of arousal and valence. For
instance, high valence and high arousal (HVHA) refers to excited and joyful state and
high valence and low arousal (HVLA) relates to concentrated and satisfied state. Low
valence and high arousal (LVHA) indicate frustrated and angry state and lastly, low
valence and low arousal (LVLA) indicate bored and tired state.
Methods such as human observation, self-report and hardware sensors with mathemati-
cal classification have been used [Dr08] to detect emotions. Regarding on sensor data,
EDA, respiratory rate, EMG (electromyogram) and ECG were widely used to investigate
the relation between physiological response and emotion [Kr10] [Co02]. Derived ECG
values such as heart rate (HR), heart rate variability (HRV) and heart rate acceleration
were researched to relate with emotion. For example, heart rate variability was found to
be linked with two sides of emotions (positive to negative) [Re15] and heart rate acceler-
ation has positive relation to intense negative emotion (fear) [VCL86]. Additionally,
EDA was found to be linked to emotional intensity (arousal level) [LO86] [CM15].
Based on the previous work relating two-dimensional emotion model with academic
emotion [Yu17], we have selected the visual stimulants for emotion study to explore
physiological signals during emotional picture viewing.


2.2    International Affective Picture System (IAPS)

The International Affective Picture System (IAPS) contains 1182 standardized emotional
stimuli that are based on a two dimensional emotional model including valence and
arousal [LB07b]. Valence indicates a range of positive to negative nature of affect
whereas arousal represents the degree of affect [Fr86]. For instance, joy and anger are
classified as similar degree in arousal (high), one with positive valence (joy) and the
other with negative valence (anger). On the other hand, both satisfied and joy state lie on
                                               Detecting Academic Emotions from Sensor Data

same positive valence axis, although the arousal level of both states is different. To as-
sign specific weight on each picture, the initial IAPS experiment includes a graphical
representation of valence and arousal (Self-Assessment Manikin).
Based on the large samples of people with a broad range of age group and gender, each
picture is provided with mean value of valence and arousal with respective standard
deviation. This large library of pictorial stimuli serves as controlled independent varia-
bles to explore emotional responses. Physiological responses such as facial EMG, skin
conductance, heart rate, brainwave, blood oxygen level, respiratory changes and other
investigations have been used in IAPS picture experiment to find relationship between
physiological responses and picture ratings [La95]. In fact, skin conductance has been
found to have close relation with arousal level and cardiac acceleration/deceleration has
been found to be responsive while viewing different pictures with degrees of valence
[VCL86] [Zu13].


2.3    EPE Design and Results

For this study, we have focused on investigating emotion therefore, we have adopted
ECG and EDA sensors to record participant’s physiological signals. For the experiment,
we have adapted the IAPS experiment setting to stimulate four types of academic emo-
tions based on a two-dimensional emotion model.
IAPS reference ratings of valence and arousal were used to assign appropriate pictures to
respective emotions. To better induce specific emotions, out of 1182 IAPS pictures, 96
pictures were selected and each category (HVHA, HVLA, LVHA or LVLA) consisted of
24 pictures. As for the picture selection process, we have considered the gender effect,
ethics, degree of ratings and their standard deviation. To minimize the gender effect on
the experiment, pictures that have no significant statistical difference in ratings (valence
and arousal) between genders were selected using independent t-test. The mean valence
rating difference between female and male was less than 1. The mean arousal rating
difference between female and male was less than 0.8. This resulted in 735 pictures. To
select pictures that are targeted to the specific category, pictures containing explicit vio-
lence and sexually explicit were excluded and rating value higher than 6 and rating value
less than 4 were used respectively. Furthermore, the range of standard deviation was
kept as low as possible (less than 2.5). The 96 pictures selected for EPE were displayed
in a random order (preparation 5 second - picture view 6 second - rating 10 second).
Some pictures depicted explicit content which required consent from the participants.
Additionally, the consent form included the description of the experiment and the collec-
tion of physiological data collection for research purpose. When the experiment was
finished, we showed a short video to reestablish the participants’ emotional state.


2.4    EPE Sensor Data for Emotion Detection

During our EPE, we recorded the participants’ EDA and ECG sensor data with the Bita-
Fadi Moukayed, Haeseon Yun, Tom Bisson, Albrecht Fortenbacher

lino (R)evolution Plugged Kit5 which comes with a 10 Bit analog-to-digital converter
(ADC). Each participant had 3-lead pre-gelled electrodes on their chest for ECG signal
detection and 2-lead pre-gelled electrodes on the intermediate phalanges of the index and
middle finger of the non-dominant hand (mostly left hand) for EDA signal. Communica-
tion with the sensor device was accomplished by Bluetooth. The signals were recorded
with a sampling rate of 1000 Hz using the manufacturer’s software OpenSignals for
persistence and visual verification. Raw ADC values provided by Bitalino sensors were
transferred to millivolts (ECG) and microsiemens (EDA). From the EDA signal, a slow
changing component (SCL, Skin Conductance Level) and a fast changing component
(SCR, Skin Conductance Response) can be derived. While SCL reflects overall changes
in sympathetic arousal, SCR occurs as an immediate consequence of stimuli [Ch81].
Regarding emotions, a depressed state can be associated with low SCLs together with
reduced nonspecific SCRs [Ch81].
Besides HR (Heart Rate), HRV (Heart Rate Variablity) is used as an indicator for the
state of a person’s autonomic nervous system [Ca96] and it has been identified as a pos-
sible indicator for emotions [Gr15]. HRV can be analyzed with respect to their frequency
or time domains. For HRV analysis in the time domain, it is claimed that only measure-
ments obtained from recordings of the same length (short or long) should be compared
[Ca96]. There are also reports on ultra short time analysis (less than 5 minute) such as
the square root of the mean squared difference (RMSSD) for which a 10 second window
shows significant correlation to the respective 5 minute recording [Th03]. Frequency
analysis of HRV regards the balance between sympathetic and vagal activity [HG17].
While HR was computed from the ECG signal using the Hamilton method [Ha02], we
derived SDNN (standard deviation of RR intervals) and SD2 (standard deviation of RR
interval differences) over a given window of 16 seconds (viewing and rating), referring
to [Pe17] where SDNN/SD2 windows of 30 seconds may be used for stress detection.


3      Deriving Emotions from Sensor Data via Fuzzy Logic

To assess learners’ emotions from sensor data, we have to deal with partial, imprecise
knowledge and imprecise data (Setz et al. claim that “EDA peak height and the instanta-
neous peak rate carry information about the stress level of a person” [Se10], and a va-
lence of 6 in Lang’s emotional model [La95] describes a rather happy, content person).
In a human-like manner, fuzzy models allow expert knowledge as sets of simple rules
(e.g. IF eda IS high THEN arousal IS high). Fuzzy membership functions allow for par-
tial membership of input and output variables around the boundary of classes (e.g. low,
midlow, midhigh, high).
We started out with a fuzzy logic model for assessing emotions as presented by Mandryk
and Atkins [MA07]. In a gaming environment, they used EDA and HR signals for arous-
al, and for valence HR and EMG signals, the latter indicating smiling or frowning activi-
5
 http://bitalino.com.en
                                              Detecting Academic Emotions from Sensor Data

ties. In our experiment, we are restricted to EDA and ECG signals, so we opted for using
SDNN and SD2 to derive valence values. Additionally, we used a genetic algorithm to
improve the arousal model, and to find a viable model for valence. This goes back to
work by Koza in 1990 [Ko90]. Even though the first boundaries are based on uncertain-
ties, the heuristic approach offers usable results at relatively low computational costs.
Additionally, as the emotional state is imprecise and difficult to be expressed numerical-
ly, some false classifications can be tolerated easily [Za96].


3.1    Fuzzy Logic to Assess Emotions

Through fuzzy logic, we can assess valence and arousal values, which serve as indicators
for emotions. Fuzzy logic provides a way to approximate and quantify granular input
expressed as linguistic variables [Za94]. In our fuzzy model, input variables consist of
EDA, HR, SD2 and SDNN, whereas arousal and valence serve as output variables.
Membership functions transform the membership of a specific element into a percentage
membership in the set of values. The fuzzy logic system weights each input signal, de-
fines overlap between the levels of input, and determines an output response. A set of
IF/THEN rules use the input membership values as weighting factors to determine their
influence on the fuzzy solution sets. Once the functions are inferred, scaled, and com-
bined, they are defuzzified into a solution variable (scalar output) [Co92].
To assess arousal, we started with a fuzzy logic system as proposed by [MA07], with
EDA and HR as input variables, triangular membership functions and with a set of eight
rules. In the case of valence, there is no clear evidence in the literature how input varia-
bles which can be derived from the ECG signal (e.g. HR or SDNN) correlate with va-
lence, or any form of emotion [Re15]. SD2 and SDNN input variables seemed to fit best
to our study design. Therefore, we started out with a fuzzy logic model with input varia-
bles SD2 and SDNN, together with a random ruleset.


3.2    Improving the Fuzzy Model via Genetic Optimization

Genetic optimization is a form of probabilistic reasoning which can be employed in
combination with fuzzy logic, which provides advantages over other methods of optimi-
zation [Za96]. Both membership functions and rulesets can be parameters of genetic
optimization. The fitness function used in our optimization is “closeness” of the obtained
values for each picture to IAPS ratings. This can be expressed as the mean squared error
(MSE) over all EPE pictures, which in the case of valence is defined by




where vi,j is the valence value derived for picture i and participant j, and vi′ is the IAPS
rating for picture i.
Fadi Moukayed, Haeseon Yun, Tom Bisson, Albrecht Fortenbacher

As a first step, the genetic algorithm generates an initial population. In each optimization
step, new individuals are created using cross-over and mutation, and added to the popu-
lation. The size of the population remains stable, so after each step, only the fittest indi-
viduals survive. The uniform crossover and mutation rates (20 % and 1% respectively)
were kept low, conforming to common recommendation [HR07]. The genetic algorithm
was configured to terminate when MSE remains constant for 500 iterations, which indi-
cates that no further improvement can be achieved.
For arousal, we took the rules by [MA07], and just optimized membership functions.
There are four membership functions for EDA and three for HR, yielding 21 parame-
ters for optimization. In the case of valence assess- ment using input variables SD2 and
SDNN, we start with a generic (chaotic) population, including random membership
functions and rule sets. Out of a set 32 possible (even contradicting) rules, each individ-
ual of the initial population obtains a random subset of these rules. Assuming four trian-
gle-shaped membership functions per input variable, this yields 24 membership function
parameters. Together, we acquire 56 parameters for optimization.


3.3    First Results

The arousal inference system optimized against IAPS reference arousal values yielded a
best MSE value of 2.39, after a total of 1706 iterations. To verify the result, we repeated
the optimization with a modified fitness function, which uses mean arousal values pro-
vided by participants in our experiment instead of IAPS reference values. In this case, it
took more time for the optimization to terminate (2958 iterations), yielding about the
same MSE value for arousal (2.40). While the genetically optimized HR membership
function boundaries for both systems were almost identical, significant difference was
observed in the shape of the EDA membership functions.
The differences between the two systems is a consequence of the deviation between the
IAPS reference values and the participant-provided ratings. Although we expected the
participant-provided ratings to reflect the emotional states more accurately, this was not
the case, and the system produced from IAPS reference values had a lower MSE than the
one produced from participant-provided values. The EDA membership functions varied
highly between the two systems as EDA is the attribute that reflects arousal more strong-
ly, as is hence more affected by the change in ratings.
We also optimized the fuzzy model for valence, using both rulesets and membership
functions (SDNN, SD2) as parameters. While we could use a “valid” ruleset in the case
of arousal, we had to start optimization of the valence model with generic (chaotic) rules
for SDNN and SD2. Optimization terminated after 1795 iterations and yielded an MSE
of 3.587, which is much worse than MSE for arousal. The final ruleset for valence con-
tained 7 rules.
                                              Detecting Academic Emotions from Sensor Data

4    Discussion and Outlook

The work presented in this paper was a feasibility study to relate sensor data (EDA,HR,
SD2 and SDNN) to academic emotions. Linguistic terms for recognizing emotions by
human judgement can easily be expressed in a fuzzy logic system, which makes it easier
to interpret our results as indicators for academic emotions. However, the study focused
on exploring the practicality of fuzzy methods to investigate physiological data to indi-
cate academic emotion. Conceptually, we related academic emotion to a two dimension-
al emotion model, yet a future study should aim at detecting emotions in a learning-
relevant context.
Relating sensor data to emotions is a challenging task. Especially for valence it is diffi-
cult to relate features such as HRV (time or frequency domain) to values which exactly
indicate emotions. In this case, genetic optimization proved to be a valuable tool to ob-
tain expert rules. Although a MSE of 3.5 (within a range from 1 - 9 of possible values)
cannot be regarded as an ideal result, it is a first step to gain “expert knowledge” about
rules to transform SDNN and SD2 to valence values. One important result is that two of
the rules obtained through optimization namely, IF sdnn IS low THEN valence IS high
and IF sdnn IS midlow THEN valence IS high confirmed previous findings where an
inverse correlation between SDNN and valence was reported [Re15].
Our next step will investigate the accuracy in detecting the picture classification based
on IAPS ratings as a fitness function for the genetic algorithm. This could yield better
results as indicators for academic emotions, or for specific learning situations. To do
this, our future investigation should entail a learning context where it can activate emo-
tions occurred during learning.
As there are many features which could be derived from our sensor data, e.g. SCR pulses
from EDA or HRV from ECG, we will use classifiers like decision trees to obtain a good
feature selection. This could result in a better understanding of derived data, in better
predictors for valence and arousal, and it could also help to optimize the promising fuzzy
logic approach.
Future studies should further investigate the means to provide a learning support for
students in a form of Intelligent Tutoring Systems (ITSs) [Ba10]. As the current studies
in learning support system revealed the need of context aware system using physiologi-
cal data [Du07] [Gi13], a system which uses sensor data and provides context aware
learning support including emotional support can help learners persist in their learning or
break out from the negative states [DG12]. Therefore, our future works include refining
emotion detection along with research on appropriate pedagogical intervention to help
learners regulate negative states.
Fadi Moukayed, Haeseon Yun, Tom Bisson, Albrecht Fortenbacher

5    Acknowledgement

This work has been funded by the BMBF project LISA (16SV7534K).


Bibliography
[Ar09] Arroyo, Ivon; Cooper, David G; Burleson, Winslow; Woolf, Beverly Park; Muldner,
       Kasia; Christopherson, Robert: Emotion sensors go to school. In: AIED. Jgg. 200, S. 17–
       24, 2009.
[Ba10]   Baker, Ryan SJd; D’Mello, Sidney K; Rodrigo, Ma Mercedes T; Graesser, Arthur C:
         Better to be frustrated than bored: The incidence, persistence, and impact of learners?
         cognitive–affective states during interactions with three different computer-based learn-
         ing environments. International Journal of Human-Computer Studies, 68(4):223–241,
         2010.
[Ca96]   Camm, A John; Malik, Marek; Bigger, JT; Breithardt, Günter; Cerutti, Sergio; Cohen,
         Richard J; Coumel, Philippe; Fallen, Ernest L; Kennedy, Harold L; Kleiger, RE et al.:
         Heart rate variability. Standards of measurement, physiological interpretation, and clini-
         cal use. European heart journal, 17(3):354–381, 1996.
[Ch81]   Christie, Margaret J: Electrodermal activity in the 1980s: a review. Journal of the Royal
         Society of Medicine, 74(8):616–622, 1981.
[CM15] Chanel, Guillaume; Mühl, Christian: Connecting brains and bodies: applying physiologi-
       cal computing to support social interaction. Interacting with Computers, 27(5):534–550,
       2015.
[Co92]   Cox, Earl: Fuzzy fundamentals. IEEE spectrum, 29(10):58–61, 1992.
[Co02]   Conati, Cristina: Probabilistic assessment of user’s emotions in educational games.
         Applied artificial intelligence, 16(7-8):555–575, 2002.
[DG12] D‘Mello, Sidney; Graesser, Art: Dynamics of affective states during complex learning.
       Learning and Instruction, 22(2):145–157, 2012.
[Dr08]   Dragon, Toby; Arroyo, Ivon; Woolf, Beverly P; Burleson, Winslow; El Kaliouby, Rana;
         Eydgahi, Hoda: Viewing student affect and learning through classroom observation and
         physical sensors. In: International Conference on Intelligent Tutoring Systems. Springer,
         S. 29–39, 2008.
[Du07]   Duckworth, Angela L; Peterson, Christopher; Matthews, Michael D; Kelly, Dennis R:
         Grit: perseverance and passion for long-term goals. Journal of personality and social
         psychology, 92(6):1087, 2007.
[Fr86]   Frijda, Nico H: The emotions. Cambridge University Press, 1986.
[Gi13]   Girard, Sylvie; Chavez-Echeagaray, Maria Elena; Gonzalez-Sanchez, Javier; Hidalgo-
         Pontet, Yoalli; Zhang, Lishan; Burleson, Winslow; VanLehn, Kurt: Defining the behav-
         ior of an affective learning companion in the affective meta-tutor project. In: Internation-
         al Conference on Artificial Intelligence in Education. Springer, S. 21–30, 2013.
                                                   Detecting Academic Emotions from Sensor Data

[Gr15]   Gruber, June; Mennin, Douglas S; Fields, Adam; Purcell, Amanda; Murray, Greg: Heart
         rate variability as a potential indicator of positive valence system disturbance: a proof of
         concept investigation. International Journal of Psychophysiology, 98(2):240–248, 2015.
[Gr16]   Grawemeyer, Beate; Mavrikis, Manolis; Holmes, Wayne; Gutierrez-Santos, Sergio;
         Wied- mann, Michael; Rummel, Nikol: Affecting off-task behaviour: how affect-aware
         feedback can improve student learning. In: Proceedings of the Sixth International Con-
         ference on Learning Analytics & Knowledge. ACM, S. 104–113, 2016.
[Ha02]   Hamilton, Pat: Open source ECG analysis. In: Computers in Cardiology, 2002. IEEE, S.
         101–104, 2002.
[HG17] Heathers, James; Goodwin, Matthew: LF/HF HRV: The ?Life After Death?Of A Refuted
       Theory. 2017.
[HR07] Haupt, Sue Ellen; RANDY, L HAUPT: Genetic algorithms and their applications in
       environmental sciences. Advanced Methods for Decision Making and Risk Management
       in Sustainability Science. Nova Science Publishers, New York, S. 183–196, 2007.
[IYF10] Immordino-Yang, Mary Helen; Faeth, Matthias: The role of emotion and skilled intui-
        tion in learning. Mind, brain, and education: Neuroscience implications for the class-
        room, S. 69–83, 2010.
[Ko90]   Koza, John R: Genetic programming: A paradigm for genetically breeding populations
         of computer programs to solve problems, Jgg. 34. Stanford University, Department of
         Computer Science Stanford, CA, 1990.
[Kr10]   Kreibig, Sylvia D: Autonomic nervous system activity in emotion: A review. Biological
         psychology, 84(3):394–421, 2010.
[La95]   Lang, Peter J: The emotion probe: Studies of motivation and attention. American psy-
         chologist, 50(5):372, 1995.
[LB07a] Lang, P; Bradley, Margaret M: The International Affective Picture System (IAPS) in the
        study of emotion and attention. Handbook of emotion elicitation and assessment, 29,
        2007.
[LB07b] Lang, P; Bradley, Margaret M: The International Affective Picture System (IAPS) in the
        study of emotion and attention. Handbook of emotion elicitation and assessment, 29,
        2007.
[LO86] Lanzetta, John T; Orr, Scott P: Excitatory strength of expressive faces: Effects of happy
       and fear expressions and context on the extinction of a conditioned fear response. Journal
       of Personality and Social Psychology, 50(1):190, 1986.
[MA07] Mandryk, Regan L; Atkins, M Stella: A fuzzy physiological approach for continuously
       modeling emotion during interaction with play technologies. International journal of
       human-computer studies, 65(4):329–347, 2007.
[Pe17]   Pereira, Tânia; Almeida, Pedro R; Cunha, João PS; Aguiar, Ana: Heart rate variability
         metrics for fine-grained stress level assessment. Computer methods and programs in bi-
         omedicine, 148:71–80, 2017.
[Re15]   Rezaei, Shahab; Moharreri, Sadaf; Dabanloo, Nader Jafarnia; Parvaneh, Saman: Evaluat-
         ing valence level of pictures stimuli in heart rate variability response. In: Computing in
Fadi Moukayed, Haeseon Yun, Tom Bisson, Albrecht Fortenbacher

         Cardiology Conference (CinC), 2015. IEEE, S. 1057–1060, 2015.
[Se10]   Setz, Cornelia; Arnrich, Bert; Schumm, Johannes; La Marca, Roberto; Tröster, Gerhard;
         Ehlert, Ulrike: Discriminating stress from cognitive load using a wearable EDA device.
         IEEE Transactions on information technology in biomedicine, 14(2):410–417, 2010.
[Th03]   Thong, Tran; Li, Kehai; McNames, James; Aboy, Mateo; Goldstein, Brahm: Accuracy of
         ultra-short heart rate variability measures. In: Engineering in Medicine and Biology So-
         ciety, 2003. Proceedings of the 25th Annual International Conference of the IEEE. Jgg.
         3. IEEE, S. 2424–2427, 2003.
[VCL86] Vrana, Scott R; Cuthbert, Bruce N; Lang, Peter J: Fear imagery and text processing.
        Psychophysiology, 23(3):247–253, 1986.
[Wo09] Woolf, Beverly; Dragon, Toby; Arroyo, Ivon; Cooper, David; Burleson, Winslow;
       Muldner, Kasia: Recognizing and responding to student affect. In: International Confer-
       ence on Human-Computer Interaction. Springer, S. 713–722, 2009.
[Wu94] Wundt, Wilhelm: "Lectures on Human and Animal Psychology". Translated by JE
       Creighton and EB Titchener. 1894.
[YFP17] Yun, Haeseon; Fortenbacher, Albrecht; Pinkwart, Niels: Improving a Mobile Learning
        Companion for Self-regulated Learning using Sensors. In: Proceedings of the 9th Inter-
        na- tional Conference on Computer Supported Education - Volume 1: CSEDU,.
        INSTICC, SciTePress, S. 531–536, 2017.
[Yu17] Yun, Haeseon; Fortenbacher, Albrecht; Pinkwart, Niels; Bisson, Tom; Moukayed, Fadi:
       A Pilot Study of Emotion Detection using Sensors in a Learning Context: Towards an
       Affective Learning Companion. In: DeLFI/GMW Workshops. 2017.
[Za94]   Zadeh, Lotfi A: Fuzzy logic, neural networks, and soft computing. Communications of
         the ACM, 37(3):77–84, 1994.
[Za96]   Zadeh, Lotfi A: Fuzzy logic = computing with words. IEEE transactions on fuzzy sys-
         tems, 4(2):103–111, 1996.
[Zu13]   Zuidhof, H.J.: Emotional Arousal Detection. Masterarbeit, University of Groningen,
         Groningen, The Netherlands, 2013.