<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">A Pilot Study of Emotion Detection using Sensors in a Learning Context: Towards an Affective Learning Companion</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Haeseon</forename><surname>Yun</surname></persName>
							<email>yun@htw-berlin.de</email>
							<affiliation key="aff0">
								<orgName type="institution">HTW Berlin</orgName>
								<address>
									<addrLine>Campus Wilhelminenhofstraße 75A</addrLine>
									<postCode>12459</postCode>
									<settlement>Berlin</settlement>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Albrecht</forename><surname>Fortenbacher</surname></persName>
							<affiliation key="aff1">
								<orgName type="institution">HTW Berlin</orgName>
								<address>
									<addrLine>Campus Wilhelminenhofstraße 75A</addrLine>
									<postCode>12459</postCode>
									<settlement>Berlin</settlement>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Niels</forename><surname>Pinkwart</surname></persName>
							<email>niels.pinkwart@hu-berlin.de</email>
							<affiliation key="aff2">
								<orgName type="department">Institut für Informatik</orgName>
								<orgName type="institution">HU Berlin</orgName>
								<address>
									<addrLine>Rudower Chaussee 25</addrLine>
									<postCode>12489</postCode>
									<settlement>Berlin</settlement>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Tom</forename><surname>Bisson</surname></persName>
							<email>tom.bisson@student.htw-berlin.de</email>
							<affiliation key="aff3">
								<orgName type="institution">HTW Berlin</orgName>
								<address>
									<addrLine>Campus Wilhelminenhofstraße 75A</addrLine>
									<postCode>12459</postCode>
									<settlement>Berlin</settlement>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Fadi</forename><surname>Moukayed</surname></persName>
							<email>fadi.moukayed@student.htw-berlin.de</email>
							<affiliation key="aff4">
								<orgName type="institution">HTW Berlin</orgName>
								<address>
									<addrLine>Campus Wilhelminenhofstraße 75A</addrLine>
									<postCode>12459</postCode>
									<settlement>Berlin</settlement>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">A Pilot Study of Emotion Detection using Sensors in a Learning Context: Towards an Affective Learning Companion</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">465AA8A5EDF625CC1CA2EBD3A05600E5</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-24T09:23+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>sensor based learning</term>
					<term>learning companion</term>
					<term>learning analytics</term>
					<term>adaptive learning</term>
					<term>emotion detection</term>
					<term>IAPS</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Emotions facilitate knowledge attainment and also affect learners on their current behavior and future choice. Sensors which detect physiological signals have been studied and related to emotions, and specifically electro dermal activity (EDA) and heart rate variability (HRV) have been adopted to detect emotion. In this pilot study, we have presented visual emotional stimuli to 6 participants and attained their ratings on a picture. Their EDA and HRV values were recorded and investigated to find any relation between the stimulated emotion, self-assessment and physiological signals. The study explored EDA and HRV signal changes due to the visual stimuli and some signal changes in EDA were observed, when joyful or satisfied pictures were presented. However, limitations need to be overcome to provide clearer interpretations. A future study on providing awareness to learners using a learning companion is suggested.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1">Introduction</head><p>Learning Analytics for Sensor-Based Adaptive Learning (LISA) 6 is a research project aimed at improving learner support through the use of sensor data. Specifically, the research to bring together user-centric learning analytics, analysis of sensor data indicating the emotional state of a learner and adaptive feedback as a learning support is being progressed to provide solutions for sensor-based adaptive learning. Furthermore, the developed solutions will be integrated into learning environments and products.</p><p>In the pilot study presented in this paper, we aimed at exploring emotion detection using sensors and tried to interpret the data attained by participants. We first discuss the role of emotions in a learning context, classify emotions in four academic emotions, collect Haeseon Yun et al. physiological data using sensors and interpret empirical data to relate with emotions in a learning context.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2">Detection and Interpretation of Emotions in a Learning Context</head><p>Emotion plays a navigating role in cognitive knowledge attainment and it affects the outcomes of learners' behaviors and even future choice on actions <ref type="bibr" target="#b13">[IF10]</ref>. For successful learning, providing an environment where learners feel emotionally supported is important so that both emotional and cognitive learning can take place <ref type="bibr" target="#b10">[GS08]</ref>. One way to provide an emotionally supportive environment is to provide awareness of his or her own emotional state to a learner so that the information can encourage him or her to modify his or her emotional state <ref type="bibr" target="#b4">[Bu06]</ref>. For instance, if a learner is in a negative emotional state such as frustration or boredom, knowing his or her current state can help a learner to make a decision to break out from the state.</p><p>The research in autonomic response in emotion examines various physiological signals such as heart rate, heart rate variability, respiratory rate, electro-dermal activity to detect positive (happiness, contentment, joy, peacefulness, and calmness) and negative (anger, disgust, fear, sadness, surprise, fear, depression, boredom, embarrassment) emotions <ref type="bibr" target="#b15">[Kr10]</ref> [CM15] <ref type="bibr" target="#b9">[Fa09]</ref>. However, applying outcomes of the studies in general emotion may not be applicable in a learning context since the emotional states interested in learning context are different from general emotions <ref type="bibr" target="#b21">[Wo09]</ref>. For instance, intrinsic motivation, state of flow <ref type="bibr" target="#b7">[Cs90]</ref> are the emotional states that are positively related to successful learning yet, classifying these states based on the previous studies of emotion is not simple. Considering emotion in a learning context, Pekrun and colleagues <ref type="bibr" target="#b20">[Pe02]</ref> focused on academic emotion which is consisted of the positive emotions which are enjoyment, hope, pride and relief and the negative emotions such as anger, anxiety, shame, hopelessness and boredom.</p><p>As vast research in sensors to detect physiological changes relation to emotion has been conducted in the field of autonomic response and the emotional state in a learning context is investigated in education and educational psychology, the commonly used dimensions of emotions <ref type="bibr" target="#b21">[Wo09]</ref> were selected. Specifically, four emotions (joyful, satisfied, angry, and bored) which are related to academic emotions were used for this study as follows:</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>•</head><p>Positive valence and high arousal: excited and joyful Valence refers to a range from happiness to unhappiness whereas arousal pertains to a range from excitement to calmness. For instance, when a person is being surprised by a nice gift, he or she can be excited (high arousal) and happy (positive valence) whereas when one is being scared by someone, one can still be excited (high arousal) but angry or unhappy (negative valence).</p><p>To detect emotions in an experiment setting, emotional stimuli are applied to learners and self-reports on the emotional stimuli are used to relate the emotional stimuli with specific emotions. For instance, Lang and colleagues <ref type="bibr" target="#b16">[LBC08]</ref> presented a set of emotional pictures as emotional stimuli and asked participants to indicate their levels of emotions.</p><p>Additionally, hardware sensors are also adopted to associate physiological changes during the presence of emotional stimulus with specific emotion. For example, a camera was used to detect head position and movement to relate to emotion <ref type="bibr" target="#b21">[Wo09]</ref> and various researchers advocated the feasibility of motivation detection using hardware sensors including camera and other physiological sensors such as EEG, EMG, electro-dermal activity sensors and heart rate sensor</p><formula xml:id="formula_0">[MA07] [BCF08] [BCF07] [Bu06] [CM09] [Ar09] [AFR11] [DG10].</formula><p>Specifically, heart rate variability is shown to tell when the given task induces stress <ref type="bibr" target="#b22">[YH08]</ref> and skin conductance and electro-dermal activity describes changes in emotion <ref type="bibr">[Kr10] [Mc12]</ref>.</p><p>Even though there are many attempts from previous mentioned studies using sensors to detect emotion, relating a specific emotion with sensors data in a learning context is still in its explorative state. Therefore, in this study we have conducted a pilot study to investigate further into sensor data in relation to emotion, specifically academic emotion.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3">A pilot study</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1">Materials and Methods</head><p>As wearable sensors can measure physiological changes which then can elaborate emotional state, we have replicated <ref type="bibr" target="#b16">[LBC08]</ref> experiment in addition to using physiological measurement of EDA and ECG. The purpose of the pilot study was to investigate physiological signals based on the founded experiment and to further make improvements for the future study when integrating sensors in an emotional stimuli experiment setting.</p><p>In this pilot study, we used 58 IAPS (International Affective Picture System) pictures and each participant wore ECG and EDA sensors. The total number of six subjects participated in this pilot study. The participants were selected based on their willingness and availability. Participants of the studies were introduced with the purpose of the study as to relate physiological studies with emotional pictures. Physiological signals were measured using pre-gelled electrodes (ECG on left part of their chest or collarbone and EDA on two fingers) and the signals were verified visually using real-time online visualization tool before conducting the experiment. Similar to the original study, all participants were explained to look at the picture and rate the Self-Assessment Manikin (SAM) <ref type="bibr" target="#b16">[LBC08]</ref> which includes three dimensions of emotions. To clarify three dimensions, example slides Haeseon Yun et al.</p><p>were provided as figure <ref type="figure" target="#fig_0">1</ref>. When a participant understood what to do during this experiment, he or she began the experiment and it took around 30 minutes.</p><p>The experiment was consisted of 58 sessions. Each session was consisted of a prompt ("Get ready for the next slide"), one of 58 emotional pictures and a SAM rating scale for self-report. The participant was prompted to prepare for the next picture for a duration of five seconds. Then the picture was displayed for six seconds followed by the screen with the 9-point scale SAM ratings. The rating process was accomplished by clicking on radio buttons associated with the three SAM scales for valence, arousal and dominance. A countdown timer was located on the bottom of the rating page to inform the remaining time to encourage a participant to make his or her selection.   </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.2">Results</head><p>Three physiological signals (ECG, EDA and HRV) along with self-report response value were collected from the six participants. The analysis of the collected data took two parallel approaches. First, participants' self-report response on presented pictures were classified into 4 dimensions of emotion (joyful, satisfied, angry and bored) and compared with the value from the original study we have replicated. We were able to attain all six participant's response on their perceptions on presented pictures.</p><p>To classify pictures into four emotions, self-report values that the subjects made were taken into consideration. Out of 9 scale, when the self-report value for a given picture was higher than 6, it is considered high and when the value was lower than 4, it was considered low. When the value falls between 4 to 6, it was marked with multiple emotions. For instance, when a picture was marked as 8 in arousal and 2 in valence, the picture was classified as angry and when the picture was perceived as 4 in arousal and 8 in valence, it was marked as satisfied/joyful based on the dimension described in section 2. The overall results show that there were some similar evaluations of the pictures especially for pictures classified in angry and joyful dimensions between the original group and our experiment subjects. For example, 3100 (burned victim) were claimed to stimulate anger by 5 out of 6 participants. Picture 5450 (liftoff) was perceived as joyful picture by 4 out of 6 participants which corresponds to the normative study result. However, for satisfied and bored, only one participant indicated bored in one picture (3230: dying man) which was coincided with the study that we have replicated. Even though similarity in evaluation of pictures in satisfied or bored dimensions were observed, the relation between our subjects and the group from the original study was more obvious for the pictures with high arousal (angry or joyful).</p><p>The physiological data (HRV and EDA) successfully retrieved in this pilot study resulted in 3 data set out of 6 subject data due to technical problems. Therefore, the analysis of the HRV and EDA will focus on 3 data set. For the EDA measured during the experiment, we could find the individual difference as shown in the figure <ref type="figure" target="#fig_2">3</ref>. Three recorded signals showed that individual started at the different point (9-13 micro Siemens) and their range of changes was different by person. This difference could be due to the random order of picture presented or individual difference in perceiving emotional pictures or this may be due to the individual difference in EDA detection. The skin conductance is known to widely different between subjects <ref type="bibr" target="#b3">[Br16]</ref>. Overall, we have observed some high peaks in EDA value when angry or joyful pictures were shown to the participants which was also shown in the previous study <ref type="bibr" target="#b19">[Mc01]</ref>. For instance, subject A rated the picture 3100 (burned victim) as the picture that induced angry emotion and the signal had slight peak from 5 to 5.2 micro Siemens. 8620 (Circus Horse) was presented afterward and it was marked as joyful and the EDA signal shows upward trend to 5.5 micro Siemens. When 2070 (baby) was shown to the subject A, the subject had a dynamic change in EDA (6.4 to 7 micro Siemens) and this picture was reported joyful. EDA data of subject B also showed highest peak of 7250 (b-day cake) with 11.2 micro Siemens and the picture was perceived as joyful. From subject C, low value in EDA was also recorded when 2280 (Neut. Boy), 5480 (Fireworks), 7510 (skyscraper), 2780 (actor makeup) and 1930 (shark) were presented and these pictures were considered low or undefined in arousal level (satisfied or bored) by the subject. Similar to subject A and B, EDA value of subject C was high when positive emotion was induced. For instance, highest EDA was observed when 1920 (dolphins) was presented and furthermore, when 5910 (fireworks), 2070 (baby), 7330 (ice cream), 5020(flowers) and 7410 (M&amp;M) were presented, the subject C had high EDA signal and the pictures were marked as joyful and satisfied.</p><p>The heart rate variability (HRV) recorded for this experiment was analyzed to check if the irregularity in heart beat is observed. According to <ref type="bibr" target="#b12">[HH79]</ref>, the HRV is suppressed (regular heart beat) when the task is demanding, which can be also translated as stressful.</p><p>The HRV signal data collected by the subjects had a different range. For instance, participant A and 3's HRV was from 520 to 850 millisecond whereas participant B's range was from 550 to 1100 millisecond as shown in the figure 2. The participant A showed highest HRV value when marking 1710 (3 puppies), 5950 (lightning) and 2070 (baby) and these pictures were marked as all joyful. The lowest HRV was marked during 7010 (basket) and 7380 (roach pizza) which were defined as angry by the participant. Participant B also had high peak, close to 1100 millisecond, when viewing 1710 (3 puppies) and the picture was marked as joyful picture by the subject. The lowest peak was observed when 2130 (angry woman) was marked neutral but somewhat aroused, which tends toward angry. Participant C showed high peaks during 7400 (candy), 7100 (fire hydrant) and 7510 (skyscraper) and low peaks during 7030 (iron), 7380 (roach pizza), 7010 (basket). The high HRV value seems to correlate with positive emotion with rather low arousal which can be defined as joyful and low HRV explains bored or angry. Distinguishing between anger and boredom from the data was not clear.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4">Discussion &amp; Outlook</head><p>Awareness of one's cognitive and emotional state (self-awareness) is the first step to improve learning process (self-control) and the lack of self-awareness attributes to learning deficiencies <ref type="bibr" target="#b25">[Zi02]</ref>. As emotion plays an important role in learning, our pilot study strived to integrate objective means to detect emotions using physiological sensors and relate to previous findings to further investigate the ways to provide awareness back to learners. Specifically, the paper focused on measuring two physiological signals (EDA and HRV) during emotional picture experiment and aimed to take a step closer to detection and interpretation of physiological data in emotion. Clear statement between physiological data and emotions could not be made due to the small sample size, yet the measured signals such as changes in EDA data were observed when joyful and satisfactory pictures were presented. Furthermore, from HRV values, we observed a distinction between positive and negative emotions, even though distinguishing within positive emotions (joyful or satisfied) and negative emotions (angry or bored) was not possible at this stage. The limitations of the study, namely the small sample size and some technical problems, resulted in a small data set for viable data analysis. Further improvement in hardware and software for accurate detection should be realized along with an increase in sample size as A Pilot Study of Emotion Detection using Sensors in a Learning Context a next step.</p><p>In addition to emotion detection and interpretation using sensors, as a next step in LISA project, we are investigating the ideal way to provide awareness of one's academic emotion and recommendation. As the primary findings, based on the users' input (focus group and workshop), heuristic evaluation on the prototypes and consideration of suitable pedagogical approach, we have suggested further steps in <ref type="bibr" target="#b24">[Yu17]</ref>. Furthermore, the pedagogical approach, a learning companion, to design a device to provide awareness to learners in a friendly and intuitive way was explained in the study.</p><p>A learning companion investigated in <ref type="bibr" target="#b24">[Yu17]</ref> in connection with this pilot study implies the need for a further systematic consideration in four aspects: 1) user experience, 2) pedagogical/ instructional support, 3) technical realization and 4) data privacy. The user experience should focus on how to design a learning companion to be trusted and respected by a user <ref type="bibr" target="#b14">[JL16]</ref>. The pedagogical/instructional support will focus on emotional support which plays a guiding role in knowledge attainment <ref type="bibr" target="#b13">[IF10]</ref>, creativity and problem solving <ref type="bibr" target="#b0">[Ah13]</ref>. The technical realization should consider feasibility of the concept considering the finance, hardware and software development and data privacy will follow a general rule of thumb to entrust all data control over to the user without transferring to any other medium or cloud without user's knowledge.</p><p>Research conducted in the LISA project requires a comprehensive investigation on not only the detection and interpretation of sensor data for emotion but also the means to provide information back to learners in a meaningful, intuitive way. Our current findings in this paper focused on the former part of the overall project and presented an initial work in emotional detection and interpretation using sensor device. The findings showed promising results as a work in progress as the physiological sensor data were able to distinguish between negative and positive emotions even with the discussed limitations.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5">Acknowledgement</head><p>This work has been funded by the BMBF project LISA (16SV7534K). </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Bibliography</head></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Fig. 1 :</head><label>1</label><figDesc>Fig. 1: One session with a prompt, a visual stimulus and a ratingThe hardware used for the experiment was the Bitalino Plugged Kit with various pluggable sensors (EMG, EDA, ECG and EEG), a status LED and Bluetooth communication channel. For the purpose of our experiment, only ECG and EDA sensors were utilized and the kit was enclosed by the 3D printed casing also manufactured by Bitalino as shown in the figure2. The digitized signal data had a resolution of 10 bits and was transmitted to the receiver via Bluetooth at a sampling rate of 1000 Hz.</figDesc><graphic coords="4,226.68,497.63,141.84,106.80" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Fig. 2 :</head><label>2</label><figDesc>Fig. 2: Bitalino Kit with ECG and EDA sensors plugged in</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Fig. 3 :</head><label>3</label><figDesc>Fig. 3: OpenSignals recording session (left), ECG/EDA/HRV visualization utility (right)</figDesc><graphic coords="5,162.35,310.04,126.06,96.20" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Fig. 4 :</head><label>4</label><figDesc>Fig.4: EDA graphs of 3 participants (left: subject A, middle: subject B, right: subject C)</figDesc><graphic coords="7,119.05,290.54,357.25,81.90" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_4"><head>Fig. 5 :</head><label>5</label><figDesc>Fig. 5: Heart Rate Variability graphs of 3 participants (left: subject A, middle: subject B, right: subject C)</figDesc><graphic coords="8,119.05,191.70,357.25,98.30" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_2"><head></head><label></label><figDesc>[Ar09] Arroyo, I., Cooper, D. G., Burleson, W., Woolf, B. P., Muldner, K., &amp; Christopherson, R.: Emotion Sensors Go To School. Artificial Intelligence in Education: Building Learning Systems That Care: From Knowledge Representation To Affective Modelling, 200, pp. 17-24, 2009. [AFR11] Azcarraga, J., Francis, J., &amp; Robert, I.: Predicting Academic Emotion based on Brainwaves Signals and Mouse Click Behavior. 2011.</figDesc><table /></figure>
		</body>
		<back>
			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Emotions, self-regulated learning, and achievement in mathematics: A growth curve analysis</title>
		<author>
			<persName><forename type="first">W</forename><surname>Ahmed</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Werf</surname></persName>
		</author>
		<author>
			<persName><surname>Van Der</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Kuyper</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Minnaert</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Educational Psychology</title>
		<imprint>
			<biblScope unit="volume">105</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="150" to="161" />
			<date type="published" when="2013">2013</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Towards Selection of Tutorial Actions Using Emotional Physiological Data</title>
		<author>
			<persName><forename type="first">K</forename><surname>Benadada</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Chaffar</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Frasson</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Wec Its</title>
				<imprint>
			<date type="published" when="2008">2008</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Towards advanced learner modeling: Discussions on quasi real-time adaptation with physiological data</title>
		<author>
			<persName><forename type="first">E</forename><surname>Blanchard</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Chalfoun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Frasson</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings -The 7th IEEE International Conference on Advanced Learning Technologies</title>
				<meeting>-The 7th IEEE International Conference on Advanced Learning Technologies<address><addrLine>Niigata, Japan</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2007">2007</date>
			<biblScope unit="page" from="809" to="813" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Patient Education and Counseling The impact of watching educational video clips on analogue patients &apos; physiological arousal and information recall</title>
		<author>
			<persName><forename type="first">I</forename><forename type="middle">R</forename><surname>Bruinessen</surname></persName>
		</author>
		<author>
			<persName><surname>Van</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><forename type="middle">T A</forename><surname>Ende</surname></persName>
		</author>
		<author>
			<persName><surname>Van Den</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">N C</forename><surname>Visser</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Dulmen</surname></persName>
		</author>
		<author>
			<persName><surname>Van</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Patient education and counseling</title>
		<imprint>
			<biblScope unit="volume">99</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="243" to="249" />
			<date type="published" when="2016">2016</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">Affective Learning Companions : strategies for empathetic agents with real-time multimodal affective sensing to foster meta-cognitive and meta-affective approaches to learning, motivation and perseverance</title>
		<author>
			<persName><forename type="first">W</forename><surname>Burleson</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Diss. Massachusetts Institute of Technology</title>
		<imprint>
			<biblScope unit="page">159</biblScope>
			<date type="published" when="2006">2006</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Connecting brains and bodies: Applying physiological computing to support social interaction</title>
		<author>
			<persName><forename type="first">G</forename><surname>Chanel</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Mühl</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Interacting with Computers</title>
		<imprint>
			<biblScope unit="volume">27</biblScope>
			<biblScope unit="issue">5</biblScope>
			<biblScope unit="page" from="534" to="550" />
			<date type="published" when="2015">2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">Empirically building and evaluating a probabilistic model of user affect</title>
		<author>
			<persName><forename type="first">C</forename><surname>Conati</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Maclaren</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">User Modeling and User-Adapted Interaction</title>
		<imprint>
			<biblScope unit="volume">19</biblScope>
			<biblScope unit="issue">3</biblScope>
			<biblScope unit="page" from="267" to="303" />
			<date type="published" when="2009">2009</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<monogr>
		<title level="m" type="main">Flow: The Psychology of Optimal Experience</title>
		<author>
			<persName><forename type="first">M</forename><surname>Csikszentmihalyi</surname></persName>
		</author>
		<imprint>
			<date type="published" when="1990">1990</date>
			<publisher>Harper Perennial</publisher>
			<pubPlace>New York</pubPlace>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features</title>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">K</forename><surname>D'mello</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Graesser</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">User Modeling and User-Adapted Interaction</title>
		<imprint>
			<biblScope unit="volume">20</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="147" to="187" />
			<date type="published" when="2010">2010</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Fundamentals of physiological computing</title>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">H</forename><surname>Fairclough</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Interacting with Computers</title>
		<imprint>
			<biblScope unit="volume">21</biblScope>
			<biblScope unit="issue">1-2</biblScope>
			<biblScope unit="page" from="133" to="145" />
			<date type="published" when="2009">2009</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Brains and the dynamics of &quot;wants&quot; and &quot;cans&quot; in learning</title>
		<author>
			<persName><forename type="first">P</forename><surname>Geert</surname></persName>
		</author>
		<author>
			<persName><surname>Van</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Steenbeek</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Mind, Brain, and Education</title>
		<imprint>
			<biblScope unit="volume">2</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="62" to="66" />
			<date type="published" when="2008">2008</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Open source ECG analysis</title>
		<author>
			<persName><forename type="first">P</forename><surname>Hamilton</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Computers in Cardiology</title>
		<imprint>
			<biblScope unit="page" from="101" to="104" />
			<date type="published" when="2002">2002. 2002</date>
			<publisher>IEEE</publisher>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">Heart-rate correlates of childhood activities: play, exploration, problem-solving and day-dreaming</title>
		<author>
			<persName><forename type="first">M</forename><surname>Hughes</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Hutt</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Am. Psychol</title>
		<imprint>
			<biblScope unit="volume">8</biblScope>
			<biblScope unit="page" from="253" to="263" />
			<date type="published" when="1979">1979</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">The role of emotion and skilled intuition in learning</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">H</forename><surname>Immordino-Yang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Faeth</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Mind brain, and education: Neuroscience implications for the classroom</title>
				<imprint>
			<date type="published" when="2010">2010</date>
			<biblScope unit="page" from="69" to="83" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">Face-to-Face Interaction with Pedagogical Agents, Twenty Years Later</title>
		<author>
			<persName><forename type="first">W</forename><forename type="middle">L</forename><surname>Johnson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">C</forename><surname>Lester</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">International Artificial Intelligence in Education Society</title>
		<imprint>
			<biblScope unit="volume">26</biblScope>
			<biblScope unit="page" from="25" to="36" />
			<date type="published" when="2015">2015. 2016</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<analytic>
		<title level="a" type="main">Autonomic nervous system activity in emotion: A review</title>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">D</forename><surname>Kreibig</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Biological Psychology</title>
		<imprint>
			<biblScope unit="volume">84</biblScope>
			<biblScope unit="issue">3</biblScope>
			<biblScope unit="page" from="394" to="421" />
			<date type="published" when="2010">2010</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b16">
	<monogr>
		<author>
			<persName><forename type="first">P</forename><forename type="middle">J</forename><surname>Lang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">M</forename><surname>Bradley</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><forename type="middle">N</forename><surname>Cuthbert</surname></persName>
		</author>
		<idno>A-8</idno>
		<title level="m">International affective picture system (IAPS): Affective ratings of pictures and instruction manual</title>
				<imprint>
			<date type="published" when="2008">2008</date>
		</imprint>
	</monogr>
	<note type="report_type">Technical Report</note>
</biblStruct>

<biblStruct xml:id="b17">
	<analytic>
		<title level="a" type="main">A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies</title>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">L</forename><surname>Mandryk</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">S</forename><surname>Atkins</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">A Pilot Study of Emotion Detection using Sensors in a Learning Context</title>
				<imprint>
			<date type="published" when="2007">2007</date>
			<biblScope unit="volume">65</biblScope>
			<biblScope unit="page" from="329" to="347" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b18">
	<analytic>
		<title level="a" type="main">AffectAura: An Intelligent System for Emotional Memory</title>
		<author>
			<persName><forename type="first">D</forename><surname>Mcduff</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Karlson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Kapoor</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Roseway</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Czerwinski</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems -CHI &apos;12</title>
				<meeting>the 2012 ACM Annual Conference on Human Factors in Computing Systems -CHI &apos;12</meeting>
		<imprint>
			<date type="published" when="2012">2012</date>
			<biblScope unit="page">849</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b19">
	<analytic>
		<title level="a" type="main">Emotional reactions in children: verbal, physiological, and behavioral responses to affective pictures</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">H</forename><surname>Mcmanis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">M</forename><surname>Bradley</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><forename type="middle">K</forename><surname>Berg</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><forename type="middle">N</forename><surname>Cuthbert</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><forename type="middle">J</forename><surname>Lang</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Psychophysiology</title>
		<imprint>
			<biblScope unit="volume">38</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="222" to="231" />
			<date type="published" when="2001">2001</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b20">
	<analytic>
		<title level="a" type="main">Academic Emotions in Students&apos; Self-Regulated Learning and Achievement: A Program of Qualitative and Quantitative Research</title>
		<author>
			<persName><forename type="first">R</forename><surname>Pekrun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Goetz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><surname>Titz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">P</forename><surname>Perry</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Educational Psychologist</title>
		<imprint>
			<biblScope unit="volume">37</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="91" to="105" />
			<date type="published" when="2002">2002</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b21">
	<analytic>
		<title level="a" type="main">Affectaware tutors: Recognising and responding to student affect</title>
		<author>
			<persName><forename type="first">B</forename><surname>Woolf</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><surname>Burleson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Arroyo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Dragon</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Cooper</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Picard</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">International Journal of Learning Technology</title>
		<imprint>
			<biblScope unit="volume">4</biblScope>
			<biblScope unit="issue">3/4</biblScope>
			<biblScope unit="page" from="129" to="164" />
			<date type="published" when="2009">2009</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b22">
	<analytic>
		<title level="a" type="main">Entertainment capture through heart rate activity in physical interactive playgrounds</title>
		<author>
			<persName><forename type="first">G</forename><forename type="middle">N</forename><surname>Yannakakis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Hallam</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">H</forename><surname>Lund</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">User Modeling and User-Adapted Interaction</title>
		<imprint>
			<biblScope unit="volume">18</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="207" to="243" />
			<date type="published" when="2008">2008</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b23">
	<analytic>
		<title level="a" type="main">Improving a Mobile Learning Companion for Self-regulated Learning using Sensors</title>
		<author>
			<persName><forename type="first">H</forename><surname>Yun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Fortenbacher</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Pinkwart</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 9th International Conference on Computer Supported Education</title>
				<meeting>the 9th International Conference on Computer Supported Education</meeting>
		<imprint>
			<publisher>Csedu</publisher>
			<date type="published" when="2017">2017</date>
			<biblScope unit="volume">1</biblScope>
			<biblScope unit="page" from="531" to="536" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b24">
	<analytic>
		<title level="a" type="main">User-Centric Approach to the Design of a Mobile Learning Companion</title>
		<author>
			<persName><forename type="first">H</forename><surname>Yun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">H</forename><surname>Israel</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Fortenbacher</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Rott</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Metzler</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Fachgruppe Be-greifbare Interaktion Workshop</title>
				<imprint>
			<publisher>Mensch und Computer</publisher>
			<date type="published" when="2017">2017</date>
		</imprint>
	</monogr>
	<note>In press</note>
</biblStruct>

<biblStruct xml:id="b25">
	<monogr>
		<title level="m" type="main">Becoming a Self-Regulated Learner : An Overview</title>
		<author>
			<persName><forename type="first">B</forename><forename type="middle">J</forename><surname>Zimmerman</surname></persName>
		</author>
		<imprint>
			<date type="published" when="2002">2002</date>
			<biblScope unit="volume">41</biblScope>
			<biblScope unit="page" from="64" to="70" />
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
