Aective Computing Experiments in Virtual Reality with Wearable Sensors. Methodological considerations and preliminary results. 1,2 2 1 2 Grzegorz J. Nalepa , Jan Argasi«ski , Krzysztof Kutt , Paweª W¦grzyn , 1 1 Szymon Bobek , and Mateusz Z. Цpicki 1 AGH University of Science and Technology Al. Mickiewicza 30, 30-059 Krakow, Poland {gjn,kkutt,sbobek,mzl}@agh.edu.pl 2 Jagiellonian University Ul. Goª¦bia 24, 31-007 Krakow, Poland {grzegorz.j.nalepa,jan.argasinski,pawel.wegrzyn}@uj.edu.pl Keywords: aective computing, virtual reality, aect metrics, mobile devices. 1 Introduction Aective computing (AfC) is a novel paradigm originally proposed in 1997 by Rosalind Picard from MIT Media Lab in her paramount book [12]. It builds on the results of biomedical engineering and psychology and aims at allowing com- puter systems to detect, use, and express emotions [4]. While at rst sight it may look general from the computer science point of view, in fact it is a construc- tive and practical approach oriented mainly at improving human-like decision support as well as human-computer interaction. AfC is a eld of study that puts interest in design and description of systems that are able to collect, inter- pret, process (ultimately  simulate) emotional states (aects). We assume that emotions are physical and cognitive [12, p.21] and as such they can be studied interdisciplinary by computer science, biomedical engineering and psychology. For aective computing there are two crucial elements to be considered: modes of data collection and ways of interpreting them in correlation with aective states corresponding to emotions. First is carried out by selection of methods for detecting information about emotions - this means using various sensors which capture data about human physical states and behaviors. Today most often har- vested and processed information are about: speech (prosody: pitch variables, speech rate), body gestures and poses (3D mapping, motion capture techniques), facial expressions (visual analysis and electromyography), physiological monitor- ing (blood pressure, blood volume pulse, galvanic skin response). In our work we plan to use a range of wearable physiological sensors, namely the Empatica E4. It is an advanced sensor based on the technologies previously developed in the Aective Computing division of MIT Media Lab 3 . Moreover, it was used 3 See http://affect.media.mit.edu/projectpages/iCalm/iCalm-2-Q.html in some works [5,6,9,10,13,16,17]. Second crucial element of aective computing paradigm relies on application of selected algorithms on acquired data to develop models of interpretation for aective states. State of the art methodologies as- sume the use of the full range of available methods of data classication and interpretation. In this paper we discuss selected important challenges in designing experi- ments that lead to data and information collection on aective states of partic- ipants. We aim at acquiring data that would be basis to formulate and evaluate computer methods for detection, identication interpretation of such aective states, and ultimately human emotions. This work is result of cooperation between two research teams: one from AGH UST led by Grzegorz J. Nalepa and second in Department of Games Technology led by Paweª W¦grzyn. Early ideas were originally presented in the project proposal entitled Knowledge-based models for aective context-aware sys- tems (KAFXS) written by Grzegorz J. Nalepa in cooperation with Jan K. Ar- gasi nski and Piotr Augustyniak. They were later developed and focused on the development of practical experiments presented in this paper. 2 Challenges in Designing Practical AfC Experiments To design and conduct practical AfC experiments that could deliver valuable aective data, a number of challenges has to be addressed. Dening emotion is a dicult problem. Without consensual conceptual- ization and operationalization of exactly what phenomenon is to be studied, progress in theory and research is hard to achieve and fruitless debates are likely to proliferate. Modern theories of emotions have their origin in 19th century  William James theorized about aects in terms of reactions to stimuli. He was precursor to appraisal theory which is among most popular in the community of computational emotional modeling [1], [8], [15]. One of the most popular appraisal theories is OOC [11] which categorizes emotion on basis of appraisal of pleasure/displeasure (valence) and intensity (arousal). These are quantiable values that can be measured and processed ascribing dierent kinds of emotions. First of all, in our work we initially assume that emotions are results of non-cognitive processes, as James proposed. More specically, we are aiming at building on the somatic feedback theory of emotions form Jesse Prinz [14], which is based on the James-Lange theory. It is assumed that embodied appraisals manifest by the body and can be detected and measured. In fact Prinz proposes a concept of core relational themes that could be possibly identied as patterns in data using data mining. Therefore, we assume that we need to measure number of bodily signals to detect and identify emotions. We begun our work with an in-depth analysis of selected works in experimental psychology and biological psychology. A survey paper [7] provided us with a pool of papers referring to the activity of Autonomic nervous system (ANS) as with our methodological assumptions should be viewed as a major component of the emotion response. The paper provides a review of 134 publications that report experimental investigations of emotional eects on peripheral physiological responding in healthy individuals. The results suggests important ANS response specicity in emotion when considering sub types of distinct emotions. Moreover, some terminological assumptions are given. From our perspective, and considering our needs, the review turned out to be mostly inconclusive and provided little support in designing our experiments. Some of the main challenges are related to: 1. the use of discrete (e.g. Ekman's faces) vs continuous (e.g. Russel's Circum- ex) models of emotions distinction between basic emotions vs high/social emotions while naming emotions is a lot conceptualization and high level se- mantics involved, that needs to be minimized the experimental assumptions dier importantly, thus to comparison of results is questionable the individ- ual, and cultural dierences of participants need to be considered in some cases how to evoke emotions in experimental setup, what are the optimal stimuli which bodily signals should be used, how to measure and what to measure (e.g. only HR and GSR values) what is the role of of user question- naires, in fact people need semantics (names of emotions) not numbers (e.g. Valence/Arousal or HR value) There are also some other practical challenges that we will need to address in the near future. They include (but are not limited to): 1. the quality of data from mobile devices, are the results from bands reliable (is raw signal available?), there is an need for cross validation 2. volatile hardware market, devices change or get discontinued, (e.g. Microsoft Band 2) 3. reliable hardware and software setup, 4. data synchronization across several devices. Finally, an important challenge remains a synthetic way of reporting the mea- sure of emotion. There are number of approaches to do it. Some of them simply assume reporting Valence/Arousal vales. We are dedicated to delivering more synthetic methods that would combine data measurements with user reports. Therefore, we are working on certain aective metrics that would include both the measurements of bodily responses, as well as report and stimulus evalua- tion by participants. Some other related works include the so-called Emotional Index [18]. 3 Methods and Tools Used The rst aim of cooperation of our two research groups is to provide: 1. an integrated sensoric framework which will use wearable physiological and biomedical hardware sensors for detection of user aects, 2. computational models for aect identication and interpretation. To deliver solutions to the rst objective we need to use some small mobile devices to acquire sensoric data. In our current experiments we selected 3 such devices. Empatica E4. An advanced sensory wristband based on the technologies pre- viously developed in the Aective Computing division of MIT Media Lab 1. Blood volume pulse and galvanic skin response sensors, as well as infrared thermopile and accelerometer are on board. It has already been used in number of works [5,6,9,10,13,16,17]. Microsoft Band 2. Band developed mainly for tracking tness goals. Equipped with optical heart rate and skin temperature sensors, accelerometer and gal- vanic skin response (GSR) sensor available through well documented Soft- ware Development Kit (SDK). e-Health Sensor Platform. An open medical monitoring platform supervised by the Cooking Hacks. It is a shield for Arduino/Raspberry Pi and the set of sensors that can be plugged in: pulse, oxygen in blood (SPO2), airow (breathing), body temperature, electrocardiogram (ECG), glucometer, gal- vanic skin response (GSR), blood pressure (sphygmomanometer), patient po- sition (accelerometer) and muscle/electromyography sensor (EMG). Thanks to build on Arduino, this solution can be combined with various devices and installations. We are using dierent data acquisition methods for these three devices. Ba- sically, for E4 we use mobile applications delivered by Empatica (however, on a desktop system), for Microsoft Band 22 we developed our own mobile app, and for e-Health we use a basic setup for serial port data acquisition. Furthermore, sensor recordings are compiled with user surveys (see Figure 1) prepared to determine which emotions were experienced during each stage of a prepared experiment. In fact in our initial phase of research we prepared and conducted several in lab experiments to verify our assumptions. The studies were conducted with the use of Virtual Reality (VR) via Oculus headset to provide emotional stimuli for more immersive user experience. Three experiments were conducted to collect the data for preliminary analysis as well as for the comparison of the signals collected by dierent devices. How do you feel during watching a movie? What do you feel? Sadness Joy Rage Anxiety Surprise Fear unpleasantly neutral pleasantly Irritation Shame Contempt Guilt Disgust Pleasure Desperation Pride boredom neutral fascination NONE OF THE ABOVE Fig. 1. Survey presented for each stage of the experiment (English adaptation for the paper; during experiments Polish version was used). In addition, the participant was able to describe the emotions in his own words below. 4 Preliminary Experiments All of the experiments were prepared and conducted by the Authors in the Department of Games Technology at Jagiellonian University in July, September, and October 2016. Experiment 1: Secondary School students. First experiment was conducted in July as a part of a holiday camp for secondary school students. 7 pupils (men) was connected to three devices (Empatica E4, Microsoft Band 2, e-Health Sensor Platform) during the test procedure. Signals measured by all devices: Heart Rate (HR), Galvanic Skin Response (GSR), Skin Temperature. Experiment consists of 4 stages that took place in Virtual Reality (VR) environment: (a) introductory movie without interaction, (b) three tasks with architecture tool (changing colors or moving furniture around), (c) jump scare, (d) movie without interaction to calm down participants. Experiment 2: The Scientist's Night. Second study was conducted during the Scientist's Night at Jagiellonian University in September. More than 100 partic- ipants take part in between-subject procedure: each participant was connected only to one of three devices (Empatica E4, Microsoft Band 2, e-Health Sensor Platform) during the experiment. This plan was adopted due to the nature of the event: a lot of people and chaos do not give a possibility to conduct within- subject study as before. Signals measured by all devices: HR, GSR. During experiment, 6 participants were at the scene: three was watching movie in VR and another three was dealing with papers (agreement to participate in the study before movie, survey after movie). In order to simplify and shorten the procedure, only one movie, approx. 5 minutes long, was presented. Experiment 3: The Scientist's Night  follow-up study. The third study was performed with the same procedure using the same tools as during Scientist's Night, but to make it very calmly, slowly and accurately. The planned group is 30 persons (10 persons x 3 devices). The data from these experiments is currently being analyzed. The results will be presented during the workshop. 5 Future Works During the AfCAI workshop in Murcia we presented in more details the experi- ments and shared the experiences we got. Our next steps for future works include cross validation and evaluation of mo- bile bands data, evaluation restriction to HR, GSR signals, provisioning several conguration of emotion evoking setup, combination of VR experiments with odor stimuli, delivery of several veried data sets for further analysis, and nally position and constraint research with specic applications and projects. Furthermore, we aim at developing data acquisition layer integrated with our recent solutions [2,3] in the area of mobile context-aware systems. We pro- posed methods for improving of knowledge management methods for imperfect or incomplete context that allow for modeling dynamics of the uncertainty and provide ecient reasoning under incomplete or missing data; as well as new modeling and context processing methods that improves the system capabilities to self-adaptability in dynamic mobile environments. The rapidly changing and uncertain nature of the physiological context, and the need to explain to the user the system behavior makes the deliverables of our previous research t perfectly the AfCAI motivations. Acknowledgments The authors would like to thank the participants of the Cognitive Science Semi- nar for Philosophy PhD Students in spring 2016. Moreover, special thanks go to Dr Michaª Klincewicz for his remarks on methods in experimental psychology. Finally, we would like to thank our students who helped us with the experiments, and data treatment, especially the AfC group in the KECS course in 2016. References 1. Arnold, M.B.: Emotion and personality. Columbia University Press (1960) 2. Bobek, S., Nalepa, G.J.: Uncertainty handling in rule-based mobile context- aware systems. Pervasive and Mobile Computing pp.  (2016), http://www. sciencedirect.com/science/article/pii/S1574119216302115 3. Bobek, S., Nalepa, G.J.: Uncertain context data management in dynamic mobile environments. Future Generation Computer Systems 66, 110  124 (2017), http: //www.sciencedirect.com/science/article/pii/S0167739X1630187X 4. Calvo, R.A., D'Mello, S.K., Gratch, J., Kappas, A. (eds.): The Oxford Handbook of Aective Computing. Oxford Library of Psychology, Oxford University Press (2015) 5. Doty, T.J., Kellihan, B., Jung, T.P., Zao, J.K., Litvan, I.: The wearable multimodal monitoring system: A platform to study falls and near-falls in the real-world. In: International Conference on Human Aspects of IT for the Aged Population. pp. 412422. Springer (2015) 6. Hernandez, J., McDu, D.J., Picard, R.W.: Bioinsights: Extracting personal data from still wearable motion sensors. In: 2015 IEEE 12th International Conference on Wearable and Implantable Body Sensor Networks (BSN). pp. 16. IEEE (2015) 7. Kreibig, S.D.: Autonomic nervous system activity in emotion: A review. Biological http://www.sciencedirect.com/science/ Psychology 84(3), 394  421 (2010), article/pii/S0301051110000827, the biopsychology of emotion: Current theoret- ical and empirical perspectives 8. Lazarus, R.S.: Psychological stress and the coping process. McGraw-Hill (1966) 9. Muaremi, A., Arnrich, B., Tröster, G.: Towards measuring stress with smartphones and wearable devices during workday and sleep. BioNanoScience 3(2), 172183 (2013) 10. Müller, S.C., Fritz, T.: Stuck and frustrated or in ow and happy: Sensing devel- opers' emotions and progress. In: Proceedings of the 37th International Conference on Software Engineering-Volume 1. pp. 688699. IEEE Press (2015) 11. Orthony, A., Clore, G., Collins, A.: The cognitive structure of emotions. Cambridge University Press (1988) 12. Picard, R.W.: Aective Computing. MIT Press (1997) 13. Picard, R.W.: Recognizing stress, engagement, and positive emotion. In: Proceed- ings of the 20th International Conference on Intelligent User Interfaces. pp. 34. ACM (2015) 14. Prinz, J.J.: Gut Reactions. A Perceptual Theory of Emotion. Oxford University Press (2006) 15. R., S.K.: Apprasial theory. In: T., D., M., P. (eds.) Handbook of cognition and emotion. pp. 637663. John Wiley and Sons (1999) 16. Sano, A., Picard, R.W.: Stress recognition using wearable sensors and mobile phones. In: Aective Computing and Intelligent Interaction (ACII), 2013 Humaine Association Conference on. pp. 671676. IEEE (2013) 17. Song, M., DiPaola, S.: Exploring dierent ways of navigating emotionally- responsive artwork in immersive virtual environments. In: Proceedings of the Con- ference on Electronic Visualisation and the Arts. pp. 232239. British Computer Society (2015) 18. Vecchiato, G., Cherubino, P., Maglione, A.G., Ezquierro, M.T.H., Marinozzi, F., Bini, F., Trettel, A., Babiloni, F.: How to measure cerebral correlates of emotions in marketing relevant tasks. Cognitive Computation 6(4), 856871 (2014), http: //dx.doi.org/10.1007/s12559-014-9304-x