=Paper= {{Paper |id=Vol-3318/keynote1 |storemode=property |title=Human-Robot Interactions Using Affective Computing |pdfUrl=https://ceur-ws.org/Vol-3318/keynote1.pdf |volume=Vol-3318 |authors=Anthony Tzes |dblpUrl=https://dblp.org/rec/conf/cikm/Tzes22 }} ==Human-Robot Interactions Using Affective Computing== https://ceur-ws.org/Vol-3318/keynote1.pdf
Human-Robot Interactions Using Affective Computing
Anthony Tzes
Electrical Engineering & Center for Artificial Intelligence and Robotics, New York University Abu Dhabi, A1-193, P.O. Box 129188, UAE


                                         Abstract
                                         Affective human robot interaction (HRI) is quite complex since the robot interacts not only with the human but also with
                                         the environment. Providing robots with emotional intelligence is critical in this field but also achieving public acceptance
                                         and trust from the public when using robots is another challenge. Robots should infer and interpret human emotions and
                                         behave in a trusted way ensuring safety. Since affective HRI aims at the system development that use emotions, it requires
                                         knowledge from fields like computer science, psychology, and cognitive science. An affective autonomous robot interacts
                                         with humans using affective technologies to detect emotions. Despite the fact that a typical robot-platform has embedded
                                         several attributes like perception, decisions, and actions it is quite difficult to detect human emotions as well as to behave in a
                                         re-assuring manner.

                                          Keywords
                                          Human-robot interaction, affective computing, wireless sensor networks, EEG.



1. Introduction                                                                                                                ing, light on/off state, loudspeaker music, et al.).
                                                                                                                                 When it comes to affective computing considerations,
The impact of affective computing and robots can be                                                                            the principal concern is for designing and building sys-
examined in the context of a smart house application. In a                                                                     tems and environments where the HRI is smooth and
smart house [1] there is interaction with a wide variety of                                                                    human centered [12]. This includes building machines
smart devices to robotic mechanisms. Such interactions                                                                         that can sense and react to human emotions but also to be
have altered the objective of the house itself as a prime                                                                      reassuring, trusted and be considered safe by the public.
place to relax and unwind. Adding several smart devices
inside our environment without any synchronization
between them or planning regarding their integrated use                                                                        2. HRI: The case of a smart house
can have a negative impact, manifested mainly as anxiety,
stress and even insecurity. On the other hand, a properly                                                                      Our work presents the creation of an integrated environ-
scheduled and coordinated environment or, equivalently,                                                                        ment that provides the foundation for a Smart house com-
a smart house ecosystem can significantly reduce stress                                                                        puting experimental platform. The experimental study
and in general contribute to a higher quality of life. This                                                                    enhances the frequent operations encountered in a smart
happens only when the individual smart devices of a                                                                            house by monitoring its state using a wireless sensor net-
house ecosystem are working seamlessly and coordinated                                                                         work [13] and mobile robots [14]. This work describes
in the background taking into consideration the house                                                                          the developed HRI testbed shown in Figure 1, indicating
occupants and not vice versa.                                                                                                  the following technologies that have been integrated to
   Among the major indicator of the well-being of human                                                                        the Smart house platform:
occupants of a smart house is that of calmness, defined                                                                             • A Media Server attached to a dedicated computer
as the state of mind having low arousal and valence [2].                                                                              (Intel i7-NUC).
Since calmness implies relatively low brain activity, it
                                                                                                                                    • A supervising Data server (Intel i7-NUC) running
can be clearly identified using EEG [3, 4, 5] or through
                                                                                                                                      Ubuntu 16.04 which infers the human’s calm state
measurements related to ego-sensor data (smartwatch
                                                                                                                                      based on a 10 second sliding window of EEG read-
[6], smartphone [7]). A smart house seeks to provide
                                                                                                                                      ings.
an environment for increasing the calmness [8] by sens-
                                                                                                                                    • The human brain activity is measured using an
ing several related intrinsic parameters (temperature [9],
                                                                                                                                      inexpensive yet reliable portable EEG-device. In
illumination [10], sound [11], et al.) and providing the
                                                                                                                                      this study the users’ brain activity is used to
necessary outputs (heating ventilation and air condition-
                                                                                                                                      validate the effect of various stimuli in a smart
                                                                                                                                      home towards the achieved calmness. An Emo-
CIKM’22: 31st ACM International Conference on Information and
Knowledge Management, October 17–21, 2022, Atlanta, GA (compan-                                                                       tiv EPOC+ EEG device [15] that transmits brain
ion volume).                                                                                                                          signals using Bluetooth to a computer is used. It
Envelope-Open anthony.tzes@nyu.edu (A. Tzes)                                                                                          can measure the brain waves of a human wearing
Orcid 0000-0003-3709-2810 (A. Tzes)                                                                                                   the device and can transmit whether the user’s
                                    © 2022 Copyright for this paper by A. Tzes. Use permitted under Creative Commons License
                                    Attribution 4.0 International (CC BY 4.0).                                                        emotion state.
 CEUR
 Workshop
 Proceedings
               http://ceur-ws.org
               ISSN 1613-0073
                                    CEUR Workshop Proceedings (CEUR-WS.org)




                                                                                                                           1
Anthony Tzes CEUR Workshop Proceedings                                                                             1–4




Figure 1: Human-robot interaction testbed.


     • A suite of sensors that monitor the environment’s            era is mounted on the mobile robot and monitors
       status (sound, CO, humidity, temperature, et.al);            the surrounding space.
       these sensors are wirelessly connected to the su-
       pervised Data serer.                                     Moreover, a Google hub device acts as a data query
                                                              and actuation server and sends event-like (on/off) com-
  The utilized sensors include:                               mands to: a) a heat adjustment device (air cooler) for
                                                              regulating the temperature, b) smart power outlets that
    • A smartwatch (Samsung Galaxy Watch Active 2) connect WiFi RGB-light bulbs and other devices that
      running Tizen OS, which measures the heart rate affect the surrounding illuminance, and c) a Bluetooth-
      of its user every 10 sec.                               enabled loudspeaker device for playing streaming audio.
    • An attention inference device in the form of an           Finally, a mobile ground robot (Robotis’ Turtlebot
      Android application running on a smartphone 3 [16]) controlled by an Intel i7 NUC with consider-
      that detects the human’s motion [1 bit word] and able number crunching capabilities. This computer is
      the call’s state (Idle, Calling, Ringing) [2 bit word], connected to the OpenCR (Cortex-M7) board and runs
      every 5 seconds                                         ROS [17]. This robot is equipped with a 360∘ line LiDAR
    • A smart house monitoring device (Libelium Wasp- that detects obstacles anywhere within 12-350 cm with a
      mote and plug and play sensors) measuring: i) 1∘ angular resolution. This 2D-LiDAR is used for Hector
      carbon monoxide (every 60 sec), ii) temperature SLAM [18] and obstacle avoidance. The mobile robot
      (every 5 sec), iii) atmospheric pressure (every 5 should not create additional attention while navigating
      sec), iv) humidity (every 5 sec), v) illuminance its path within the smart house. For this reason, the robot
      (every 5 sec), and vi) luminosity (every 5 sec).        should not be in the Field of View of the humans which
    • A sound sensor (microphone) connected to an is monitored by an IMU placed along the EEG-device.
      Odroid XU-4 embedded microcontroller that mon-
      itors the power spectrum of the surrounding
      sound (over a 10 sec sliding window) and wire- 3. Affective computing for robot
      lessly transmits its normalized values [0 (noise-           applications
      less) up to 1 (loud)] to the server,
    • A spherical camera (Ricoh Theta V) that streams Humans living in an environment can perform percep-
      video at 4K-resolution to the data server; this cam- tual, spatial, motor, and cognitive activities. In real life



                                                           2
Anthony Tzes CEUR Workshop Proceedings                                                                                     1–4



these activities are interleaved creating complex real life
situations. We generated several scenarios that consist
of different combinations of such activities, executed the
scenarios in our smart house platform prototype and
checked the human reaction using the EEG. Our ini-
tial results show that the user’s emotions (calmness) are
strongly influenced by the scheduling of the activities.
More experiments need to be conducted to examine how
user behaviour is influenced in different situations like
simultaneous processing of clues, situations with low
arousal and high arousal etc.
                                                                   Figure 2: Commercial Social Robots.
   Several research directions can be followed based on
the above platform. An interesting problem to examine is
the use of AI based scheduler trained to the needs of the
user. The problem of smart home scheduling has been
examined mainly in the context of controlling appliances           assessments, or psychometric tests, or ongoing studies
for efficient energy consumption [19].                             involving the Negative Attitudes toward Robots Scale
   Social robots, shown in Figure 2 have been used for a           (NARS) will be used to evaluate the HRI. Figure 3 indi-
variety of applications. In [20], the major fields of appli-       cates a mobile robot in our smart house that moves away
cations for social robotics that include companionship,            from the human’s Field of View in order not to affect
healthcare, education, are investigated. Furthermore, the          NARS.
incorporation of social attributes to the HRI under the
social effects of these robots are highlighted.
   For example in the education field social robots have
been introduced for children education. In [21] social
robots introduce a new perspective in understanding
children learning. Robots are equipped with several sen-
sors and data analysis of the collected data during their
interaction with children can provide insights on the
learning process. An interesting result on HRI in the case
of children is presented in [22] where the authors use a
NAO humanoid robot to a handwriting partner to teach
children how to write.
   In some cases the results of the use of social robots are
not so encouraging. Such a case can be seen in [23] the
authors examined the literature on using social robots
for mental health interventions i.e. for improving depres-
sion and concluded that the research results have low
internal and external validity. HRIs in social robotics can
be remote or proximate. The problem of proximate inter-            Figure 3: Robot’s maneuver to decrease NARS
actions affects the Traits, Attitudes, Moods and Emotions
(TAME) of humans. Examples of proximate activities
between humans and robots can be as simple as the han-
dover of an item or as complicated as a joint surgery.
Human expectations and build of trust when considering             4. Conclusions
robot errors is of paramount importance as explained in
[24].                                                              It is evident that in the field of HRI, there is a challenge
   In our ongoing research, we are interested in proxi-            that needs to be addressed on how to add characteris-
mate HRI [25], where humans interact with colocated                tics and emotional intelligence to machines and environ-
robots. This interaction affects the sociability because of        ments so that the interactions with the humans to be
the robot’s functionality. Proximate HRI includes social,          intuitive, smooth, natural and trusted. This paper pre-
emotive, and cognitive capabilities of this interaction.           sented the development of a platform that encompasses
   The robot’s architecture is modified to account for the         several application fields and identifies future research
underlying affective models. Inhere, the TAME frame-               issues related to machines, emotional intelligence and
work [26, 27] is adopted to facilitate the overall HRI. Self       trust.



                                                               3
Anthony Tzes CEUR Workshop Proceedings                                                                                    1–4



References                                                              of a new framework, KI-Künstliche Intelligenz 31
                                                                        (2017) 283–289.
 [1] A. Tsoukalas, P. S. Annor, E. Kafeza, A. Tzes, IoT            [15] M. Strmiska, Z. Koudelkova, Analysis of perfor-
     Enhancements for an In-house Calm Computing                        mance metrics using emotiv EPOC+, in: MATEC
     Environment, in: 2022 8th International Confer-                    Web of Conferences, EDP Sciences, 2018.
     ence on Automation, Robotics and Applications                 [16] P. M. de Assis Brasil, F. U. Pereira, M. A. d. S. L.
     (ICARA), IEEE, 2022, pp. 239–242.                                  Cuadros, A. R. Cukla, D. F. T. Gamarra, Dijkstra
 [2] S. M. Alarcao, M. J. Fonseca, Emotions recognition                 and A∗ algorithms for global trajectory planning
     using EEG signals: A survey, IEEE Transactions on                  in the TurtleBot 3 mobile robot, in: International
     Affective Computing 10 (2017) 374–393.                             Conference on Intelligent Systems Design and Ap-
 [3] H. Becker, J. Fleureau, P. Guillotel, F. Wendling,                 plications, Springer, 2020, pp. 346–356.
     I. Merlet, L. Albera, Emotion recognition based on            [17] A. Koubâa, et al., Robot Operating System (ROS).,
     high-resolution EEG recordings and reconstructed                   Springer, 2017.
     brain sources, IEEE Transactions on Affective Com-            [18] W. Weichen, B. Shirinzadeh, M. Ghafarian,
     puting 11 (2017) 244–257.                                          S. Esakkiappan, T. Shen, Hector SLAM with ICP
 [4] R. N. Khushaba, L. Greenacre, S. Kodagoda, J. Lou-                 trajectory matching, in: IEEE/ASME AIM, IEEE,
     viere, S. Burke, G. Dissanayake, Choice modeling                   2020, pp. 1971–1976.
     and the brain: A study on the electroencephalo-               [19] K. Salameh, M. Awad, A. Makarfi, A.-H. Jallad,
     gram (EEG) of preferences, Expert Systems with                     R. Chbeir, Demand side management for smart
     Applications 39 (2012) 12378–12388.                                houses: A survey, Sustainability 13 (2021).
 [5] S. Grissmann, M. Spüler, J. Faller, T. Krumpe, T. O.          [20] A. Washburn, A. Adeleye, T. An, L. D. Riek, Robot
     Zander, A. Kelava, C. Scharinger, P. Gerjets, Context              errors in proximate HRI: How functionality fram-
     sensitivity of EEG-based workload classification                   ing affects perceived reliability and trust, ACM
     under different affective valence, IEEE Transactions               Transactions on human-robot interaction 9 (2020).
     on Affective Computing 11 (2017) 327–334.                     [21] W. Johal, Research trends in social robots for learn-
 [6] J. C. Quiroz, E. Geangu, M. H. Yong, Emotion recog-                ing, Current Robotics Reports 1 (2020) 75–83.
     nition using smart watch sensor data: Mixed-design            [22] D. Hood, S. Lemaignan, P. Dillenbourg, When chil-
     study, JMIR Mental health 5 (2018).                                dren teach a robot to write: An autonomous teach-
 [7] G. Meinlschmidt, J.-H. Lee, E. Stalujanis, A. Belardi,             able humanoid which uses simulated handwriting,
     M. Oh, E. K. Jung, H.-C. Kim, J. Alfano, S.-S. Yoo,                in: ACM/IEEE HRI, ACM, New York, NY, USA, 2015,
     M. Tegethoff, Smartphone-based psychotherapeu-                     p. 83–90.
     tic micro-interventions to improve mood in a real-            [23] B. S. de Araujo, M. Fantinato, S. M. Peres, R. C.
     world setting, Frontiers in psychology 7 (2016).                   de Melo, S. S. T. Batistoni, M. Cachioni, P. C. Hung,
 [8] C. Kapelonis, A calm house is a smart house, Re-                   Effects of social robots on depressive symptoms in
     search report, MIT Media Lab, 2018.                                older adults: A scoping review, Library Hi Tech
 [9] F. Barbosa Escobar, C. Velasco, K. Motoki, D. V.                   (2021).
     Byrne, Q. J. Wang, The temperature of emotions,               [24] A. Lambert, N. Norouzi, G. Bruder, G. Welch, A
     PLoS one 16 (2021).                                                systematic review of ten years of research on hu-
[10] R. Kaplan, S. Kaplan, The experience of nature: A                  man interaction with social robots, International
     psychological perspective, Cambridge University                    Journal of Human–Computer Interaction 36 (2020)
     Press, 1989.                                                       1804–1817.
[11] J. J. Alvarsson, S. Wiens, M. E. Nilsson, Stress recov-       [25] J. M. Beer, K. R. Liles, X. Wu, S. Pakala, Affective
     ery during exposure to nature sound and environ-                   human–robot interaction, in: Emotions and affect
     mental noise, International journal of environmen-                 in human factors and human-computer interaction,
     tal research and public health 7 (2010) 1036–1046.                 Elsevier, 2017, pp. 359–381.
[12] M. Spezialetti, G. Placidi, S. Rossi, Emotion recogni-        [26] R. C. Arkin, L. Moshkina, Affect in human-robot
     tion for human-robot interaction: Recent advances                  interaction, Technical Report, Georgia Inst. of Tech.
     and future perspectives, Frontiers in Robotics and                 Atlanta, 2014.
     AI 7 (2020) 532279.                                           [27] L. Moshkina, S. Park, R. C. Arkin, J. K. Lee, H. Jung,
[13] H. Ghayvat, S. Mukhopadhyay, X. Gui, N. Suryade-                   Tame: Time-varying affective response for hu-
     vara, WSN-and IoT-based smart homes and their                      manoid robots, International Journal of Social
     extension to smart buildings, Sensors 15 (2015)                    Robotics 3 (2011) 207–221.
     10350–10379.
[14] S. M. Nguyen, C. Lohr, P. Tanguy, Y. Chen, Plug and
     play your robot into your smart home: Illustration



                                                               4