<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Human-Robot Interactions Using Afective Computing</article-title>
      </title-group>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Electrical Engineering &amp; Center for Artificial Intelligence and Robotics, New York University Abu Dhabi</institution>
          ,
          <addr-line>A1-193, P.O. Box 129188, UAE</addr-line>
          ,
          <country country="US">USA</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Workshop Proce dings</institution>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2022</year>
      </pub-date>
      <fpage>17</fpage>
      <lpage>21</lpage>
      <abstract>
        <p>Afective human robot interaction (HRI) is quite complex since the robot interacts not only with the human but also with the environment. Providing robots with emotional intelligence is critical in this field but also achieving public acceptance and trust from the public when using robots is another challenge. Robots should infer and interpret human emotions and behave in a trusted way ensuring safety. Since afective HRI aims at the system development that use emotions, it requires knowledge from fields like computer science, psychology, and cognitive science. An afective autonomous robot interacts with humans using afective technologies to detect emotions. Despite the fact that a typical robot-platform has embedded several attributes like perception, decisions, and actions it is quite dificult to detect human emotions as well as to behave in a re-assuring manner.</p>
      </abstract>
      <kwd-group>
        <kwd>Human-robot interaction</kwd>
        <kwd>afective computing</kwd>
        <kwd>wireless sensor networks</kwd>
        <kwd>EEG</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>The impact of afective computing and robots can be
examined in the context of a smart house application. In a
smart house [1] there is interaction with a wide variety of
smart devices to robotic mechanisms. Such interactions
have altered the objective of the house itself as a prime
place to relax and unwind. Adding several smart devices
inside our environment without any synchronization
between them or planning regarding their integrated use
can have a negative impact, manifested mainly as anxiety,
stress and even insecurity. On the other hand, a properly
scheduled and coordinated environment or, equivalently,
a smart house ecosystem can significantly reduce stress
and in general contribute to a higher quality of life. This
happens only when the individual smart devices of a
house ecosystem are working seamlessly and coordinated
in the background taking into consideration the house
occupants and not vice versa.</p>
      <sec id="sec-1-1">
        <title>Among the major indicator of the well-being of human</title>
        <p>
          occupants of a smart house is that of calmness, defined
as the state of mind having low arousal and valence [2].
Since calmness implies relatively low brain activity, it
can be clearly identified using EEG [ 3, 4, 5] or through
measurements related to ego-sensor data (smartwatch
[6], smartphone [
          <xref ref-type="bibr" rid="ref30">7</xref>
          ]). A smart house seeks to provide
an environment for increasing the calmness [8] by
sensing several related intrinsic parameters (temperature [9],
illumination [10], sound [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ], et al.) and providing the
necessary outputs (heating ventilation and air
conditionCEUR
htp:/ceur-ws.org
ISN1613-073
        </p>
        <p>CEUR</p>
        <p>Workshop Proceedings (CEUR-WS.org)
ing, light on/of state, loudspeaker music, et al.).</p>
        <p>When it comes to afective computing considerations,
the principal concern is for designing and building
systems and environments where the HRI is smooth and
human centered [12]. This includes building machines
that can sense and react to human emotions but also to be
reassuring, trusted and be considered safe by the public.</p>
      </sec>
    </sec>
    <sec id="sec-2">
      <title>2. HRI: The case of a smart house</title>
      <p>Our work presents the creation of an integrated
environment that provides the foundation for a Smart house
computing experimental platform. The experimental study
enhances the frequent operations encountered in a smart
house by monitoring its state using a wireless sensor
network [13] and mobile robots [14]. This work describes
the developed HRI testbed shown in Figure 1, indicating
the following technologies that have been integrated to
the Smart house platform:
• A Media Server attached to a dedicated computer
(Intel i7-NUC).</p>
      <p>ings.
• A supervising Data server (Intel i7-NUC) running
Ubuntu 16.04 which infers the human’s calm state
based on a 10 second sliding window of EEG
read• The human brain activity is measured using an
inexpensive yet reliable portable EEG-device. In
this study the users’ brain activity is used to
validate the efect of various stimuli in a smart
home towards the achieved calmness. An
Emotiv EPOC+ EEG device [15] that transmits brain
signals using Bluetooth to a computer is used. It
can measure the brain waves of a human wearing
the device and can transmit whether the user’s
emotion state.</p>
      <p>• A suite of sensors that monitor the environment’s
status (sound, CO, humidity, temperature, et.al);
these sensors are wirelessly connected to the
supervised Data serer.</p>
      <sec id="sec-2-1">
        <title>The utilized sensors include:</title>
        <p>
          Moreover, a Google hub device acts as a data query
and actuation server and sends event-like (on/of)
commands to: a) a heat adjustment device (air cooler) for
regulating the temperature, b) smart power outlets that
• A smartwatch (Samsung Galaxy Watch Active 2) connect WiFi RGB-light bulbs and other devices that
running Tizen OS, which measures the heart rate afect the surrounding illuminance, and c) a
Bluetoothof its user every 10 sec. enabled loudspeaker device for playing streaming audio.
• An attention inference device in the form of an Finally, a mobile ground robot (Robotis’ Turtlebot
Android application running on a smartphone 3 [
          <xref ref-type="bibr" rid="ref25">16</xref>
          ]) controlled by an Intel i7 NUC with
considerthat detects the human’s motion [1 bit word] and able number crunching capabilities. This computer is
the call’s state (Idle, Calling, Ringing) [2 bit word], connected to the OpenCR (Cortex-M7) board and runs
every 5 seconds ROS [17]. This robot is equipped with a 360∘ line LiDAR
• A smart house monitoring device (Libelium Wasp- that detects obstacles anywhere within 12-350 cm with a
mote and plug and play sensors) measuring: i) 1∘ angular resolution. This 2D-LiDAR is used for Hector
carbon monoxide (every 60 sec), ii) temperature SLAM [18] and obstacle avoidance. The mobile robot
(every 5 sec), iii) atmospheric pressure (every 5 should not create additional attention while navigating
sec), iv) humidity (every 5 sec), v) illuminance its path within the smart house. For this reason, the robot
(every 5 sec), and vi) luminosity (every 5 sec). should not be in the Field of View of the humans which
• A sound sensor (microphone) connected to an is monitored by an IMU placed along the EEG-device.
        </p>
        <p>Odroid XU-4 embedded microcontroller that
monitors the power spectrum of the surrounding
sound (over a 10 sec sliding window) and wire- 3. Afective computing for robot
lessly transmits its normalized values [0 (noise- applications
less) up to 1 (loud)] to the server,
• A spherical camera (Ricoh Theta V) that streams Humans living in an environment can perform
percepvideo at 4K-resolution to the data server; this cam- tual, spatial, motor, and cognitive activities. In real life
era is mounted on the mobile robot and monitors
the surrounding space.
these activities are interleaved creating complex real life
situations. We generated several scenarios that consist
of diferent combinations of such activities, executed the
scenarios in our smart house platform prototype and
checked the human reaction using the EEG. Our
initial results show that the user’s emotions (calmness) are
strongly influenced by the scheduling of the activities.</p>
        <p>More experiments need to be conducted to examine how
user behaviour is influenced in diferent situations like
simultaneous processing of clues, situations with low
arousal and high arousal etc.</p>
        <p>Several research directions can be followed based on Figure 2: Commercial Social Robots.
the above platform. An interesting problem to examine is
the use of AI based scheduler trained to the needs of the
user. The problem of smart home scheduling has been
examined mainly in the context of controlling appliances assessments, or psychometric tests, or ongoing studies
for eficient energy consumption [ 19]. involving the Negative Attitudes toward Robots Scale</p>
        <p>Social robots, shown in Figure 2 have been used for a (NARS) will be used to evaluate the HRI. Figure 3
indivariety of applications. In [20], the major fields of appli- cates a mobile robot in our smart house that moves away
cations for social robotics that include companionship, from the human’s Field of View in order not to afect
healthcare, education, are investigated. Furthermore, the NARS.
incorporation of social attributes to the HRI under the
social efects of these robots are highlighted.</p>
        <p>For example in the education field social robots have
been introduced for children education. In [21] social
robots introduce a new perspective in understanding
children learning. Robots are equipped with several
sensors and data analysis of the collected data during their
interaction with children can provide insights on the
learning process. An interesting result on HRI in the case
of children is presented in [22] where the authors use a
NAO humanoid robot to a handwriting partner to teach
children how to write.</p>
        <p>In some cases the results of the use of social robots are
not so encouraging. Such a case can be seen in [23] the
authors examined the literature on using social robots
for mental health interventions i.e. for improving
depression and concluded that the research results have low
internal and external validity. HRIs in social robotics can
be remote or proximate. The problem of proximate inter- Figure 3: Robot’s maneuver to decrease NARS
actions afects the Traits, Attitudes, Moods and Emotions
(TAME) of humans. Examples of proximate activities
between humans and robots can be as simple as the
handover of an item or as complicated as a joint surgery.</p>
        <p>Human expectations and build of trust when considering 4. Conclusions
robot errors is of paramount importance as explained in
[24]. It is evident that in the field of HRI, there is a challenge</p>
        <p>In our ongoing research, we are interested in proxi- that needs to be addressed on how to add
characterismate HRI [25], where humans interact with colocated tics and emotional intelligence to machines and
environrobots. This interaction afects the sociability because of ments so that the interactions with the humans to be
the robot’s functionality. Proximate HRI includes social, intuitive, smooth, natural and trusted. This paper
preemotive, and cognitive capabilities of this interaction. sented the development of a platform that encompasses</p>
        <p>The robot’s architecture is modified to account for the several application fields and identifies future research
underlying afective models. Inhere, the TAME frame- issues related to machines, emotional intelligence and
work [26, 27] is adopted to facilitate the overall HRI. Self trust.</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <article-title>of a new framework</article-title>
          ,
          <source>KI-Künstliche Intelligenz 31</source>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          (
          <year>2017</year>
          )
          <fpage>283</fpage>
          -
          <lpage>289</lpage>
          . [1]
          <string-name>
            <given-names>A.</given-names>
            <surname>Tsoukalas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. S.</given-names>
            <surname>Annor</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Kafeza</surname>
          </string-name>
          ,
          <string-name>
            <surname>A</surname>
          </string-name>
          . Tzes, IoT [15]
          <string-name>
            <given-names>M.</given-names>
            <surname>Strmiska</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Koudelkova</surname>
          </string-name>
          , Analysis of perfor-
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Environment</surname>
          </string-name>
          , in: 2022 8th International Confer- Web of Conferences,
          <source>EDP Sciences</source>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <source>ence on Automation, Robotics and Applications</source>
          [16]
          <string-name>
            <surname>P. M. de Assis Brasil</surname>
            ,
            <given-names>F. U.</given-names>
          </string-name>
          <string-name>
            <surname>Pereira</surname>
            ,
            <given-names>M. A. d. S. L.</given-names>
          </string-name>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <source>(ICARA)</source>
          , IEEE,
          <year>2022</year>
          , pp.
          <fpage>239</fpage>
          -
          <lpage>242</lpage>
          . Cuadros,
          <string-name>
            <given-names>A. R.</given-names>
            <surname>Cukla</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. F. T.</given-names>
            <surname>Gamarra</surname>
          </string-name>
          , Dijkstra [2]
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Alarcao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. J.</given-names>
            <surname>Fonseca</surname>
          </string-name>
          ,
          <article-title>Emotions recognition and A∗ algorithms for global trajectory planning</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <article-title>using EEG signals: A survey, IEEE Transactions on in the TurtleBot 3 mobile robot</article-title>
          , in: International
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <source>Afective Computing</source>
          <volume>10</volume>
          (
          <year>2017</year>
          )
          <fpage>374</fpage>
          -
          <lpage>393</lpage>
          .
          <source>Conference on Intelligent Systems Design and Ap</source>
          [3]
          <string-name>
            <given-names>H.</given-names>
            <surname>Becker</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Fleureau</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Guillotel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Wendling</surname>
          </string-name>
          , plications, Springer,
          <year>2020</year>
          , pp.
          <fpage>346</fpage>
          -
          <lpage>356</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <given-names>I.</given-names>
            <surname>Merlet</surname>
          </string-name>
          , L. Albera,
          <source>Emotion recognition based on</source>
          [17]
          <string-name>
            <given-names>A.</given-names>
            <surname>Koubâa</surname>
          </string-name>
          , et al.,
          <source>Robot Operating System (ROS)</source>
          .,
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <article-title>high-resolution EEG recordings</article-title>
          and reconstructed Springer,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          <article-title>brain sources</article-title>
          ,
          <source>IEEE Transactions on Afective Com</source>
          <volume>-</volume>
          [18]
          <string-name>
            <given-names>W.</given-names>
            <surname>Weichen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Shirinzadeh</surname>
          </string-name>
          , M. Ghafarian,
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <source>puting 11</source>
          (
          <year>2017</year>
          )
          <fpage>244</fpage>
          -
          <lpage>257</lpage>
          . S. Esakkiappan,
          <string-name>
            <given-names>T.</given-names>
            <surname>Shen</surname>
          </string-name>
          ,
          <source>Hector SLAM with ICP</source>
          [4]
          <string-name>
            <given-names>R. N.</given-names>
            <surname>Khushaba</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Greenacre</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Kodagoda</surname>
          </string-name>
          , J. Lou- trajectory matching, in: IEEE/ASME AIM, IEEE,
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          <string-name>
            <surname>viere</surname>
          </string-name>
          , S. Burke, G. Dissanayake, Choice modeling
          <year>2020</year>
          , pp.
          <fpage>1971</fpage>
          -
          <lpage>1976</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          <article-title>and the brain: A study on the electroencephalo-</article-title>
          [19]
          <string-name>
            <given-names>K.</given-names>
            <surname>Salameh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Awad</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Makarfi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.-H.</given-names>
            <surname>Jallad</surname>
          </string-name>
          ,
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          <source>Applications</source>
          <volume>39</volume>
          (
          <year>2012</year>
          )
          <fpage>12378</fpage>
          -
          <lpage>12388</lpage>
          . houses: A survey,
          <source>Sustainability</source>
          <volume>13</volume>
          (
          <year>2021</year>
          ). [5]
          <string-name>
            <given-names>S.</given-names>
            <surname>Grissmann</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Spüler</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Faller</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Krumpe</surname>
          </string-name>
          , T. O. [20]
          <string-name>
            <given-names>A.</given-names>
            <surname>Washburn</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Adeleye</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>An</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. D.</given-names>
            <surname>Riek</surname>
          </string-name>
          , Robot
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          <article-title>under diferent afective valence</article-title>
          ,
          <source>IEEE Transactions Transactions on human-robot interaction 9</source>
          (
          <year>2020</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          <source>on Afective Computing</source>
          <volume>11</volume>
          (
          <year>2017</year>
          )
          <fpage>327</fpage>
          -
          <lpage>334</lpage>
          . [21]
          <string-name>
            <given-names>W.</given-names>
            <surname>Johal</surname>
          </string-name>
          , Research trends in social robots for learn[6]
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Quiroz</surname>
          </string-name>
          , E. Geangu,
          <string-name>
            <given-names>M. H.</given-names>
            <surname>Yong</surname>
          </string-name>
          , Emotion recog- ing,
          <source>Current Robotics Reports</source>
          <volume>1</volume>
          (
          <year>2020</year>
          )
          <fpage>75</fpage>
          -
          <lpage>83</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          <article-title>nition using smart watch sensor data: Mixed-</article-title>
          design [22]
          <string-name>
            <given-names>D.</given-names>
            <surname>Hood</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Lemaignan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Dillenbourg</surname>
          </string-name>
          , When chil-
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          <string-name>
            <surname>study</surname>
          </string-name>
          ,
          <source>JMIR Mental health 5</source>
          (
          <year>2018</year>
          ).
          <article-title>dren teach a robot to write: An autonomous teach</article-title>
          [7]
          <string-name>
            <given-names>G.</given-names>
            <surname>Meinlschmidt</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.-H.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Stalujanis</surname>
          </string-name>
          ,
          <string-name>
            <surname>A.</surname>
          </string-name>
          <article-title>Belardi, able humanoid which uses simulated handwriting,</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          <string-name>
            <given-names>M.</given-names>
            <surname>Oh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. K.</given-names>
            <surname>Jung</surname>
          </string-name>
          , H.-C. Kim,
          <string-name>
            <given-names>J.</given-names>
            <surname>Alfano</surname>
          </string-name>
          , S.-S. Yoo, in: ACM/IEEE HRI, ACM, New York, NY, USA,
          <year>2015</year>
          ,
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          <string-name>
            <given-names>M.</given-names>
            <surname>Tegethof</surname>
          </string-name>
          , Smartphone-based psychotherapeu- p.
          <fpage>83</fpage>
          -
          <lpage>90</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          <article-title>tic micro-interventions to improve mood in a real-</article-title>
          [23]
          <string-name>
            <surname>B. S. de Araujo</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Fantinato</surname>
            ,
            <given-names>S. M.</given-names>
          </string-name>
          <string-name>
            <surname>Peres</surname>
          </string-name>
          , R. C.
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          <source>world setting, Frontiers in psychology 7</source>
          (
          <year>2016</year>
          ). de Melo,
          <string-name>
            <given-names>S. S. T.</given-names>
            <surname>Batistoni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Cachioni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. C.</given-names>
            <surname>Hung</surname>
          </string-name>
          , [8]
          <string-name>
            <given-names>C.</given-names>
            <surname>Kapelonis</surname>
          </string-name>
          ,
          <article-title>A calm house is a smart house, Re- Efects of social robots on depressive symptoms in</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          <source>search report</source>
          , MIT Media Lab,
          <year>2018</year>
          .
          <article-title>older adults: A scoping review</article-title>
          ,
          <source>Library Hi Tech</source>
          [9]
          <string-name>
            <given-names>F.</given-names>
            <surname>Barbosa Escobar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Velasco</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Motoki</surname>
          </string-name>
          ,
          <string-name>
            <surname>D. V.</surname>
          </string-name>
          (
          <year>2021</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          <string-name>
            <surname>Byrne</surname>
            ,
            <given-names>Q. J.</given-names>
          </string-name>
          <string-name>
            <surname>Wang</surname>
            , The temperature of emotions, [24]
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Lambert</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Norouzi</surname>
            ,
            <given-names>G.</given-names>
            Bruder, G.
          </string-name>
          <string-name>
            <surname>Welch</surname>
          </string-name>
          , A
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          <source>PLoS one 16</source>
          (
          <year>2021</year>
          ).
          <article-title>systematic review of ten years of research on hu</article-title>
          [10]
          <string-name>
            <given-names>R.</given-names>
            <surname>Kaplan</surname>
          </string-name>
          ,
          <string-name>
            <surname>S. Kaplan,</surname>
          </string-name>
          <article-title>The experience of nature: A man interaction with social robots</article-title>
          , International
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          <source>psychological perspective</source>
          , Cambridge University Journal of Human-Computer Interaction
          <volume>36</volume>
          (
          <year>2020</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          Press,
          <year>1989</year>
          . 1804-
          <fpage>1817</fpage>
          . [11]
          <string-name>
            <given-names>J. J.</given-names>
            <surname>Alvarsson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Wiens</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. E.</given-names>
            <surname>Nilsson</surname>
          </string-name>
          , Stress recov- [25]
          <string-name>
            <surname>J. M. Beer</surname>
            ,
            <given-names>K. R.</given-names>
          </string-name>
          <string-name>
            <surname>Liles</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          <string-name>
            <surname>Wu</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Pakala</surname>
          </string-name>
          , Afective
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          <source>tal research and public health 7</source>
          (
          <year>2010</year>
          )
          <fpage>1036</fpage>
          -
          <lpage>1046</lpage>
          . Elsevier,
          <year>2017</year>
          , pp.
          <fpage>359</fpage>
          -
          <lpage>381</lpage>
          . [12]
          <string-name>
            <given-names>M.</given-names>
            <surname>Spezialetti</surname>
          </string-name>
          , G. Placidi,
          <string-name>
            <given-names>S.</given-names>
            <surname>Rossi</surname>
          </string-name>
          , Emotion recogni- [26]
          <string-name>
            <given-names>R. C.</given-names>
            <surname>Arkin</surname>
          </string-name>
          , L. Moshkina,
          <article-title>Afect in human-robot</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          and future perspectives,
          <source>Frontiers in Robotics and Atlanta</source>
          ,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          <issue>AI 7</issue>
          (
          <year>2020</year>
          )
          <fpage>532279</fpage>
          . [27]
          <string-name>
            <given-names>L.</given-names>
            <surname>Moshkina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Park</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. C.</given-names>
            <surname>Arkin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. K.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Jung</surname>
          </string-name>
          , [13]
          <string-name>
            <given-names>H.</given-names>
            <surname>Ghayvat</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Mukhopadhyay</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Gui</surname>
          </string-name>
          ,
          <string-name>
            <surname>N.</surname>
          </string-name>
          <article-title>Suryade- Tame: Time-varying afective response for hu-</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          <article-title>extension to smart buildings</article-title>
          ,
          <source>Sensors</source>
          <volume>15</volume>
          (
          <year>2015</year>
          )
          <article-title>Robotics 3 (</article-title>
          <year>2011</year>
          )
          <fpage>207</fpage>
          -
          <lpage>221</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          10350-
          <fpage>10379</fpage>
          . [14]
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Nguyen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Lohr</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Tanguy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Chen</surname>
          </string-name>
          , Plug and
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>