<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Social Assistive Robots in Elderly Care: Exploring the role of Empathy</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>PaoloBuono</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>GiovannaCastellan</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>BerardinaDecaroli</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>sand NicolaMacchiarul</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Dipartimento di Informatica, Università degli Studi di Bari Aldo Moro</institution>
          ,
          <addr-line>Bari</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>The COVID-19 emergency has shown that elderly people living in Assisted Living Houses (ALHs) have been highly exposed to the virus. Besides health problems, during the social distancing restrictions, the elderly were also strongly afected by loneliness due to a lack of contact with their loved ones. Innovative solutions for ALH based on Social Assistive Robotics can reduce the risk of infection and, at the same time, improve the quality of life of elderly people. In this work, after a brief overview on the Pepper4Elderly project, we focus on the role of empathy and afective behaviors in human-robot interaction when the robot is used as a caring agent to assist and entertain the elderly guests of ALHs.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Social Assistive Robots</kwd>
        <kwd>Assistive Living Houses</kwd>
        <kwd>Pepper</kwd>
        <kwd>Emphatic Behaviour Model</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>Then, the general goal of the Pepper4Elderly project is to employ the Pepper robot as a
natural interface to heath-care services with the added value of establishing a social relation
with the user.</p>
      <p>Indeed, in assistive environments, social robots are being used for care services since their
physical embodiment, the use of a combination of verbal and nonverbal cues and the possibility
to interact with them naturally increase people’s engagement with and tr1u,s2t]()[.</p>
      <p>Recent research demonstrated that robots can positively shape human-to-human
communication, extending social communication with the introduction of artificial agents, thus, making
possible the existence of hybrid systems composed by humans and virtual agen3t]s. [</p>
      <p>
        However, in order to increase the acceptance of such a technology, the robot, besides
considering only a service-oriented response to the user needs, has to take into account the establishment
of social relationships. Psychologists indicate that afective behaviors and empathy, in particular,
have a beneficial efects on attitudes and relationships4[]. Empathy has been shown to play
a key role in patient-centered care, because it implies the understanding of the other inner
afective state [
        <xref ref-type="bibr" rid="ref5 ref6">5, 6</xref>
        ]. Previous works showed that empathic agents and robots are perceived as
more caring, likeable, and trustworthy than agents without empathic capabil7i,ti8e,s9[, 10].
      </p>
      <p>Taking these findings into account, this paper focuses on a specific socio-afective layer which
enables Pepper to recognize and monitor emotions and mood of users, in order to trigger the
most appropriate coping empathic strategies.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Pepper4Elderly</title>
      <p>The problem of taking care of the elderly is becoming extremely relevant, because significant
demographical and social changes afected our society in the last decades. The COVID-19
emergency emphasized issues regarding both the safety of older people living in caring houses
and their loneliness due to isolation.</p>
      <p>In this perspective, the use of technologies may improve the quality of life of elderly people
living in AHLs by providing cognitive and physical support, and easy access to the environment
services [11, 12, 13, 14].</p>
      <p>Pepper1 is a social robot that has the characteristics to intervene efectively as a caring
assistant in ALH: it interacts through speech, gestures, colors and sounds, and has a tablet
on its chest that can be used for telepresence activities. Pepper can move autonomously after
scanning the environment and can support the social-health operators in carrying out their
tasks, reducing the frequency with which they must come into close contact with patients.</p>
      <p>Following the human-in-the-loop paradigm, we propose a solution in which human operators
may be involved in the care process without requiring their physical presence. Operators can
use the robot as an interface to patients and can provide useful feedback to adapt the robot’s
behavior. In addition, the robot will be endowed with autonomous behavior aiming at detecting
and monitoring the states of the elders and, at the same time, interacting with them to execute
exercises or to remind therapies and planned actions.</p>
      <p>To increase acceptability, usability and user experience, the robot will be equipped with
behavioral models that make the interaction plausible and engaging.</p>
      <p>1Softbank Robotics</p>
      <p>Computer Vision solutions will be used for the analysis of the facial expressions of elderly
people, in particular those related to afective states and a model for classifying emotions form
the speech prosody will be integrated to address the task of multimodal emotion recognition.
According to the recognized afective state and to the context of the elder people, Pepper will
reason and act empathically. To this aim, it will be endowed with a computational model of
empathy [10]. Such model distinguishes between cognitive empathy (i.e. understanding how
another feels) and afective empathy (i.e. an active emotional reaction to another’s afective
state), in order to provide a complete definition of empathic behavior.</p>
      <p>
        In order to model the cognitive aspect of empathy, Pepper will not only recognize an afective
state, but also understand what caused it. To this aim, the robot will reason on the situation
and, according to an extended Belief-Desire-Intention (BDI) architect1u5r]e, w[hich takes
afective factors into account, it will decide which empathic goal to achieve by executing the
most appropriate plan of actions. The reasoning will be modeled using consolidated formalisms
- such as Dynamic Belief Networks1[
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] and Fuzzy Logic 1[
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] - that are suitable for simulating
human reasoning by dealing with uncertainty, typical of natural situations that gradually evolve
during time [18]. In this way Pepper will be able to simulate both the components of empathy,
since a social emotion is triggered in the robot as a consequence of its perception of the user’s
state. The resulting prototype will be tested both by psychology experts and through user study
with elderly guests of an ALH.
      </p>
      <p>In particular, the experimental study will aim at testing two conditions: Empathic Robot
(ER) vs. Non-Empathic Robot (NER) in which Pepper acts only as an interface toward services.
The study will be conducted in an ALH with elderly people. To measure results, specific
questionnaires will be developed by cooperating with psychologists who are expert in this
ifeld. In addition we will collect and analyse behavioral data so as to relate senior’s reactions to
robot’s behavior.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Modeling Empathic Behavior</title>
      <p>
        In the context of the Pepper4Elderly project, we plan to use the Pepper robot as a natural
interface towards environment services and at the same as an embodied companion. Several
studies confirmed that elderly users like to interact with social robots and to establish a social
relation with them 1[9, 20, 21]. Developing the social component of the interaction requires the
development of user models that involve reasoning on both cognitive and afective components
of the user’s state of mind, as in the case of the simulation of the empathic behavior. To have a
baseline to start for developing such a behavior, we look at available definitions of empathy.
Empathy is seen as the ability to perceive, understand and experience what others are feeling,
and to communicate such an understanding to them22[]. Baron-Cohen distinguishes between
cognitive and afective empathy. Cognitive empathy refers to the understanding of how another
feels, while afective empathy represents an active emotional reaction to another’s afective
state [23]. In the field of HRI, researchers have demonstrated the benefits of empathy in robot
behavior design 7[]. Many of them address only the afective dimension of empathy 2[
        <xref ref-type="bibr" rid="ref4">4</xref>
        ].
However, the cognitive component seems to be relevant to attribute an empathic behavior to a
robot.
      </p>
      <p>According to the results of a previous stud1y0][, the perception of empathy increases when
the robot shows that it understands the reason of the user’s state. Therefore, it is important to
endow the robot with the capability of recognizing as precisely as possible the emotional state
of the user, because a wrong recognition may compromise empathy. On the other hand, it is
also important that the generated behaviors are accurately designed. To this aim, we defined
the architecture of the robot’s reasoninFgig(ure 1) by including:
• the recognition of the user’s afective state starting from his behavior;
• the feeling generated by this situation in the robot’s mind by endowing the robot with
beliefs about its own emotions as a consequence of what has been recognized;
• the triggering of an empathic goal if necessary;
• the planning and execution of the behavior that is most appropriate to the user’s state.</p>
      <p>
        The Afective User Modeling component is dedicated to the inference of a particular state
of mind of the user starting from the analysis of the combination of facial expressions with
the speech prosody. Typically, the accuracy of Facial Expression Recognition (FER) systems
is afected by many factors among which the age 2[
        <xref ref-type="bibr" rid="ref5">5, 26</xref>
        ], because of age-related changes in
the face. Recently, Deep Learning (DL) algorithms, like Convolutional Neural Networks, which
lead to automated feature learning, have been successfully employed in several CV applications
achieving challenging results on various tas2k7s].[ However, DL algorithms used in FER
systems have been experimentally validated on young faces since the most commonly used
datasets for training have a little amount of old faces examples. For this reason, in order to train
the FER module of Pepper4Elderly, we plan to create a new dataset by enriching the FACES
dataset [28] with the addition of new older faces taken from videos.
      </p>
      <p>The robot’s afective reasoner will implement a computational model of emotion triggering
based on an extension of the model proposed i1n0[] in which we model the robot’s empathic
feelings as a DBN. According to the robot’s beliefs about the situation an empathic goal may be
triggered by the Empathic Goal Triggering module. These phase will be based on an extension
of the BDI model that, besides rational beliefs, includes also emotions thus becoming an EBDI
model [15]. Then, the selection of the plan to be executed by the robot will be driven also by the
recognized emotions of the user and triggered in the robot’s afective model. The selected plan
will be then executed by the robot by generating the most appropriate combination of verbal
and non verbal communicative actions in combination with service execution. According to
the BDI approach this cycle includes both deliberative and reactive reasoning, thus allowing
the generation of robot’s behaviors appropriate to the situation.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Preliminary evaluation</title>
      <p>We are currently working on the definition of goals that the robot has to pursue in the application
scenario and, consequently, behaviors that Pepper should use to interact emphatically with
elderly users to reach these goals. These behaviors will be designed with experts in the field
using the approach based on PERSONAs and Scenario definition.</p>
      <p>Example of goals and associated behaviors are we are currently considering are the following
empathic goals:
• console
• encourage and motivate
• congratulate
• play
• calm down</p>
      <p>A team of psychologists will be asked to assist to some sessions simulating the scenarios
between an actor, playing the role of the elder user, and the robot. For each scenario the experts
will evaluate the behavior plan in terms of communicative acts and verbal or non-verbal signs,
used for each communicative act. The results of this preliminary formative evaluation step will
provide useful feedback to refine the underlying model and the robot’s behaviors.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions and Future Work Directions</title>
      <p>Social robots should take into account afective factors when interacting with elderly users,
especially in caring contexts such as the ALH. In the context of Pepper4Elderly project we aim at
testing that endowing the robot with empathetic behaviors helps in establishing social long-term
relationships enforcing trust and confidence. To do so, we are designing and implementing a
general architecture based on an extension of the BDI model that takes into account afective
factors. In particular, the Pepper robot will recognize the emotional state of the user by analyzing
communicative signals extracted from speech, facial expressions, gestures, and posture, in order
to trigger its own afective state accordingly. The Pepper robot will reason on rational and
emotional beliefs to take a decision by activating and pursuing goals through the execution
of suitable behaviors. The efectiveness of the proposed approach will be initially tested in a
formative phase with domain experts and then with elderly people in a ALH. The goal is to
refine the model and the robot behaviors.
[9] L. Charrier, A. Galdeano, A. Cordier, M. Lefort, Empathy display influence on human-robot
interactions: a pilot study, 2018.
[10] B. D. Carolis, S. Ferilli, G. Palestra, Simulating empathic behavior in a social
assistive robot, Multimedia Tools Appl. 76 (2017) 5073–5094. URhLt:tps://doi.org/10.1007/
s11042-016-3797-0. doi:10.1007/s11042- 016- 3797- 0.
[11] H. Bui, N. Y. Chong, An integrated approach to human-robot-smart environment
interaction interface for ambient assisted living, in: 2018 IEEE Workshop on Advanced Robotics
and its Social Impacts (ARSO), 2018, pp. 32–37.
[12] C. D. Napoli, S. Rossi, A layered architecture for socially assistive robotics as a service, in:
2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), 2019, pp.
352–357.
[13] N. Casiddu, A. Cesta, G. Cortellessa, A. Orlandini, C. Porfirione, A. Divano, E. Micheli,
M. Zallio, Robot Interface Design: The Giraf Telepresence Robot for Social Interaction,
volume 11, 2015, pp. 499–509. doi1: 0.1007/978- 3- 319- 18374- 9_46.
[14] M. Manca, F. Paternò, C. Santoro, E. Zedda, C. Braschi, R. Franco, A. Sale, The impact of
serious games with humanoid robots on mild cognitive impairment older adults, International
Journal of Human-Computer Studies (2020).
[15] A. S. Rao, M. P. George, Bdi agents: From theory to practice, in: Proceedings of the First</p>
      <p>International Conference on Multi-Agent Systems (ICMAS-95), 1995, pp. 312–319.
[16] A. E. Nicholson, J. M. Brady, Dynamic belief networks for discrete monitoring, IEEE</p>
      <p>Transactions on Systems, Man, and Cybernetics 24 (1994) 1593–1610.
[17] C. Freksa, Fuzzy logic: An interface between logic and human reasoning, IEEE Expert 9
(1994) 20–21.
[18] B. D. Carolis, N. Novielli, Recognizing signals of social attitude in interacting with ambient
conversational systems, Multimodal User Interfaces 8 (2014) 43–60.
[19] J. Broekens, M. Heerink, H. Rosendal, Assistive social robots in elderly care: A review,</p>
      <p>Gerontechnology 8 (2009) 94–103. do1i0: .4017/gt.2009.08.02.002.00.
[20] S. Bahadori, A. Cesta, G. Grisetti, L. Iocchi, R. Leone, D. Nardi, A. Oddi, F. Pecora, R. Rasconi,
Robocare: Pervasive intelligence for the domestic care of the elderly, in: AI*IA Magazine
Special Issue, 2003.
[21] N. Chen, J. Song, B. Li, Providing aging adults social robots’ companionship in home-based
elder care, Journal of Healthcare Engineering 2019 (2019) 1–71.0d.o11i:55/2019/2726837.
[22] R. Picard, Toward machines with emotional intelligence, 2004, pp. 29–30.1d0o.1i:093/
acprof:oso/9780195181890.003.0016.
[23] S. Baron-Cohen, The Science of Evil: On Empathy and the Origins of Cruelty, Basic Books,</p>
      <p>NY, 2011.
[24] A. Paiva, I. Leite, H. Boukricha, I. Wachsmuth, Empathy in virtual agents and robots: A
survey, ACM Trans. Interact. Intell. Syst. 7 (2017). URhLt:tps://doi.org/10.1145/291215 0.
doi:10.1145/2912150.
[25] G. Guo, R. Guo, X. Li, Facial expression recognition influenced by human aging, IEEE</p>
      <p>Transactions on Afective Computing 4 (2013) 291–298.
[26] S. Wang, S. Wu, Z. Gao, Q. Ji, Facial expression recognition through modeling age-related
spatial patterns, Multimedia Tools and Applications 75 (2015) 3937–3954.
[27] D. Yu, L. Deng, Deep learning and its applications to signal and information processing
[exploratory dsp], IEEE Signal Processing Magazine 28 (2011) 145–154.
[28] N. Ebner, M. Riediger, U. Lindenberger, Faces—a database of facial expressions in young,
middle-aged, and older women and men: Development and validation, Behavior research
methods 42 (2010) 351–62. doi:10.3758/BRM.42.1.351.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>A. X.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Florendo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. E.</given-names>
            <surname>Miller</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Ishiguro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. P.</given-names>
            <surname>Saygin</surname>
          </string-name>
          ,
          <article-title>Robot form and motion influences social attention</article-title>
          ,
          <source>in: Proc. Int. Conf. on Human-Robot Interaction, HRI '15</source>
          ,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2015</year>
          , p.
          <fpage>43</fpage>
          -
          <lpage>50</lpage>
          . URhLt:tps: //doi.org/10.1145/2696454.269647 8. doi:
          <volume>10</volume>
          .1145/2696454.2696478.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>S. M.</given-names>
            <surname>Anzalone</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Boucenna</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ivaldi</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <article-title>Chetouani, Evaluating the engagement with social robots</article-title>
          ,
          <source>International Journal of Social Robotics</source>
          <volume>7</volume>
          (
          <year>2015</year>
          )
          <fpage>465</fpage>
          -
          <lpage>478</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>M. L.</given-names>
            <surname>Traeger</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. Strohkorb</given-names>
            <surname>Sebo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Jung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Scassellati</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. A.</given-names>
            <surname>Christakis</surname>
          </string-name>
          ,
          <article-title>Vulnerable robots positively shape human conversational dynamics in a human-robot team</article-title>
          ,
          <source>Proc. of the National Academy of Sciences</source>
          <volume>117</volume>
          (
          <year>2020</year>
          )
          <fpage>6370</fpage>
          -
          <lpage>6375</lpage>
          . URL: https://www.pnas.org/content/117/12/6370.doi:
          <volume>10</volume>
          .1073/pnas.1910402117. arXiv:https://www.pnas.org/content/117/12/6370.full.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>J.</given-names>
            <surname>Decety</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Bartal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Uzefovsky</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Knafo-Noam</surname>
          </string-name>
          ,
          <article-title>Empathy as a driver of prosocial behaviour: Highly conserved neurobehavioural mechanisms across species</article-title>
          ,
          <source>Philosophical Transactions of the Royal Society B: Biological Sciences</source>
          <volume>371</volume>
          (
          <year>2016</year>
          )
          <fpage>201500771</fpage>
          .
          <year>0d</year>
          .
          <year>o1i0</year>
          :98/ rstb.
          <year>2015</year>
          .
          <volume>0077</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>F.</given-names>
            <surname>de Vignemont</surname>
          </string-name>
          , T. Singer,
          <article-title>The empathic brain: how, when</article-title>
          and why?,
          <source>Trends in Cognitive Sciences</source>
          <volume>10</volume>
          (
          <year>2006</year>
          )
          <fpage>435</fpage>
          -
          <lpage>441</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>C.</given-names>
            <surname>Anderson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Keltner</surname>
          </string-name>
          ,
          <article-title>The role of empathy in the formation and maintenance of social bonds</article-title>
          ,
          <source>Behavioral and Brain Sciences</source>
          <volume>25</volume>
          (
          <year>2002</year>
          )
          <fpage>21</fpage>
          -
          <lpage>22</lpage>
          .
          <year>d10o</year>
          .
          <source>i1:017/S0140525X02230010.</source>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>I.</given-names>
            <surname>Leite</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Pereira</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Mascarenhas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Martinho</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Prada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Paiva</surname>
          </string-name>
          ,
          <article-title>The influence of empathy in human-robot relations</article-title>
          ,
          <source>International Journal of Human-Computer Studies</source>
          <volume>71</volume>
          (
          <year>2012</year>
          ). doi:
          <volume>10</volume>
          .1016/j.ijhcs.
          <year>2012</year>
          .
          <volume>09</volume>
          .005.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>A.</given-names>
            <surname>Paiva</surname>
          </string-name>
          , I. Leite,
          <string-name>
            <given-names>H.</given-names>
            <surname>Boukricha</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Wachsmuth</surname>
          </string-name>
          ,
          <article-title>Empathy in virtual agents and robots: A survey</article-title>
          ,
          <source>ACM Trans. Interact. Intell. Syst</source>
          .
          <volume>7</volume>
          (
          <year>2017</year>
          ). URhLt:tps://doi.org/10.1145/291215 0. doi:
          <volume>10</volume>
          .1145/2912150.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>