=Paper=
{{Paper
|id=Vol-2702/sample-3col
|storemode=property
|title=Social Assistive Robots in Elderly Care: Exploring the role of Empathy
|pdfUrl=https://ceur-ws.org/Vol-2702/EMPATHY_2020_paper_3.pdf
|volume=Vol-2702
|authors=Paolo Buono,Giovanna Castellano,Berardina Decarolis,Nicola Macchiarulo
|dblpUrl=https://dblp.org/rec/conf/avi/BuonoCCM20
}}
==Social Assistive Robots in Elderly Care: Exploring the role of Empathy==
Social Assistive Robots in Elderly Care: Exploring the
role of Empathy
Paolo Buonoa , Giovanna Castellanoa , Berardina Decarolisa and Nicola Macchiaruloa
a
Dipartimento di Informatica, Università degli Studi di Bari Aldo Moro, Bari, Italy
Abstract
The COVID-19 emergency has shown that elderly people living in Assisted Living Houses (ALHs) have
been highly exposed to the virus. Besides health problems, during the social distancing restrictions,
the elderly were also strongly affected by loneliness due to a lack of contact with their loved ones.
Innovative solutions for ALH based on Social Assistive Robotics can reduce the risk of infection and,
at the same time, improve the quality of life of elderly people. In this work, after a brief overview on
the Pepper4Elderly project, we focus on the role of empathy and affective behaviors in human-robot
interaction when the robot is used as a caring agent to assist and entertain the elderly guests of ALHs.
Keywords
Social Assistive Robots, Assistive Living Houses, Pepper, Emphatic Behaviour Model
1. Introduction
The COVID-19 emergency has shown that elderly people living in Assisted Living Houses
(ALHs) have been highly exposed to the virus. Social Assistive Robots (SAR), thanks to Affective
Computing (AC), Computer Vision (CV) and Human-Robot Interaction (HRI) technologies, can
support seniors in Assistive Living Houses (ALHs) in their daoly tasks with socially acceptable
behaviors. Moreover, with the use of SAR, the workload of ALH staff can be reduced and the
safety levels of both operators and elderly people can increase.
To this aim, the Pepper4Elderly project proposes a solution based on the use of the Pepper
robot acting as a natural and intelligent interface to services specifically designed for assisting
elderly people in ALHs. Pepper4Elderly addresses the ALHs need for innovative solutions
to face COVID-19 emergency management by: i) minimizing the transmission of the virus
and, therefore, mortality; ii) ensuring the protection of guests and health professionals; iii)
maintaining communication between guests and their relatives; iv) offering entertainment and
company to ALH guests and monitoring their mood.
EMPATHY: Empowering People in Dealing with Internet of Things Ecosystems. Workshop co-located with AVI 2020,
Island of Ischia, Italy
email: paolo.buono@uniba.it (P. Buono); giovanna.castellano@uniba.it (G. Castellano);
berardina.decarolis@uniba.it (B. Decarolis); nicola.macchiarulo@uniba.it (N. Macchiarulo)
url: http://ivu.di.uniba.it/people/buono.htm (P. Buono);
https://sites.google.com/site/cilabuniba/people/giovanna-castellano (G. Castellano); http://www.di.uniba.it/~nadja
(B. Decarolis); https://www.researchgate.net/profile/Nicola_Macchiarulo (N. Macchiarulo)
orcid: 0000-0002-1421-3686 (P. Buono); 0000-0002-6489-8628 (G. Castellano); 0000-0002-2689-137X (B. Decarolis);
0000-0002-3754-9991 (N. Macchiarulo)
© 2020 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR
Workshop
Proceedings
http://ceur-ws.org
ISSN 1613-0073
CEUR Workshop Proceedings (CEUR-WS.org)
Then, the general goal of the Pepper4Elderly project is to employ the Pepper robot as a
natural interface to heath-care services with the added value of establishing a social relation
with the user.
Indeed, in assistive environments, social robots are being used for care services since their
physical embodiment, the use of a combination of verbal and nonverbal cues and the possibility
to interact with them naturally increase people’s engagement with and trust ([1, 2]).
Recent research demonstrated that robots can positively shape human-to-human communi-
cation, extending social communication with the introduction of artificial agents, thus, making
possible the existence of hybrid systems composed by humans and virtual agents [3].
However, in order to increase the acceptance of such a technology, the robot, besides consider-
ing only a service-oriented response to the user needs, has to take into account the establishment
of social relationships. Psychologists indicate that affective behaviors and empathy, in particular,
have a beneficial effects on attitudes and relationships [4]. Empathy has been shown to play
a key role in patient-centered care, because it implies the understanding of the other inner
affective state [5, 6]. Previous works showed that empathic agents and robots are perceived as
more caring, likeable, and trustworthy than agents without empathic capabilities [7, 8, 9, 10].
Taking these findings into account, this paper focuses on a specific socio-affective layer which
enables Pepper to recognize and monitor emotions and mood of users, in order to trigger the
most appropriate coping empathic strategies.
2. Pepper4Elderly
The problem of taking care of the elderly is becoming extremely relevant, because significant
demographical and social changes affected our society in the last decades. The COVID-19
emergency emphasized issues regarding both the safety of older people living in caring houses
and their loneliness due to isolation.
In this perspective, the use of technologies may improve the quality of life of elderly people
living in AHLs by providing cognitive and physical support, and easy access to the environment
services [11, 12, 13, 14].
Pepper1 is a social robot that has the characteristics to intervene effectively as a caring
assistant in ALH: it interacts through speech, gestures, colors and sounds, and has a tablet
on its chest that can be used for telepresence activities. Pepper can move autonomously after
scanning the environment and can support the social-health operators in carrying out their
tasks, reducing the frequency with which they must come into close contact with patients.
Following the human-in-the-loop paradigm, we propose a solution in which human operators
may be involved in the care process without requiring their physical presence. Operators can
use the robot as an interface to patients and can provide useful feedback to adapt the robot’s
behavior. In addition, the robot will be endowed with autonomous behavior aiming at detecting
and monitoring the states of the elders and, at the same time, interacting with them to execute
exercises or to remind therapies and planned actions.
To increase acceptability, usability and user experience, the robot will be equipped with
behavioral models that make the interaction plausible and engaging.
1
Softbank Robotics
Computer Vision solutions will be used for the analysis of the facial expressions of elderly
people, in particular those related to affective states and a model for classifying emotions form
the speech prosody will be integrated to address the task of multimodal emotion recognition.
According to the recognized affective state and to the context of the elder people, Pepper will
reason and act empathically. To this aim, it will be endowed with a computational model of
empathy [10]. Such model distinguishes between cognitive empathy (i.e. understanding how
another feels) and affective empathy (i.e. an active emotional reaction to another’s affective
state), in order to provide a complete definition of empathic behavior.
In order to model the cognitive aspect of empathy, Pepper will not only recognize an affective
state, but also understand what caused it. To this aim, the robot will reason on the situation
and, according to an extended Belief-Desire-Intention (BDI) architecture [15], which takes
affective factors into account, it will decide which empathic goal to achieve by executing the
most appropriate plan of actions. The reasoning will be modeled using consolidated formalisms
- such as Dynamic Belief Networks [16] and Fuzzy Logic [17] - that are suitable for simulating
human reasoning by dealing with uncertainty, typical of natural situations that gradually evolve
during time [18]. In this way Pepper will be able to simulate both the components of empathy,
since a social emotion is triggered in the robot as a consequence of its perception of the user’s
state. The resulting prototype will be tested both by psychology experts and through user study
with elderly guests of an ALH.
In particular, the experimental study will aim at testing two conditions: Empathic Robot
(ER) vs. Non-Empathic Robot (NER) in which Pepper acts only as an interface toward services.
The study will be conducted in an ALH with elderly people. To measure results, specific
questionnaires will be developed by cooperating with psychologists who are expert in this
field. In addition we will collect and analyse behavioral data so as to relate senior’s reactions to
robot’s behavior.
3. Modeling Empathic Behavior
In the context of the Pepper4Elderly project, we plan to use the Pepper robot as a natural
interface towards environment services and at the same as an embodied companion. Several
studies confirmed that elderly users like to interact with social robots and to establish a social
relation with them [19, 20, 21]. Developing the social component of the interaction requires the
development of user models that involve reasoning on both cognitive and affective components
of the user’s state of mind, as in the case of the simulation of the empathic behavior. To have a
baseline to start for developing such a behavior, we look at available definitions of empathy.
Empathy is seen as the ability to perceive, understand and experience what others are feeling,
and to communicate such an understanding to them [22]. Baron-Cohen distinguishes between
cognitive and affective empathy. Cognitive empathy refers to the understanding of how another
feels, while affective empathy represents an active emotional reaction to another’s affective
state [23]. In the field of HRI, researchers have demonstrated the benefits of empathy in robot
behavior design [7]. Many of them address only the affective dimension of empathy [24].
However, the cognitive component seems to be relevant to attribute an empathic behavior to a
robot.
Figure 1: The proposed empathic behavior model.
According to the results of a previous study [10], the perception of empathy increases when
the robot shows that it understands the reason of the user’s state. Therefore, it is important to
endow the robot with the capability of recognizing as precisely as possible the emotional state
of the user, because a wrong recognition may compromise empathy. On the other hand, it is
also important that the generated behaviors are accurately designed. To this aim, we defined
the architecture of the robot’s reasoning (Figure 1) by including:
• the recognition of the user’s affective state starting from his behavior;
• the feeling generated by this situation in the robot’s mind by endowing the robot with
beliefs about its own emotions as a consequence of what has been recognized;
• the triggering of an empathic goal if necessary;
• the planning and execution of the behavior that is most appropriate to the user’s state.
The Affective User Modeling component is dedicated to the inference of a particular state
of mind of the user starting from the analysis of the combination of facial expressions with
the speech prosody. Typically, the accuracy of Facial Expression Recognition (FER) systems
is affected by many factors among which the age [25, 26], because of age-related changes in
the face. Recently, Deep Learning (DL) algorithms, like Convolutional Neural Networks, which
lead to automated feature learning, have been successfully employed in several CV applications
achieving challenging results on various tasks [27]. However, DL algorithms used in FER
systems have been experimentally validated on young faces since the most commonly used
datasets for training have a little amount of old faces examples. For this reason, in order to train
the FER module of Pepper4Elderly, we plan to create a new dataset by enriching the FACES
dataset [28] with the addition of new older faces taken from videos.
The robot’s affective reasoner will implement a computational model of emotion triggering
based on an extension of the model proposed in [10] in which we model the robot’s empathic
feelings as a DBN. According to the robot’s beliefs about the situation an empathic goal may be
triggered by the Empathic Goal Triggering module. These phase will be based on an extension
of the BDI model that, besides rational beliefs, includes also emotions thus becoming an EBDI
model [15]. Then, the selection of the plan to be executed by the robot will be driven also by the
recognized emotions of the user and triggered in the robot’s affective model. The selected plan
will be then executed by the robot by generating the most appropriate combination of verbal
and non verbal communicative actions in combination with service execution. According to
the BDI approach this cycle includes both deliberative and reactive reasoning, thus allowing
the generation of robot’s behaviors appropriate to the situation.
4. Preliminary evaluation
We are currently working on the definition of goals that the robot has to pursue in the application
scenario and, consequently, behaviors that Pepper should use to interact emphatically with
elderly users to reach these goals. These behaviors will be designed with experts in the field
using the approach based on PERSONAs and Scenario definition.
Example of goals and associated behaviors are we are currently considering are the following
empathic goals:
• console
• encourage and motivate
• congratulate
• play
• calm down
A team of psychologists will be asked to assist to some sessions simulating the scenarios
between an actor, playing the role of the elder user, and the robot. For each scenario the experts
will evaluate the behavior plan in terms of communicative acts and verbal or non-verbal signs,
used for each communicative act. The results of this preliminary formative evaluation step will
provide useful feedback to refine the underlying model and the robot’s behaviors.
5. Conclusions and Future Work Directions
Social robots should take into account affective factors when interacting with elderly users,
especially in caring contexts such as the ALH. In the context of Pepper4Elderly project we aim at
testing that endowing the robot with empathetic behaviors helps in establishing social long-term
relationships enforcing trust and confidence. To do so, we are designing and implementing a
general architecture based on an extension of the BDI model that takes into account affective
factors. In particular, the Pepper robot will recognize the emotional state of the user by analyzing
communicative signals extracted from speech, facial expressions, gestures, and posture, in order
to trigger its own affective state accordingly. The Pepper robot will reason on rational and
emotional beliefs to take a decision by activating and pursuing goals through the execution
of suitable behaviors. The effectiveness of the proposed approach will be initially tested in a
formative phase with domain experts and then with elderly people in a ALH. The goal is to
refine the model and the robot behaviors.
References
[1] A. X. Li, M. Florendo, L. E. Miller, H. Ishiguro, A. P. Saygin, Robot form and motion
influences social attention, in: Proc. Int. Conf. on Human-Robot Interaction, HRI ’15,
Association for Computing Machinery, New York, NY, USA, 2015, p. 43–50. URL: https:
//doi.org/10.1145/2696454.2696478. doi:1 0 . 1 1 4 5 / 2 6 9 6 4 5 4 . 2 6 9 6 4 7 8 .
[2] S. M. Anzalone, S. Boucenna, S. Ivaldi, M. Chetouani, Evaluating the engagement with
social robots, International Journal of Social Robotics 7 (2015) 465–478.
[3] M. L. Traeger, S. Strohkorb Sebo, M. Jung, B. Scassellati, N. A. Christakis,
Vulnerable robots positively shape human conversational dynamics in a hu-
man–robot team, Proc. of the National Academy of Sciences 117 (2020)
6370–6375. URL: https://www.pnas.org/content/117/12/6370. doi:1 0 . 1 0 7 3 / p n a s . 1 9 1 0 4 0 2 1 1 7 .
arXiv:https://www.pnas.org/content/117/12/6370.full.pdf.
[4] J. Decety, I. Bartal, F. Uzefovsky, A. Knafo-Noam, Empathy as a driver of prosocial
behaviour: Highly conserved neurobehavioural mechanisms across species, Philosophical
Transactions of the Royal Society B: Biological Sciences 371 (2016) 20150077. doi:1 0 . 1 0 9 8 /
rstb.2015.0077.
[5] F. de Vignemont, T. Singer, The empathic brain: how, when and why?, Trends in Cognitive
Sciences 10 (2006) 435–441.
[6] C. Anderson, D. Keltner, The role of empathy in the formation and maintenance of social
bonds, Behavioral and Brain Sciences 25 (2002) 21–22. doi:1 0 . 1 0 1 7 / S 0 1 4 0 5 2 5 X 0 2 2 3 0 0 1 0 .
[7] I. Leite, A. Pereira, S. Mascarenhas, C. Martinho, R. Prada, A. Paiva, The influence of
empathy in human–robot relations, International Journal of Human-Computer Studies 71
(2012). doi:1 0 . 1 0 1 6 / j . i j h c s . 2 0 1 2 . 0 9 . 0 0 5 .
[8] A. Paiva, I. Leite, H. Boukricha, I. Wachsmuth, Empathy in virtual agents and robots: A
survey, ACM Trans. Interact. Intell. Syst. 7 (2017). URL: https://doi.org/10.1145/2912150.
doi:1 0 . 1 1 4 5 / 2 9 1 2 1 5 0 .
[9] L. Charrier, A. Galdeano, A. Cordier, M. Lefort, Empathy display influence on human-robot
interactions: a pilot study, 2018.
[10] B. D. Carolis, S. Ferilli, G. Palestra, Simulating empathic behavior in a social assis-
tive robot, Multimedia Tools Appl. 76 (2017) 5073–5094. URL: https://doi.org/10.1007/
s11042-016-3797-0. doi:1 0 . 1 0 0 7 / s 1 1 0 4 2 - 0 1 6 - 3 7 9 7 - 0 .
[11] H. Bui, N. Y. Chong, An integrated approach to human-robot-smart environment interac-
tion interface for ambient assisted living, in: 2018 IEEE Workshop on Advanced Robotics
and its Social Impacts (ARSO), 2018, pp. 32–37.
[12] C. D. Napoli, S. Rossi, A layered architecture for socially assistive robotics as a service, in:
2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), 2019, pp.
352–357.
[13] N. Casiddu, A. Cesta, G. Cortellessa, A. Orlandini, C. Porfirione, A. Divano, E. Micheli,
M. Zallio, Robot Interface Design: The Giraff Telepresence Robot for Social Interaction,
volume 11, 2015, pp. 499–509. doi:1 0 . 1 0 0 7 / 9 7 8 - 3 - 3 1 9 - 1 8 3 7 4 - 9 _ 4 6 .
[14] M. Manca, F. Paternò, C. Santoro, E. Zedda, C. Braschi, R. Franco, A. Sale, The impact of seri-
ous games with humanoid robots on mild cognitive impairment older adults, International
Journal of Human-Computer Studies (2020).
[15] A. S. Rao, M. P. George, Bdi agents: From theory to practice, in: Proceedings of the First
International Conference on Multi-Agent Systems (ICMAS-95), 1995, pp. 312–319.
[16] A. E. Nicholson, J. M. Brady, Dynamic belief networks for discrete monitoring, IEEE
Transactions on Systems, Man, and Cybernetics 24 (1994) 1593–1610.
[17] C. Freksa, Fuzzy logic: An interface between logic and human reasoning, IEEE Expert 9
(1994) 20–21.
[18] B. D. Carolis, N. Novielli, Recognizing signals of social attitude in interacting with ambient
conversational systems, Multimodal User Interfaces 8 (2014) 43–60.
[19] J. Broekens, M. Heerink, H. Rosendal, Assistive social robots in elderly care: A review,
Gerontechnology 8 (2009) 94–103. doi:1 0 . 4 0 1 7 / g t . 2 0 0 9 . 0 8 . 0 2 . 0 0 2 . 0 0 .
[20] S. Bahadori, A. Cesta, G. Grisetti, L. Iocchi, R. Leone, D. Nardi, A. Oddi, F. Pecora, R. Rasconi,
Robocare: Pervasive intelligence for the domestic care of the elderly, in: AI*IA Magazine
Special Issue, 2003.
[21] N. Chen, J. Song, B. Li, Providing aging adults social robots’ companionship in home-based
elder care, Journal of Healthcare Engineering 2019 (2019) 1–7. doi:1 0 . 1 1 5 5 / 2 0 1 9 / 2 7 2 6 8 3 7 .
[22] R. Picard, Toward machines with emotional intelligence, 2004, pp. 29–30. doi:1 0 . 1 0 9 3 /
acprof:oso/9780195181890.003.0016.
[23] S. Baron-Cohen, The Science of Evil: On Empathy and the Origins of Cruelty, Basic Books,
NY, 2011.
[24] A. Paiva, I. Leite, H. Boukricha, I. Wachsmuth, Empathy in virtual agents and robots: A
survey, ACM Trans. Interact. Intell. Syst. 7 (2017). URL: https://doi.org/10.1145/2912150.
doi:1 0 . 1 1 4 5 / 2 9 1 2 1 5 0 .
[25] G. Guo, R. Guo, X. Li, Facial expression recognition influenced by human aging, IEEE
Transactions on Affective Computing 4 (2013) 291–298.
[26] S. Wang, S. Wu, Z. Gao, Q. Ji, Facial expression recognition through modeling age-related
spatial patterns, Multimedia Tools and Applications 75 (2015) 3937–3954.
[27] D. Yu, L. Deng, Deep learning and its applications to signal and information processing
[exploratory dsp], IEEE Signal Processing Magazine 28 (2011) 145–154.
[28] N. Ebner, M. Riediger, U. Lindenberger, Faces—a database of facial expressions in young,
middle-aged, and older women and men: Development and validation, Behavior research
methods 42 (2010) 351–62. doi:1 0 . 3 7 5 8 / B R M . 4 2 . 1 . 3 5 1 .