<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Toward the Integration of Perception and Knowledge Reasoning: An Adaptive Rehabilitation Scenario</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Alessandro Umbrico</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alessandra Sorrentino</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Filippo Cavallo</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Laura Fiorini</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Andrea Orlandini</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Amedeo Cesta</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>CNR - Istituto di Scienze e Tecnologie della Cognizione</institution>
          ,
          <addr-line>Roma</addr-line>
          ,
          <country country="IT">Italia</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Scuola Superiore Sant'Anna</institution>
          ,
          <addr-line>Pisa</addr-line>
          ,
          <country country="IT">Italia</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Social Robotics is a research eld aiming at designing robots able to interact with people in a natural manner. Within the domain of Socially Assistive Robotics the capability of adapting and personalizing behaviors and assistive services of robots, according to the speci c assistive context and needs of a person is crucial to improve the e cacy in users support and hence acceptance. The authors rely on some recent results concerning the realization of a cognitive control approach for assistive robots supporting the synthesis of personalized and exible assistive behaviors. This paper takes into account a general rehabilitation scenario and presents some initial steps toward the integration of perception, knowledge representation and planning capabilities to pursue exibility, adaptation and personalization of assistive robot behaviors.</p>
      </abstract>
      <kwd-group>
        <kwd>Socially Assistive Robotics Knowledge Representation and Reasoning Perception and Machine Learning Arti cial Intelligence</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        The research eld of Socially Assistive Robotics (SAR) aims at designing robots
capable to assist fragile users supporting their daily living activities and also rely
on Social Interaction features [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. This kind of robots is to provide people with
continuous support and assistance, possibly facing a signi cant number of
heterogeneous tasks [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] such as reminding the dietary restrictions and medical
appointments or monitoring the heart rate or the sleep quality of a person. To
this purpose adaptivity constitutes a key capability. Personalization and
adaptation features in robotics architecture are strongly required to e ectively address
the speci c needs of a person as well as to achieve a good level of acceptance
[
        <xref ref-type="bibr" rid="ref17 ref19">17, 19</xref>
        ]. To this aim, a key point is to see Socially Assistive Robots (SAR) as
complex systems (SAR systems) merging requirements that come from end-users
Copyright c 2020 for this paper by its authors. Use permitted under Creative
Commons License Attribution 4.0 International (CC BY 4.0).
(or patients), secondary users (e.g., caregivers or health-care professionals) and
the social robot itself.
      </p>
      <p>
        End-users have speci c health-related needs determining the type of
assistance they need. Secondary users see in a SAR system a means to improve their
quality of work and facilitate communication with patients. Previous experiences
in domestic assistance scenarios like e.g. the [
        <xref ref-type="bibr" rid="ref10 ref6 ref7">6, 10, 7</xref>
        ] have clearly pointed out
the role of SAR systems as technological means capable of performing assistive
functionalities that can support the daily living of an assisted person as well as
support the proactive intervention of external/third persons like e.g. caregivers
or health-care professionals. Finally, a social robot itself has speci c capabilities
like e.g., autonomous navigation, object manipulation, multi-modal interaction
and so on, that may determine the set of assistive services the resulting SAR
system can actually support.
      </p>
      <p>
        In this context, a SAR control system should integrate a large variety of
(con icting) requirements. To satisfy such requirements in a exible and
adaptable way, SAR systems should encapsulate a number of cognitive capabilities to
autonomously i) reason about these requirements, ii) nd out the most suitable
set of assistive services needed in a speci c context and, iii) con gure and adapt
these services in order to achieve the desired assistive objectives. To endow an
assistive robot with such cognitive capabilities we have recently started a research
initiative called KOaLa (Knowledge-based cOntinuous Loop) aimed at developing
a novel cognitive control architecture for assistive robots [
        <xref ref-type="bibr" rid="ref2 ref22 ref3">22, 2, 3</xref>
        ]. The developed
prototype has been evaluated in simulated domestic assistance scenarios
showing the desired level of proactivity, personalization and adaptation, concerning
robot behaviors [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ].
      </p>
      <p>A recently started research project, called SI-Robotics (SocIal ROBOTics for
active and healthy ageing) presents a number of interesting assistive challenges
that represent a good opportunity to enhance and evaluate the capabilities of
KOaLa in realistic assistive scenarios. Taking into account the challenges raised
by a generic rehabilitation scenario where an assistive robot should support
both therapists and patients, the contributions of the paper consist of: (i) a
general presentation of SI-Robotics and the considered applicative scenarios; (ii)
a discussion of Human-Robot Interaction issues related to adaptive SAR systems;
(iii) a high-level cognitive control architecture for SAR extending KOaLa with
the integration of brain-inspired perception and emotion recognition capabilities.
2</p>
    </sec>
    <sec id="sec-2">
      <title>The SI-Robotics Project</title>
      <p>SI-Robotics is an Italian research project whose aim is to design and develop
novel solutions of collaborative assistive robotics capable of supporting humans
in healthcare scenarios and interacting with them in a socially acceptable way.
The scienti c objective of the project is to investigate and develop advanced
software and robotic solutions for assisting seniors in a variety of situations that
range from daily-home living support to continuous monitoring of health-related</p>
      <sec id="sec-2-1">
        <title>Services</title>
        <p>RESIDENTIAL SCENARIOS
• Teleservice
• Health Monitoring
• Coaching
• Cognitive Stimulation</p>
        <p>HOSPITAL SCENARIOS
• Welcoming and orientation
• Therapy and Rehabilitation</p>
        <p>Support
• Patient Monitoring</p>
      </sec>
      <sec id="sec-2-2">
        <title>AI Technologies</title>
        <p>• Knowledge Representation and</p>
        <p>Reasoning
• Ontology
• Common Sense Reasoning
• Natural Language Processing
• Sensor Data Fusion
• Decision Making
• Planning and Execution
• Learning and Adaptation
conditions to possibly facilitate an early detection of cognitive decline like e.g.,
early dementia or mild cognitive impairment.</p>
        <p>SI-Robotics identi es the integration of core technologies like Robotics,
Internet of Things (IoT) and Arti cial Intelligence (AI) as the key enabling feature
to realize an innovative SAR system capable of synthesizing exible assistive
behaviors tailored on the speci c needs of primary users.</p>
        <p>Figure 1 shows a conceptual view of SI-Robotics. Central to the project is the
development of novel sensorized robotic platforms. On the left side of the gure,
there are a number of heterogeneous assistive services and scenarios considered
within the project. Such services are supported by means of the integration
on a novel modular robotic platform of a number of advanced AI technologies
Speci cally, two types of scenarios are considered in SI-Robotics: (i) residential
scenarios and; (ii) hospital scenarios.</p>
        <p>Residential scenarios concern assistive services where usually the assistance
is carried out in restricted environments like the social houses or the house of the
patient. The assistive services are mainly targeted in supporting the daily home
living of a person over a long temporal horizon. The envisaged services in this
context are: (i) Teleservice or teleassistance where the robot acts as a
communication channel allowing external persons like e.g., doctors or relatives to contact
and talk to the target senior; (ii) Health monitoring where the robot through a
number of physiological and environmental sensing devices continuously
monitor the activities and health parameters of the target senior and proactively
triggers alerts and/or noti cations when some \not regular" event happens; (iii)
Coaching where the robot is meant to support the target seniors in continuing
his/her rehabilitation therapy when he/she is back from from hospital. (iv)
Cognitive stimulation where the robot continuously interact with the target person
and constantly stimulate and evaluate his/her cognitive capabilities through a
number of dedicated games properly integrated into the system (gami cation).</p>
        <p>Hospital scenarios concern assistive services where the assistance is carried
out in public environment (i.e., a hospital) and where the robot could interact
simultaneously with several target seniors. However, unlike the other scenarios,
the assistance and the interactions needed to realize services that span over a
reduced temporal horizon and concern: (i) Welcoming and orientation where
the robot is placed at the entrance of the hospital and is in charge of realizing
social functionalities to interact with di erent people providing useful
information; (ii) Rehabilitation Support where the robot is meant to support a therapist
in performing the rehabilitation tasks with one ore more seniors. (iii) Patient
Monitoring where the robot is in charge of constantly monitoring the health
conditions of bedridden patients, autonomously identify critical conditions and
promptly notify/alarm the medical sta .
3</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Adaptive Assistance through Human-Robot Interaction</title>
      <p>
        The scenarios and services of SI-Robotics are well suited to evaluate the holistic
approach pursued within KOaLa [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ] and to enhance the developed cognitive
capabilities [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ]. Figure 2 gives a general view of the proposed holistic approach.
      </p>
      <p>Environment
Perspective
IoT Devices</p>
      <p>Smart</p>
      <p>Environments
Autonomy
Perspective
Skills, Navigation, &amp;
Communication
Interaction
Perspective
User Categories
Personalization
Perspective
FUesaetruNreeseds &amp;</p>
      <p>The left-side shows the technological features of a SAR system. Here, we
have two perspectives that characterize the capabilities of the system. The
environment perspective concerns perception capabilities, taking into account the
IoT devices like e.g., environmental or physiological sensors that allow the
system to gather information about the environment and the state of the assisted
person. The autonomy perspective concerns the skills, the operations and the
autonomy levels of the assistive robot, determining the possible interactions with
the environment.</p>
      <p>The right-side of Figure 2 instead characterizes the behavioral features of a
SAR system. The related two perspectives determine the set of assistive services
needed for the considered users as well as the \shape" of the robot behaviors
that carry out such services. The interaction perspective concerns the de nition
of the di erent types of user that interact with the system. The personalization
perspective concerns the identi cation of the health-related and cognitive features
that a ect the modalities of interaction between the robot and the person who
receives the assistance. It determines the parameters/constraints to consider in
order to realize e ective assistive behaviors.</p>
      <p>Given this multi-perspective approach and considering again the assistive
services of SI-Robotics, a particularly interesting one is the Rehabilitation
Support for hospital scenarios. Such a service requires a robot to support both a
patient and a therapist during the execution of a general rehabilitation. This
service presents a number of interesting aspects with respect to behavior
adaptation and human-robot interactions. First of all, the robot should be capable of
interacting with two di erent types of user (i.e., the patient and the therapist),
providing them with di erent information and functionalities. Then, the robot
should be capable of facing a variety of situations in terms of types of
rehabilitation procedure and health conditions of the assisted person to monitor. Namely,
the robot should be capable of supporting several rehabilitation therapies and
monitoring di erent technical and physiological parameters to assess the correct
execution of planned exercises.</p>
      <p>To achieve the desired quality level of assistance we conceive a three-step
procedure consisting of: (i) a con guration and training step allowing the robot
to interact with the therapist learn the exercises composing the rehabilitation
procedure and the technical parameters/features as well as the quality metrics to
monitor for exercise assessment; (ii) a pro ling step allowing the robot to interact
with the patient and learn his/her health-related needs and cognitive
capabilities representing additional information to consider during the execution of the
exercises; (iii) a monitoring and control step alllowing the robot to show to the
patients the exercises he/she must perform, to monitor the parameters/features
con gured for the assessment and then to intervene when necessary.
3.1</p>
      <sec id="sec-3-1">
        <title>Rehabilitation Con guration and Training</title>
        <p>The role of the robot is to support the therapist in the administration of
rehabilitation exercises. The therapist trains the robot in pro ling the user (and
his/her rehabilitation) and in providing data about correct executions of the
rehabilitation therapy.</p>
        <p>During the rst step of the con guration, the therapist inserts generic data
of the user (i.e. name, age, gender, nationality) as well as clinical data related
to health condition. As second step, the list of exercises and the parameters of
interest are con gured by the therapist himself. These data are important to
determine the kind of interaction the user is more prone to and to assess the
rehabilitation procedure. In the training phase, the robot learns the features
to monitor during the execution in order to evaluate observed performances.
Therefore, the robot acts passively to the rehabilitation procedure. It stands
close to the therapist and collects data.</p>
        <p>We can suppose that the robot is endowed with an internal representation of
rehabilitation exercises and the associated physiological and physical parameters
that are considered for performance assessment. The therapist con gures the
robot by selecting the exercises the robot is going to \learn" and the parameters
to observe for the assessment. During the training phase, the robot monitors the
exercises through the selected parameters. Then, the therapist enters his/her own
evaluation at the end of each exercise. In this way, the robot internally build a
\training set" enabling the autonomous evaluation of the (known) exercises.
3.2</p>
      </sec>
      <sec id="sec-3-2">
        <title>Patient Pro ling</title>
        <p>To assess the pro le of the patient, the assistive robot collects information about
the health-related needs of the patient who performs the therapy. It is important
to know physical as well as cognitive capabilities and features of a patient in
order to properly interpret observed behaviors. The importance of collecting these
kinds of information is twofold. From one side, it allows the robot to update
the information of the patient and make it accessible to the therapist(re ning
phase). Furthermore, it is important to automatically customize the
rehabilitation exercises the patient needs to perform. For example, patients a ected by
short-term memory loss can be constantly reminded by the robot about the
exercises they are going to perform, explain the steps that compose the rehabilitation
procedure and how they should be executed.</p>
        <p>On the other side, this information is crucial to allow the robot to interact
with the patient in an e ective way and to explain decisions and possible changes
in the execution or in the structure of the proposed therapy. For example, the
robot can prefer a text-based interaction modality if the patient is a ected by
hearing impairments or a voice-based interaction modality if the patient is
affected by eyesight impairments.</p>
        <p>More in general, knowing health-related needs of a patient allows a robot to
justify exercises with respect to the health conditions of the patient but also to
recognize hazardous or critical situations and ask the intervention of a therapist.
3.3</p>
      </sec>
      <sec id="sec-3-3">
        <title>Rehabilitation Execution and Control</title>
        <p>The tasks performed by the robot in this phase require the ability of monitoring
the performance and the engagement of the patient who is executing the exercise.
It needs to make the user feel safe and comfortable in exercising. According to
the outcomes of monitoring activities the robot may decide to interact with the
patient in di erent ways during the execution of the exercise. For example the
robot can encourage the patient if an exercise has been performed well or
viceversa can interrupt and explain again the exercise if the patient performs too
bad. The following subsections detail the monitoring procedures.
Performance Monitoring From a technical point of view, the robot monitors
the patient's performance by collecting data from visual sensors (i.e. cameras)
Heartrate80.0
Respiratoryrate
12</p>
        <p>Heartrate80.0
Respiratoryrate
12</p>
        <p>Heartrate80.0
Respiratoryrate
12
(a)
(b)
(c)
and wearable sensors (i.e. inertial sensors), if requested by the therapist. The
data are then displayed on the screen to provide a feedback to the patient. On
the same time, the online analysis of physiological data helps the robot to assess
the quality of the performance. Due to this analysis, the robot can identify 3
scenarios, as shown in Figure 3:
1. Good performance: the patient is correctly performing the rehabilitation
exercise and the value of the physiological parameters are uncritical (Figure
3(a));
2. Bad performance: the patient is not correctly performing the rehabilitation
exercise (Figure 3(b));
3. Alert performance: the patient is correctly exercising but the physiological
parameters got critical values (Figure 3(c)).</p>
        <p>Engagement Monitoring The engagement monitoring allows the robot to
modify its interaction plan based on the feelings expressed by the patient during
the exercise. Based on the scenario described in the previous subsection, if the
robot notices that the patient is correctly exercising, it will keep motivating
him, like a personal coach. Otherwise, if the robot recognizes that the patient
is incorrectly performing the rehabilitation task, it will slow down the exercise
and it will suggest the correct execution. In case the user is not motivated in
performing the rehabilitation, the robot will try to get his attention in order to
increase the engagement. When Alert performance scenario is detected, the robot
will stop the rehabilitation session and it will comfort him with some routine's
questions.
4</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>AI-based Cognitive Control</title>
      <p>To realize the desired assistive services it is necessary to design and develop an
advanced intelligent control system capable of implementing and integrating the
numerous cognitive capabilities needed to successfully achieve the desired
objectives. Starting from the cognitive architecture de ned within KOaLa, advanced
perception capabilities are needed to gather and process data from di erent
sensing devices (e.g., video-cameras, environmental sensors and physiological
sensors). Representation and abstraction capabilities are needed to integrate
and contextualize sensory information in order to recognize di erent \assistive
contexts" and situations, allowing the robot to incrementally build a kind of
consciousness. According to this knowledge decision making and acting
capabilities are needed to decide which high-level action to perform and synthesize an
interaction plan to proactively support an end-user.</p>
      <p>
        Taking inspiration from research in cognitive architecture [
        <xref ref-type="bibr" rid="ref1 ref14 ref15">15, 14, 1</xref>
        ], we here
propose the integration of three AI-based processes implementing the cited
capabilities. Speci cally, we here propose an extension to a knowledge-based
framework for cognitive control called KOaLa [
        <xref ref-type="bibr" rid="ref3 ref4">4, 3</xref>
        ] (Knowledge-based cOntinuous
Loop) in order to integrate additional sensory information and enhance the
reasoning and acting capabilities of the approach.
      </p>
      <p>Figure 4 proposes a schematic representation of a cognitive architecture for
adaptive rehabilitation. The cognitive control approach relies on three main
modules: (i) A perception module is in charge of realizing the raw-data processing
mechanisms needed to extract useful information from sensory inputs; (ii) A
knowledge module is in charge of encoding information about the therapy and
rehabilitation exercises, metrics for performance evaluation and patients' state in
terms of health-related needs and emotions; (iii) A planning and acting module
is in charge of making decisions about how to support the rehabilitation
therapy, selecting the exercises to perform and the interactions needed to correct or
support the rehabilitation.</p>
      <p>Perception
Engagement Monitoring
Performance Monitoring</p>
      <p>Therapy
Measures</p>
      <p>Knowledge</p>
      <p>KB
Exercise
Monitoring</p>
      <p>Physiological</p>
      <p>State
Emotional</p>
      <p>State
Exercise Support
and Adaptation</p>
      <p>Planning &amp; Acting</p>
      <p>Goal
Reasoning
Therapy
Execution</p>
      <p>
        Therapy
Synthesis
Perception module allows to assess the emotional and engagement state of the
user, while he/she is performing the exercise. In details, the perceptual system
aims at converting raw data coming form the sensory equipment into behavioural
patterns. The perceptual system presented in this work resembles the abstraction
process occurring in the human brain. The capability of the brain to process
stimuli from the environment is mimicked by the interconnection of three modules,
denoted as: thalamus, sensory cortex and associative cortex. The functionality
of each module shows analogies with the abilities of the corresponding
humanbeings' neural structure. Namely, the thalamus module is the one responsible of
gathering sensory data. Sensory cortex is composed by multiple modules, one
for each sensory modality, which extract the features of interest used by the
associative cortex to assess the multi-modal pattern describing the human's
behaviour. The ow of information characterizing the overall system recalls the
human inherent ability of automatically assessing the mental state, feeling and
other personal traits, as described by the Theory of Mind [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. The nal output
of this system is to assess speci c emotional and engagement states. Among the
emotional states described by [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], joy, anger, fear, sadness and surprise are of
interest. The engagement state of the user is adopted to assess an additional
feelings, like boredom (low level of engagement) and excitement (high level of
engagement).
4.2
      </p>
      <sec id="sec-4-1">
        <title>Ontology-based Representation</title>
        <p>To support the desired level of interaction and adaptability, the robot should
exchange information with users as well as understand information or
instructions from them and correctly interpreting \signals" from the environment. It
is therefore necessary to endow an assistive robot with some sort of knowledge
in order to deal with information about rehabilitation exercises and
parameters considered for the assessment. To characterize such knowledge, we follow
an ontological-based approach to de ne a clear semantics of the general concepts
and properties the robot deals with in the considered rehabilitation scenarios.</p>
        <p>
          To this aim, we will extend the KOaLa ontology, previously designed for
domestic assistance [
          <xref ref-type="bibr" rid="ref3">3</xref>
          ], by introducing concepts and properties needed to manage
the needed information. The KOaLa ontology relies on the DOLCE foundational
ontology [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ] and the SSN ontology [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. Concepts and properties related to
rehabilitation exercises are modeled as DOLCE:Process and characterized in terms
of the e ects on some physical/physiological parameters of a person. In this
regard, the KOaLa ontology integrates a representation of the ICF [
          <xref ref-type="bibr" rid="ref18">18</xref>
          ]
classication proposed by WHO (World Health Organization) to generally describe
health-related conditions of persons. This knowledge is crucial in this context
to build pro les of end-users but also to link rehabilitation exercises to
healthrelated parameters to monitor for the assessment. Also, this knowledge is crucial
for supporting explainability [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ]. Using the ICF taxonomy indeed the robot can
explain and motivate the exercises as well as positive or negative evaluations to
end-users by taking into account his/her health-related needs.
        </p>
        <p>The KOaLa ontology is also extended with the representation of the
emotional states of a person that can be recognized by the integrated perception
capabilities. This knowledge allows the robot to maintain an internal
representation of the mental state of the assisted person and contextualize the observations
accordingly.
4.3</p>
      </sec>
      <sec id="sec-4-2">
        <title>Holistic Reasoning for Robot Behavior Synthesis</title>
        <p>
          The ontology de nes a clear semantics of the heterogeneous concepts and
properties the robot must deal with in di erent assistive scenarios. This semantics
guides the knowledge reasoning processes that link perception module to
planning and acting module of Figure 4. KOaLa reasoning processes interpret and
contextualize perception information according to the semantics de ned by the
ontology [
          <xref ref-type="bibr" rid="ref2 ref20">2, 20</xref>
          ]. Such processes continuously re ne an internal Knowledge Based
(KB) which characterizes the assistive scenario with respect to di erent
abstraction levels and perspectives.
5
        </p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Conclusions and Future Works</title>
      <p>This paper presents some initial design e orts aimed at realizing a novel cognitive
control system for SAR systems within the Italian research project SI-Robotics.
The paper focuses on a speci c assistive scenario for rehabilitation where an
assistive robot is supposed to support therapists as well as patients during the
execution of some exercises. The contribution of the paper consists in proposing
an initial integrated view of perception, knowledge representation and acting
capabilities. Perception capabilities should extract information useful for the
assessment of an exercises (correctness and engagement). Then, information should
be modeled by a dedicated ontology and then integrated into a knowledge-based
control architecture (KOaLa) to dynamically adapt the behavior of an assistive
robot to the speci c health-related needs and state of a person. Next steps, will
push on the concrete integration and development of a prototype that can be
evaluated in real rehabilitation scenarios.</p>
    </sec>
    <sec id="sec-6">
      <title>Acknowledgements</title>
      <p>Research supported by \SocIal ROBOTics for active and healthy ageing"
(SIROBOTICS) project founded by the Italian \Ministero dell'Istruzione, dell'
Universita e della Ricerca" under the framework \PON 676 - Ricerca e Innovazione
2014-2020", Grant Agreement ARS01 01120.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Anderson</surname>
            ,
            <given-names>J.R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Matessa</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lebiere</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <string-name>
            <surname>Act-r</surname>
          </string-name>
          :
          <article-title>A theory of higher level cognition and its relation to visual attention</article-title>
          .
          <source>Hum.-Comput. Interact</source>
          .
          <volume>12</volume>
          (
          <issue>4</issue>
          ),
          <volume>439</volume>
          {462 (Dec
          <year>1997</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Cesta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cortellessa</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Orlandini</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Umbrico</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>A Cognitive Loop for Assistive Robots - Connecting Reasoning on Sensed Data to Acting</article-title>
          .
          <source>In: RO-MAN. The 27th IEEE International Symposium on Robot and Human Interactive Communication</source>
          . pp.
          <volume>826</volume>
          {
          <issue>831</issue>
          (
          <year>2018</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Cesta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cortellessa</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Orlandini</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sorrentino</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Umbrico</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>A Semantic Representation of Sensor Data to Promote Proactivity in Home Assistive Robotics</article-title>
          . In: Arai,
          <string-name>
            <given-names>K.</given-names>
            ,
            <surname>Kapoor</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Bhatia</surname>
          </string-name>
          ,
          <string-name>
            <surname>R</surname>
          </string-name>
          . (eds.)
          <source>Intelligent Systems and Applications</source>
          , pp.
          <volume>750</volume>
          {
          <fpage>769</fpage>
          . Springer International Publishing,
          <string-name>
            <surname>Cham</surname>
          </string-name>
          (
          <year>2019</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Cesta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cortellessa</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Orlandini</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Umbrico</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Will robin ever help \nonna lea" using arti cial intelligence</article-title>
          ? In: Leone,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Caroppo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Rescio</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            ,
            <surname>Diraco</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            ,
            <surname>Siciliano</surname>
          </string-name>
          , P. (eds.) Ambient Assisted Living. pp.
          <volume>181</volume>
          {
          <fpage>191</fpage>
          . Springer International Publishing,
          <string-name>
            <surname>Cham</surname>
          </string-name>
          (
          <year>2019</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Compton</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Barnaghi</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bermudez</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Garc</surname>
            a-Castro,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Corcho</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cox</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Graybeal</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hauswirth</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Henson</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Herzog</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Huang</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Janowicz</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kelsey</surname>
            ,
            <given-names>W.D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Phuoc</surname>
            ,
            <given-names>D.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lefort</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leggieri</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Neuhaus</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nikolov</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Page</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Passant</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sheth</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Taylor</surname>
          </string-name>
          , K.:
          <article-title>The SSN ontology of the W3C semantic sensor network incubator group</article-title>
          .
          <source>Web Semantics: Science, Services and Agents on the World Wide Web</source>
          <volume>17</volume>
          (
          <string-name>
            <surname>Supplement</surname>
            <given-names>C)</given-names>
          </string-name>
          ,
          <volume>25</volume>
          {
          <fpage>32</fpage>
          (
          <year>2012</year>
          ), http://www.sciencedirect.com/science/article/pii/S1570826812000571
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Coradeschi</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cesta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cortellessa</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coraci</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gonzalez</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Karlsson</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Furfari</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lout</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Orlandini</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Palumbo</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pecora</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>von Rump</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stimec</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ullberg</surname>
          </string-name>
          , J., Otslund, B.:
          <article-title>Gira Plus: Combining social interaction and long term monitoring for promoting independent living</article-title>
          .
          <source>In: The 6th International Conference on Human System Interactions (HSI)</source>
          . pp.
          <volume>578</volume>
          {
          <issue>585</issue>
          (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Cortellessa</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fracasso</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sorrentino</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Orlandini</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bernardi</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coraci</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>De</surname>
            <given-names>Benedictis</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            ,
            <surname>Cesta</surname>
          </string-name>
          ,
          <string-name>
            <surname>A.</surname>
          </string-name>
          :
          <article-title>ROBIN, a Telepresence Robot to Support Older Users Monitoring and Social Inclusion: Development and Evaluation</article-title>
          .
          <source>Telemedicine and e-Health</source>
          <volume>24</volume>
          (
          <issue>2</issue>
          ),
          <volume>145</volume>
          {
          <fpage>154</fpage>
          (
          <year>2018</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Dosilovic</surname>
            ,
            <given-names>F.K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Brcic</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hlupic</surname>
          </string-name>
          , N.:
          <article-title>Explainable arti cial intelligence: A survey</article-title>
          .
          <source>In: 2018 41st International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO)</source>
          . pp.
          <volume>0210</volume>
          {
          <issue>0215</issue>
          (May
          <year>2018</year>
          ). https://doi.org/10.23919/MIPRO.
          <year>2018</year>
          .8400040
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Ekman</surname>
            ,
            <given-names>P.:</given-names>
          </string-name>
          <article-title>An argument for basic emotions</article-title>
          .
          <source>Cognition &amp; emotion 6(3-4)</source>
          ,
          <volume>169</volume>
          {
          <fpage>200</fpage>
          (
          <year>1992</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Esposito</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fiorini</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Limosani</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bonaccorsi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Manzi</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cavallo</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dario</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          :
          <article-title>Supporting active and healthy aging with advanced robotics integrated in smart environment</article-title>
          . In:
          <article-title>Optimizing assistive technologies for aging populations</article-title>
          , pp.
          <volume>46</volume>
          {
          <fpage>77</fpage>
          .
          <string-name>
            <given-names>IGI</given-names>
            <surname>Global</surname>
          </string-name>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Feil-Seifer</surname>
            ,
            <given-names>D.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mataric</surname>
          </string-name>
          , M.J.:
          <article-title>De ning socially assistive robotics</article-title>
          .
          <source>9th International Conference on Rehabilitation Robotics</source>
          ,
          <year>2005</year>
          .
          <source>ICORR 2005</source>
          . pp.
          <volume>465</volume>
          {
          <issue>468</issue>
          (
          <year>2005</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Frith</surname>
            ,
            <given-names>U.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Frith</surname>
          </string-name>
          , C.D.:
          <article-title>Development and neurophysiology of mentalizing</article-title>
          .
          <source>Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences</source>
          <volume>358</volume>
          (
          <issue>1431</issue>
          ),
          <volume>459</volume>
          {
          <fpage>473</fpage>
          (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Gangemi</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Guarino</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Masolo</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Oltramari</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schneider</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          :
          <article-title>Sweetening ontologies with dolce</article-title>
          . In:
          <string-name>
            <surname>Gomez-Perez</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Benjamins</surname>
            ,
            <given-names>V.R</given-names>
          </string-name>
          . (eds.)
          <article-title>Knowledge Engineering and Knowledge Management: Ontologies and the Semantic Web</article-title>
          . pp.
          <volume>166</volume>
          {
          <fpage>181</fpage>
          . Springer Berlin Heidelberg, Berlin, Heidelberg (
          <year>2002</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Laird</surname>
            ,
            <given-names>J.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Newell</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rosenbloom</surname>
            ,
            <given-names>P.S.:</given-names>
          </string-name>
          <article-title>Soar: An architecture for general intelligence</article-title>
          .
          <source>Arti cial Intelligence</source>
          <volume>33</volume>
          (
          <issue>1</issue>
          ),
          <volume>1</volume>
          {
          <fpage>64</fpage>
          (
          <year>1987</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Langley</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Laird</surname>
            ,
            <given-names>J.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rogers</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Cognitive architectures: Research issues and challenges</article-title>
          .
          <source>Cognitive Systems Research</source>
          <volume>10</volume>
          (
          <issue>2</issue>
          ),
          <volume>141</volume>
          {
          <fpage>160</fpage>
          (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Mataric</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tapus</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Feil-Seifer</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          :
          <article-title>Personalized socially assistive robotics</article-title>
          .
          <source>In: Workshop on Intelligent Systems for Assisted Cognition (10</source>
          <year>2007</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Moro</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nejat</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mihailidis</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Learning and personalizing socially assistive robot behaviors to aid with activities of daily living</article-title>
          .
          <source>ACM Trans. Hum</source>
          .-Robot
          <string-name>
            <surname>Interact</surname>
          </string-name>
          .
          <volume>7</volume>
          (
          <issue>2</issue>
          ),
          <volume>15</volume>
          :1{
          <fpage>15</fpage>
          :
          <fpage>25</fpage>
          (
          <year>2018</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Organization</surname>
            ,
            <given-names>W.H.</given-names>
          </string-name>
          , et al.:
          <article-title>International classi cation of functioning, disability and health: ICF</article-title>
          . Geneva: World Health Organization (
          <year>2001</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19.
          <string-name>
            <surname>Rossi</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ferland</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tapus</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>User pro ling and behavioral adaptation for HRI: A survey</article-title>
          .
          <source>Pattern Recognition Letters</source>
          <volume>99</volume>
          ,
          <issue>3</issue>
          {
          <fpage>12</fpage>
          (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          20.
          <string-name>
            <surname>Umbrico</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cesta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cortellessa</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Orlandini</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>A goal triggering mechanism for continuous human-robot interaction</article-title>
          . In: Ghidini,
          <string-name>
            <given-names>C.</given-names>
            ,
            <surname>Magnini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            ,
            <surname>Passerini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Traverso</surname>
          </string-name>
          , P. (eds.)
          <source>AI*IA 2018 { Advances in Arti cial Intelligence</source>
          . pp.
          <volume>460</volume>
          {
          <fpage>473</fpage>
          . Springer International Publishing,
          <string-name>
            <surname>Cham</surname>
          </string-name>
          (
          <year>2018</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          21.
          <string-name>
            <surname>Umbrico</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cesta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cortellessa</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Orlandini</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>A holistic approach to behavior adaptation for socially assistive robots</article-title>
          .
          <source>International Journal of Social Robotics</source>
          (
          <year>2020</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          22.
          <string-name>
            <surname>Umbrico</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cortellessa</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Orlandini</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cesta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Toward intelligent continuous assistance</article-title>
          .
          <source>Journal of Ambient Intelligence and Humanized Computing</source>
          (
          <year>2020</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>