<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Toward the Integration of Perception and Knowledge Reasoning: An Adaptive Rehabilitation Scenario</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Alessandro</forename><surname>Umbrico</surname></persName>
							<affiliation key="aff1">
								<orgName type="department">CNR -Istituto di Scienze e Tecnologie della Cognizione</orgName>
								<address>
									<settlement>Roma</settlement>
									<country key="IT">Italia</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Alessandra</forename><surname>Sorrentino</surname></persName>
							<affiliation key="aff0">
								<orgName type="institution">Scuola Superiore Sant&apos;Anna</orgName>
								<address>
									<settlement>Pisa</settlement>
									<country key="IT">Italia</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Filippo</forename><surname>Cavallo</surname></persName>
							<affiliation key="aff0">
								<orgName type="institution">Scuola Superiore Sant&apos;Anna</orgName>
								<address>
									<settlement>Pisa</settlement>
									<country key="IT">Italia</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Laura</forename><surname>Fiorini</surname></persName>
							<affiliation key="aff0">
								<orgName type="institution">Scuola Superiore Sant&apos;Anna</orgName>
								<address>
									<settlement>Pisa</settlement>
									<country key="IT">Italia</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Andrea</forename><surname>Orlandini</surname></persName>
							<affiliation key="aff1">
								<orgName type="department">CNR -Istituto di Scienze e Tecnologie della Cognizione</orgName>
								<address>
									<settlement>Roma</settlement>
									<country key="IT">Italia</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Amedeo</forename><surname>Cesta</surname></persName>
							<affiliation key="aff1">
								<orgName type="department">CNR -Istituto di Scienze e Tecnologie della Cognizione</orgName>
								<address>
									<settlement>Roma</settlement>
									<country key="IT">Italia</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Toward the Integration of Perception and Knowledge Reasoning: An Adaptive Rehabilitation Scenario</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">1600B461D12F67250174B6FEA26F3EC1</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-25T05:16+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Socially Assistive Robotics</term>
					<term>Knowledge Representation and Reasoning</term>
					<term>Perception and Machine Learning</term>
					<term>Artificial Intelligence</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Social Robotics is a research field aiming at designing robots able to interact with people in a natural manner. Within the domain of Socially Assistive Robotics the capability of adapting and personalizing behaviors and assistive services of robots, according to the specific assistive context and needs of a person is crucial to improve the efficacy in users support and hence acceptance. The authors rely on some recent results concerning the realization of a cognitive control approach for assistive robots supporting the synthesis of personalized and flexible assistive behaviors. This paper takes into account a general rehabilitation scenario and presents some initial steps toward the integration of perception, knowledge representation and planning capabilities to pursue flexibility, adaptation and personalization of assistive robot behaviors.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1">Introduction</head><p>The research field of Socially Assistive Robotics (SAR) aims at designing robots capable to assist fragile users supporting their daily living activities and also rely on Social Interaction features <ref type="bibr" target="#b10">[11]</ref>. This kind of robots is to provide people with continuous support and assistance, possibly facing a significant number of heterogeneous tasks <ref type="bibr" target="#b15">[16]</ref> such as reminding the dietary restrictions and medical appointments or monitoring the heart rate or the sleep quality of a person. To this purpose adaptivity constitutes a key capability. Personalization and adaptation features in robotics architecture are strongly required to effectively address the specific needs of a person as well as to achieve a good level of acceptance <ref type="bibr" target="#b16">[17,</ref><ref type="bibr" target="#b18">19]</ref>. To this aim, a key point is to see Socially Assistive Robots (SAR) as complex systems (SAR systems) merging requirements that come from end-users (or patients), secondary users (e.g., caregivers or health-care professionals) and the social robot itself.</p><p>End-users have specific health-related needs determining the type of assistance they need. Secondary users see in a SAR system a means to improve their quality of work and facilitate communication with patients. Previous experiences in domestic assistance scenarios like e.g. the <ref type="bibr" target="#b5">[6,</ref><ref type="bibr" target="#b9">10,</ref><ref type="bibr" target="#b6">7]</ref> have clearly pointed out the role of SAR systems as technological means capable of performing assistive functionalities that can support the daily living of an assisted person as well as support the proactive intervention of external/third persons like e.g. caregivers or health-care professionals. Finally, a social robot itself has specific capabilities like e.g., autonomous navigation, object manipulation, multi-modal interaction and so on, that may determine the set of assistive services the resulting SAR system can actually support.</p><p>In this context, a SAR control system should integrate a large variety of (conflicting) requirements. To satisfy such requirements in a flexible and adaptable way, SAR systems should encapsulate a number of cognitive capabilities to autonomously i) reason about these requirements, ii) find out the most suitable set of assistive services needed in a specific context and, iii) configure and adapt these services in order to achieve the desired assistive objectives. To endow an assistive robot with such cognitive capabilities we have recently started a research initiative called KOaLa (Knowledge-based cOntinuous Loop) aimed at developing a novel cognitive control architecture for assistive robots <ref type="bibr" target="#b21">[22,</ref><ref type="bibr" target="#b1">2,</ref><ref type="bibr" target="#b2">3]</ref>. The developed prototype has been evaluated in simulated domestic assistance scenarios showing the desired level of proactivity, personalization and adaptation, concerning robot behaviors <ref type="bibr" target="#b20">[21]</ref>.</p><p>A recently started research project, called SI-Robotics (SocIal ROBOTics for active and healthy ageing) presents a number of interesting assistive challenges that represent a good opportunity to enhance and evaluate the capabilities of KOaLa in realistic assistive scenarios. Taking into account the challenges raised by a generic rehabilitation scenario where an assistive robot should support both therapists and patients, the contributions of the paper consist of: (i) a general presentation of SI-Robotics and the considered applicative scenarios; (ii) a discussion of Human-Robot Interaction issues related to adaptive SAR systems; (iii) a high-level cognitive control architecture for SAR extending KOaLa with the integration of brain-inspired perception and emotion recognition capabilities.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2">The SI-Robotics Project</head><p>SI-Robotics is an Italian research project whose aim is to design and develop novel solutions of collaborative assistive robotics capable of supporting humans in healthcare scenarios and interacting with them in a socially acceptable way. The scientific objective of the project is to investigate and develop advanced software and robotic solutions for assisting seniors in a variety of situations that range from daily-home living support to continuous monitoring of health-related conditions to possibly facilitate an early detection of cognitive decline like e.g., early dementia or mild cognitive impairment. SI-Robotics identifies the integration of core technologies like Robotics, Internet of Things (IoT) and Artificial Intelligence (AI) as the key enabling feature to realize an innovative SAR system capable of synthesizing flexible assistive behaviors tailored on the specific needs of primary users.   Residential scenarios concern assistive services where usually the assistance is carried out in restricted environments like the social houses or the house of the patient. The assistive services are mainly targeted in supporting the daily home living of a person over a long temporal horizon. The envisaged services in this context are: (i) Teleservice or teleassistance where the robot acts as a communication channel allowing external persons like e.g., doctors or relatives to contact and talk to the target senior; (ii) Health monitoring where the robot through a number of physiological and environmental sensing devices continuously monitor the activities and health parameters of the target senior and proactively triggers alerts and/or notifications when some "not regular" event happens; (iii) Coaching where the robot is meant to support the target seniors in continuing his/her rehabilitation therapy when he/she is back from from hospital. (iv) Cognitive stimulation where the robot continuously interact with the target person and constantly stimulate and evaluate his/her cognitive capabilities through a number of dedicated games properly integrated into the system (gamification).</p><p>Hospital scenarios concern assistive services where the assistance is carried out in public environment (i.e., a hospital) and where the robot could interact simultaneously with several target seniors. However, unlike the other scenarios, the assistance and the interactions needed to realize services that span over a reduced temporal horizon and concern: (i) Welcoming and orientation where the robot is placed at the entrance of the hospital and is in charge of realizing social functionalities to interact with different people providing useful information; (ii) Rehabilitation Support where the robot is meant to support a therapist in performing the rehabilitation tasks with one ore more seniors. (iii) Patient Monitoring where the robot is in charge of constantly monitoring the health conditions of bedridden patients, autonomously identify critical conditions and promptly notify/alarm the medical staff.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3">Adaptive Assistance through Human-Robot Interaction</head><p>The scenarios and services of SI-Robotics are well suited to evaluate the holistic approach pursued within KOaLa <ref type="bibr" target="#b21">[22]</ref> and to enhance the developed cognitive capabilities <ref type="bibr" target="#b20">[21]</ref>. Figure <ref type="figure" target="#fig_3">2</ref> gives a general view of the proposed holistic approach. The left-side shows the technological features of a SAR system. Here, we have two perspectives that characterize the capabilities of the system. The environment perspective concerns perception capabilities, taking into account the IoT devices like e.g., environmental or physiological sensors that allow the system to gather information about the environment and the state of the assisted person. The autonomy perspective concerns the skills, the operations and the autonomy levels of the assistive robot, determining the possible interactions with the environment.</p><p>The right-side of Figure <ref type="figure" target="#fig_3">2</ref> instead characterizes the behavioral features of a SAR system. The related two perspectives determine the set of assistive services needed for the considered users as well as the "shape" of the robot behaviors that carry out such services. The interaction perspective concerns the definition of the different types of user that interact with the system. The personalization perspective concerns the identification of the health-related and cognitive features that affect the modalities of interaction between the robot and the person who receives the assistance. It determines the parameters/constraints to consider in order to realize effective assistive behaviors.</p><p>Given this multi-perspective approach and considering again the assistive services of SI-Robotics, a particularly interesting one is the Rehabilitation Support for hospital scenarios. Such a service requires a robot to support both a patient and a therapist during the execution of a general rehabilitation. This service presents a number of interesting aspects with respect to behavior adaptation and human-robot interactions. First of all, the robot should be capable of interacting with two different types of user (i.e., the patient and the therapist), providing them with different information and functionalities. Then, the robot should be capable of facing a variety of situations in terms of types of rehabilitation procedure and health conditions of the assisted person to monitor. Namely, the robot should be capable of supporting several rehabilitation therapies and monitoring different technical and physiological parameters to assess the correct execution of planned exercises.</p><p>To achieve the desired quality level of assistance we conceive a three-step procedure consisting of: (i) a configuration and training step allowing the robot to interact with the therapist learn the exercises composing the rehabilitation procedure and the technical parameters/features as well as the quality metrics to monitor for exercise assessment; (ii) a profiling step allowing the robot to interact with the patient and learn his/her health-related needs and cognitive capabilities representing additional information to consider during the execution of the exercises; (iii) a monitoring and control step alllowing the robot to show to the patients the exercises he/she must perform, to monitor the parameters/features configured for the assessment and then to intervene when necessary.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1">Rehabilitation Configuration and Training</head><p>The role of the robot is to support the therapist in the administration of rehabilitation exercises. The therapist trains the robot in profiling the user (and his/her rehabilitation) and in providing data about correct executions of the rehabilitation therapy.</p><p>During the first step of the configuration, the therapist inserts generic data of the user (i.e. name, age, gender, nationality) as well as clinical data related to health condition. As second step, the list of exercises and the parameters of interest are configured by the therapist himself. These data are important to determine the kind of interaction the user is more prone to and to assess the rehabilitation procedure. In the training phase, the robot learns the features to monitor during the execution in order to evaluate observed performances. Therefore, the robot acts passively to the rehabilitation procedure. It stands close to the therapist and collects data.</p><p>We can suppose that the robot is endowed with an internal representation of rehabilitation exercises and the associated physiological and physical parameters that are considered for performance assessment. The therapist configures the robot by selecting the exercises the robot is going to "learn" and the parameters to observe for the assessment. During the training phase, the robot monitors the exercises through the selected parameters. Then, the therapist enters his/her own evaluation at the end of each exercise. In this way, the robot internally build a "training set" enabling the autonomous evaluation of the (known) exercises.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.2">Patient Profiling</head><p>To assess the profile of the patient, the assistive robot collects information about the health-related needs of the patient who performs the therapy. It is important to know physical as well as cognitive capabilities and features of a patient in order to properly interpret observed behaviors. The importance of collecting these kinds of information is twofold. From one side, it allows the robot to update the information of the patient and make it accessible to the therapist(refining phase). Furthermore, it is important to automatically customize the rehabilitation exercises the patient needs to perform. For example, patients affected by short-term memory loss can be constantly reminded by the robot about the exercises they are going to perform, explain the steps that compose the rehabilitation procedure and how they should be executed.</p><p>On the other side, this information is crucial to allow the robot to interact with the patient in an effective way and to explain decisions and possible changes in the execution or in the structure of the proposed therapy. For example, the robot can prefer a text-based interaction modality if the patient is affected by hearing impairments or a voice-based interaction modality if the patient is affected by eyesight impairments.</p><p>More in general, knowing health-related needs of a patient allows a robot to justify exercises with respect to the health conditions of the patient but also to recognize hazardous or critical situations and ask the intervention of a therapist.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.3">Rehabilitation Execution and Control</head><p>The tasks performed by the robot in this phase require the ability of monitoring the performance and the engagement of the patient who is executing the exercise. It needs to make the user feel safe and comfortable in exercising. According to the outcomes of monitoring activities the robot may decide to interact with the patient in different ways during the execution of the exercise. For example the robot can encourage the patient if an exercise has been performed well or viceversa can interrupt and explain again the exercise if the patient performs too bad. The following subsections detail the monitoring procedures.</p><p>Performance Monitoring From a technical point of view, the robot monitors the patient's performance by collecting data from visual sensors (i.e. cameras)  </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Engagement Monitoring</head><p>The engagement monitoring allows the robot to modify its interaction plan based on the feelings expressed by the patient during the exercise. Based on the scenario described in the previous subsection, if the robot notices that the patient is correctly exercising, it will keep motivating him, like a personal coach. Otherwise, if the robot recognizes that the patient is incorrectly performing the rehabilitation task, it will slow down the exercise and it will suggest the correct execution. In case the user is not motivated in performing the rehabilitation, the robot will try to get his attention in order to increase the engagement. When Alert performance scenario is detected, the robot will stop the rehabilitation session and it will comfort him with some routine's questions.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4">AI-based Cognitive Control</head><p>To realize the desired assistive services it is necessary to design and develop an advanced intelligent control system capable of implementing and integrating the numerous cognitive capabilities needed to successfully achieve the desired objectives. Starting from the cognitive architecture defined within KOaLa, advanced perception capabilities are needed to gather and process data from different sensing devices (e.g., video-cameras, environmental sensors and physiological sensors). Representation and abstraction capabilities are needed to integrate and contextualize sensory information in order to recognize different "assistive contexts" and situations, allowing the robot to incrementally build a kind of consciousness. According to this knowledge decision making and acting capabilities are needed to decide which high-level action to perform and synthesize an interaction plan to proactively support an end-user.</p><p>Taking inspiration from research in cognitive architecture <ref type="bibr" target="#b14">[15,</ref><ref type="bibr" target="#b13">14,</ref><ref type="bibr" target="#b0">1]</ref>, we here propose the integration of three AI-based processes implementing the cited capabilities. Specifically, we here propose an extension to a knowledge-based framework for cognitive control called KOaLa <ref type="bibr" target="#b3">[4,</ref><ref type="bibr" target="#b2">3]</ref> (Knowledge-based cOntinuous Loop) in order to integrate additional sensory information and enhance the reasoning and acting capabilities of the approach.</p><p>Figure <ref type="figure" target="#fig_5">4</ref> proposes a schematic representation of a cognitive architecture for adaptive rehabilitation. The cognitive control approach relies on three main modules: (i) A perception module is in charge of realizing the raw-data processing mechanisms needed to extract useful information from sensory inputs; (ii) A knowledge module is in charge of encoding information about the therapy and rehabilitation exercises, metrics for performance evaluation and patients' state in terms of health-related needs and emotions; (iii) A planning and acting module is in charge of making decisions about how to support the rehabilitation therapy, selecting the exercises to perform and the interactions needed to correct or support the rehabilitation.  Perception module allows to assess the emotional and engagement state of the user, while he/she is performing the exercise. In details, the perceptual system aims converting raw data coming form the sensory equipment into behavioural patterns. The perceptual system presented in this work resembles the abstraction process occurring in the human brain. The capability of the brain to process stimuli from the environment is mimicked by the interconnection of three modules, denoted as: thalamus, sensory cortex and associative cortex. The functionality of each module shows analogies with the abilities of the corresponding humanbeings' neural structure. Namely, the thalamus module is the one responsible of gathering sensory data. Sensory cortex is composed by multiple modules, one for each sensory modality, which extract the features of interest used by the associative cortex to assess the multi-modal pattern describing the human's behaviour. The flow of information characterizing the overall system recalls the human inherent ability of automatically assessing the mental state, feeling and other personal traits, as described by the Theory of Mind <ref type="bibr" target="#b11">[12]</ref>. The final output of this system is to assess specific emotional and engagement states. Among the emotional states described by <ref type="bibr" target="#b8">[9]</ref>, joy, anger, fear, sadness and surprise are of interest. The engagement state of the user is adopted to assess an additional feelings, like boredom (low level of engagement) and excitement (high level of engagement).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.2">Ontology-based Representation</head><p>To support the desired level of interaction and adaptability, the robot should exchange information with users as well as understand information or instructions from them and correctly interpreting "signals" from the environment. It is therefore necessary to endow an assistive robot with some sort of knowledge in order to deal with information about rehabilitation exercises and parameters considered for the assessment. To characterize such knowledge, we follow an ontological-based approach to define a clear semantics of the general concepts and properties the robot deals with in the considered rehabilitation scenarios.</p><p>To this aim, we will extend the KOaLa ontology, previously designed for domestic assistance <ref type="bibr" target="#b2">[3]</ref>, by introducing concepts and properties needed to manage the needed information. The KOaLa ontology relies on the DOLCE foundational ontology <ref type="bibr" target="#b12">[13]</ref> and the SSN ontology <ref type="bibr" target="#b4">[5]</ref>. Concepts and properties related to rehabilitation exercises are modeled as DOLCE:Process and characterized in terms of the effects on some physical/physiological parameters of a person. In this regard, the KOaLa ontology integrates a representation of the ICF <ref type="bibr" target="#b17">[18]</ref> classification proposed by WHO (World Health Organization) to generally describe health-related conditions of persons. This knowledge is crucial in this context to build profiles of end-users but also to link rehabilitation exercises to healthrelated parameters to monitor for the assessment. Also, this knowledge is crucial for supporting explainability <ref type="bibr" target="#b7">[8]</ref>. Using the ICF taxonomy indeed the robot can explain and motivate the exercises as well as positive or negative evaluations to end-users by taking into account his/her health-related needs.</p><p>The KOaLa ontology is also extended with the representation of the emotional states of a person that can be recognized by the integrated perception capabilities. This knowledge allows the robot to maintain an internal representation of the mental state of the assisted person and contextualize the observations accordingly.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.3">Holistic Reasoning for Robot Behavior Synthesis</head><p>The ontology defines a clear semantics of the heterogeneous concepts and properties the robot must deal with in different assistive scenarios. This semantics guides the knowledge reasoning processes that link perception module to planning and acting module of Figure <ref type="figure" target="#fig_5">4</ref>. KOaLa reasoning processes interpret and contextualize perception information according to the semantics defined by the ontology <ref type="bibr" target="#b1">[2,</ref><ref type="bibr" target="#b19">20]</ref>. Such processes continuously refine an internal Knowledge Based (KB) which characterizes the assistive scenario with respect to different abstraction levels and perspectives.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5">Conclusions and Future Works</head><p>This paper presents some initial design efforts aimed at realizing a novel cognitive control system for SAR systems within the Italian research project SI-Robotics. The paper focuses on a specific assistive scenario for rehabilitation where an assistive robot is supposed to support therapists as well as patients during the execution of some exercises. The contribution of the paper consists in proposing an initial integrated view of perception, knowledge representation and acting capabilities. Perception capabilities should extract information useful for the assessment of an exercises (correctness and engagement). Then, information should be modeled by a dedicated ontology and then integrated into a knowledge-based control architecture (KOaLa) to dynamically adapt the behavior of an assistive robot to the specific health-related needs and state of a person. Next steps, will push on the concrete integration and development of a prototype that can be evaluated in real rehabilitation scenarios.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>•</head><label></label><figDesc>Knowledge Representation and Reasoning • Ontology • Common Sense Reasoning • Natural Language Processing • Sensor Data Fusion • Decision Making • Planning and Execution • Learning and Adaptation</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Fig. 1 .</head><label>1</label><figDesc>Fig. 1. Conceptual view of the Italian research project SI-Robotics</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 1</head><label>1</label><figDesc>Figure1shows a conceptual view of SI-Robotics. Central to the project is the development of novel sensorized robotic platforms. On the left side of the figure, there are a number of heterogeneous assistive services and scenarios considered within the project. Such services are supported by means of the integration on a novel modular robotic platform of a number of advanced AI technologies Specifically, two types of scenarios are considered in SI-Robotics: (i) residential scenarios and; (ii) hospital scenarios.Residential scenarios concern assistive services where usually the assistance is carried out in restricted environments like the social houses or the house of the patient. The assistive services are mainly targeted in supporting the daily home living of a person over a long temporal horizon. The envisaged services in this context are: (i) Teleservice or teleassistance where the robot acts as a communication channel allowing external persons like e.g., doctors or relatives to contact and talk to the target senior; (ii) Health monitoring where the robot through a number of physiological and environmental sensing devices continuously monitor the activities and health parameters of the target senior and proactively triggers alerts and/or notifications when some "not regular" event happens; (iii) Coaching where the robot is meant to support the target seniors in continuing his/her rehabilitation therapy when he/she is back from from hospital. (iv) Cognitive stimulation where the robot continuously interact with the target person and constantly stimulate and evaluate his/her cognitive capabilities through a number of dedicated games properly integrated into the system (gamification).Hospital scenarios concern assistive services where the assistance is carried out in public environment (i.e., a hospital) and where the robot could interact simultaneously with several target seniors. However, unlike the other scenarios,</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Fig. 2 .</head><label>2</label><figDesc>Fig. 2. Perspectives of Human-Robot interaction</figDesc><graphic coords="4,152.06,410.06,89.61,78.24" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_4"><head>Fig. 3 .</head><label>3</label><figDesc>Fig. 3. Rehabilitation execution under robot control</figDesc><graphic coords="7,185.33,115.83,67.69,67.69" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_5"><head>Fig. 4 .</head><label>4</label><figDesc>Fig. 4. Cognitive approach to adaptive rehabilitation</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_1"><head>Perception Knowledge Planning &amp; Acting KB Therapy Measures Physiological State Emotional State Goal Reasoning Therapy Synthesis Therapy Execution Performance Monitoring Engagement Monitoring Exercise Support and Adaptation Exercise Monitoring</head><label></label><figDesc></figDesc><table /></figure>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Acknowledgements</head><p>Research supported by "SocIal ROBOTics for active and healthy ageing" (SI-ROBOTICS) project founded by the Italian "Ministero dell'Istruzione, dell' Università e della Ricerca" under the framework "PON 676 -Ricerca e Innovazione 2014-2020", Grant Agreement ARS01 01120.</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Act-r: A theory of higher level cognition and its relation to visual attention</title>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">R</forename><surname>Anderson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Matessa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Lebiere</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Hum.-Comput. Interact</title>
		<imprint>
			<biblScope unit="volume">12</biblScope>
			<biblScope unit="issue">4</biblScope>
			<biblScope unit="page" from="439" to="462" />
			<date type="published" when="1997-12">Dec 1997</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">A Cognitive Loop for Assistive Robots -Connecting Reasoning on Sensed Data to Acting</title>
		<author>
			<persName><forename type="first">A</forename><surname>Cesta</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Cortellessa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Orlandini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Umbrico</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">The 27th IEEE International Symposium on Robot and Human Interactive Communication</title>
				<editor>
			<persName><surname>Ro-Man</surname></persName>
		</editor>
		<imprint>
			<date type="published" when="2018">2018</date>
			<biblScope unit="page" from="826" to="831" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">A Semantic Representation of Sensor Data to Promote Proactivity in Home Assistive Robotics</title>
		<author>
			<persName><forename type="first">A</forename><surname>Cesta</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Cortellessa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Orlandini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Sorrentino</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Umbrico</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Intelligent Systems and Applications</title>
				<editor>
			<persName><forename type="first">K</forename><surname>Arai</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">S</forename><surname>Kapoor</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">R</forename><surname>Bhatia</surname></persName>
		</editor>
		<meeting><address><addrLine>Cham</addrLine></address></meeting>
		<imprint>
			<publisher>Springer International Publishing</publisher>
			<date type="published" when="2019">2019</date>
			<biblScope unit="page" from="750" to="769" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Will robin ever help &quot;nonna lea&quot; using artificial intelligence?</title>
		<author>
			<persName><forename type="first">A</forename><surname>Cesta</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Cortellessa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Orlandini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Umbrico</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Ambient Assisted Living</title>
				<editor>
			<persName><forename type="first">A</forename><surname>Leone</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">A</forename><surname>Caroppo</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">G</forename><surname>Rescio</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">G</forename><surname>Diraco</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">P</forename><surname>Siciliano</surname></persName>
		</editor>
		<meeting><address><addrLine>Cham</addrLine></address></meeting>
		<imprint>
			<publisher>Springer International Publishing</publisher>
			<date type="published" when="2019">2019</date>
			<biblScope unit="page" from="181" to="191" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">The SSN ontology of the W3C semantic sensor network incubator group</title>
		<author>
			<persName><forename type="first">M</forename><surname>Compton</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Barnaghi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Bermudez</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>García-Castro</surname></persName>
		</author>
		<author>
			<persName><forename type="first">O</forename><surname>Corcho</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Cox</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Graybeal</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Hauswirth</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Henson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Herzog</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Huang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Janowicz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><forename type="middle">D</forename><surname>Kelsey</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><forename type="middle">L</forename><surname>Phuoc</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Lefort</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Leggieri</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Neuhaus</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Nikolov</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Page</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Passant</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Sheth</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Taylor</surname></persName>
		</author>
		<ptr target="http://www.sciencedirect.com/science/article/pii/S1570826812000571" />
	</analytic>
	<monogr>
		<title level="j">Web Semantics: Science, Services and Agents on the World Wide Web</title>
		<imprint>
			<biblScope unit="volume">17</biblScope>
			<biblScope unit="page" from="25" to="32" />
			<date type="published" when="2012">2012</date>
		</imprint>
	</monogr>
	<note>Supplement C</note>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">GiraffPlus: Combining social interaction and long term monitoring for promoting independent living</title>
		<author>
			<persName><forename type="first">S</forename><surname>Coradeschi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Cesta</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Cortellessa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Coraci</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Gonzalez</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Karlsson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Furfari</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Loutfi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Orlandini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Palumbo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Pecora</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Von Rump</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Štimec</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Ullberg</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Ötslund</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">The 6th International Conference on Human System Interactions (HSI)</title>
				<imprint>
			<date type="published" when="2013">2013</date>
			<biblScope unit="page" from="578" to="585" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">ROBIN, a Telepresence Robot to Support Older Users Monitoring and Social Inclusion: Development and Evaluation</title>
		<author>
			<persName><forename type="first">G</forename><surname>Cortellessa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Fracasso</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Sorrentino</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Orlandini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Bernardi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Coraci</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>De Benedictis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Cesta</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Telemedicine and e-Health</title>
		<imprint>
			<biblScope unit="volume">24</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="145" to="154" />
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">Explainable artificial intelligence: A survey</title>
		<author>
			<persName><forename type="first">F</forename><forename type="middle">K</forename><surname>Došilovic</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Brcic</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Hlupic</surname></persName>
		</author>
		<idno type="DOI">10.23919/MIPRO.2018.8400040</idno>
		<ptr target="https://doi.org/10.23919/MIPRO.2018.8400040" />
	</analytic>
	<monogr>
		<title level="m">2018 41st International Convention on Information and Communication Technology, Electronics and Microelectronics</title>
				<imprint>
			<publisher>MIPRO</publisher>
			<date type="published" when="2018-05">May 2018</date>
			<biblScope unit="page" from="210" to="0215" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">An argument for basic emotions</title>
		<author>
			<persName><forename type="first">P</forename><surname>Ekman</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Cognition &amp; emotion</title>
		<imprint>
			<biblScope unit="volume">6</biblScope>
			<biblScope unit="issue">3-4</biblScope>
			<biblScope unit="page" from="169" to="200" />
			<date type="published" when="1992">1992</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Supporting active and healthy aging with advanced robotics integrated in smart environment</title>
		<author>
			<persName><forename type="first">R</forename><surname>Esposito</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Fiorini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Limosani</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Bonaccorsi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Manzi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Cavallo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Dario</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Optimizing assistive technologies for aging populations</title>
				<imprint>
			<publisher>IGI Global</publisher>
			<date type="published" when="2016">2016</date>
			<biblScope unit="page" from="46" to="77" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Defining socially assistive robotics</title>
		<author>
			<persName><forename type="first">D</forename><forename type="middle">J</forename><surname>Feil-Seifer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">J</forename><surname>Matarić</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">9th International Conference on Rehabilitation Robotics</title>
				<imprint>
			<date type="published" when="2005">2005. 2005. 2005</date>
			<biblScope unit="page" from="465" to="468" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Development and neurophysiology of mentalizing. Philosophical Transactions of the Royal Society of London</title>
		<author>
			<persName><forename type="first">U</forename><surname>Frith</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">D</forename><surname>Frith</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Series B: Biological Sciences</title>
		<imprint>
			<biblScope unit="volume">358</biblScope>
			<biblScope unit="page" from="459" to="473" />
			<date type="published" when="1431">1431. 2003</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">Sweetening ontologies with dolce</title>
		<author>
			<persName><forename type="first">A</forename><surname>Gangemi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Guarino</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Masolo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Oltramari</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Schneider</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Knowledge Engineering and Knowledge Management: Ontologies and the Semantic Web</title>
				<editor>
			<persName><forename type="first">A</forename><surname>Gómez-Pérez</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">V</forename><forename type="middle">R</forename><surname>Benjamins</surname></persName>
		</editor>
		<meeting><address><addrLine>Berlin Heidelberg; Berlin, Heidelberg</addrLine></address></meeting>
		<imprint>
			<publisher>Springer</publisher>
			<date type="published" when="2002">2002</date>
			<biblScope unit="page" from="166" to="181" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">Soar: An architecture for general intelligence</title>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">E</forename><surname>Laird</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Newell</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><forename type="middle">S</forename><surname>Rosenbloom</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Artificial Intelligence</title>
		<imprint>
			<biblScope unit="volume">33</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="1" to="64" />
			<date type="published" when="1987">1987</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">Cognitive architectures: Research issues and challenges</title>
		<author>
			<persName><forename type="first">P</forename><surname>Langley</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">E</forename><surname>Laird</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Rogers</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Cognitive Systems Research</title>
		<imprint>
			<biblScope unit="volume">10</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="141" to="160" />
			<date type="published" when="2009">2009</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<analytic>
		<title level="a" type="main">Personalized socially assistive robotics</title>
		<author>
			<persName><forename type="first">M</forename><surname>Mataric</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Tapus</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Feil-Seifer</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Workshop on Intelligent Systems for Assisted Cognition</title>
				<imprint>
			<date type="published" when="2007">10. 2007</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b16">
	<analytic>
		<title level="a" type="main">Learning and personalizing socially assistive robot behaviors to aid with activities of daily living</title>
		<author>
			<persName><forename type="first">C</forename><surname>Moro</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Nejat</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Mihailidis</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">ACM Trans. Hum.-Robot Interact</title>
		<imprint>
			<biblScope unit="volume">7</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page">25</biblScope>
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b17">
	<monogr>
		<author>
			<persName><forename type="first">W</forename><forename type="middle">H</forename><surname>Organization</surname></persName>
		</author>
		<title level="m">International classification of functioning, disability and health: ICF</title>
				<meeting><address><addrLine>Geneva</addrLine></address></meeting>
		<imprint>
			<publisher>World Health Organization</publisher>
			<date type="published" when="2001">2001</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b18">
	<analytic>
		<title level="a" type="main">User profiling and behavioral adaptation for HRI: A survey</title>
		<author>
			<persName><forename type="first">S</forename><surname>Rossi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Ferland</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Tapus</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Pattern Recognition Letters</title>
		<imprint>
			<biblScope unit="volume">99</biblScope>
			<biblScope unit="page" from="3" to="12" />
			<date type="published" when="2017">2017</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b19">
	<analytic>
		<title level="a" type="main">A goal triggering mechanism for continuous human-robot interaction</title>
		<author>
			<persName><forename type="first">A</forename><surname>Umbrico</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Cesta</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Cortellessa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Orlandini</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">AI*IA 2018 -Advances in Artificial Intelligence</title>
				<editor>
			<persName><forename type="first">C</forename><surname>Ghidini</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">B</forename><surname>Magnini</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">A</forename><surname>Passerini</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">P</forename><surname>Traverso</surname></persName>
		</editor>
		<meeting><address><addrLine>Cham</addrLine></address></meeting>
		<imprint>
			<publisher>Springer International Publishing</publisher>
			<date type="published" when="2018">2018</date>
			<biblScope unit="page" from="460" to="473" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b20">
	<analytic>
		<title level="a" type="main">A holistic approach to behavior adaptation for socially assistive robots</title>
		<author>
			<persName><forename type="first">A</forename><surname>Umbrico</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Cesta</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Cortellessa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Orlandini</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">International Journal of Social Robotics</title>
		<imprint>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b21">
	<analytic>
		<title level="a" type="main">Toward intelligent continuous assistance</title>
		<author>
			<persName><forename type="first">A</forename><surname>Umbrico</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Cortellessa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Orlandini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Cesta</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Ambient Intelligence and Humanized Computing</title>
		<imprint>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
