Monitoring People that Need Assistance through a Sensor-based System: Evaluation and First Results Xavier Rafael-Palou, Eloisa Vargiu, Felip Miralles Barcelona Digital Technology Center, {xrafael, evargiu, fmiralles}@bdigital.org Abstract. Decline in daily functioning usually involves the reduction and dis- continuity in daily routines; entailing a considerable decrease of the quality of life (QoL). This is especially relevant for people that need assistance, as for in- stance elderly or disabled people and may also hide pathological (e.g., Alzheimer) and/or mental (e.g., depression or melancholia) conditions. Thus, there is the need to intelligent systems able to monitor users’ activities to detect emergencies, rec- ognize activities, send notifications, and provide a summary of all the relevant information. In this paper, we present a sensor-based telemonitoring system that addresses all that issues. Its goal is twofold: (i) helping and supporting people (e.g., elderly or disabled) at home; and (ii) giving a feedback to therapists, care- givers, and relatives about the evolution of the status, behavior and habits of each monitored user. Some features of the system have been evaluated with two health- users in Barcelona and results show good performance. Finally, the system has been adopted and installed in several end-users’ homes under the umbrella of the projects SAAPHO and BackHome. 1 Introduction In the literature, various studies and systems aimed at detecting and overwhelm- ing the worsening in daily activities have been proposed. Several methods are limited to measuring daily functioning using self-report such as with the modified Katz ADL scale [13] or a more-objective measurement method as the Assessment of Motor and Process Skills [5]. Recently, solutions have been proposed to unob- trusively monitor activities of people that need assistance. In particular, sensor- based approaches are normally used [7]. They rely on a conjunction of sensors, each one devoted to monitor a specific status, a specific activity or activities re- lated to a specific location. Binary sensors are currently the most adopted sensors [11], even if they are prone to noise and errors [10]. Once all of the data have been collected, intelligent solutions that incrementally and continuously analyze the data to all the involved actors (i.e., therapists, caregivers, relatives, and end- users themselves) are required. Moreover, it is then necessary to identify if the person needs a form of assistance since an unusual activity has been recognized. This requires the adoption of machine learning solutions to take into account the environment, the performed activity and/or some physiological data [3]. Further- more, once data have been analyzed, the system has to react and perform some actions, accordingly. On the one hand, the user needs to be keep informed about emergencies as soon as they happen and s/he has to be in contact with therapists and caregivers to change habits and/or to perform some therapy. On the other side, 22 AI-AM/NetMed 2015 4th International Workshop on Artificial Intelligence and Assistive Medicine monitoring systems are very important from the perspective of therapists, care- givers, and relatives. In fact, those systems allow them to become aware of user context by acquiring heterogeneous data coming from sensors and other sources. In this paper, we present a sensor-based telemonitoring system aimed at detecting emergencies, recognizing activities, sending notifications as well as collecting current and past information in a summary. The goal of the proposed solution is twofold. On the one hand, it is aimed at helping and supporting people (e.g., elderly or disabled) at home. On the other hand, it is devoted to constantly give a feedback to therapists, caregivers, and relatives about the evolution of the status, behavior and habits of each monitored user. The rest of the paper is organized as follows. Section 2 presents the architecture of the sensor-based solution as well as the intelligent monitoring system. Section 3 shows the evaluation performed to test the availability and reliability of the proposed solution as well as the results coming from the adoption of the system in two real scenarios. In Section 4, we conclude with the main results of this work pointing out its future directions. 2 The Proposed Solution 2.1 The Sensor-based System Advanced telemonitoring systems entail the composition and orchestration of heterogeneous distributed technologies and services. In Figure 1, we sketch the high-level architecture of the proposed system. As shown, its main components are: home; healthcare center; middleware; and intelligent monitoring system. The sensor-based system is able to monitor indoor activities by relying on a set of home automation sensors and outdoor activities by using an activity tracker, namely Moves1 . Moreover, through environmental sensors, the system is able to detect emergency situations. At home, a set of sensors are installed. In particular, we use presence sensors (i.e., Everspring SP103), to identify the room where the user is located (one sensor for each monitored room); a door sensor (i.e., Vision ZD 2012), to detect when the user enters or exits the premises; electrical power meters and switches, to con- trol leisure activities (e.g., television and pc); and pressure mats (i.e., bed and seat sensors) to measure the time spent in bed (wheelchair). The system is also composed of a network of environmental sensors that measures and monitors en- vironmental variables like temperature, but also potentially dangerous events like gas leak, fire, CO escape and presence of intruders. All the adopted sensors are wireless z-wave. They send the retrieved data to a collector (based on Raspberry pi). The Raspberry pi collects all the retrieved data and securely redirects them to the cloud where they will be stored, processed, mined, and analyzed. We are also using the user’s smartphone as a sensor by relying on Moves, an app for smartphones able to recognize physical activities and movements by transporta- tion. The user interacts with the overall system through a suitable interface aware of end-user needs and preferences. The middleware, which acts as a SaaS, is composed by a secure communication and authentication module; API module to enable the collector transmitting all 1 http://www.moves-app.com/ 23 AI-AM/NetMed 2015 4th International Workshop on Artificial Intelligence and Assistive Medicine Fig. 1. Main components of the sensor-based system focused on the intelligent monitoring. the data from sensors to make them available to the activity monitoring module; and further utilities such as load balancing and concurrency. In order to cope with the data necessities of the actors of the system (i.e., thera- pists, caregivers, relatives, and end-users themselves), an Intelligent Monitoring (IM) system has been designed. The healthcare center receives notifications, summaries, statistics, and general information belonging to the users through a web application. 2.2 Intelligent Monitoring IM aims to continuously analyze and mine the data through 4-dimensions: de- tection of emergencies, activity recognition, event notifications, and summary extraction. In order to cope with these objectives, the IM is composed of the fol- lowing modules (see Figure 2): PP, the pre-processing module to encode the data for the analysis; ED, the emergency detection module to notify, for instance, in case of smoke and gas leakage; AR, the activity recognition module to identify the location, position, activity- and sleeping-status of the user; EN, the event no- tification module to inform when a new event has been detected; and SC, the summary computation module to perform summaries from the data. Pre-processing IM continuously and concurrently listens for new data. The goal of PP is to pre-process the data iteratively sending a chunk c to ED accord- ing to a sliding window approach. Starting from the overall data streaming, the system sequentially considers a range of time |ti - ti+1 | between a sensor measure si at time ti and the subsequent measure si+1 at time ti+1 . Thus, the output of PP is a window c from ts to ta , where ts is the starting time of a given period (e.g., 8:00 a.m.) and ta is the actual time. Thus, each chunk is composed of a sequence of sensor measures s; where s is a triple < ID, v,t >, i.e., the sensor ID, its value and the time in which a change in the sensor status is measured. Figure 3 shows an example of a chunk composed by four sensors measures. 24 AI-AM/NetMed 2015 4th International Workshop on Artificial Intelligence and Assistive Medicine Fig. 2. The flow of data and interactions among the modules in the proposed approach. Fig. 3. Example of a chunk composed of four sensor measures. Emergency Detection ED module aims to detect and inform about emergency situations for the end-users and sensor-based system critical failures. Regarding the critical situations for the end-users, simple rules are defined and implemented to rise an emergency, when specific values appear on c (e.g.; gas sensor ID, smoke sensor ID). Regarding the system failures, ED is able to detect when the end- user’s home is disconnected from the middleware as well as a malfunctioning of a sensor (e.g., low battery). The former is implemented by a keepalive mechanism in the Raspberry pi. If no signals are received from the Raspberry pi after a given threshold, an emergency is risen. The latter is implemented by using a multivari- ate gaussian distributions of sensor measurements on c. If the corresponding total number of measures is greater than a given threshold, an emergency is risen. Each emergency is a pair < si , lεi > composed of the sensor measure si and the corresponding label lεi that indicates the corresponding emergency (e.g., fire, smoke). Once the ED finishes the analysis of c, the list of emergencies ε is sent to the middleware, whereas c, filtered from the critical situations, is sent to AR. Activity Recognition In the current implementation, the system is able to recognize if the user is at home or away and if s/he is alone; the room in which the user is (no-room in case s/he is away, transition in case s/he moving from a room to another); the activity status (i.e., active or inactive); and the sleeping status (i.e., awake or asleep). To recognize if the user is at home or away and if s/he is alone, we implemented a solution based on machine learning techniques [9]. The adopted solution is a hierarchical classifier composed of two levels: the upper is aimed at recognizing if the user is at home or not, whereas the lower is aimed at recognizing if the user is really alone or if s/he received some visits. The goal of the classifier at the upper level is to improve performance of the door sensor. In fact, it may happen that the sensor registers a status change (from closed to open) even if the door has not been opened. This implies that AR may register that the user is away and, in the meanwhile, activities are detected at user’s home. On the contrary, AR may register that the user is at home and, in the meanwhile, activities are not 25 AI-AM/NetMed 2015 4th International Workshop on Artificial Intelligence and Assistive Medicine detected at user’s home. Thus, we first revise the data gathered by AR searching for anomalies, i.e.: (1) the user is away and at home some events are detected and (2) the user is at home and no events are detected. Then, we validate those data by relying on Moves, installed and running on the user smartphone. Using Moves as an “oracle”, we build a dataset in which each entry is labeled depending on the fact that the door sensor was right (label “1”) or wrong (label “0”). The goal of the classifier at the lower level is to identify whether the user is alone or not. The input data of this classifier are those that has been filtered by the upper level, being recognized as positives. To build this classifier, we rely on the novelty detection approach [6] used when data has few positive cases (i.e., anomalies) compared with the negatives (i.e., regular cases); in case of skewed data. To measure the activity status, we rely on the home automation sensors. By de- fault, we consider as “active” the status of the user when s/he is away (the cor- responding positions are saved as “no-room”). On the contrary, when the user is at home, AR recognizes s/he as “inactive” if the sensor measures at time ti that user is in a given room r and the following sensor measure is given at time ti+1 and the user was in the same room, with ti+1 − ti greater than a given threshold θ . Otherwise, the system classified the user as “active”. Finally, sleeping is currently detected by relying on the presence sensor located in the bedroom and the pressure mat located below the mattress. In particular, we consider the presence of the user in that room and no movements detection (i.e., the activity status is “inactive”) together with the pressure of the mattress. Thus, the output of AR is a triple < ts ,te , l >, where ts and te are the time in which the activity has started and has finished, respectively, and l is a list of four labels that indicates: the localization (i.e., home, away, or visits), the position (i.e., the room, no-room, or transition), the activity status (i.e., active or inactive), and the sleeping status (i.e., awake or asleep). To give an example, let us consider Figure 4 where the same chunk of Figure 3 has been processed by AR. Fig. 4. Example of a chunk after the AR processing. Event Notification By relying on a set of simple rules, EN is able to detect events to be notified. Each event is defined by a pair < ti , l > corresponding to the time ti in which the event happens together with a label l that indicates the kind of event. In particular, we are interested in detecting the following kind of events: leaving the home, going back to home, receiving a visit, remaining alone after a visit, going to the bathroom, going out of the bathroom, going to sleep, and awaking. Following the example, in Figure 3, an event is the pair < 2014 − 02 − 2410 : 31 : 55, going to the bathroom >. Summary Computation Once all the activities and events have been classi- fied, measures aimed at representing the summary of the user’s monitoring during a given period are performed. In particular, two kinds of summary are provided: 26 AI-AM/NetMed 2015 4th International Workshop on Artificial Intelligence and Assistive Medicine historical and actual. As for the historical summary, we decided to have a list of the activities performed during (i) the morning (i.e., from 8 a.m. to 8 p.m.), (ii) the night (i.e. from 8 p.m. to 8 a.m.), (iii) all the day, (iv) the week (from Monday morning to Sunday night), as well as (v) the month. In particular, we monitor: sleeping time; time spent outdoors; time spent indoors; time spent performing in- door activities; time spent performing outdoor activities; number of times spent in each room; and number of times that the user lefts the house. As for the actual summary, we are interested in monitoring: the room in which the user is; if the user is at home, or not; the number of times that s/he leaves the home; sleeping time; activity time; and number of visits per room. As a final remark, let us note that all emergencies, activities, notifications, and summaries are stored in a database to be available to all the involved actors. 3 Evaluation and First Results The proposed solution has been developed according to a user-centered design approach in order to collect requirements and feedback from all the actors. For evaluation purposes, the system has been installed in two healthy-user homes in Barcelona. Moreover, the system has been used in the SAAPHO2 project to monitor elderly people and in the BackHome3 project to monitor disabled people. 3.1 Evaluation Before installing the system at real end-users home, its evaluation was undertaken by a control group of healthy users. Two healthy users participated in the study as a control group (1 female, M=32.5 years). The evaluation has been performed from November 2nd, 2014 to December 21st, 2014 for a total of 34 days. The performed testing activity was focused on evaluating some of the features of AR and EN. In particular, we evaluated: the performance of the hierarchical approach (AR) as well as the ability in recognizing the events of leaving the home and receiving visits (EN); and the ability in recognizing the sleeping activity (AR). As for the evaluation of the hierarchical approach, we first trained both classifiers. To measure the performance, we compared the overall results with those obtained by using a rule-based approach at each level of the hierarchy. Results are shown in Table 1 and point out that the proposed approach outperforms the rule-based one with a significant improvement. The interested reader may refer to [9] for details about the adopted rules. To evaluate in the ability of EN in correctly detecting the notification about the number of times that the user leaves the home and the number of received visits, we daily ask the users to answer to the questions: “How many times did you go out from home?” and “How many times did you receive visits at your home?”. Then, we compared the answers given by the user with the number of detection by EN. Figure 5 sketches the results whereas, in Table 2, the first two columns show the cosine similarity and the accuracy for each of the considered notifications. As it can be noted, the system is able to recognize quite well the number of times the user leaves the home as well as the number of visits that s/he receives, thanks to the proposed hierarchical approach. 2 http://www.saapho-aal.eu/ 3 http://www.backhome-fp7.eu/backhome/ 27 AI-AM/NetMed 2015 4th International Workshop on Artificial Intelligence and Assistive Medicine Table 1. Results of the overall hierarchical approach with respect to the rule-based one. Rule-based Hierarchical Improv. Accuracy 0.80 0.95 15% Precision 0.68 0.94 26% Recall 0.71 0.91 20% F1 0.69 0.92 23% Fig. 5. Comparisons between the results given by EN and those coming from the questionnaire in detecting the number of times that the user leaves the home (on the left) and the number of received visits (on the right). Table 2. Cosine similarity and accuracy calculated between pipeline outputs and answers from daily user questionnaire. Leavings Visits Sleeping Cosine similarity 0.9689 0.6172 0.8888 Accuracy 0.8823 0.8823 0.8529 As for the evaluation in the ability of AR to recognize the sleeping activity, we daily ask the users to answer to the questions “What time did you stand up from bed?” and “What time did you go to sleep?”. Then, we compared the answers given by the user with the sleeping time calculated by AR. Figure 6 sketches the results whereas, in Table 2, the last column shows the cosine similarity and the accuracy. Let us note that, due to the fact that we are not relying on further infor- mation such as luminosity to understand if the user is on the bed doing something (e.g., reading) or if s/he is really sleeping and that the user in the questionnaire is giving the time in which s/he turns-off the light, we are considering a bias of 5400 secs to consider the user as awake. 3.2 First Results SAAPHO was an European R&D project aimed at integrating health, social, and security services seamlessly in the same architecture [1]. The main objec- tive of the sensor-based system was to control health parameters of elderly and warn them in time in order to increase their personal independence. In the final SAAPHO pilot, the composition of the trial formed by 6 participants from Spain (N=3) and Slovenia (N=3). They were invited to use the system at their own home for 2 months. The mean age of the 3 participants in Spain was 69.3 (SD: 9.9), 66- 72 years; whereas the mean age of the 3 participants in Slovenia was 65.7 (SD: 28 AI-AM/NetMed 2015 4th International Workshop on Artificial Intelligence and Assistive Medicine Fig. 6. Comparisons between the results given by AR system and those coming from the ques- tionnaire regarding the sleeping activity. 9.9), 60-74 years. Regarding the gender both in Spain and Slovenia, 66.6% of the participants were women. Moreover, 100% of the participants had experience using computers; 66.6% had experience in using tablet PC and all of them had Internet at home. ED has been used in SAAPHO to detect fire and smoke, AR and EN to detect events, such as inactivity and toileting4 . This information has been then sent to the end-user through a suitable interface in a smart portable de- vice. In particular, it is possible to check on real time the status of environmental sensors (e.g., temperature, humidity), to trigger an alarm (e.g., when smoke or gas lekeage is detected), to view the list of the recent home notification events (e.g., sharp increase/decrease of temperature, prolonged lack of movements) as well as to configure the home sensors. In [8], some of the main qualitative results of the SAAPHO final prototype after one-month testing have been presented. The evaluation was performed following a systematic approach. Positive impressions were collected from the participants using SAAPHO in real settings; the system was very well accepted among the participants in both countries: it was consid- ered easy to use; most of the offered services extremely useful; and respondent to users’ needs. BackHome is an European R&D project that focuses on restoring independence to people that are affected by motor impairment due to acquired brain injury or disease, with the overall aim of preventing exclusion [4]. In BackHome, informa- tion gathered by the sensor-based system is used to provide context-awareness by relying on ambient intelligence [2]. AR is currently used in BackHome to study habits and to automatically assess QoL of people [12]. Figure 7 shows an exam- ple of user habits recognized by AR. The BackHome system is currently running in three end-user’s home in Belfast. 4 Conclusions and Future Work In this paper, we presented a sensor-based system aimed at detecting emergencies, recognizing activities, sending notification as well as collecting the information in a summary. The goal of the implemented system was to help and support people that need assistance and to constantly give a feedback to therapists, caregivers, and relatives about the evolution of the status, behavior and habits of the corre- sponding user. The system has been evaluated with 2 healthy-users to assess if 4 SC has not been implemented due to the self-managing purpose of the project; i.e., no care- givers were involved. 29 AI-AM/NetMed 2015 4th International Workshop on Artificial Intelligence and Assistive Medicine Fig. 7. User habits recognized by AR in BackHome. the user is at home or away, the number of times the user leaves the home; if s/he receiving visits as well as the number of received visits; and the sleeping activ- ity. Results show that the system performs well in all of these tasks. Moreover, under the umbrella of SAAPHO and BackHome, the system has been installed and tested in 6 homes of elderly people and in 3 homes of disabled people, re- spectively. As a final remark, let us note that the intelligent monitoring system could be extended by adding new functionalities into the modules depending on the requirements of the corresponding use-case(s). As for the future work, we are currently improving the sleeping activity recogni- tion by relying also to a sensor that measures luminosity in addition to presence. Moreover, we are setting up new tests to evaluate the ability of the system in recognizing when the user is active or inactive, relying also on the information coming from Moves. We are also planning to recognize more activities, such as cooking and eating. Finally, in order to assess quality of life, we are interested in measuring the sleep quality and in making some studies on the (virtual or physi- cal) social interactions of the users. Acknowledgments The research leading to these results has received funding from the European Community’s, Seventh Framework Programme FP7/2007-2013, BackHome project grant agreement n. 288566, and AAL (Ambient Assisted Living)/Call 3, SAAPHO project grant agreement n. 2010-3-035. References 1. Ahmed, M.U., Espinosa, J.R., Reissner, A., Domingo, À., Banaee, H., Loutfi, A., Rafael-Palou, X.: Self-serve ict-based health monitoring to support active ageing. In: 8th International Conference on Health Informatics HEALTH- INF, 12 Jan 2015, Lisbon, Portugal (2015) 2. Casals, E., Cordero, J.A., Dauwalder, S., Fernández, J.M., Solà, M., Vargiu, E., Miralles, F.: Ambient intelligence by atml: Rules in backhome. In: Emerging ideas on Information Filtering and Retrieval. DART 2013: Revised and Invited Papers; C. Lai, A. Giuliani and G. Semeraro (eds.) (2014) 30 AI-AM/NetMed 2015 4th International Workshop on Artificial Intelligence and Assistive Medicine 3. Cook, D.J., Augusto, J.C., Jakkula, V.R.: Ambient intelligence: Technolo- gies, applications, and opportunities (2007) 4. Daly, J., Armstrong, E., Miralles, F., Vargiu, E., Müller-Putz, G., Hintermller, C., Guger, C., Kuebler, A., Martin, S.: Backhome: Brain-neural-computer interfaces on track to home. In: RAatE 2012 - Recent Advances in Assistive Technology & Engineering (2012) 5. Fisher, A.G., Jones, K.B.: Assessment of motor and process skills. Three Star Press Fort Collins, CO (1999) 6. Markou, M., Singh, S.: Novelty detection: a review?part 1: statistical ap- proaches. Signal processing 83(12), 2481–2497 (2003) 7. Pol, M.C., Poerbodipoero, S., Robben, S., Daams, J., Hartingsveldt, M., Vos, R., Rooij, S.E., Kröse, B., Buurman, B.M.: Sensor monitoring to measure and support daily functioning for independently living older people: A sys- tematic review and road map for further development. Journal of the Ameri- can Geriatrics Society 61(12), 2219–2227 (2013) 8. Rafael-Palou, X., Serra, G., Miralles, F.: Saapho: A system to enhance active ageingthrough safety, participation and healthservices. In: Broader, Bigger, Better ? AAL solutions for Europe. Proceedings of the AAL Forum, (2014) 9. Rafael-Palou, X., Vargiu, E., Serra, G., Miralles, F.: Improving activity mon- itoring through a hierarchical approach. In: The International Conference on Information and Communication Technologies for Ageing Well and e-Health (ICT 4 Ageing Well) (2015) 10. Ranganathan, A., Al-Muhtadi, J., Campbell, R.H.: Reasoning about uncer- tain contexts in pervasive computing environments. IEEE Pervasive Com- puting 3(2), 62–70 (2004) 11. Tapia, E.M., Intille, S.S., Larson, K.: Activity recognition in the home using simple and ubiquitous sensors. Springer (2004) 12. Vargiu, E., Fernández, J.M., Miralles, F.: Context-aware based quality of life telemonitoring. In: Distributed Systems and Applications of Information Fil- tering and Retrieval. DART 2012: Revised and Invited Papers. C. Lai, A. Giuliani and G. Semeraro (eds.) (2014) 13. Weinberger, M., Samsa, G.P., Schmader, K., Greenberg, S.M., Carr, D., Wildman, D.: Comparing proxy and patients’ perceptions of patients’ func- tional status: results from an outpatient geriatric clinic. Journal of the Amer- ican Geriatrics Society (1992) 31