<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>A Multi-modal Sensing Framework for Human Activity Recognition</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Barbara Bruno</string-name>
          <email>barbara.bruno@unige.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jasmin Grosinger</string-name>
          <email>jasmin.grosinger@aass.oru.se</email>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Fulvio Mastrogiovanni</string-name>
          <email>fulvio.mastrogiovanni@unige.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Federico Pecora</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alessandro Saffiotti</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Subhash Sathyakeerthy</string-name>
          <email>subhash.sathyakeerthy@aass.oru.se</email>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Antonio Sgorbissa</string-name>
          <email>antonio.sgorbissa@unige.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>University of Genova, Dept. DIBRIS</institution>
          ,
          <addr-line>Via Opera Pia 13, 16145 Genova</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Robots for the elderly are a particular category of home assistive robots, helping people in the execution of daily life tasks to extend their independent life. Such robots should be able to determine the level of independence of the user and track its evolution over time, to adapt the assistance to the person capabilities and needs. We present an heterogeneous information management framework, allowing for the description of a wide variety of human activities in terms of multi-modal environmental and wearable sensing data and providing accurate knowledge about the user activity to any assistive robot.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>
        Home assistive robotics addresses the design of robots to be deployed in domestic
environments, to assist the residents in the execution of daily life tasks. Robots
for the elderly are a particular category of home assistive robots, which rely on
social interaction with the person and aim at extending the elderly independent
life [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. To properly and effectively perform the assistive duties, robots for the
elderly should be context-aware, i.e., able to assess the status of the environment
they are in, user-aware, i.e., able to assess the status of the person they are
working for, and also ageing-aware, i.e., able to perform a long term analysis of
the person cognitive and physical evolution, to adapt to their current capabilities.
      </p>
      <p>
        Human Activity Recognition (HAR) systems for elderly-care are devoted to
the identification, among all actions executed by a person during a day, of
specific activities of interest, the Activities of Daily Living (ADL), which require
the use of different cognitive and physical abilities and are used by
gerontologists to estimate the level of autonomy of a person [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. ADL cover a wide variety
of human activities: consequently, a number of sensing strategies have been
developed for their automatic recognition. ADL occurring at home, in particular,
are usually monitored with smart environments [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] and wearable sensing systems
[
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Unfortunately, wearable sensing systems are prone to ambiguity, while smart
environments may reach erroneous conclusions due to incomplete information.
      </p>
      <p>We address the problem of endowing robots for the elderly with the ability
of monitoring the Activities of Daily Living, by designing a HAR system which
allows for a seamless integration with the robot planning system. We propose the
integration of multiple sensing strategies in a single framework, to compensate
the weaknesses of each modality and increase the recognition reliability.</p>
      <p>The abstract is organized as follows. Section 2 details the system architecture.
Preliminary experimental results are analysed in Section 3. Conclusions follow.
2</p>
    </sec>
    <sec id="sec-2">
      <title>System Architecture</title>
      <p>We set up a test bed in an apartment located in the city of O¨ rebro (SWE), in
the elderly care facility A¨ ngen. The apartment, shown in Figure 1, is composed
of fully furnished living-room, bathroom, bedroom and kitchen.
(a)
(b)</p>
      <p>We propose the multi-modal monitoring system with the architecture shown
in Figure 1 for the reliable detection of the activities: transferring (denoting the
motions of sitting down, standing up, lying down, getting up); feeding (eating,
drinking); food preparation; indoor transportation (climbing stairs, descending
stairs, walking). The system makes use of: (i) a wrist-placed inertial sensor; (ii)
a waist-placed inertial sensor; (iii) a network of Passive Infra-Red sensors; (iv)
RFID tags, pressure sensors and switches; and (v) a temporal reasoner.</p>
      <p>
        The data extracted by the wrist sensor are used to detect occurrences of
gestures [
        <xref ref-type="bibr" rid="ref5 ref6">5, 6</xref>
        ], such as walking, picking up or sitting. The data provided by the
waist sensor, instead, are used to estimate the person posture on the basis of the
angle between the torso and the gravity force. The combined analysis of wrist
and waist acceleration data also allows for detecting falls with high accuracy [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
Person localization is achieved via a network of Passive Infra-Red (PIR) sensors.
We identified three categories of objects to monitor: cutlery and dishes, which are
assumed to be in use when located on the kitchen table and that we detect via an
RFID network; furniture, such as chairs, armchairs and bed, for which pressure
sensors detect whether and which is in use; household appliances, such as the
fridge and the oven, whose usage can be inferred by checking the status of their
doors with switches. All elements in the architecture are envisioned as Physically
Embedded Intelligent Systems (PEIS) [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], i.e., devices incorporating
computational, communication, sensing and/or actuating resources, connected with each
other by a uniform communication model. The analysis systems (focusing on
objects usage, user localization and user posture &amp; gestures, respectively) share
information among each other and with a reasoning system which is responsible
for the recognition of all occurrences of activities of interest. The adopted
temporal reasoner uses and extends Allen’s interval algebra to model the activities
as sets of temporal constraints [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ].
3
      </p>
    </sec>
    <sec id="sec-3">
      <title>Experimental Evaluation</title>
      <p>Listing 1.1 reports the models of sitting and standing. The field Head defines the
entity it refers to and the name of the model, separated by a ::. As an example,
Head Human::SitDown() indicates that whenever the reported constraints are
satisfied, the reasoner should infer that the activity of sitting has been executed
by the person. The field RequiredState defines the sensor values which
correspond to the execution of the motion. The field Constraint defines the temporal
relation between each sensor value of interest and the activity.</p>
      <p>Listing 1.1. DDL models for the activities sit down and stand up.
)
( S i mp l e O p e r a to r
( Head Human : : SitDown ( ) )
( R e q u i r e d S t a t e r e q 1 G e s tu r e : : S i t ( ) )
( R e q u i r e d S t a t e r e q 2 P o s t u r e : : S i t t i n g ( ) )
( R e q u i r e d S t a t e r e q 3 C h a i r : : On ( ) )
( C o n s t r a i n t OverlappedBy ( Head , r e q 1 ) )
( C o n s t r a i n t During ( Head , r e q 2 ) )
( C o n s t r a i n t EndEnd ( Head , r e q 3 ) )
( S i mp l e O p e r a to r
( Head Human : : SitDown ( ) )
( R e q u i r e d S t a t e r e q 1 G e s tu r e : : S i t ( ) )
( R e q u i r e d S t a t e r e q 2 P o s t u r e : : S i t t i n g ( ) )
( R e q u i r e d S t a t e r e q 3 Armchair : : On ( ) )
( C o n s t r a i n t OverlappedBy ( Head , r e q 1 ) )
( C o n s t r a i n t During ( Head , r e q 2 ) )
( C o n s t r a i n t EndEnd ( Head , r e q 3 ) )
( S i mp l e O p e r a to r
( Head Human : : StandUp ( ) )
( R e q u i r e d S t a t e r e q 1 G e s tu r e : : Stand ( ) )
( R e q u i r e d S t a t e r e q 2 P o s t u r e : : S ta n d i n g ( ) )
( R e q u i r e d S t a t e r e q 3 Human : : SitDown ( ) )
( C o n s t r a i n t MetByOrOverlappedBy ( Head , r e q 1 ) )
( C o n s t r a i n t S t a r t s ( Head , r e q 2 ) )
( C o n s t r a i n t MetByOrAfter ( Head , r e q 3 ) )</p>
      <p>We performed preliminary tests defining sequences of sensor values to
analyse the reasoner inferences they trigger. In Figure 2, the timeline of the context
variable Human is computed by the reasoner and list all corresponding recognized
activities, as indicated by the Head fields. The other timelines report the sensor
values (i.e., gesture, posture, location and objects sensors). At each time instant,
the reasoner samples the sensors, keeping track of all modelled activities which
are consistent with the sensors readings up to that instant (i.e., those that could
be the one currently being executed). As time passes, the number of possible
activities progressively reduces, until it converges to the one effectively performed,
if it is among the modelled ones, or to none. All sensors or context variables
statuses supporting an inferred activity are marked with a blue filling.</p>
      <p>Figure 2 reports the simulated sensor readings related to a person who drops
a heavy bag on the kitchen chair, then walks to the living-room and sits on
the armchair. Although the chair pressure sensor is activated by the bag (for
t = [16; 35]), the wearable gesture and posture sensors do not signal any sitting
motion, therefore preventing the reasoner from making an erroneous inference.
Later on, when the person sits on the armchair, environmental and wearable
sensors agree on indicating that the person sat down, therefore triggering the
correct recognition of the sitting motion. The example also highlights one
advantage deriving from overloading rules. Since the two modelled sitting actions
(i.e., sitting down on the kitchen chair and sitting down on the armchair) are
defined as SitDown, it is possible to define a single model for the standing up
motion, constrained by the previous occurrence of any sitting action.
In this abstract, we introduce the idea of a multi-modal monitoring system,
which combines information retrieved via different monitoring approaches and
we prove, in simulation, that the integration of wearable and environmental
information is beneficial for the purposes of human activity monitoring. Future
work will focus on the set up of a test bed apartment in an elderly care facility in
O¨ rebro, Sweden, to test the performance of the system under realistic conditions.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Johnson</surname>
            ,
            <given-names>D.O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cuijpers</surname>
            ,
            <given-names>R.H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Juola</surname>
            ,
            <given-names>J.F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Torta</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Simonov</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Frisiello</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bazzani</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yan</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Weber</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wermter</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Meins</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Oberzaucher</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Panek</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Edelmayer</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mayer</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Beck</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Socially Assistive Robots: A Comprehensive Approach to Extending Independent Living</article-title>
          .
          <source>International Journal of Social Robotics</source>
          <volume>6</volume>
          ,
          <issue>2</issue>
          ,
          <fpage>195</fpage>
          -
          <lpage>211</lpage>
          (
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Katz</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chinn</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cordrey</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          :
          <article-title>Multidisciplinary studies of illness in aged persons: a new classification of functional status in activities of daily living</article-title>
          .
          <source>J. Chron. Dis. 9</source>
          ,
          <issue>1</issue>
          ,
          <fpage>55</fpage>
          -
          <lpage>62</lpage>
          (
          <year>1959</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Alam</surname>
            ,
            <given-names>M.R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Reaz</surname>
            ,
            <given-names>M.B.I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ali</surname>
            <given-names>M.A.M.:</given-names>
          </string-name>
          <article-title>A Review of Smart HomesPast, Present, and Future</article-title>
          .
          <source>IEEE Trans on Systems, Man, and Cybernetics</source>
          , Part C:
          <article-title>Applications</article-title>
          and Reviews 42,
          <issue>6</issue>
          ,
          <fpage>1190</fpage>
          -
          <lpage>1203</lpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Lara O.D.</surname>
          </string-name>
          ,
          <string-name>
            <surname>Labrador</surname>
            ,
            <given-names>M.A.</given-names>
          </string-name>
          :
          <article-title>A Survey on Human Activity Recognition using Wearable Sensors</article-title>
          .
          <source>IEEE Communications Surveys and Tutorials</source>
          ,
          <volume>15</volume>
          ,
          <issue>3</issue>
          ,
          <fpage>1192</fpage>
          -
          <lpage>1209</lpage>
          (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Bruno</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mastrogiovanni</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sgorbissa</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vernazza</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zaccaria</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          :
          <article-title>Human motion modelling and recognition: A computational approach</article-title>
          .
          <source>In: IEEE Int Conf on Automation Science and Engineering (CASE)</source>
          ,
          <fpage>156</fpage>
          -
          <lpage>161</lpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Bruno</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mastrogiovanni</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sgorbissa</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vernazza</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zaccaria</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          :
          <article-title>Analysis of human behavior recognition algorithms based on acceleration data</article-title>
          .
          <source>In: IEEE Int Conf on Robotics and Automation (ICRA)</source>
          ,
          <fpage>1602</fpage>
          -
          <lpage>1607</lpage>
          (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Kangas</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Konttila</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lindgren</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Winblad</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jamsa</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>Comparison of low-complexity fall detection algorithms for body attached accelerometers</article-title>
          .
          <source>Gait &amp; Posture</source>
          <volume>28</volume>
          ,
          <fpage>285</fpage>
          -
          <lpage>291</lpage>
          (
          <year>2008</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Saffiotti</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Broxvall</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gritti</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>LeBlanc</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lundh</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rashid</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Seo</surname>
            ,
            <given-names>B.S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cho</surname>
            ,
            <given-names>Y.J.:</given-names>
          </string-name>
          <article-title>The PEIS-ecology project: vision and results</article-title>
          .
          <source>In: IEEE/RSJ Int Conf on Intelligent Robots and Systems (IROS)</source>
          ,
          <fpage>2329</fpage>
          -
          <lpage>2335</lpage>
          (
          <year>2008</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Pecora</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cirillo</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dell'Osa</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ullberg</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Saffiotti</surname>
            ,
            <given-names>A.:</given-names>
          </string-name>
          <article-title>A constraint-based approach for proactive, context-aware human support</article-title>
          .
          <source>Journal of Ambient Intelligence and Smart Environments</source>
          <volume>4</volume>
          ,
          <fpage>347</fpage>
          -
          <lpage>367</lpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>