<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>August</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Behavior Classification for Bed Monitoring Using Short-Term 60 GHz-Band FMCW Radar Images</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Shuhei Hashimoto</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Xiangbo Kong</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Kazuhide Kamiya</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Kenshi Saho</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Electronic and Computer Engineering, Ritsumeikan University</institution>
          ,
          <addr-line>1-1-1 Noji-Higashi, Kusatsu, Shiga</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Department of Intelligent Robotics, Toyama Prefectural University</institution>
          ,
          <addr-line>5180 Kurokawa, Imizu, Toyama</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2024</year>
      </pub-date>
      <volume>1</volume>
      <fpage>9</fpage>
      <lpage>22</lpage>
      <abstract>
        <p>This paper shows a practical 60 GHz-band radar-based method to recognize behaviors for bed monitoring systems. Time-range images, derived from short-term data, were produced through the processing of frequency-modulated continuous-wave (FMCW) radar signals. Using the generated images, behaviors such as leaving the bed, lying on the bed, and sitting on the bed were accurately classified with a classification accuracy of 96% using a convolutional neural network with a relatively light-sized model of MobileNet. Furthermore, six representative behaviors in bed monitoring were classified with an accuracy of 83.1%. These findings indicate that 60 GHz-band millimeter-wave radar holds promise as a non-invasive bed monitoring tool.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Millimeter-wave radar</kwd>
        <kwd>60 GHz FMCW radar</kwd>
        <kwd>Human activity recognition</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        In the elderly care sector, the shortage of careers relative to the number of individuals requiring
assistance has led to an increased burden on each caregiver. Consequently, round-the-clock
monitoring in nursing homes to prevent falls or wandering is required and is contributing to
caregiver workload. Previous researches have focused on sensor-based monitoring systems to
track bed-bound behaviors, with early detection of bed-leaving being crucial for identifying
abnormal situations. These systems have employed various devices such as cameras, pressure
sensors, wearable sensors, and infrared cameras [
        <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
        ]. While pressure sensors are easy to install
and are easy to detect bed-leaving, their accuracy for the behavior classification is insuficient.
Wearable sensors require caregiver assistance for attachment and removal, adding to their
workload. Camera systems, including Infrared cameras, ofer non-contact monitoring but raise
privacy concerns and installation challenges within private rooms.
      </p>
      <p>
        To tackle the above problems, recent research has explored the integration of radar technology
into monitoring systems designed for the elderly [
        <xref ref-type="bibr" rid="ref3 ref4">3, 4</xref>
        ]. Radar technique enables non-contact
measurements of the distance and velocity information, and can operate efectively in low-light
conditions. It does not raise privacy concerns because it does not capture visual data. Recently,
mill-meter wave radar techniques have achieved accurate human pose detection [
        <xref ref-type="bibr" rid="ref5 ref6">5, 6</xref>
        ]. However,
such techniques require long-term data and processing with heavy computational load.
      </p>
      <p>
        Thus, we developed a bed-leaving detection method using short-term radar data [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. We used
millimeter-wave radar images, which fully contain range and velocity information of human
body parts, to classify human behaviors related to bed-leaving and bed-lying. By applying
frequency-modulated continuous-wave (FMCW) radar signal processing, time-range
(timedistance) and time-velocity images were generated as the input of machine learning-based
behavior classification models. Accurate classifications were demonstrated using the images
corresponding to short-time data for real-time monitoring systems for nursing homes. As a
result, the convolutional neural network (CNN) utilizing time-range images accurately classified
bed-leaving and other behaviors, including sitting on the bed, with over 90% accuracy. However,
this study conducted only binary classification, e.g., a simple classification of two classes of
bed-leaving and bed-lying; results are lacking in terms of developing a monitoring system that
includes a variety of behaviors.
      </p>
      <p>In this paper, we report the experimental results for multinomial classification of human
behaviors for bed monitoring. We assume six classes of behaviors related to bed-leaving and
bed-lying activities. Classification accuracies using practical CNN models with the inputs of the
short-term time-range images were evaluated for 10 participant data.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Radar Experimental Setup</title>
      <p>Figure 1 shows the experimental setup. A 60 GHz FMCW radar was installed at a height of 40
cm from the surface of the bed. The radar bandwidth was B = 6.8 GHz, the range resolution
was ∆ R = c/2B = 2.4 cm (where c is the speed of light) and its frame rate was 80 Hz. Radar
directivity in a plane parallel to the bed surface was 120°.</p>
      <p>The participants were ten young adults. Instructions were given before measurement and
each behavior was measured for the 60 s with the pulse repetition interval of 80 Hz. Our
experiments aim to classify representative behaviors on and around the bed via the unstrained
measurements using the radar. The assuming behaviors were "bed-leaving", "bed-lying", "sitting
square", "long sitting", "standing outside" and "lying outside". Fig. 2 shows the experimental
sites for some classes. Each behavior is defined as follows.</p>
      <p>• bed-leaving: A participant is not in the radar measurement area.
• bed-lying: A participant is lying on the bed. This includes the participant’s slight motions
in the bed.
• sitting square: A participant is sitting on the edge of the bed.
• long sitting: A participant is sitting with extended legs on the bed.
• standing outside: A participant is standing by the bed.
• lysing outside: A participant is lying on the floor by the bed. This class assumes abnormal
situation.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Generation of Time-Range Images</title>
      <p>
        To classify the behaviors, we calculate the time-range distributions from the radar received
signals similar to our previous work [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. We obtain time-range distribution images by Fourier
transform of the received signal corresponding to each pulse transmission. For every 0.5 seconds,
a 224 × 224 PNG image with horizontal axis of time and vertical axis of range was generated
whose color corresponds to the received power.
      </p>
      <p>Fig. 3 shows the examples of time-range images of the six classes. In these images, the vertical
axis means that the lower part of the image is further away. There are no significant components
in the "bed-leaving" image because the participant does not exist in the measurement area. The
"lying outside" image is similar to "bed-leaving" except for the slight components in a relatively
far range. We can confirm the similar significant components in the "bed-lying", "sitting square",
and "long sitting" images. In the "standing outside" images, the significant components are
confirmed in far range compared with "bed lying". We used these images for the CNN inputs to
classify the six classes based on the slight diferences described above.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Results and Discussion for Behavior Classification using CNN</title>
      <p>In this section, the classification results for representative behaviors related to bed-leaving
and -lying movements are presented. For each classification, the data to evaluate classification
performance was obtained from 10 participants, and the mean classification accuracy in the
leave-one-subject-out was evaluated. We generated about 100 time-range images corresponding
to each behavior for all participants.</p>
      <p>The behavior corresponding to each image is classified using the CNN. For the CNN
architecture, we used the ResNet-18 [8] because it was demonstrated to be eficient for radar-based
human activity recognition problems [9]. This study used the ResNet-18. In addition, we also
used MobileNetV3 [10] because the ResNet model size is relatively large, and a smaller model
size is required for practical use based on edge computing. For both models, we performed
training for 50 epochs and used a batch size of 64. The learning rate was 0.01 and was decreased
by multiplying it by 0.5 every ten epochs. These hyperparameters were empirically optimized.</p>
      <sec id="sec-4-1">
        <title>4.1. Classification of Bed-Leaving and Other Behaviors</title>
        <p>Because the detection of bed-leaving is the most important function for daily bed monitoring
systems, this subsection considers the classification of bed-leaving and other five behaviors.
Table 1 shows the results for the ResNet with a mean classification accuracy of 97.8%. Suficient
accuracy was achieved, and the rate of misclassification of bed-leaving as other behaviors
is lower. The result is considered safe in terms of developing a monitoring system. Table 2
shows the results for the MobileNet with a mean classification accuracy of 95.9%. Although the
accuracy of MobileNet is lower than that of ResNet, suficiently accurate detection of bed-leaving
is achieved even when using MobileNet with a smaller model size.</p>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Multinomial Classification of Behaviors</title>
        <p>We classify six behaviors to develop more useful multinominal behavior classification functions
for the bed monitoring systems. Table 3 shows the results for the ResNet, with a mean
classification accuracy of 89.7%. As indicated in this table, the classes "sitting square" and "long sitting"
were misclassified because the features of these classes were not suficiently expressed in the
time-range image. In addition, "lying outside" was also largely misclassified because the part of
the participant was out of measurement area, and it concluded time range images included less
information. However, the important classes of "bed-leaving" and "bed-lying" were accurately
classified. Thus, we achieve the behavior classifications with moderate accuracy even using the
images generated from short-term data of 0.5 s.</p>
        <p>Table 4 shows the results for the MobileNet with the mean classification accuracy of 68.5%.
Because MobileNet is a simple CNN model, classification accuracy is lower than that of ResNet.
However, the model sizes of the ResNet and MobileNet in this study were 18.0 and 0.2 MB,
respectively. Thus, the performance improvement using the MobileNet is important future
study to develop practical bed monitoring systems with simple implementation. Multiple
images corresponding to multiple receiving antennas can be used to improve the classification
performance.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusion</title>
      <p>To develop practical remote bed monitoring systems for nursing care, this study aimed to classify
various behaviors related to bed-leaving and bed-lying using short-term data generated via the
received signals of 60 GHz millimeter-wave FMCW radar. The CNN, utilizing time-range images
generated from radar data at 0.5 s intervals, achieved the accurate classification of bed-leaving
and other representative behaviors (bed-lying, sitting square, long sitting, standing-out of the
bed, and lying-outside of the bed). Even when we used the simple MobileNet, the detection
rate of bed-leaving was 95.9%.In addition, we achieved the above six types of behaviors with
89.7% accuracy using the ResNet. However, the classification accuracy using the MobileNet
was 68.5%, although it has merits in suficiently smaller model size. Thus, to develop practical
bed monitoring systems, future investigation to improve the classification accuracy using the
multiple images of multiple receivers is required.
related to bed-leaving and bed-lying using millimeter-wave fmcw radar., in: ATAIT, 2023,
pp. 105–114.
[8] K. He, X. Zhang, S. Ren, J. Sun, Deep residual learning for image recognition, in:
Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp.
770–778.
[9] S. Z. Gurbuz, M. G. Amin, Radar-based human-motion recognition with deep learning:
Promising applications for indoor monitoring, IEEE Signal Processing Magazine 36 (2019)
16–28.
[10] A. G. Howard, Mo-bilenets: Eficient convolutional neural networks for mo-bile vision
applications, arXiv preprint arXiv:1704.04861 (2017).</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>L.-J.</given-names>
            <surname>Kau</surname>
          </string-name>
          , M.-
          <string-name>
            <given-names>Y.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Zhou</surname>
          </string-name>
          ,
          <article-title>Pressure-sensor-based sleep status and quality evaluation system</article-title>
          ,
          <source>IEEE Sensors Journal</source>
          <volume>23</volume>
          (
          <year>2023</year>
          )
          <fpage>9739</fpage>
          -
          <lpage>9754</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>C.-J. Lin</surname>
            ,
            <given-names>C.-H.</given-names>
          </string-name>
          <string-name>
            <surname>Shih</surname>
          </string-name>
          , T.-S. Wei, P.-T. Liu, C.-Y. Shih,
          <article-title>Local object tracking using infrared array for bed-exit behavior recognition</article-title>
          .,
          <source>Sensors &amp; Materials</source>
          <volume>34</volume>
          (
          <year>2022</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>D.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Qi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Sheng</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Huang</surname>
          </string-name>
          ,
          <article-title>Wise information technology of med: human pose recognition in elderly care</article-title>
          ,
          <source>Sensors</source>
          <volume>21</volume>
          (
          <year>2021</year>
          )
          <fpage>7130</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>F.-K.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-K.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.-P.</given-names>
            <surname>Lin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.-Y.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.-M.</given-names>
            <surname>Yeh</surname>
          </string-name>
          , C.-Y. Wang,
          <article-title>Detecting anomalies of daily living of the elderly using radar and self-comparison method</article-title>
          ,
          <source>in: 2022 18th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA)</source>
          , IEEE,
          <year>2022</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>6</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>A.</given-names>
            <surname>Sengupta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Jin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , S. Cao, mm-pose:
          <article-title>Real-time human skeletal posture estimation using mmwave radars and cnns</article-title>
          ,
          <source>IEEE Sensors Journal</source>
          <volume>20</volume>
          (
          <year>2020</year>
          )
          <fpage>10032</fpage>
          -
          <lpage>10044</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>X.</given-names>
            <surname>Zhou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Jin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Dai</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Song</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Qiu</surname>
          </string-name>
          ,
          <article-title>Md-pose: Human pose estimation for singlechannel uwb radar</article-title>
          ,
          <source>IEEE Transactions on Biometrics, Behavior, and Identity Science</source>
          <volume>5</volume>
          (
          <year>2023</year>
          )
          <fpage>449</fpage>
          -
          <lpage>463</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>S.</given-names>
            <surname>Hashimoto</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Kong</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Manabe</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Minematsu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Saho</surname>
          </string-name>
          , Classification of behaviors
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>