<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Verification of Pulse Rate Estimation Accuracy for Human-recorder</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Daiki Nomura</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Tsubasa Kinoshita</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Takeshi Kumaki</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Kyosuke Kageyama</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Dept. of Electrical, Electronic and Communication Engineering, Kindai University</institution>
          ,
          <addr-line>3-4-1 Kowakae, Higashi-Osaka, Osaka</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Dept. of Electronic and Computer Engineering, Ritsumeikan University</institution>
          ,
          <addr-line>1-1-1 Noji-Higashi, Kusatsu, Shiga</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Graduate School of Science and Engineering, Kindai University</institution>
          ,
          <addr-line>3-4-1 Kowakae, Higashi-Osaka, Osaka</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
      </contrib-group>
      <fpage>142</fpage>
      <lpage>149</lpage>
      <abstract>
        <p>Rescue workers at disaster sites and construction site operators routinely perform demanding tasks. They become fatigued due to these demanding tasks. This fatigue causes discomfort and a lack of motivation at work which afects both work eficiency and safety. Therefore, it is necessary to constantly monitor their physical condition. Several existing technologies are known to detect fatigue using a face image or wearable device. However, these systems have several issues. Recording the face carries a risk of invading the subject's privacy, and it is inconvenient to constantly wear a wearable device. In recent years, a head-mounted camera which is attached to a helmet is widespread. Therefore, a human-recorder has been proposed to obtain the user's biometric information and to detect fatigue from the head-mounted camera. The human-recorder extracts head sway in a video recorded by the head-mounted camera, and analyzes the time change of head sway. In this paper, the estimated pulse rate is calculated using the human-recorder to confirm that the pulse rate can be estimate in real time. Furthermore, the efect on the accuracy of the pulse rate estimation is verified by the camera frame rate. From the results, it was verified that the pulse rate can be estimated using the human-recorder. Also, it was confirmed that pulse rate estimation is possible regardless of the camera frame rate.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Human-recorder</kwd>
        <kwd>Pulse rate estimation</kwd>
        <kwd>Optical flow</kwd>
        <kwd>Frequency analysis</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>sites. Also, it can be used hands-free operation. Therefore, a human-recorder has been proposed
to obtain the user’s biometric information for fatigue detection using the head-mounted camera
[5, 6, 7]. The human-recorder extracts head sway in a video recorded by the head-mounted
camera, and analyzes the time change of head sway. In this paper, the estimated pulse rate is
calculated using the human-recorder to confirm that the pulse rate can be estimate in real time.
Furthermore, the efect on the accuracy of the pulse rate estimation is verified by the camera
frame rate. This analysis is important because it supports more practical and real-time use at
disaster sites and construction site.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Human-recorder for pulse estimation</title>
      <p>
        (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) The feature point 1(1, 1) is determined from the first frame in the recorded video. The
feature coordinate point (, ) each frame in the recorded video is calculated using
optical flow.
(
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) The time change of the feature coordinate point in x-coordinate is extracted and applying
frequency analysis. From the result of frequency analysis, the user’s pulse rate is estimated.
      </p>
      <p>
        Head-mounted (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) Feature point tracking
camera
      </p>
      <p>
        (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) Frequency analysis
( , )
m
u
r
t
c
e
p
s
r
e
w
o
      </p>
      <p>P
time
( , )</p>
      <p>Frequency</p>
      <sec id="sec-2-1">
        <title>2.1. The feature coordinate point calculation using optical flow</title>
        <p>The feature coordinate point is calculated using optical flow to extract the user’s head sway (Fig
2 (b)). The first feature point 1(1, 1) is determined from the first frame in the recorded video.
The feature point (, ) each frame is calculated by tracking the feature point 1(1, 1)
using optical flow, and the time change of the feature coordinate point in x-coordinate (Fig 2
(c)) and y-coorfinate (Fig 2 (d)) are extracted. The x-coordinate time change is analyzed in this
paper.</p>
      </sec>
      <sec id="sec-2-2">
        <title>2.2. Pulse rate estimation using frequency analysis</title>
        <p>FFT
Band-pass
filter
r
e
w
o
P</p>
        <sec id="sec-2-2-1">
          <title>Peak</title>
        </sec>
        <sec id="sec-2-2-2">
          <title>Frequency [Hz] (c) Pulse analysis applied band-pass filter</title>
          <p>(c)
Battery
Case
Pulse
sensor</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Experiment procedure</title>
      <sec id="sec-3-1">
        <title>3.1. Pulse rate estimation using human-recorder</title>
        <p>In this experiment, a video is recorded by the human-recorder and analyzed to confirm that
real-time pulse rate estimation is possible. Figure 4 shows the overview of pulse rate estimation
using human-recorder. Raspberry Pi Zero 2W controls camera module 3 at 50 fps in this
prototype. Three participants put on this prototype and record a video for 120 seconds in a
sitting position. Then, the estimated pulse rate is calculated by analyzing the video recorded
using the human-recorder. Additionally, the actual pulse rate of the participants is measured
using a pulse sensor while they record a video to compare with the estimated pulse rate.
(a) Recording situation</p>
        <p>(b) Human-recorder</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Verification pulse rate estimation accuracy by fps</title>
        <p>In this experiment, the efect on the accuracy of the pulse rate estimation is verified by setting
the fps values to 30, 40, 50, and 60 in the same situation as in Section 3.1. The Root Mean Square
Error (RMSE) is used to evaluate the accuracy of predictions by calculating the square root
of the mean of the squared diferences between predicted and actual values. It is defined by
equation 3 provides a single value that summarizes how close the estimated values are to the
true values. Smaller RMSE values indicate higher accuracy and better model performance.
⎯</p>
        <p>
          RMSE = ⎷⎸⎸ 1 ∑︁ ( − ˆ )2

=1
(
          <xref ref-type="bibr" rid="ref3">3</xref>
          )
        </p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Experiment result</title>
      <sec id="sec-4-1">
        <title>4.1. Result of pulse rate estimation using human-recorder</title>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Verification of pulse rate estimation by fps</title>
      </sec>
      <sec id="sec-4-3">
        <title>4.3. Conclusion</title>
        <p>In this paper, the estimated pulse rate was calculated using the human-recorder to confirm that
pulse rate can be estimated in real time. Furthermore, the efect of the camera’s frame rate on
the accuracy of pulse rate estimation was evaluated. The results demonstrated that the pulse
rate measured by the human-recorder was in close agreement with the actual pulse rate. In
other words, the human-recorder enables pulse rate estimation. Based on the RMSE results, it
can also be confirmed that the diference in fps has little efect on the accuracy of pulse rate
estimation. However, it was observed that the number of reliable feature coordinate points
decreases as the fps decreases, which may negatively afect estimation accuracy. To address this
issue, future work will focus on developing methods to interpolate missing or sparse feature
point data, especially under low-fps conditions, in order to maintain stable performance.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Declaration on Generative Al</title>
      <p>The author(s) have not employed any Generative Al tools.
“Verification of Fatigue Detection Human-recorder Using Depth Estimation Method,” , Vol.
2025-EMB-68, No. 30, pp. 1 – 6, March 2025.
[8] https://www.raspberrypi.com/documentation/computers/raspberry-pi.html (Accessed 1 Ju
ne 2025.)
[9] https://www.raspberrypi.com/products/camera-module-3/ (Accessed 1 June 2025.)</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1] https://j-fatigue.jp/wp-content/uploads/2024/02/guideline.pdf.
          <source>(Accessed 1 Jun</source>
          <year>2025</year>
          .)
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>Iori</given-names>
            <surname>Suzuki</surname>
          </string-name>
          and Fumiaki Satou:
          <source>Stress Estimation Based on Heart Rate Variability Detected by Wearable Devices</source>
          ,
          <source>2020 Information Processing Society of Japan</source>
          , Vol. 2020-CSEC-88, No.
          <volume>30</volume>
          ,
          <string-name>
            <surname>Mar</surname>
          </string-name>
          .
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Takuro</given-names>
            <surname>Nonomura</surname>
          </string-name>
          ,
          <article-title>Tsuyoshi Hayakawa and Hiroshi Yamamoto: Research and Development of a Fatigue Estimation System Using Multidimensional Analysis of Webcam Images, The Institute of Electronics, Information and Communication Engineers</article-title>
          ,
          <source>ISS-SP-030</source>
          , p.
          <fpage>166</fpage>
          ,
          <string-name>
            <surname>Mar</surname>
          </string-name>
          .
          <year>2021</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4] https://www.fdma.go.jp/singi_kento/kento/items/post-51/03/shiryou1.pdf.
          <source>(Accessed 1 Jun</source>
          <year>2025</year>
          .)
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Daiki</given-names>
            <surname>Nomura</surname>
          </string-name>
          , Masato Inoue, Takehiro Nakano, Yoshihiko Hibino, Takeshi Kumaki, and Kyosuke Kageyama, “
          <article-title>Feature Extraction of Head Movements using the Fatigue Detection Human-recorder,”</article-title>
          <source>Journal of Signal Processing</source>
          , vol.
          <volume>29</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>103</fpage>
          -
          <lpage>106</lpage>
          ,
          <year>2025</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Daiki</given-names>
            <surname>Nomura</surname>
          </string-name>
          , Masato Inoue, Takehiro Nakano, Yoshihiko Hibino, Takeshi Kumaki, and Kyosuke Kageyama, “
          <article-title>Study of Human Recorder for Fatigue Detection with Head-</article-title>
          mouted
          <string-name>
            <surname>Camera</surname>
          </string-name>
          ,”
          <source>2024 Annual Conference on Electronics, Information and Systems</source>
          ,
          <source>The Institute of Electrical Engineers of Japan</source>
          , pp.
          <fpage>825</fpage>
          -
          <lpage>828</lpage>
          , Sep.,
          <year>2024</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>Daiki</given-names>
            <surname>Nomura</surname>
          </string-name>
          , Masato Inoue, Yoshihiko Hibino, Takeshi Kumaki, and Kyosuke Kageyama,
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>