<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Hybrid Loss for Robust FMC W Radar-based Heartbeat Sensing</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Ying Wang</string-name>
          <email>ying.wang@nuist.edu.cn</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Zhaodong Sun</string-name>
          <email>zhaodong.sun@nuist.edu.cn</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Xu Cheng</string-name>
          <email>xcheng@nuist.edu.cn</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Zuxian He</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="editor">
          <string-name>FMCW Radar, Remote Heart Rate Sensing, Deep Learning, Vital Signs</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>210004</institution>
          ,
          <country country="CN">China</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>School of Computer Science, Nanjing University of Information Science and Technology</institution>
          ,
          <addr-line>219 Ningliu Road, Nanjing, Jiangsu</addr-line>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>The 4th Vision-based Remote Physiological Signal Sensing (RePSS) Challenge &amp; Workshop</institution>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2025</year>
      </pub-date>
      <abstract>
        <p>Remote physiological sensing using Frequency Modulated Continuous Wave (FMCW) radar has emerged as a promising alternative to contact-based methods due to its non-intrusive nature and privacy preservation. However, existing signal-processing and CNN-based approaches sufer from phase wrapping ambiguities, noise sensitivity, and limited ability to capture long-range dependencies in heartbeat dynamics. In this work, we propose a novel CNN-Transformer framework for supervised radar-based heartbeat measurement. The CNN component extracts local temporal features, while the transformer encoder models long-range dependencies critical for periodic cardiac motion. To further enhance performance, we design a hybrid loss function that integrates Negative Pearson Loss, Signal-to-Noise Ratio (SNR) Loss, and Sparsity Loss, efectively balancing temporal fidelity, noise robustness, and physiologically meaningful frequency representation. We additionally introduce RadHR, a new FMCW radar dataset with recordings from 50 participants, providing a high-quality benchmark for non-contact heartbeat estimation. Extensive experiments on both the public EquiPleth dataset and RadHR demonstrate that our method consistently outperforms existing baselines, achieving state-of-the-art accuracy and robustness under realistic conditions.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>W</p>
    </sec>
    <sec id="sec-2">
      <title>1. Introduction</title>
      <p>CEUR
Workshop</p>
      <p>ISSN1613-0073</p>
      <p>
        Beyond reviews, numerous radar architectures and signal processing pipelines have been explored
for real-world vital sign detection. Impulse-radio ultra-wideband (IR-UWB) radar, owing to its fine
temporal resolution, has been widely applied to monitor heartbeat and respiration. Early systems
demonstrated non-contact heart rate monitoring using IR-UWB signals under controlled conditions
[
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], while subsequent preclinical studies validated simultaneous monitoring of respiration and carotid
pulsation, paving the way for clinical applicability [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. IR-UWB radar has also been deployed in
challenging enclosed environments, such as inside vehicles, to detect and localize passengers while
extracting vital signs through non-line-of-sight measurements [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
      </p>
      <p>
        In addition to IR-UWB methods, self-calibrating radar systems have been proposed to improve stability
and adaptability across diferent users and scenarios. These approaches automatically adjust system
parameters to mitigate the efects of channel variation and hardware non-idealities, thereby enhancing
robustness for long-term monitoring [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. Meanwhile, mm-wave FMCW radars have been validated for
remote monitoring of human vital signs, showing strong resilience to environmental interference and
enabling compact, low-power implementations suitable for pervasive healthcare systems [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ].
      </p>
      <p>
        Another emerging application domain is radar-based vital sign monitoring in automotive
environments. Detecting passenger presence and health conditions inside vehicles is particularly challenging
due to vibrations and motion artifacts. Recent studies have conducted both theoretical investigations
[
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] and practical experiments [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ], demonstrating the feasibility of extracting respiration and heartbeat
information even in the presence of strong vehicle vibrations. These findings extend the applicability
of radar sensing from controlled laboratory settings to highly dynamic real-world conditions.
      </p>
      <p>
        Despite these advancements, conventional radar-based heartbeat sensing approaches typically rely on
extracting and unwrapping signal phases to recover heartbeat dynamics [
        <xref ref-type="bibr" rid="ref12 ref13 ref9">9, 12, 13</xref>
        ]. However, such
methods remain highly susceptible to motion artifacts, multipath interference, and low signal-to-noise ratio
(SNR) conditions. Phase wrapping ambiguities and noise sensitivity often lead to significant performance
degradation, particularly in realistic environments where subjects are not perfectly stationary.
      </p>
      <p>
        To overcome these limitations, recent advances in supervised deep learning have demonstrated the
ability to learn complex spatiotemporal representations directly from radar signals [
        <xref ref-type="bibr" rid="ref14">14, 15, 16</xref>
        ]. By
bypassing explicit phase unwrapping, these methods achieve greater robustness under noise and motion.
Nevertheless, most existing deep learning frameworks are dominated by convolutional neural networks
(CNNs), which are efective at capturing local temporal features but struggle to model long-range
dependencies that are critical for representing periodic heartbeat dynamics.
      </p>
      <p>In this work, we propose a supervised FMCW radar-based heartbeat measurement framework that
combines CNNs with Transformers. Our main contributions are summarized as follows:
• We design a novel framework for radar heartbeat sensing, which integrates 1D CNN layers for local
feature extraction with Transformer encoders for modeling long-range temporal dependencies,
addressing the limitations of CNN-only baselines.
• We collected a new radar dataset (RadHR) containing 50 individuals for heartbeat sensing
benchmark, and the dataset will be made public upon request.
• The proposed model demonstrates superior resilience against motion artifacts, multipath
interference, and low-SNR conditions, which commonly degrade the performance of traditional
signal-processing and CNN-based methods.
• We conduct extensive experiments on FMCW radar data, showing that our approach consistently
outperforms state-of-the-art CNN-based baselines in heartbeat sensing accuracy and robustness.</p>
    </sec>
    <sec id="sec-3">
      <title>2. Methodology</title>
      <sec id="sec-3-1">
        <title>2.1. Preliminaries</title>
        <p>A range matrix is obtained from FMCW radar raw data to facilitate subsequent analysis and processing.
Specifically, the procedure of constructing a range matrix is as follows. In each chirp loop, the FMCW
e
m
i
T</p>
        <sec id="sec-3-1-1">
          <title>Distance</title>
          <p>E
n
c
o
d
e
r
T
r
a
n
s
fr
o
m
e
r
D
e
c
o
d
e
r
FFT</p>
        </sec>
        <sec id="sec-3-1-2">
          <title>PPG Signal</title>
        </sec>
        <sec id="sec-3-1-3">
          <title>PPG Signal</title>
          <p>Q
K
V</p>
        </sec>
        <sec id="sec-3-1-4">
          <title>Multi-Head</title>
        </sec>
        <sec id="sec-3-1-5">
          <title>Attention</title>
        </sec>
        <sec id="sec-3-1-6">
          <title>Add &amp; Norm</title>
        </sec>
        <sec id="sec-3-1-7">
          <title>Feed Forward</title>
        </sec>
        <sec id="sec-3-1-8">
          <title>Add &amp; Norm</title>
          <p>radar transmits a chirp signal () and simultaneously receives the corresponding reflected chirp signal
() . Both () and () are linear frequency modulation signals, commonly referred to as chirp signals.
In particular, the received signal () is mixed with the in-phase and quadrature (IQ) components of the
transmitted signal, denoted as   () and   () , to produce the complex intermediate frequency (IF) signal
() ∈ ℝ  , which can be expressed as:
() ∝</p>
          <p>LPF[  () ⋅ ()] + ,</p>
          <p>LPF[  () ⋅ ()] ∝</p>
          <p>exp((2   + )),  = 2/,  = 4 /
where LPF denotes the low-pass filter,  is the frequency slope of the FMCW signal,  is the distance,
 is the speed of light, and  is the wavelength associated with the FMCW starting frequency. The
frequency  of the IF signal () corresponds to the frequency diference between the transmitted signal
() and the received signal () . Consequently,  is directly proportional to the signal round-trip time
and the distance  between the radar and the target. Likewise, the phase  is also proportional to the
distance, but it is inherently wrapped within the interval [− ,  ] .</p>
          <p>To capture heartbeat dynamics continuously, the radar sequentially transmits  chirps
[ 1(),  2(), … ,   ()] and receives the corresponding reflected signals [ 1(),  2(), … ,   ()] .
Accordingly,  IF signals [ 1(),  2(), … ,   ()] are obtained, where each   () ∈ ℝ  . Since the frequency of
each IF signal   () is a function of the target distance, a fast Fourier transform (FFT) is applied to each
  () to generate the corresponding range profile   [ ] . Finally, by concatenating all  range profiles,
the range matrix is constructed as:</p>
          <p>= [ 1[ ],  2[ ], … ,   [ ]] ∈ ℝ × ,
where  represents the number of chirps and  denotes the number of range bins. This range matrix
serves as the foundation for subsequent feature extraction and heartbeat estimation in our framework.</p>
        </sec>
      </sec>
      <sec id="sec-3-2">
        <title>2.2. CNN-Transformer Module</title>
        <p>
          As described in Section 2.1, the raw FMCW radar signals are converted into a range matrix  ∈ ℝ  × ,
where  is the number of chirps and  is the number of range bins. This range matrix captures both
the distance-related amplitude and phase variations, serving as the input for subsequent heartbeat
estimation. Before feeding into the neural network, we take a window of the range matrix   ∈ ℝ ×
around the central range bin  to get the windowed heartbeat matrix   (⋅,  ± Δ) ∈ ℝ  ×(2Δ+1) as the
input following the previous work [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ].
        </p>
        <p>The first stage of our model is a 1D CNN-based feature extractor, which operates along the temporal
dimension of the range matrix. Specifically, for each range bin   ∈ [1,  ± Δ] , the corresponding
(1)
(2)
where  is the number of feature channels output by the CNN. We adopt ReLU activations, batch
normalization, and dropout to improve training stability and prevent overfitting.</p>
        <p>While CNNs efectively capture local patterns, heartbeat signals exhibit long-range temporal
dependencies that CNNs alone may fail to model. To address this, we incorporate a Transformer encoder after
the CNN stage. The Transformer employs self-attention mechanisms to model interactions between
distant time steps, allowing the network to capture the periodicity and subtle dynamics of cardiac
motion. Given the CNN features  CNN ∈ ℝ × , the Transformer computes:
  
=      (
 
),   
∈ ℝ ×
where  Trans encodes both local and global temporal information. We adopt multi-head attention to
allow the model to focus on multiple temporal patterns simultaneously, followed by a feed-forward
network with residual connections and layer normalization.
temporal sequence   [∶,   ] is processed by stacked convolutional layers with small kernel sizes. These
layers aim to capture local temporal patterns corresponding to heartbeat-induced chest movements.
Formally, the CNN feature extraction can be expressed as:
  
=   (
 ),   
∈ ℝ ×
(3)
(4)
(5)
(6)</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>3. Losses</title>
      <p>
        In this work, we design a composite loss function that combines Negative Pearson Loss, Signal-to-Noise
Ratio (SNR) Loss[
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], and a Sparsity Loss[17] to optimize the supervised heartbeat measurement task
using FMCW radar. This hybrid design allows the model to achieve accurate estimation, suppress noise,
and encourage physiologically meaningful spectral representations.
      </p>
      <sec id="sec-4-1">
        <title>3.2. Signal-to-Noise Ratio(SNR) Loss</title>
        <p>To enhance robustness against noise and motion artifacts, we employ an SNR-based loss that emphasizes
spectral energy concentration around the true heartbeat frequency. Specifically, the loss is defined as:</p>
      </sec>
      <sec id="sec-4-2">
        <title>3.1. Negative Pearson Loss</title>
        <p>The Negative Pearson Loss evaluates the linear correlation between the predicted signal and the ground
truth. The Pearson correlation coeficient ranges from −1 to 1, with higher values indicating stronger
correlations. To maximize similarity, we minimize the negative Pearson coeficient:</p>
        <p>= −  ( , ̂ )
where  ̂ and  denote the predicted and reference heartbeat signals, respectively. This loss function
encourages the model to preserve temporal waveform consistency with the ground truth.
  
(y, ŷ) =</p>
        <p>∫  00−+  |Ŷ( )| 2
∫−0∞− |Ŷ( )| 2 +</p>
        <p>∞
∫ 0+ |Ŷ( )| 2
,  0 =   ( )
where  ( )</p>
        <p>and  (̂ )</p>
      </sec>
      <sec id="sec-4-3">
        <title>3.3. Sparsity Loss</title>
        <p>are the respective Fourier transforms of  and  ̂ and w is the chosen window size.</p>
        <p>We integrate Sparsity Loss with Negative Pearson Loss and SNR Loss motivated by the fact that in
FMCW radar-based heartbeat sensing, the heartbeat frequency typically manifests as the dominant
spectral peak within a physiologically plausible range (e.g., 45–250 bpm). The Sparsity Loss penalizes
predictions that fail to concentrate energy near the main peak within this frequency band:
(7)
(8)
  =
1

∑
=  
 ∗−Δ</p>
        <p>=
[ ∑   +</p>
        <p>∑
= ∗+Δ
  ]
where  ∗ = argmax(Y) and ∆ are the frequencies of the spectral peak and padding around the peak,
respectively. For all experiments ∆ = 6 beats per minute[18].</p>
      </sec>
      <sec id="sec-4-4">
        <title>3.4. Overall Loss</title>
        <p>The final loss function is a weighted combination of the three terms:
  
=  1 
+   
+  2 
where  1,  2 are hyperparameters balancing the contribution of each term.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>4. Experiments</title>
      <sec id="sec-5-1">
        <title>4.1. Datasets and Experimental Setup</title>
        <p>
          4.1.1. Equipleth Dataset
The Equipleth radar dataset [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ] comprises 550 paired facial video and FMCW radar recordings collected
from 91 participants. Skin tones are classified using the Fitzpatrick scale [ 19], with 28, 49, and 14 subjects
representing light, medium, and dark skin tones, respectively, for fairness evaluation. Each participant
contributed six 30-second recordings. Additional details are provided in the supplementary materials.
4.1.2. RadHR
Our self-collected radar heartbeat dataset (RadHR) consists of recordings from 50 participants, each
measured in a stationary condition to minimize motion artifacts. For every subject, FMCW radar signals
were continuously collected for 30 seconds at a sampling rate of 120 frames per second (fps), and
subsequently converted into range matrices following the standard FMCW signal processing pipeline.
This dataset provides high-temporal-resolution radar measurements of subtle chest wall movements,
serving as a reliable benchmark for supervised heartbeat estimation research.
4.1.3. Experimental Setup
Following prior work [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ], we use 10-second windows for training and heart rate evaluation. For
the Equipleth and RadHR dataset, we use the same training protocol as [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ]. The model is optimized
using the AdamW algorithm with a learning rate of 1 × 10−4 for 200 epochs, and the best-performing
checkpoint is selected based on validation set performance. For evaluation, we follow prior work
and report mean absolute error (MAE), root mean squared error (RMSE), and the Pearson correlation
coeficient (  ) as the primary metrics for heart rate estimation.
4.2. Comparison with State-of-the-Art method
(MAE), root mean squared error (RMSE), and Pearson correlation coeficient (  ).
        </p>
        <p>On the EquipIeth dataset, our approach achieves an MAE of 1.82, RMSE of 5.39, and correlation
 = 0.89 , surpassing previous methods and demonstrating robust performance. Similarly, on the RadHR
dataset, our method achieves an MAE of 2.11, RMSE of 2.73, and correlation  = 0.92 , outperforming
all baselines by a clear margin. Notably, compared with the FFT-based Radar method, our approach
reduces the RMSE by more than 85% on RadHR, highlighting the efectiveness of combining CNN and
Transformer architectures with our tailored loss design.</p>
      </sec>
      <sec id="sec-5-2">
        <title>4.3. Ablation Study</title>
        <p>To investigate the contribution of each component in the overall loss function, we conduct an ablation
study on the EquiPleth dataset. The results are summarized in Table 2.</p>
        <p>When only the Pearson Loss is used, the model achieves a relatively high MAE (8.52) and RMSE
(14.12), with a poor correlation coeficient (r = 0.33). This indicates that although Pearson Loss enforces
correlation, it alone is insuficient for stable reconstruction.</p>
        <p>Introducing the SNR Loss significantly improves performance, reducing the MAE to 1.92 and RMSE to
5.33, while the correlation r increases to 0.89. This suggests that the SNR constraint efectively enhances
the signal fidelity by improving the signal-to-noise ratio.</p>
        <p>Finally, when all three losses (Pearson Loss, SNR Loss, and Sparsity Loss) are combined, the model
achieves the best performance, with the lowest MAE (1.82), competitive RMSE (5.39), and the highest
correlation (r = 0.89). The improvement demonstrates that the Sparsity Loss further regularizes the
prediction, helping the model suppress redundant information and capture more discriminative features.</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>5. Conclusion</title>
      <p>In this paper, we presented a CNN-Transformer framework for FMCW radar-based heartbeat
estimation, coupled with a novel hybrid loss design. By leveraging CNNs for local feature extraction and
Transformers for long-range dependency modeling, our approach efectively captures both fine-grained
and global temporal patterns of cardiac dynamics. The integration of Pearson Loss, SNR Loss, and
Sparsity Loss further enhances robustness by encouraging waveform fidelity, noise suppression, and
physiologically consistent spectral concentration. To support the community, we introduced RadHR, a
new radar heartbeat dataset comprising recordings from 50 subjects under stationary conditions.
Experimental results on both RadHR and the EquiPleth dataset demonstrated that our method outperforms
conventional signal-processing and deep learning baselines in terms of MAE, RMSE, and correlation
coeficient.</p>
    </sec>
    <sec id="sec-7">
      <title>Acknowledgments</title>
      <p>This work was supported by the National Natural Science Foundation of China (Grant No. 62572249),
the Natural Science Foundation of Jiangsu Province (Grant No. BK20250742), the Startup Foundation
for Introducing Talent of NUIST (Grant No. 1083142501006), and the Postgraduate Research &amp; Practice
Innovation Program of Jiangsu Province (Grant No. SJCX25_0518).</p>
    </sec>
    <sec id="sec-8">
      <title>Declaration on Generative AI</title>
      <p>During the preparation of this work, the authors used ChatGPT in order to: Grammar and spelling
check, Paraphrase, and reword. After using this tool/service, the authors reviewed and edited the
content as needed and take full responsibility for the publication’s content.
Blending camera and 77 ghz radar sensing for equitable, robust plethysmography., ACM Trans.</p>
      <p>Graph. 41 (2022) 36–1.
[15] Q. Hu, Q. Zhang, H. Lu, S. Wu, Y. Zhou, Q. Huang, H. Chen, Y.-C. Chen, N. Zhao, Contactless
arterial blood pressure waveform monitoring with mmwave radar, Proceedings of the ACM on
Interactive, Mobile, Wearable and Ubiquitous Technologies 8 (2024) 1–29.
[16] Z. Wu, Y. Xie, B. Zhao, J. He, F. Luo, N. Deng, Z. Yu, Cardiacmamba: A multimodal rgb-rf fusion
framework with state space models for remote physiological measurement, IEEE Transactions on
Instrumentation and Measurement (2025).
[17] J. Speth, N. Vance, P. Flynn, A. Czajka, Non-contrastive unsupervised learning of physiological
signals from video, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern
Recognition, 2023, pp. 14464–14474.
[18] E. M. Nowara, D. McDuf, A. Veeraraghavan, Systematic analysis of video-based pulse measurement
from compressed videos, Biomedical Optics Express 12 (2020) 494–508.
[19] S. Sachdeva, Fitzpatrick skin typing: Applications in dermatology, Indian journal of dermatology,
venereology and leprology 75 (2009) 93.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>A. D.</given-names>
            <surname>Droitcour</surname>
          </string-name>
          ,
          <article-title>Non-contact measurement of heart and respiration rates with a single-chip microwave doppler radar</article-title>
          , Stanford University,
          <year>2006</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>C.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. M.</given-names>
            <surname>Lubecke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Boric-Lubecke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Lin</surname>
          </string-name>
          ,
          <article-title>A review on recent advances in doppler radar sensors for noncontact healthcare monitoring</article-title>
          ,
          <source>IEEE Transactions on microwave theory and techniques 61</source>
          (
          <year>2013</year>
          )
          <fpage>2046</fpage>
          -
          <lpage>2060</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>N. Van Thi</given-names>
            <surname>Phuoc</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Tang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Demir</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Hasan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. Duc</given-names>
            <surname>Minh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Mukhopadhyay</surname>
          </string-name>
          ,
          <article-title>Review-microwave radar sensing systems for search and rescue purposes</article-title>
          ,
          <source>Sensors</source>
          <volume>19</volume>
          (
          <year>2019</year>
          )
          <fpage>2879</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>E.</given-names>
            <surname>Cardillo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Caddemi</surname>
          </string-name>
          ,
          <article-title>A review on biomedical mimo radars for vital sign detection and human localization</article-title>
          ,
          <source>Electronics</source>
          <volume>9</volume>
          (
          <year>2020</year>
          )
          <fpage>1497</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.-Y.</given-names>
            <surname>Park</surname>
          </string-name>
          , Y.-W. Choi, H.
          <article-title>-</article-title>
          K. Park,
          <string-name>
            <given-names>S.-H.</given-names>
            <surname>Cho</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. H.</given-names>
            <surname>Cho</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-H.</given-names>
            <surname>Lim</surname>
          </string-name>
          ,
          <article-title>A novel non-contact heart rate monitor using impulse-radio ultra-wideband (ir-uwb) radar technology</article-title>
          ,
          <source>Scientific reports 8</source>
          (
          <year>2018</year>
          )
          <fpage>13053</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>J.-Y.</given-names>
            <surname>Park</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-W.</given-names>
            <surname>Choi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Heo</surname>
          </string-name>
          , H.
          <article-title>-</article-title>
          K. Park,
          <string-name>
            <given-names>S.-H.</given-names>
            <surname>Cho</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. H.</given-names>
            <surname>Cho</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-H.</given-names>
            <surname>Lim</surname>
          </string-name>
          ,
          <article-title>Preclinical evaluation of a noncontact simultaneous monitoring method for respiration and carotid pulsation using impulse-radio ultra-wideband radar</article-title>
          ,
          <source>Scientific reports 9</source>
          (
          <year>2019</year>
          )
          <fpage>11892</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>S.</given-names>
            <surname>Lim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Jung</surname>
          </string-name>
          , S.-C.
          <article-title>Kim, Detection and localization of people inside vehicle using impulse radio ultra-wideband radar sensor</article-title>
          ,
          <source>IEEE Sensors Journal</source>
          <volume>20</volume>
          (
          <year>2019</year>
          )
          <fpage>3892</fpage>
          -
          <lpage>3901</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>M.-C. Huang</surname>
            ,
            <given-names>J. J.</given-names>
          </string-name>
          <string-name>
            <surname>Liu</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          <string-name>
            <surname>Xu</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Gu</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Sarrafzadeh</surname>
          </string-name>
          ,
          <article-title>A self-calibrating radar sensor system for measuring vital signs</article-title>
          ,
          <source>IEEE transactions on biomedical circuits and systems 10</source>
          (
          <year>2015</year>
          )
          <fpage>352</fpage>
          -
          <lpage>363</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>M.</given-names>
            <surname>Alizadeh</surname>
          </string-name>
          , G. Shaker,
          <string-name>
            <surname>J. C. M. D. Almeida</surname>
            ,
            <given-names>P. P.</given-names>
          </string-name>
          <string-name>
            <surname>Morita</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Safavi-Naeini</surname>
          </string-name>
          ,
          <article-title>Remote monitoring of human vital signs using mm-wave fmcw radar</article-title>
          ,
          <source>IEEE Access 7</source>
          (
          <year>2019</year>
          )
          <fpage>54958</fpage>
          -
          <lpage>54968</lpage>
          . doi:
          <volume>10</volume>
          .1109/ ACCESS.
          <year>2019</year>
          .
          <volume>2912956</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>S. D.</given-names>
            <surname>Da Cruz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.-P.</given-names>
            <surname>Beise</surname>
          </string-name>
          , U. Schröder,
          <string-name>
            <given-names>U.</given-names>
            <surname>Karahasanovic</surname>
          </string-name>
          ,
          <article-title>A theoretical investigation of the detection of vital signs in presence of car vibrations and radar-based passenger classification</article-title>
          ,
          <source>IEEE Transactions on Vehicular Technology</source>
          <volume>68</volume>
          (
          <year>2019</year>
          )
          <fpage>3374</fpage>
          -
          <lpage>3385</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>S. D.</given-names>
            <surname>Da Cruz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.-P.</given-names>
            <surname>Beise</surname>
          </string-name>
          , U. Schröder, U. Karahasanovic,
          <article-title>Detection of vital signs in presence of car vibrations and radar-based passenger classification</article-title>
          ,
          <source>in: 2018 19th International Radar Symposium (IRS)</source>
          , IEEE,
          <year>2018</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>10</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>J.</given-names>
            <surname>Tu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Hwang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Lin</surname>
          </string-name>
          ,
          <article-title>Respiration rate measurement under 1-d body motion using single continuous-wave doppler radar vital sign detection system</article-title>
          ,
          <source>IEEE Transactions on Microwave Theory and Techniques</source>
          <volume>64</volume>
          (
          <year>2016</year>
          )
          <fpage>1937</fpage>
          -
          <lpage>1946</lpage>
          . doi:
          <volume>10</volume>
          .1109/TMTT.
          <year>2016</year>
          .
          <volume>2560159</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>M.</given-names>
            <surname>Mercuri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. R.</given-names>
            <surname>Lorato</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-H.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Wieringa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. V.</given-names>
            <surname>Hoof</surname>
          </string-name>
          , T. Torfs,
          <article-title>Vital-sign monitoring and spatial tracking of multiple people using a contactless radar-based sensor</article-title>
          ,
          <source>Nature Electronics</source>
          <volume>2</volume>
          (
          <year>2019</year>
          )
          <fpage>252</fpage>
          -
          <lpage>262</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>A.</given-names>
            <surname>Vilesov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Chari</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Armouti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. B.</given-names>
            <surname>Harish</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Kulkarni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Deoghare</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Jalilian</surname>
          </string-name>
          ,
          <string-name>
            <surname>A</surname>
          </string-name>
          . Kadambi,
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>