<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Visual and Radar Sensor Fusion for Perimeter Protection and Homeland Security on Edge</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Danny Buchman</string-name>
          <email>danny.buchman@seraphimas-hls.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff4">4</xref>
          <xref ref-type="aff" rid="aff5">5</xref>
          <xref ref-type="aff" rid="aff6">6</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Michail Drozdov</string-name>
          <email>michail.drozdov@seraphimas-hls.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
          <xref ref-type="aff" rid="aff4">4</xref>
          <xref ref-type="aff" rid="aff5">5</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Aušra Mackut ė-Varoneckiene</string-name>
          <email>ausra.mackute-varoneckiene@bpti.lt</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
          <xref ref-type="aff" rid="aff4">4</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Tomas Krilavičius</string-name>
          <email>tomas.krilavicius@bpti.lt</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
          <xref ref-type="aff" rid="aff4">4</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>1. Both radar and camera are acquiring data in Polar coordinates. 2. While full 3D Cartesian representation can be reconstructed from the radar data, it is not true for camera without several assumptions on the geometry of setup</institution>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Baltic Institute of Advanced Technology</institution>
          ,
          <addr-line>Vilnius</addr-line>
          ,
          <country country="LT">Lithuania</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Department of Applied Informatics, Vytautas Magnus University Kaunas</institution>
          ,
          <country country="LT">Lithuania</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Geozondas Ltd.</institution>
          ,
          <addr-line>Vilnius</addr-line>
          ,
          <country country="LT">Lithuania</country>
        </aff>
        <aff id="aff4">
          <label>4</label>
          <institution>IVUS 2020: Information Society and University Studies</institution>
          ,
          <addr-line>23</addr-line>
        </aff>
        <aff id="aff5">
          <label>5</label>
          <institution>JVC Sonderus</institution>
          ,
          <addr-line>Vilnius</addr-line>
          ,
          <country country="LT">Lithuania</country>
        </aff>
        <aff id="aff6">
          <label>6</label>
          <institution>Seraphim Optronics Ltd.</institution>
          ,
          <addr-line>Yokne'am Illit</addr-line>
          ,
          <country country="IL">Israel</country>
        </aff>
      </contrib-group>
      <fpage>92</fpage>
      <lpage>105</lpage>
      <abstract>
        <p>Today, in the border and perimeter protection, it is very common to use RADAR technology and pan-tilt (PT) cameras to have terrain dominance. The common solution uses both sources - while most of the threats are detected by radar, the camera used for inspection of motion, detected by radar. This solution is very dependent on radar performance and not efective for diferent scenarios when the radar is not capable to monitor the movement of all targets. Inputs from camera and radar are used in close integration to increase detection probability and reduce false alarms. In this work two alternative methods of radar and visual data fusion are proposed, data structures and processing algorithms are defined and results of experimental validation for both proposed methods are shown.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Sensor fusion</kwd>
        <kwd>Radar</kwd>
        <kwd>Video motion detection</kwd>
        <kwd>Perimeter protection</kwd>
        <kwd>Kalman filter</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        fusion are defined in this context, with first covering
such important parts of the system like data
associaSensor fusion is a large research topic. Its goal is to tion and state updates and second being more modular
combine multiple data sources to receive joined data, and distributed alternative.
which allows to improve processes or calculations, com- Methods based on Kalman family filters [
        <xref ref-type="bibr" rid="ref2 ref3 ref4">2, 3, 4</xref>
        ] are
pared to single-source data usage. common, when dealing with the data-level fusion
be
      </p>
      <p>
        Tracking solution using radar as the only source of cause they enable to have process model independent
data sufers from unreliable detection or even absence from observation structure [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] while working with
unof detection when dealing with mostly tangential tra- certain data. In the case of several sensors, such filters
jectories of observed objects. An attempt is made to allow to incorporate new data into the model as data
lessen this problem by adding a camera as a second gets available [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ].
source of data and combining radar tracking with video Due to the properties of Kalman filter (KF), it is
remotion detection (VMD) while keeping a common tar- quired, that state update of the described dynamic
proget state for detections from both sources. It is also cess would be linear. It is common practice to use
expected, that fusion can add the benefit of reduced Cartesian coordinates to describe object state when
false detection rate since validation of tracks can be dealing with mostly linear movement. When trying
more reliable using two sources of information redun- to fuse camera and radar data, two issues are quite
apdant fusion scheme [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]). parent:
      </p>
      <p>This work focuses on research related to the
practical application of fusion between radar and video. Two
main methods of fusion, namely data fusion and tracks</p>
      <p>
        The first problem can be solved in several ways. The
usual practice is to keep the target state in Cartesian
coordinates [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] while measuring in Polar and
transforming data before the update (converted
measurements [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]). The covariance matrix, which
is used in update and estimation, gets biased if
transformed directly. There are exist many solutions for
the linearization of space near estimated point to get
proper values for the covariance matrix [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]
(Extended Kalman Filter, Unscented Kalman filter to name
a few). After transformation state can be updated by
normal Kalman filter formulas.
      </p>
      <p>
        Second issue is not addressed by these solutions: if
Kalman filter would be used while keeping target state
in Cartesian coordinates, camera would change from
very precise sensor ( azand el angles) to very unprecise
( x, y, z), as distance to object is used in Polar to
Cartesian transform for any direction and is not directly
measured by camera. There are numerous diferent
approaches to get at least some estimation of distance
from direct camera measurements:
1. Use radar detections as a base and map
camera detections to radar improving angular
resolution [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
2. Use homography1 estimation methods for
cam
      </p>
      <p>
        era calibration in the lab.
3. Use corner reflector or another strongly
reflective object to map precisely radar and camera
detections into 3D [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ].
4. Use many assumptions2 on the positioning of
detections relative to the optical axis (ground is
straight plane, camera position, and orientation
is known, targets are always on the ground, etc.)
[16], [17].
5. Use machine learning (ML) techniques, in the
cases when targets are specified (e.g. like
detecting image size of cars the physical size of which
is known) [
        <xref ref-type="bibr" rid="ref13">13, 18</xref>
        ].
6. Use the movement of the camera and additional
features (lane lines) to acquire distance [19], [20],
[21].
      </p>
      <p>
        For scenarios, arising in perimeter protection or
homeland security, method 6) is not applicable or too costly
in a sense of performance. Usually, there are no
predeifned markers and the system is stationary (no
translational movement). All other approaches can be
explored. Ideal solution, however, would be to use
camera only in its strongest domain to augment
information received by radar instead of increasing
inaccuracies in one dimension while decreasing in others. One
such potential solution is to keep the state of Kalman
iflter in Polar coordinates. This isn’t unknown in
literature, as bearing only tracking often uses Modified
Polar Coordinates (MPC) [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], [22] or Modified
Spherical Coordinates (MSC) [23].
      </p>
      <p>Following issues are explored before finalizing data
model for fusion:
1. The impact of the origin of the coordinates – to
use Cartesian or Polar coordinates in the linear
Kalman filter, for the diferent movement
patterns.
2. Improvement (if any) of the accuracy of state
estimation in Polar coordinates, if the camera is
also used in Kalman filter updates.
3. Detection of distance to objects from camera
measurements and the efect of the addition of this
result to the Kalman filter measurement model.</p>
      <p>As a consequence, some of the following sections
(section II, section V and section VI) are split into two
main parts - preliminary experiments (simulation) and
methods validation. It should be clear, however, that
chronologically all preliminary tests were performed
and results were analyzed before finalizing fusion
models and implementing fusion methods.</p>
      <p>2. Data Description
1A geometrical relation between two images of the same planar Coordinates, used in this research are defined as shown
surface, described by the transformation matrix in Fig.1</p>
      <p>2The camera is moving by known pattern, the terrain is flat,
camera position and angle to the surface are known, etc.</p>
      <sec id="sec-1-1">
        <title>1. Range noise standard deviation</title>
        <p>2. Velocity noise standard deviation
3. Detection angle noise standard deviation
4. False detection rate (per simulation area element)
5. Detection view angle
6. Update rate</p>
        <p>The following simulation parameters can be defined
for the camera:
1. VMD detection box noise (in pixels)
2. Camera resolution
3. Angular field of view
4. VMD false positive rate and area</p>
        <sec id="sec-1-1-1">
          <title>2.1. Simulation Data Description</title>
        </sec>
      </sec>
      <sec id="sec-1-2">
        <title>5. Update rate</title>
        <p>In this part of the research, the definition of custom Simulation of detections for VMD and radar and
subtrajectories for any number of targets as well as radar sequent registration without knowledge of ground truth
and video motion detection (VMD) noise models were or noise model is performed.
icmanplbeemdeenfinteedd.foTrhraedfaorl:lowing simulation parameters casKeaolmfCaanrtfilteesriasntactoeoirsddienfinaetedsbaynd by[  [            ]T in ]T
in Polar coordinates, where
1.   and   are velocities in x and z dimensions</p>
        <p>respectively,
2.   and   are accelerations
3.   - range change rate
4.  - azimuth angle
5.  and   - the rate of change for azimuth
and the rate of change of  (angle
acceleration)</p>
        <p>Measurements of radar and camera are simulated
using Gaussian noise model. For camera – diferent
accuracies of VMD were tested ranging from 1 pixel
to 5 pixels.</p>
        <p>Measurements are filtered and the mean square
error (MSE) is calculated for comparison of estimated
positions to actual trajectories.</p>
        <p>In Fig.2 test trajectories Trajectory 1 – T1,
Trajectory 2 – T2, Trajectory 3 – T3, which were used for
evaluation (camera and radar are at the point (0,0) in
a −  Cartesian coordinate system) are presented.</p>
        <sec id="sec-1-2-1">
          <title>2.2. Data for Evaluation of Fusion</title>
        </sec>
        <sec id="sec-1-2-2">
          <title>Methods</title>
          <p>In case of track fusion inputs for fusion module are
1. list of VMD outputs as bounding boxes [   ℎ
and ID of VMD track
2. list of radar tracker outputs as [  
of radar track
   ] and ID</p>
        </sec>
      </sec>
      <sec id="sec-1-3">
        <title>In case of data fusion inputs are</title>
        <p>1. list of bounding boxes [   ℎ ] received directly
from the "blob" stage of VMD pipeline
2. list of radar targets [    ]</p>
        <p>The main diference between sources of data for two
approaches is, that in case of data fusion there are many
false detections, which are not filtered by VMD or radar
tracker methods respectively.</p>
      </sec>
    </sec>
    <sec id="sec-2">
      <title>3. Data Fusion Methods</title>
      <sec id="sec-2-1">
        <title>1. centralized architecture,</title>
        <p>2. decentralized architecture,
3. distributed architecture,
4. hierarchical architecture.</p>
        <p>Centralized architecture (data from all sources is
processed in single module) is expected to be
theoretically optimal in case of proper synchronization of data
sources and suficient bandwidth for data transfer. It
can sufer, however, from lack of distribution of
bandwidth and processing in case these resources are
limited for given task. Alternatives, solving this issue,
] raaretedreacwendtraatalizineddaiferrcehnitteoctrudreer
(afnudsioconmnpoodseitsioinnc)oarnpdodistributed architecture (fusion nodes receive single
sensor data and provide features to be fused). In our view,
decentralized architecture would introduce
unnecessary complexity and implementation dificulty, so
distributed architecture is considered as only another
option to centralized architecture.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>4. Proposed Fusion Methods</title>
      <p>Based on the discussion in the previous section and the
results of preliminary evaluations (described in section
VI), two main fusion approaches are established:</p>
      <sec id="sec-3-1">
        <title>1. data fusion</title>
        <p>
          2. tracks fusion
Overview of diferent sensor fusion classifications can Data fusion is a centralized mixed input (DAI-FEO
be found in [
          <xref ref-type="bibr" rid="ref1">1</xref>
          ] for a further reading. Here we employ and FEI-FEO) method, accepting raw outputs of radar
terminology, used by Castanedo in that paper. and intermediate results of VMD (bounding boxes of
        </p>
        <p>Possible schemes of the sensor fusion in a given ap- detected blobs).
plication are limited to redundant schemes (with other Tracks fusion is distributed FEI-FEO method,
acceptpossibilities being complementary and cooperative) bas- ing features (tracks) from VMD and radar tracker
moded on the relations of sensors used in the system. Both ules.
types of sensors are measuring the same state of ob- Both should solve the problem defined, but the pros
jects at the same time in the observed area. and cons of approaches are diferent. Fusion of tracks</p>
        <p>
          Based on the idea of modularity of the system, it is is performed after data from both sources is processed
clear, that radar/camera fusion should not be a decision- and there is track detected on both sets of data. Then
making module. While detections are performed by both tracks are matched (track to track association) to
the fusion module, the final decision is afected by ad- better reflect the behavior of the object being tracked.
ditional rule-based filters and the recognition module. In the case of one of the sources not returning track
The goal of the fusion module is to minimize the false while other is returning track, diferent policies can
alarm rate (FAR) of the system and provide inputs for be used to favor reduced false detection rate over
exdecision-making modules. It can be said then, that fu- tended tracking duration or vice versa. Comparing
sion module approaches can be limited to two (data in with data fusion, tracks fusion is easier to debug and
- feature out (DAI-FEO), f eature in - feature out (FEI- tune, since separate modules can be tested and tuned
FEO)), contrary to approaches, which provide data or faster due to overall reduced complexity. Data fusion
decisions as outputs (data in - data out (DAI-DAO), works on a lower level than tracks fusion. It has the
f eature in - decision out (FEI-DEO), decision in - de- benefit of incorporating updates from both sources of
cision out (DEI-DEO)). data into common state update converging to a true
Four types of fusion architectures are defined in [
          <xref ref-type="bibr" rid="ref1">1</xref>
          ]:
state of the object being tracked faster. The role of
policies in the case of one of the sources missing
detections is reduced since both sources of raw data can
be treated similarly (both update Kalman filter state).
        </p>
        <p>There is no need to tune policies for diferent cases
(no recent radar detection, no recent camera detection,
higher than an average mismatch between sources, etc.),
it is handled by common update. The downside of data
fusion compared to track fusion is longer debugging
and tuning. If conditions of experiments/use cases or
equipment changes, that could add delivery time
overhead.</p>
        <sec id="sec-3-1-1">
          <title>4.1. Data Fusion</title>
          <p>General strategy for data-level fusion can be
summarized as follows:
1. Use radar detections as a base for track
valida</p>
          <p>tion.
2. Any VMD detection can create a persistent track,
but it will be validated without radar data only
after a relatively long track age is reached.
3. Any track with both recent VMD and Radar
detections has a higher probability to be validated
as a real track, as a consequence, it is validated
at the lower age of the track.</p>
          <p>Such choices are proposed due to relative ease of
radar data validation - if the movement of a potential
target in the area under test contradicts Doppler
velocity, reported by radar, such target can be quickly
invalidated. There is no similar process for VMD.</p>
          <p>Full track state representation consists of the
following parts:
1. Track state vector at time  (based on definitions
in Section II). 6x1 vector:
⎡  ⎤
⎢  ⎥
⎢ ⎥
  = ⎢  ⎥ ,
⎢  ⎥
⎢  ⎥
⎢ ⎥
⎣  ⎦
⎡  ⎤
⎢   ⎥⎥⎥ .
  = ⎢
⎢
⎣  ⎦
or its counterpart in Polar coordinates. In case
of constant velocity model (4x1 vector):
(1)
(2)
2. The track covariance matrix   . 6x6 matrix or
4x4 matrix (if accelerations removed),
describing the amount of variance in data and interstate
dependencies (covariance).
3. The track age. Time elapsed from the moment of
track creation by VMD or radar. It is updated if
either VMD or radar detection can be associated
with the current state vector.
4. The track innovation error. Value, which is
compared to predefined threshold values for track
validation and removal. Track innovation
error (squared Mahalanobis distance) is calculated
from state measurement residual (innovation) and
residual (innovation) covariance. It is one of the
main criteria for positive detection.
5. Track update timestamp for VMD
6. Track update timestamp for radar
7. Object size. It can be calculated if both VMD and</p>
          <p>radar detected an object.
8. Object visual distance state. 2x1 vector (distance,
rate of distance). It is estimated from the camera
and can be used if the positioning of the device
is known. It is highly unstable due to partial
obstructions and because of that is separated from
the track state vector. It can be used by
higherlevel decision-making modules.
9. Object visual distance covariance matrix. 2x2</p>
          <p>matrix.
10. The total duration of detection for VMD. It is one</p>
          <p>of the main criteria for positive detection.
11. The total duration of detection for radar. It is</p>
          <p>one of the main criteria for positive detection.</p>
          <p>The fusion scheme consists of data association, state
update, and management of tracks. Data association
(point to track) in the first prototype is implemented
as simple Nearest Neighbour (NN) estimation in state
space, favoring simplicity and speed. Joint
Probability Data Association (JPDA) based association method
[24] is used as an alternative in later versions. In NN
based data association each measurement is compared
against the estimated state   | −1 at time  by
calculating Mahalanobis distance based metric (discussed
further). The prefiltering of potential associations can be
performed using some simple heuristics like 2D
distance.</p>
          <p>Linear KF is used for state estimation and update.</p>
          <p>Estimation step:
  | −1 =   −1| −1,</p>
          <p>(3)
where A is process matrix defined for a state with
accelerations as:
for state without acceleration as:
 = ⎢
⎡1  
⎢0
⎢
⎢0
0
0
0
⎢
⎢
⎣
1
0
0
0
0

1
0
0
0
2
 = ⎢
⎡1 
⎢0
0
⎢
⎣0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0</p>
          <p>measurement matrix,   is the
measurement vector of  x1 size. The size of the measurement
vector and the measurement matrix  depend on the
type of sensor and coordinate systems used for
measurements and the state.  can be understood as
number of parameters measured by a specific sensor and 
as mapping between state and measured parameters.</p>
          <p>If transformation from sensor coordinates to state
coordinates is linear, it can be performed by 
In our experiment of using Polar representation of
tar</p>
          <p>directly.
get state space
  = ⎢
⎡  ⎤
⎢   ⎥
⎢
⎣

⎥ .
⎥
⎦
  = ⎢  ⎥ ,
⎡  ⎤
⎢
⎣ ⎦</p>
          <p>⎥
The data vector of radar and measurement matrix:
which simplifies calculations. In the case of camera
(VMD) update without the knowledge of the
geometrical setup:</p>
          <p>= [  ] .
here  
then</p>
          <p>is just x position of the center of the bounding
box in screen coordinates. The measurement matrix
 = [0 0</p>
          <p>H FOV
0

also be calculated by just using 
and the state vector needs to be augmented with
additional element 1 as the last row to allow matrix
multiplication in (7). In the above formula,  is the width of
the video in pixels. The single element matrix</p>
          <p>can
element of the state
⎢
⎣
matching pair if all  are larger than some predefined</p>
          <p>identity matrix. The best match
be(4)
(5)
(7)
(8)
(9)
here  is  x diagonal measurement error matrix. It is
defined based on the parameters of sensors used.
Error ranges from the datasheet of the sensor is a good
starting point. Kalman gain</p>
          <p>=   | −1 −  T −1,
here inverse of   is taken. Innovation error, which is
not used directly in the Kalman filter update, but is the
main criterion for data association:</p>
          <p>=   T −1  ,</p>
          <p>Lastly, if it is found, that there is the best match
between the track-measurement pair, the Kalman
filter update step is finished by calculating the posterior
state and covariance:
threshold  . In such a case, a new track state for the
track without measurement:
(19)
(20)
(21)
  | =   | −1,
  | =   | −1.</p>
          <p>Track merging is performed if (similarly to (16)):</p>
          <p>= ( 1 −  2 )T m−1atch( 1 −  2 )) &lt;  thresh,
where  1 and  2 are states of two tracks to be
compared,  match is a diagonal matrix, can be treated as
typical variances of track state elements (with large
values of  match more tracks will be matched),  thresh
is a threshold for matching.</p>
          <p>With the main parts of the data association and track
update discussed, global view of data fusion can be
deifned (see Algorithm 1).</p>
          <p>During the step of managing unmatched tracks,
innovation error of track (normally calculated as (16)),
is increased. In current implementation following
empirically found formula is used:</p>
          <p>√
 =  ∗ max(1 +  /3, 1 + 6 / tr),</p>
          <p>(22)
here  tr is the age of the track. The idea is to increase
 for new tracks faster, that older tracks, as the track
age usually shows, how reliable the current track is.</p>
          <p>The age of track is increased if there is VMD or Radar
detection. If detection was from previous
measurement (contrary to measurements, which came from
both sources on the same processing step) it is increased
by the time diference between current and previous
measurements. If track is picked up after an absence
of matching measurements, delta time is added based
on the type of update (frame update time for VMD or
frame update time for Radar). Tracks without current
matching measurements do not update age value.</p>
          <p>For VMD only state (no recent radar detection)
previous angular movement is used along with size to
create view space gating for measurement to tracks
matching. It is part of measurement prefiltering,
mentioned earlier. If angular velocity isn’t initialized (the
track age is low), the maximum possible rate of
movement based on typical size and velocity ratio is used to
create view space gate.</p>
          <p>For a track, having recent Radar detection or
overall high radar detection duration, the exact estimated
position is calculated and new detections of VMD and
Radar are projected on common space to use spatial
gating.</p>
          <p>Tracks are created/initialized with every moving
object detection. For radar detection movement
condition is non-zero Doppler velocity by default or can
/* merging of existing tracks
for each track t in memory do
for each track t2 in memory except t do
apply (21);
if (21) condition met then
merge t and t2;
remove t2 from memory;</p>
          <p>*/
end
end
end
/* updating of tracks by</p>
          <p>measurement */
if have a new measurement of any source then
for each track t in memory do
estimate prior state (3) and covariance
(6) prefilter list of possible
measurements;
for each measurement with eps small
enough do
if measurement was already used
then
check for more precise updates
in earlier tracks;
if no better found then
update Kalman state (17),
(18);
push updated state to stack
of potential updates;
mark measurement as used;
end
else
end
end
update Kalman state (17), (18);
push updated state to stack of
potential updates;
mark measurement as used;
end
for each track t in memory do
apply the best update from the updated
states stack;
end
end
end
/* manage unmatched tracks
for each unmatched track do
increase track innovation error;
if innovation error reaches threshold then</p>
          <p>remove a track from memory
end
/* create potential tracks
for each unmatched measurement do
if satisfies movement conditions then
create a track with an initial state and
initial covariance;
*/
*/
98</p>
          <p>end
end</p>
          <p>Algorithm 1: Data fusion algorithm
be set as one of the many algorithm parameters. All
VMD detections treated as moving by definition.
After the creation track is in a non-validated state. Tracks
are considered to be validated after predefined age
depending on the following properties:
1. Calculate angle to the base of target based on the
bottom edge of VMD detection and knowledge
of VFOV of the camera as:
 =
ℎVFOV</p>
          <p>,
 ′ =  −  .</p>
          <p>=  tan  ′.
where ℎ is projection of position, where target
touches the ground on camera view,   - vertical
resolution (number of pixels along the vertical
axis of image).
2. Subtract this angle from the angle formed by
straight</p>
          <p>up direction and camera "looking" direction:
3. Calculate distance, by knowing one side of the
triangle (height of camera) and the angle between
this side and hypotenuse:
(23)
(24)
(25)
1. Existence or absence of recent VMD/Radar
detection. Radar detections (without VMD detec- There are other ways to estimate the distance to the
tion) having a higher impact on validation thresh- target from the camera only. For example, if the
tarold than another way around. get is identified, the relative size of the target on the
2. Track innovation error. image could signal distance. That requires, however,
3. Total previous duration of detections for radar a feedback loop between the fusion module and the
and VMD recognition module.
4. Trajectory type - mostly tangential movement
with radar only detections should be verified for 4.3. Tracks Fusion
a longer duration.
5. Velocity thresholds.</p>
          <p>A general strategy for tracks fusion can be
summarized as follows:
Tracks to be removed from potential tracks list if:
1. Track innovation error grows too large. It is
calculated based on new measurements and also
incremented after the absence of radar detection
(22).
2. The track is created by VMD detection and
visualonly innovation error grows to large. It is
calculated from angular measurements only and
updated similarly to (22).</p>
        </sec>
        <sec id="sec-3-1-2">
          <title>4.2. Distance Estimation from a Camera</title>
          <p>Distance from camera measurements is calculated based
on assumptions, that:
1. target is not blocked by some other objects
2. height and elevation of the camera are known
3. detection is of ground-based targets</p>
          <p>The main features used for calculations are presented
in Fig.3. Algorithm:
1. Both sources can create fusion tracks with
another component not present until the match will
be found at a later stage.
2. Input tracks are not verified against Kalman state
estimate (no data association), because this step
is already done in the tracker module.
3. VMD and radar tracker tracks can be merged if
merging requirements are met. From such
moment fusion track has both components.
4. The track can be split, if it is detected, that visual
and radar data diverge too much. Track
splitting/merging is performed at each data update
step.</p>
          <p>Additionally to track structure discussed in the
previous section following fields are defined for tracking
state:
1. Fused track ID (diferent from components).
2. visualSeparated- boolean value, which shows, that
track recently had a visual component but lost it
due to diverging of visual and radar.
3. VMD track ID. With new data updates, the
degree of matching between fused tracks
components is first checked for stored best previous
matches
4. Radar tracker track ID. Same as above.</p>
          <p>The track fusion algorithm is overviewed in
Algorithm 2. The operation of mismatch calculation, used
in algorithm description on many occasions can be
explained looking at Fig. 4. First, measurement
timestamps are created for all entries of both types of tracks.
Then estimations for matching are calculated by
interpolation (or extrapolation, if on edges). Average
azimuth mismatch is used as a matching parameter. The
early exit of the matching function is possible if
mismatch grows to a predefined value.</p>
          <p>The age of track is updated as per data fusion with
each new radar tracker output considered as new radar
measurement with time step equal to radar update
duration. Track time out is increased, if no
measurements were added to track. This step is the same for
component tracks and the fused track. Time out for
deletion calculation, mentioned in Algorithm 2, is
calculated based on the current number of tracks. It is
deifned as 3 s if the number of tracks is less than  max
the maximum number of tracks. If, on the other hand,
the number of tracks is higher, allowed time out
reduces:
 timeout = 3
2 max −  cur
 max
.</p>
          <p>(26)
This assures, that all tracks are cleared if  cur reaches
2 max. If a number of tracks for some reason grows
more than 2 max, all tracks are cleared.</p>
          <p>Tracks are created/initialized with every moving
object detection from radar and every VMD detection.
After creation, VMD track is in non-validated state,
but track, created from radar tracker data directly, is
in a validated state. Fused track having both
components can be split, if VMD measurements diverge from
tracker output too much. visualSeparated is set to true
in this case. It is done to prevent the reacquisition
of the same VMD track with a high mismatch factor.
VMD only part of such split inherits range data and all
*/
*/
*/
*/
/* updating tracks structures
if have a new radar tracker frame then
for each track in frame do
if The ID of a track can be found in
already existing then</p>
          <p>append a new measurement;
else
end
end
create a new list of track entries
with new ID;
end
if have new frame of VMD tracks then</p>
          <p>same as for radar tracks;
end
/* matching of tracks
for each fused track do
calculate mismatch of radar and video;
for all non-fused tracks of both types do
calculate mismatch with appropriate
(radar vs. VMD) track;
if a better match found then
assign a new component to fused;
release the previous component as
non-fused;
end
end
end
/* generation of tracks
for each combination of radar and VMD track
do
*/
calculate mismatch; if mismatch small
enough then
create a new fused track with matched
components
end
end
/* destruction of tracks
calculate time out of track for deletion;
delete all tracks with higher time out than
allowed;
/* tracks state update
for each fused track do
if any of the track components received
updates then</p>
          <p>a full Kalman filter update
else
end
end
update the state as an estimation only
(17),(18)
Algorithm 2: The track fusion algorithm
data, which is relevant to durations of detections and
age of the track. It can be matched again later after
visualSeparated expires. The split tracks get invalidated
for some short duration (less than second) by setting
it’s innovation error parameter to some high value and
gradually reducing it after new measurements.
data acquisition was performed simultaneously. Time
synchronization was assured by knowing the starting
time of video and frame rate and storing radar raw data
or radar tracks with exact timestamps. Although the
discussed algorithms were not running at the time of
these measurements, close to real-time performance is
achieved later using the same hardware.</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>5. Experimental Setup</title>
      <p>allows more control over experiments.</p>
      <p>Three experiments are performed:
The area, selected for experiments is shown in Fig. 5.</p>
      <p>The selected area allows the testing performance of
tracking on big enough distance (more than 100 m)
with parts of trajectories being almost exactly tangen- Experimental investigation was subdivided into two
tial while moving around the edge of the stadium. A stages. First, preliminary experiments were performed
small amount of moving background (cars, people, trees) to select the state model and update strategy for fused
tracks. During the second stage, the tracks fusion and
the data fusion approaches were validated.</p>
    </sec>
    <sec id="sec-5">
      <title>6. Results of Experimental Investigation</title>
      <p>1. A person moving clockwise around the edge of
the stadium without stopping
2. A person moving counter-clockwise around the
edge of the stadium, stopping, then proceeding
3. A person moving clockwise around the edge of
the stadium, then changing the moving pattern
from mostly tangential to radial</p>
      <p>An example frame of video with detection displayed
is presented in Fig. 6. Data were acquired using the
NXP iMX6 SoM based embedded system with a
quadcore 1.2 GHz processor. Video recording and radar</p>
      <sec id="sec-5-1">
        <title>6.1. Comparison of Cartesian</title>
      </sec>
      <sec id="sec-5-2">
        <title>Coordinates and Polar Coordinates for KF State Representation</title>
        <p>The impact of the selected system of coordinates to
performance is presented in (a) - (c) pictures in Fig.
7. KF state representation by Cartesian or Polar
coordinates produces very close results and a visual
separation of results is hardly noticeable. In Table 1, the
results of models, in which Polar or Cartesian
coordinates are used for KF state representation
performances, are presented. Since results are very similar,
it can be concluded, that there is no significant
diference in diferent KF state representations. To obtain
error for each model, 10 simulations were performed
for each and mean MSE calculated.</p>
      </sec>
      <sec id="sec-5-3">
        <title>6.2. Accuracy of Filtering with Camera</title>
      </sec>
      <sec id="sec-5-4">
        <title>Data added</title>
        <p>The resulting performances of models with/without
adding camera to state update, represented through
MSE, are shown in Table 2. It can be observed, that KF
state updated using camera output represents ground
truth more accurately than updated by radar data only.
As before, mean MSE is calculated by running
simulation 10 times for each trajectory.</p>
        <p>Trajectory</p>
        <p>T1
T2
T3</p>
        <p>Measured error KF error Polar Fusion error
2.4647 0.87358 0.72844
2.2787 0.58846 0.51873
2.7926 0.63428 0.58261</p>
      </sec>
      <sec id="sec-5-5">
        <title>6.3. Accuracy of Filtering with Distance</title>
      </sec>
      <sec id="sec-5-6">
        <title>Calculation from Camera Data</title>
        <p>Results of fusion error and fusion with distance
models performance evaluation using MSE statistics are
presented in Table 3. The best minimum values as
well as mean values of MSE shows, that fusion with
distance evaluation model performance outperforms
model without distance evaluation performance.</p>
        <p>An example of a typical simulation run with
diferent models evaluated is shown in Fig. 8</p>
      </sec>
      <sec id="sec-5-7">
        <title>6.4. Fusion Methods Evaluation</title>
        <p>The main metrics for evaluation of two fusion approaches
were Object count accuracy (OCA) and FAR. OCA is
Trajectory
T1
T2
T3</p>
        <p>Measured error</p>
        <p>KF error</p>
        <p>Fusion error
Fusion with DIST error
Fusion with DIST error</p>
        <p>Measured error</p>
        <p>KF error</p>
        <p>Fusion error
Measured error</p>
        <p>KF error</p>
        <p>Fusion error
Fusion with DIST error</p>
        <p>MIN
Fusion error and fusion with distance estimation error comparison using MSE
defined as</p>
        <p>OCA ( 

,    ) =
min(   ,    )
  + 
2
,
(27)   
and</p>
        <p>where    and    are sets of ground truth points and
detected points in measurement frame  respectively,</p>
        <p>are quantities of ground truth and
detected instances respectively. Overall OCA is defined
as the average OCA of all frames of measurements.</p>
        <p>MAX</p>
        <p>Any false track appearing in the frame constitutes to
given frame becoming a false positive.</p>
        <p>The focus of the experimental investigation was on
elimination of false detections and reduction of missed
detections rate. Progress towards both goals can be
successfully monitored using selected metrics [25, 26].</p>
        <p>Rather conservative policies for tracks validation were
chosen for both versions of fusion to highlight the
possibility to avoid false alarms while still keeping high
enough detection rate (indirectly shown by OCA) for
all practical purposes. Evaluation results are presented
in Table 4. Best results are obtained by Data fusion and
close to the best results are obtained by Tracks fusion.
7. Conclusions
1. It was observed experimentally, that radar only
tracking sufers from many missed detections, if
the target trajectory is close to tangential.
2. While radar only tracker performs without false
alarms during the first two sequences, it is
demonstrated with the third sequence, that target
direction changes can cause false tracks to appear.
3. Two issues, mentioned above, can be solved with
any of two fusion of radar and camera approaches,
as it is seen from evaluation results. OCA
increased drastically in both cases compared to radar
only tracking.
4. Data fusion ofers slightly better performance,
reflected by higher OCA values. In practice, it
means faster track validation and more robust
tracking with missed detections from either VMD
or radar.
5. The addition of distance measurements from the
camera didn’t prove to be stable method for tracks
matching or state updates in practice. Although
simulation was suggesting accuracy improvement,
real measurements were highly unstable while
using this approach.</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>8. Acknowledgments</title>
      <p>Research is partially funded by Lithuanian Research
Council Project Nr. 09.3.3-ESFA-V-711-01-0001, and
partially funded by Lithuanian Business Support Agency
Project Nr. 01.2.1-LVPA-T-848-01-0002.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>F.</given-names>
            <surname>Castanedo</surname>
          </string-name>
          ,
          <article-title>A review of data fusion techniques</article-title>
          ,
          <source>The Scientific World Journal</source>
          <year>2013</year>
          (
          <year>2013</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>C.</given-names>
            <surname>Napoli</surname>
          </string-name>
          , E. Tramontana,
          <string-name>
            <given-names>M.</given-names>
            <surname>Wozniak</surname>
          </string-name>
          , En- tomi,
          <article-title>Obstacle detection using millimeter-wave hancing environmental surveillance against or- radar and its visualization on image sequence, ganised crime with radial basis neural networks</article-title>
          ,
          <source>in: Proceedings of the 17th International Conferin: 2015 IEEE Symposium Series on Computa- ence on Pattern Recognition</source>
          ,
          <year>2004</year>
          .
          <source>ICPR</source>
          <year>2004</year>
          ., tional Intelligence, IEEE,
          <year>2015</year>
          , pp.
          <fpage>1476</fpage>
          -
          <lpage>1483</lpage>
          . volume 3, IEEE,
          <year>2004</year>
          , pp.
          <fpage>342</fpage>
          -
          <lpage>345</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Y. B.</given-names>
            <surname>Shalom</surname>
          </string-name>
          , Multitarget-multisensor tracking: [16]
          <string-name>
            <given-names>G. P.</given-names>
            <surname>Stein</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Mano</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Shashua</surname>
          </string-name>
          ,
          <article-title>Vision-based advanced applications, Artech House, Boston, acc with a single camera: bounds on range and MA (</article-title>
          <year>1990</year>
          ).
          <article-title>range rate accuracy</article-title>
          , in: IEEE IV2003 Intelli-
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Bar-Shalom</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.-R.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <article-title>Multitarget-multisensor gent Vehicles Symposium</article-title>
          .
          <source>Proceedings (Cat. No. tracking: principles and techniques</source>
          , volume
          <volume>19</volume>
          ,
          <year>03TH8683</year>
          ), IEEE,
          <year>2003</year>
          , pp.
          <fpage>120</fpage>
          -
          <lpage>125</lpage>
          . YBs Storrs, CT,
          <year>1995</year>
          . [17]
          <string-name>
            <given-names>F.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Sparbert</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Stiller</surname>
          </string-name>
          , Immpda vehicle
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>D.</given-names>
            <surname>Willner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Chang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Dunn</surname>
          </string-name>
          ,
          <article-title>Kalman filter tracking system using asynchronous sensor fualgorithms for a multi-sensor system, in: 1976 sion of radar and vision</article-title>
          , in: 2008
          <source>IEEE Intelligent IEEE Conference on Decision and Control in- Vehicles Symposium</source>
          , IEEE,
          <year>2008</year>
          , pp.
          <fpage>168</fpage>
          -
          <lpage>173</lpage>
          .
          <source>cluding the 15th Symposium on Adaptive Pro</source>
          <volume>-</volume>
          [18]
          <string-name>
            <given-names>F.</given-names>
            <surname>Beritelli</surname>
          </string-name>
          , G. Capizzi,
          <string-name>
            <given-names>G. L.</given-names>
            <surname>Sciuto</surname>
          </string-name>
          , C. Napoli, cesses, IEEE,
          <year>1976</year>
          , pp.
          <fpage>570</fpage>
          -
          <lpage>574</lpage>
          . F. Scaglione,
          <article-title>Rainfall estimation based on the in-</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>N.</given-names>
            <surname>Kaempchen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Dietmayer</surname>
          </string-name>
          ,
          <article-title>Data synchroniza- tensity of the received signal in a lte/4g mobile tion strategies for multi-sensor fusion, in: Pro- terminal by using a probabilistic neural network</article-title>
          ,
          <source>ceedings of the IEEE Conference on Intelligent IEEE Access</source>
          <volume>6</volume>
          (
          <year>2018</year>
          )
          <fpage>30865</fpage>
          -
          <lpage>30873</lpage>
          .
          <source>Transportation Systems</source>
          , volume
          <volume>85</volume>
          ,
          <year>2003</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>[</lpage>
          19]
          <string-name>
            <given-names>A.</given-names>
            <surname>Sole</surname>
          </string-name>
          , et al.,
          <article-title>Solid or not solid: Vision for radar 9. target validation</article-title>
          , in: IEEE Intelligent Vehicles
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>E. A.</given-names>
            <surname>Wan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. Van Der</given-names>
            <surname>Merwe</surname>
          </string-name>
          , S. Haykin,
          <source>The un- Symposium</source>
          ,
          <year>2004</year>
          , IEEE,
          <year>2004</year>
          , pp.
          <fpage>819</fpage>
          -
          <lpage>824</lpage>
          .
          <article-title>scented kalman filter, Kalman filtering</article-title>
          and neu- [20]
          <string-name>
            <given-names>C.</given-names>
            <surname>Kreucher</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Lakshmanan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Kluge</surname>
          </string-name>
          ,
          <article-title>A driver ral networks 5 (</article-title>
          <year>2001</year>
          )
          <fpage>221</fpage>
          -
          <lpage>280</lpage>
          .
          <article-title>warning system based on the lois lane detec-</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>D.</given-names>
            <surname>Laneuville</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Jaufret</surname>
          </string-name>
          ,
          <article-title>Recursive bearings- tion algorithm, in: Proceedings of IEEE interonly tma via unscented kalman filter: Cartesian national conference on intelligent vehicles, volvs. modified polar coordinates</article-title>
          ,
          <source>in: 2008 IEEE ume 1</source>
          , Stuttgart, Germany,
          <year>1998</year>
          , pp.
          <fpage>17</fpage>
          -
          <lpage>22</lpage>
          . Aerospace Conference, IEEE,
          <year>2008</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>11</lpage>
          . [21]
          <string-name>
            <given-names>R.</given-names>
            <surname>Deriche</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Faugeras</surname>
          </string-name>
          , Tracking line segments,
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>J.</given-names>
            <surname>Lian-Meng</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Quan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Xiao-Xue</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Feng</surname>
          </string-name>
          ,
          <article-title>A Image and vision computing 8 (</article-title>
          <year>1990</year>
          )
          <fpage>261</fpage>
          -
          <lpage>270</lpage>
          .
          <article-title>robust converted measurement kalman filter for</article-title>
          [22]
          <string-name>
            <given-names>S. D.</given-names>
            <surname>Gupta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. Y.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Mallick</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <article-title>Coates, target tracking</article-title>
          ,
          <source>in: Proceedings of the 31st</source>
          Chi
          <string-name>
            <surname>- M. Morelande</surname>
          </string-name>
          ,
          <article-title>Comparison of angle-only filternese Control Conference</article-title>
          , IEEE,
          <year>2012</year>
          , pp.
          <fpage>3754</fpage>
          -
          <article-title>ing algorithms in 3d using ekf, ukf, pf, pf, and 3758</article-title>
          . ensemble kf, in: 2015 18th International Confer-
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>S. V.</given-names>
            <surname>Bordonaro</surname>
          </string-name>
          ,
          <article-title>Converted measurement track- ence on Information Fusion (Fusion)</article-title>
          , IEEE,
          <year>2015</year>
          ,
          <article-title>ers for systems with nonlinear measurement pp</article-title>
          .
          <fpage>1649</fpage>
          -
          <lpage>1656</lpage>
          . functions,
          <source>Ph.D. thesis</source>
          , The school of the thesis, [23]
          <string-name>
            <given-names>D. V.</given-names>
            <surname>Stallard</surname>
          </string-name>
          , Angle-only
          <source>tracking filter in modDoctoral Dissertation</source>
          ,
          <year>2015</year>
          .
          <article-title>ified spherical coordinates</article-title>
          ,
          <source>Journal of Guidance</source>
          ,
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>S.</given-names>
            <surname>Blackman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Popoli</surname>
          </string-name>
          ,
          <source>Design and analysis of Control, and Dynamics</source>
          <volume>14</volume>
          (
          <year>1991</year>
          )
          <fpage>694</fpage>
          -
          <lpage>696</lpage>
          .
          <article-title>modern tracking systems(book), Norwood</article-title>
          , MA: [24]
          <string-name>
            <given-names>T.</given-names>
            <surname>Fortmann</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Bar-Shalom</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Schefe</surname>
          </string-name>
          , Sonar Artech House,
          <year>1999</year>
          . (
          <year>1999</year>
          ).
          <article-title>tracking of multiple targets using joint proba-</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>D. Y.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Jeon</surname>
          </string-name>
          ,
          <article-title>Data fusion of radar and im- bilistic data association, IEEE journal of Oceanic age measurements for multi-object tracking via Engineering 8 (</article-title>
          <year>1983</year>
          )
          <fpage>173</fpage>
          -
          <lpage>184</lpage>
          . kalman filtering,
          <source>Information Sciences 278</source>
          (
          <year>2014</year>
          ) [25]
          <string-name>
            <given-names>F.</given-names>
            <surname>Beritelli</surname>
          </string-name>
          , G. Capizzi,
          <string-name>
            <given-names>G. Lo</given-names>
            <surname>Sciuto</surname>
          </string-name>
          , C. Napoli,
          <volume>641</volume>
          -
          <fpage>652</fpage>
          . M.
          <article-title>Woźniak, A novel training method to preserve</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>X.</given-names>
            <surname>Wu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Ren</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Wu</surname>
          </string-name>
          ,
          <string-name>
            <surname>J. Shao,</surname>
          </string-name>
          <article-title>Study on target generalization of rbpnn classifiers applied to ecg tracking based on vision and radar sensor fusion, signals diagnosis</article-title>
          ,
          <source>Neural Networks</source>
          <volume>108</volume>
          (
          <year>2018</year>
          ) in: WCX World Congress Experience. SAE Inter-
          <volume>331</volume>
          -
          <fpage>338</fpage>
          . national,
          <year>2018</year>
          . [26]
          <string-name>
            <given-names>G.</given-names>
            <surname>Capizzi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. Lo</given-names>
            <surname>Sciuto</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Napoli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Polap</surname>
          </string-name>
          ,
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>G.</given-names>
            <surname>Alessandretti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Broggi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Cerri</surname>
          </string-name>
          , Vehicle and
          <string-name>
            <given-names>M.</given-names>
            <surname>Wozniak</surname>
          </string-name>
          ,
          <article-title>Small lung nodules detection based guard rail detection using radar and vision data on fuzzy-logic and probabilistic neural network fusion, IEEE transactions on intelligent trans- with bioinspired reinforcement learning</article-title>
          ,
          <source>IEEE portation systems 8</source>
          (
          <year>2007</year>
          )
          <fpage>95</fpage>
          -
          <lpage>105</lpage>
          .
          <source>Transactions on Fuzzy Systems</source>
          <volume>28</volume>
          (
          <year>2020</year>
          )
          <fpage>1178</fpage>
          -
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>S.</given-names>
            <surname>Sugimoto</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Tateda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Takahashi</surname>
          </string-name>
          , M. Oku-
          <volume>1189</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>