=Paper= {{Paper |id=Vol-3097/paper28 |storemode=property |title=Drone Positioning From Combined mmWave Radar and Depth Camera Data |pdfUrl=https://ceur-ws.org/Vol-3097/paper28.pdf |volume=Vol-3097 |authors=José A. Paredes,Miles Hansard,Khalid Z. Rajab,Fernando J. Álvarez,Teodoro Aguilera |dblpUrl=https://dblp.org/rec/conf/ipin/ParedesHRAA21 }} ==Drone Positioning From Combined mmWave Radar and Depth Camera Data== https://ceur-ws.org/Vol-3097/paper28.pdf
Drone Positioning From Combined mmWave Radar
and Depth Camera Data
José A. Paredes1 , Miles Hansard2 , Khalid Z. Rajab2,3 , Fernando J. Álvarez1 and
Teodoro Aguilera1
1
  Sensory System Research Group (GISS). University of Extremadura, Av. de Elvas s/n, 06006 Badajoz, Spain.
2
  Centre for Advanced Robotics (ARQ), School of EECS, Queen Mary University of London, Mile End Road, London E1
4NS, UK.
3
  NodeNs Medical Ltd., 10 Bloomsbury Way, London WC1A 2SL, UK.


                                         Abstract
                                         Accurate positioning is needed for many drone manoeuvres, such as landing or remote manipulation.
                                         This work uses a millimeter-wave radar device to produce range-angle and range-velocity heatmaps.
                                         Additional depth data is obtained, while the drone is flying, from a time of flight camera. The radar and
                                         depth data are subsequently combined, using Gaussian process regression. This approach increases the
                                         accuracy of the proposed localization system, while preserving the advantages of the radar sensor in
                                         poor visibility conditions. An experimental evaluation of the system is performed, in a typical flight
                                         scenario.

                                         Keywords
                                         mmWave Radar, ToF Camera, local positioning system, UAV




1. Introduction
The development of reliable positioning systems for UAVs (ummaned aerial vehicles) has
become essential, owing to the widespread use of these platforms. Recent examples include
surveillance [1], logistical [2], audiovisual [3] and military [4] tasks. These applications require
a positioning system of some kind, which provides input to the process for both detection and
navigation purposes. These systems need to provide accurate coordinates, especially for certain
manoeuvres, like landing or grasping.
   This work describes a precise 3D drone positioning system, based on a mmWave (millimeter-
wave) radar and a ToF (time of flight) camera. The main idea is to detect and localise the drone
with the ToF camera, and then to use this data to calibrate the radar. The resulting system

IPIN’21: Eleventh International Conference on Indoor Positioning and Indoor Navigation, November 29 – December 11,
2021, Lloret de Mar, Spain
" japaredesm@unex.es (J. A. Paredes); miles.hansard@qmul.ac.uk (M. Hansard); khalid@nodens.eu (K. Z. Rajab);
fafranco@unex.es (F. J. Álvarez); teoaguibe@unex.es (T. Aguilera)
~ https://giss.unex.es/testimonial/jose-antonio-paredes-moreno-en/ (J. A. Paredes);
http://www.eecs.qmul.ac.uk/~milesh/ (M. Hansard); https://nodens.eu/ (K. Z. Rajab);
https://giss.unex.es/testimonial/fernando-javier-alvarez-franco-en/ (F. J. Álvarez);
https://giss.unex.es/testimonial/teodoro-aguilera-benitez-en/ (T. Aguilera)
 0000-0002-0412-0179 (J. A. Paredes); 0000-0002-7610-1452 (F. J. Álvarez); 0000-0001-9436-5999 (T. Aguilera)
                                       © 2021 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
    CEUR
    Workshop
    Proceedings
                  http://ceur-ws.org
                  ISSN 1613-0073       CEUR Workshop Proceedings (CEUR-WS.org)
benefits from the advantages of the mmWave and ToF devices, and covers the whole workspace
of the drone.
   The paper is organised as follows: the main related works are presented in Section 2; the
proposed algorithm is developed in Section 3; the experimental setup is described in Section 4,
along with the main results; finally, the conclusions and future works are indicated in Section 5.


2. Related Works
Radar systems have been widely used in the last decades to detect targets. Depending on the
wavelength and beam spread, a radar can return multiple readings from the same transmission,
leading to a spatial representation of the environment. By capturing the reflected signal, a radar
system can determine the range, velocity, and AoA (angle of arrival).
   Radar systems based on mmWave technology have become increasingly popular for spatial
sensing tasks. These include detecting and tracking humans, in order to locate and protect
them in potentially dangerous environments. For instance, the Doppler spectrum is analysed
in [5] to distinguish the movement of people’s limbs when walking and thus locate them. In
particular, this system is able to differentiate the movement of an adult from that of a child.
Also a fusion algorithm between mmWave and cameras RGB (red/green/blue) is shown in [6],
where the image ROIs (regions of interest) are dynamically adapted according to the distance.
Some research tries to achieve a position using only a mmWave device, such as [7]. Here, the
authors apply a human gait recognition technique (the device used operates in the 77–81GHz
band). After an initial setup stage, a spatial resolution of 4.4cm and a maximum unambiguous
distance of 5m are achieved. In terms of velocity, the maximum radial velocity is 2m/s, with a
resolution of 0.26m/s. In total, the algorithm works with a recognition accuracy of 89% with
12 people to identify and 0.16m positioning error.
   A positioning and tracking system based on 60GHz signals is analysed in [8]. This design
manages to filter out multiple reflections and diffuse scattering components so that the accuracy
achieved is relatively high. Its working area extends from 0.46m to 5.55m in the longitudinal
direction and from 1.91m on the left to 3.04m on the right. The target’s position is obtained
by calculating the local centroid of the associated point cloud. Globally, the system ensures a
positioning in the plane, with a 99% confidence level and an error around 30–40cm.
   There are also works whose target is a drone. For example, two approaches to perimeter
surveillance are presented in [9]. This system covers distances from tens to hundreds of metres,
and the authors show that velocity can be estimated, by analysing small Doppler shifts. Also,
2D detection and tracking for UAVs is explored in [10], where a comparison with a lidar system
is made. The authors can detect drones at distances up to 25m, although if the environment
contains clutter, the detection is lost at 5m. Besides, when the UAV must be detected at low flight
angles, the multipath effect (due to reflection in ground clutter) could mask direct reflections
from the target. Other works, such as [11, 12], focus on pursuit of one drone by another. In this
scenario, a 2D radar mounted on the tracking drone is used to detect the target. The missing
axis information is obtained via geometric calculations, by estimating the position of the first
drone.
Figure 1: System overview. The ToF camera provides the drone coordinates, from a limited FoV, that
will be used to calibrate the mmWave radar. The latter has full coverage of the scene.


3. Proposed LPS
The main objective of this work lies in implementing an LPS (local positioning system) to
localise a drone flying in a cluttered scene. We use a mmWave radar, which has the following
advantages. Firstly, it works in the absence of visible light, and performs well under challenging
environmental conditions (e.g. fog, smoke, rain). Secondly, these devices effectively preserve
privacy, as they do not provide recognisable images, but only a map of reflected energy.
   This LPS is designed to work in relatively small spaces, focused on addressing those manoeu-
vres needing high accuracy, such as landing or taking off. Based on that, the mmWave radar
will be calibrated using the data from a ToF camera. The main idea is to use the high accuracy
of the camera, which has a limited maximum range and FoV (field of view), to calibrate the data
from the radar. The main steps for the algorithm are detailed in the following sections.

3.1. Depth camera target detection
The idea to get 3D accurate coordinates for calibration is to adapt the algorithm in [13] to detect
the flying drone in a scene where the ToF camera is arranged with the radar, as indicated in
Figure 1. This camera will provide the data for the red shaded area, which will be used as input
to a supervised learning algorithm – Section 3.3 – for the radar to work over the whole scene
(blue shaded area).
   In [13], the authors consider the disturbance that a quadcopter causes in a azimuthal depth
map so as to apply a 2D matched filter, known as “Gnome Hat”. This wavelet allows the system
to detect all drones flying in the scene quickly. Moreover, it is dynamically adapted to the image
portion that they occupy, taking into account the variation of this parameter with their flying
altitude, which is also theoretically determined. The filter has been conveniently adapted to
detect drones in a tilted position, taking into account the size of the drone used and the image
produced (see Section 4 for more details on this device).

3.2. Radar target detection
Once the radar data have been acquired, the detection must be strengthened, in order to discount
the effects of clutter in the scene, which can obscure the drone. This process is done in three
stages:
   1. By fitting a simple aluminium retro-reflector to the drone, thereby increasing the target
      RCS (radar cross-section).
   2. By subtracting the static background signal, once the data has been acquired. This
      technique can be applied to velocity as well as position maps.
   3. By using the micro-Doppler effect. When an object has some mechanical vibration or
      rotation, a modulation in frequency is induced in the reflected wave, generating side-lobes
      around the base frequency, allowing us to distinguish the range at which the drone is
      flying.

3.3. Localization algorithm
The fundamental concept in radar systems is the emission of a signal and its reception after
reflecting from the surrounding objects. Specifically, the device chosen in this work is based on
a MIMO (multiple input multiple output) approach, which uses digital beamforming to optimise
the angular estimates.
   Briefly, two consecutive operations of spectral analysis are required to obtain information
from radar data. One determines the range and the velocity, and the other one extracts angular
information (azimuth and elevation if the antenna layout allows it). Here, the first analysis will
be through a FFT (fast Fourier transform), while MUSIC (multiple signal classification) will be
applied for the extraction of angles. For more in-depth radar basics, see [14].
   The data read from the device can be summarise as follows. Let P ≡ 𝑃 (𝑟, 𝜃, 𝜑) be the 3D
range-azimuth-elevation array obtained after both spectral analysis over space, as explained
in Section 3.2, and PD ≡ 𝑃𝐷 (𝑟, 𝑣) the 2D range-velocity array after one spectral analysis over
time (MIMO strategy [15, 16]). Both arrays have previously gone through the background
subtraction stage. Thus, the following representations can then be constructed:

    • A range-azimuth heatmap:

                                      𝑃𝐴 (𝑟, 𝜃) = max 𝑃 (𝑟, 𝜃, 𝜑)                              (1)
                                                    𝜑

    • A range-elevation heatmap:

                                      𝑃𝐸 (𝑟, 𝜑) = max 𝑃 (𝑟, 𝜃, 𝜑)                              (2)
                                                    𝜃
    • A range profile:
                                        𝑃𝐷 (𝑟) = mean 𝑃𝐷 (𝑟, 𝑣)                                (3)
                                                        𝑣

Note that the last equation calculates the average over the velocity. In this way, the sidelobes
mentioned above (Section 3.2) help increase the main peak of this profile, i.e., the detection is
strengthened. Then, the range coordinate can be taken as:

                                      𝑟+ = arg max 𝑃𝐷 (𝑟)                                      (4)
                                                    𝑟

In the same way, considering the coordinate 𝑟+ , the angular coordinates can be extracted:

                                    𝜃+ = arg max 𝑃𝐴 (𝑟+ , 𝜃)                                   (5)
                                                𝜃
                                   𝜑+ = arg max 𝑃𝐸 (𝑟+ , 𝜑)                                    (6)
                                                𝜑

Alternatively, we can use a more sophisticated method, to estimate the 3D position from the
radar data arrays. In particular, we use a GPR (Gaussian process regression) over the angular
profiles 𝑃𝐴 (𝑟+ , 𝜃) and 𝑃𝐸 (𝑟+ , 𝜑) to try to improve the global accuracy. The hypothesis is that
the GPR estimates [𝜃⋆ , 𝜑⋆ ] will be better than the direct maximum estimates [𝜃+ , 𝜑+ ] in (5)
and (6):

                                      𝜃⋆ ← GPR𝜃 (𝑟+ , 𝛼, 𝜖)                                    (7)
                                     𝜑⋆ ← GPR𝜑 (𝑟+ , 𝛼, 𝜖)                                     (8)

where 𝛼 and 𝜖 are the coefficients representing the former profiles in a B-spline basis. This
is done to reduce the input size in the GPR. Besides, this procedure will absorb the lack of
calibration between the ToF camera and the radar. The same procedure will be assessed for
range:
                                    𝑟⋆ ← GPR𝑟 (𝑟+ , 𝛼, 𝜖)                                 (9)


4. Experimental setup and results
This section presents the experimental setup as well as the main results to assess the proposed
LPS. There are a range of devices that could address the problem described here. We have
implemented the system as affordably as possible, using two off-the-shelf devices: a PMD
Picoflexx [17] ToF camera, and a Texas Instruments IWR6843ISK-ODS mmWave radar [18].
These devices are mounted as shown in Figure 2, and slightly tilted, with respect to the ground,
for better radar performance.
   The target is a Parrot Mambo Minidrone [19], a small UAV with a wingspan of 13cm. The
small size of this drone is deliberately challenging for any localisation system. As commented
in Section 3.2, an aluminium retroreflector is attached to the drone, in order to increase the RCS,
as shown in Figure 3. This piece has been wrapped in white paper to avoid the ToF camera
becoming saturated due to the high reflectivity of the aluminium. Note that the radar signal
will not be affected by this covering.
Figure 2: Experimental setup. The ToF camera has been attached to the mmWave radar, in order to
calibrate the system.


   Firstly, the background subtraction stage is tested. Figure 4 depicts raw heatmaps (top),
compared to the results after background subtraction (bottom). It can be seen that this is an
effective way to remove clutter.
   The detection can fail even after background subtraction, for example, due to the uncontrolled
multipath effect suffered in these environments. The micro-Doppler effect is used to avoid this,
as illustrated in Figure 5. If we relied only on the range-azimuth heatmap in Figure 5a, the
detection would be erroneous. However, when the sidelobes generated from the propellers
in the range-velocity heatmap (Figure 5b) are integrated, the detection is corrected, as can be
extracted from the red line in Figure 5c.
   Finally, a simultaneous record from the camera and the radar is taken while the drone is flying.
The positioning results are depicted in figure 6 for range, azimuth and elevation separately.




Figure 3: Parrot Mambo Minidrone, with a trihedral retroreflector to increase mmWave detectability.
Here, the points represent the values obtained by extracting the maxima of all radar profiles
[𝑟+ , 𝜃+ , 𝜑+ ] versus the values obtained with the camera. It can be seen that the range maxima
are close to the directly estimated values (45∘ line). The angular estimates are more scattered,
and a systematic pattern of errors can be seen. In order to address this, a GPR has been performed
on these measurements, as represented by the blue line and the shaded area. This line allows us
to notice the slight bias due to the mis-calibration between both devices, which will be absorbed
by the final GPR.
   Now, the data is randomly divided into 70% training and 30% test data, represented respec-
tively as red and blue points in the last figure. New GPRs whose inputs are those proposed in
equations (7), (8) are implemented. For range, it has been experimentally found that the results
are slightly better when employing the next regression, instead of equation 9:

                                                       𝑟⋆ ← GPR𝑟 (𝑟+ )                                               (10)

The final results are shown in Figure 7. The 𝑥-axis represents the coordinates extracted from
the camera [𝑅, Θ, Φ], while the y-axis corresponds to the estimates from the GPR. It can be
observed that the scattering has been reduced, i.e., the RMSE (root-mean-square error) has


                                                                                  60
                      60

                      40                                                          40

                      20                                                          20
                                                                 Elevation (m)
      Azimuth ( °)




                       0                                                           0

                     -20                                                         -20

                     -40                                                         -40

                     -60
                                                                                 -60

                           1   2   3           4   5    6                              1   2   3         4   5   6
                                   Range (m)                                                   Range (m)

                                   (a)                                                         (b)

                                                                                  60
                      60

                      40                                                          40

                      20                                                          20
      Azimuth ( °)




                                                                 Elevation (m)




                       0                                                           0

                     -20                                                         -20

                     -40                                                         -40

                     -60
                                                                                 -60

                           1   2   3         4     5    6                              1   2   3         4   5   6
                                   Range (m)                                                   Range (m)

                                   (c)                                                         (d)
Figure 4: Background subtraction example for azimuth (a,c), and elevation (b,d) heatmaps. The ma-
genta cross indicates the true position of the target. Environmental clutter produces false peaks in (a,b),
which dominate the target. After background subtraction in (c,d), the dominant peak is close to the
true position (estimated by the camera system). The residual biases are modelled by GPR.
                   60
                                                                                                                              2
                   40
                                                                                                                              1
   Azimuth ( °)




                   20




                                                                                                            Velocity (m/s)
                    0                                                                                                         0

                  -20
                                                                                                                             -1
                  -40
                                                                                                                             -2
                  -60

                        0   1   2          3                             4             5                                          0    1    2             3     4   5
                                    Range (m)                                                                                                       Range (m)

                                    (a)                                                                                                             (b)

                                                                 1
                                                                             P 0 (r)
                                                                             PD0 (r)                      False detection

                                                               0.8

                                                                         Real position
                                          Relative intensity




                                                               0.6


                                                               0.4


                                                               0.2


                                                                 0
                                                                     0            1         2         3                           4    5        6
                                                                                                Range (m)

                                                                                                (c)
Figure 5: Example use of micro-Doppler information to assist target detection. The peak of the range
profile extracted from the range-azimuth heatmap (a), around 3m, does not correspond to the true tar-
get range. However, the peak of the range-velocity data (b) is located at the true range of the target (c).


Table 1
Summary of RMSEs for the proposed LPS.
                                                                                           Direct Maxima                              Final GPR
                                      Range (cm)                                                19.40                                   14.74
                                      Azimuth (∘ )                                               7.75                                    7.48
                                      Elevation (∘ )                                             8.62                                    4.11


improved, as shown in Table 1.


5. Conclusions and future work
This paper has presented an LPS based on a mmWave radar sensor, which has been dynamically
calibrated with respect to a ToF camera. In this way, the advantages of each device are combined,
while their drawbacks are compensated for.
   The main proposal lies in implementing a GPR over the radar data, whose inputs are concate-
                                                         4
                                                                      Prediction
                                                                      95% confidence
                                                                      Test
                                                         3            Training




                                             Range (m)
                                                         2



                                                         1



                                                         0
                                                             0         1                2                   3         4
                                                                                r + (m)

                          40                                                                        40
                                  Prediction                                                                    Prediction
                                  95% confidence                                                                95% confidence
                                  Test                                                                          Test
                          20      Training
                                                                                  Elevation ( °)    20          Training
          Azimuth ( °)




                           0                                                                         0



                         -20                                                                       -20



                         -40                                                                       -40
                            -40   -20       0                    20        40                         -40       -20         0       20   40
                                          3 + (°)                                                                         ? + (°)

Figure 6: Positioning results 𝑟+ , 𝜃+ and 𝜑+ directly taken from the maxima of the respective profiles.
The regression line of all the points (blue line) reveal a slight bias, probably due to the mis-calibration
between the devices. The point colours represent training (red) and test (blue) data for the final GPR
shown in figure 7.


nated versions of range, azimuth and elevation data, taking into account that a combination of
those defines every point in the space. The ground truth has been provided by a ToF camera in
a precise way. The system has been experimentally tested, and it has been demonstrated that
the experimental RMSE decreases for range and angular estimates. Future work will investigate
different beamforming techniques for processing the radar data, as well as alternative supervised
learning algorithms for the calibration process.


Acknowledgments
This work has been partially supported by the UK EPSRC National Centre for Nuclear Robotics
(NCNR) EP/R02572X/1; by the Spanish Government and the European Regional Development
Fund (ERDF) through Project MICROCEBUS under Grant RTI2018-095168-B-C54; and by the
Regional Government of Extremadura and ERDF - ESF, under Project GR18038 and through the
                                                   4
                                                                Training
                                                                Test

                                                   3




                                          r? (m)
                                                   2



                                                   1



                                                   0
                                                       0          1            2               3         4
                                                                            R (m)

                      40                                                               40
                              Training                                                             Training
                              Test                                                                 Test

                      20                                                    ?? (/ )    20
          3? ( / )




                       0                                                                0



                     -20                                                              -20



                     -40                                                              -40
                        -40   -20         0                20          40                -40       -20          0      20   40
                                              /
                                         #( )                                                                 ) (/ )

Figure 7: Results for the final GPR, where the inputs have been taken as [𝑟+ , 𝛼, 𝜖]. A narrowing in the
dispersion of the data can be observed, which leads to a decrease in the RMSE.


Pre-Doctoral Scholarship under Grant 45/2016 Exp. PD16030.


References
 [1] S. K. Boddhu, M. McCartney, O. Ceccopieri, R. L. Williams, A collaborative smartphone
     sensing platform for detecting and tracking hostile drones, in: T. Pham, M. A. Kolodny,
     K. L. Priddy (Eds.), SPIE Defense, Security, and Sensing, Baltimore, Maryland, USA, 2013,
     p. 874211. doi:10.1117/12.2014530.
 [2] R. Kellermann, T. Biehle, L. Fischer, Drones for parcel and passenger transportation: A
     literature review, Transportation Research Interdisciplinary Perspectives 4 (2020) 100088.
     doi:10.1016/j.trip.2019.100088.
 [3] J. Harvard, M. Hyvönen, I. Wadbring, Journalism from Above: Drones and the Media
     in Critical Perspective, Media and Communication 8 (2020) 60–63. doi:10.17645/mac.
     v8i3.3442.
 [4] T. de Swarte, O. Boufous, P. Escalle, Artificial intelligence, ethics and human values: The
     cases of military drones and companion robots, Artificial Life and Robotics 24 (2019)
     291–296. doi:10.1007/s10015-019-00525-1.
 [5] Y. Balal, N. Balal, Y. Richter, Y. Pinhasi, Time-frequency spectral signature of limb move-
     ments and height estimation using micro-doppler millimeter-wave radar, Sensors 20 (2020)
     4660. doi:10.3390/s20174660.
 [6] X.-p. Guo, J.-s. Du, J. Gao, W. Wang, Pedestrian detection based on fusion of millimeter
     wave radar and vision, in: 2018 International Conference on Artificial Intelligence and
     Pattern Recognition, AIPR 2018, ACM, New York, USA, 2018, pp. 38–42. doi:10.1145/
     3268866.3268868.
 [7] P. Zhao, C. X. Lu, J. Wang, C. Chen, W. Wang, N. Trigoni, A. Markham, mID: Tracking and
     identifying people with millimeter wave radar, in: 2019 15th International Conference on
     Distributed Computing in Sensor Systems (DCOSS), IEEE, Santorini Island, Greek, 2019,
     pp. 33–40. doi:10.1109/DCOSS.2019.00028.
 [8] A. Antonucci, M. Corra, A. Ferrari, D. Fontanelli, E. Fusari, D. Macii, L. Palopoli, Per-
     formance Analysis of a 60-GHz Radar for Indoor Positioning and Tracking, in: 2019
     International Conference on Indoor Positioning and Indoor Navigation (IPIN), IEEE, Pisa,
     Italy, 2019, pp. 1–7. doi:10.1109/IPIN.2019.8911764.
 [9] M. Caris, S. Stanko, W. Johannes, S. Sieger, N. Pohl, Detection and tracking of Micro
     Aerial Vehicles with millimeter wave radar, in: Proceedings of the 2016 European Radar
     Conference (EuRAD), London, UK, 2016, pp. 406–408.
[10] M. U. de Haag, C. G. Bartone, M. S. Braasch, Flight-test evaluation of small form-factor
     LiDAR and radar sensors for sUAS detect-and-avoid applications, in: 2016 IEEE/AIAA
     35th Digital Avionics Systems Conference (DASC), Sacramento, California, UK, 2016, pp.
     1–11. doi:10.1109/DASC.2016.7778108.
[11] S. Dogru, R. Baptista, L. Marques, Tracking drones with drones using millimeter wave
     radar, in: Robot 2019: Fourth Iberian Robotics Conference, volume 1093, Oporto, Portugal,
     2019, pp. 392–402. doi:10.1007/978-3-030-36150-1_32.
[12] S. Dogru, L. Marques, Pursuing drones with drones using millimeter wave radar, IEEE
     Robotics and Automation Letters 5 (2020) 4156–4163. doi:10.1109/LRA.2020.2990605.
[13] J. A. Paredes, F. J. Álvarez, T. Aguilera, F. J. Aranda, Precise drone location and tracking by
     adaptive matched filtering from a top-view ToF camera, Expert Systems with Applications
     141 (2020) 112989. doi:10.1016/j.eswa.2019.112989.
[14] M. A. Richards, Fundamentals of Radar Signal Processing, McGraw Hill Professional, 2005.
[15] E. Fishler, A. Haimovich, R. Blum, D. Chizhik, L. Cimini, R. Valenzuela, MIMO radar: An
     idea whose time has come, in: Proceedings of the 2004 IEEE Radar Conference (IEEE Cat.
     No.04CH37509), 2004, pp. 71–78. doi:10.1109/NRC.2004.1316398.
[16] J. Li, P. Stoica, MIMO Radar with Colocated Antennas, IEEE Signal Processing Magazine
     24 (2007) 106–114. doi:10.1109/MSP.2007.904812.
[17] PMD, Flexx | picofamily, 2021. URL: https://pmdtec.com/picofamily/flexx/.
[18] T. Instruments, IWR6843ISK-ODS IWR6843 intelligent mmWave overhead detection sensor
     (ODS) antenna plug-in module, 2020. URL: http://www.ti.com/tool/IWR6843ISK-ODS.
[19] Parrot, Parrot Bebop 2 Power - Pack FPV, 2017. URL: https://www.parrot.com/es/drones/
     parrot-bebop-2-power-pack-fpv.