<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>On the Localization of Sensors using a Drone with UWB Antennas</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Francesco Betti Sorbelli</string-name>
          <email>francesco.bettisorbelli@uni</email>
          <email>francesco.bettisorbelli@uni .it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Cristina M. Pinotti</string-name>
          <email>cristina.pinotti@unipg.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Dept. of Computer Science and Math., University of Florence</institution>
          ,
          <country country="IT">Italy</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Dept. of Computer Science and Math., University of Perugia</institution>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>In the last years, mega-wild res have become routine news. For the re ghters, it is crucial to rely on systems able to prevent and detect those disastrous events. Wireless Sensor Networks (WSNs) become an important issue such as environmental and forest monitoring. An important requirement for the WSNs is the ability to precisely localize their sensors for augmenting the collected data with their position. In this paper, we survey three previous presented algorithms, called Short, Dir, and Omni, for localizing terrestrial sensors using a drone. We compare their performance under a variety of di erent conditions: we evaluate the path length, we compare the localization performance in terms of precision and error, and we analyze the impact of using multilateration instead of trilateration during each localization. Our study supports the importance of the geometry in selecting the waypoints from which a sensor is measured. The common belief that multilateration leads to a more precise localization is not completely correct. Indeed, we have shown by simulations that, when the waypoints follow a well pondered geometry, multilateration is not useful. Whereas, the multilateration can improve the precision for algorithms that select the waypoints according to a best-e ort policy.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>It is widely know that, during the dry season, the risk to have a forest re is very high. Every year, such
situations create huge damages in terms of money and even in terms of human lives. Nowadays, we are used to
watch on television re ghters that try to extinguish res where the majority of which are caused from humans
fraudulently. Therefore, it is appropriate to build robust solutions for re ghting in order to prevent and detect
such emergencies. As an example, the Mediterranean basin, which is a very large area, is continuously a ected
by res in the summer. Hence, it is required to cover and precisely monitor these potential hazardous areas by
sensors.</p>
      <p>
        In such scenario, a Wireless Sensor Network (WSN) can be built inside a forest for monitoring tasks. Deploying
sensors close to dangerous places is essential in order to detect potential clues, for example, unexpected
temperature increases in a very dry zone. Once the detected data from the terrestrial sensors are delivered to the master
gateway node, proper and adequate solutions are taken. To build the WSN, sensors can be randomly deployed
in the target area and a mobile vehicle can be used to be the contact point between the WSN and the external
world. Therefore, the random deployment results in sensors initially unaware of their location. Localization is
one the most important task in a WSN, since it provides fundamental support for many location-aware protocols
and applications. Due to limitations in form factor, cost per unit, and energy budget, individual sensors are not
expected to be GPS-enabled. Many existing localization algorithms in the literature require a large number of
xed anchor points, i.e., sensors whose positions are known a-priori [
        <xref ref-type="bibr" rid="ref2">1</xref>
        ]. The number of the anchor points and
the cost of their deployment grow with the size of the deployment area. Moreover, the anchor points must be
deployed in advance, making the use of anchor sensors unsuitable for emergency situations.
      </p>
      <p>
        In order to decrease the setup costs of WSNs, we propose to replace xed anchor sensors with a single mobile
anchor sensor, such as an Unmanned Aerial Vehicle (UAV) [
        <xref ref-type="bibr" rid="ref3 ref4">2, 3</xref>
        ]. Recently, UAVs, also known as drones, have
received increasing attention from research and industry community [
        <xref ref-type="bibr" rid="ref5 ref6">4, 5</xref>
        ]. Depending on the adopted technique
to estimate the sensors' positions, localization algorithms can be categorized in range-free or range-based. In
the former case, the position estimation is performed without using any type of ranging measurements. In the
latter, the estimations are instead inferred by exploiting several properties of the communication signals, such
as the Received Signal Strength Indicator (RSSI) or the Time of Arrival (ToA).
      </p>
      <p>
        In this paper, we survey three previous presented algorithms: Short [
        <xref ref-type="bibr" rid="ref7">6</xref>
        ], Dir [
        <xref ref-type="bibr" rid="ref4">3</xref>
        ], and Omni [
        <xref ref-type="bibr" rid="ref3">2</xref>
        ]. All the three
algorithms are range-based and use a drone as a mobile anchor, which replaces the xed anchors. The original
contribution of this work is a comparative study via simulation of the properties of these three localization
algorithms. Moreover, we extend the three algorithms by using, to localize the sensors, multilateration instead of
trilateration. Our somehow counterintuitive nding is that multilateration does not always improve the precision
of the localization, especially when the trilateration has selected measurements that re ects a peculiar geometry.
      </p>
      <p>The three algorithms make di erent assumptions of the available hardware at the drone for communicating
with the terrestrial sensors. Dir assumes that the drone is equipped with directional antennas, which can adjust
their beamwidth according to the required precision. Short and Omni assume only a regular omnidirectional
antenna. All the solutions adopt the Impulse-Radio Ultra-Wide-Band (IR-UWB) technology for the antennas.
Dir and Omni are based on a static path for the drone over the deployment area, and guarantee the localization
precision required by the nal user, while Short produces a very short static path, but it cannot guarantee any
a-priori localization precision. Again, Dir and Omni prefer precision to path length, whereas Short sacri ces
precision for being able to accomplish an entire mission without recharging the battery.</p>
      <p>It is very important to stress that all the three algorithms locate all the nodes independently of their number
and their actual position. While traversing the path, the drone calculates the distances with the sensors at
special positions, called waypoints: each waypoint acts as a xed anchor node. Once a sensor has received three
measurements, it can calculate its position by applying the trilateration procedure. Moreover, in this paper, to
localize themselves, the sensor can either perform a multilateration using six di erent measurements or the usual
trilateration procedure.</p>
      <p>The main contribution of this work is to present an extensive comparison of the three algorithms that studies, in
particular, whether using a greater number of measurements for each sensor is possible to increase the localization
precision. Our ndings contradict the common belief that multilateration always help.</p>
      <p>The rest of paper is organized as follows. Sec. 2 revises the current literature on localization algorithms using
drones. Speci cally, in Sec. 2.1, we survey Short, in Sec. 2.2, we survey Dir, while in Sec. 2.3, we survey Omni.
The experimental results are discussed in Sec. 3, while Sec. 4 concludes the paper.
2</p>
    </sec>
    <sec id="sec-2">
      <title>Recent Localization Algorithms</title>
      <p>We consider a network of n sensors deployed in a rectangular area Q of size Qx Qy. From now on, sensor P
is short form for the sensor that resides at a point P in Q. Since, in principle any point P in Q is candidate
to contain a sensor, with a little abuse of notation, we denote P indistinctly as the point P or the sensor P .
Moreover, we use the notation P P 0 to denote the ground distance between two waypoints/sensors P and P 0.</p>
      <p>
        Our mobile anchor is a drone. It ies at a xed altitude h [
        <xref ref-type="bibr" rid="ref8">7</xref>
        ]. We assume that the drone measures its altitude
with negligible error, for example, by using the Di erential GPS technique [
        <xref ref-type="bibr" rid="ref9">8</xref>
        ]. To range the sensors on the
ground and to achieve higher precision, we adopt the two-way ranging application of the IR-UWB technology,
which guarantees a very high measurement precision of the order of 10 cm [
        <xref ref-type="bibr" rid="ref10">9</xref>
        ], but other technologies could be
used as well. For example, WiFi guarantees an instrumental precision of 7-10 m, and Bluetooth of 15 m [
        <xref ref-type="bibr" rid="ref11">10</xref>
        ].
In recent years, autonomous vehicles have been used for patrolling areas. Since in this work we assume that the
autonomous vehicle is a drone, the trajectory will be traversed at a certain altitude h and the projection on the
ground of the trajectory will be denoted as 2D movement trajectory.
      </p>
      <p>Assuming that the path is used for localizing objects, the drone will take the measurements at prede ned
points along its trajectory. The projection on the ground of a measurement point will be called waypoint.
(a) Scan
(b) Double Scan
(c) Hilbert
(d) S-Curves
(e) Lmat</p>
      <p>
        In [
        <xref ref-type="bibr" rid="ref12">11</xref>
        ], three di erent 2D movement trajectories have been studied, i.e., Scan, Double Scan and Hilbert.
The simplest algorithm is Scan (see Fig. 1(a)), in which the drone follows a path formed by vertical straight lines
interconnected by horizontal lines. Essentially, the drone sweeps the area along the y-axis and the waypoints
are regularly distributed along the path. The main drawback is that Scan provides a large amount of collinear
anchor points. In order to resolve the collinearity problem, Double Scan has been proposed (see Fig. 1(b)), in
which the mobile anchor node sweeps the sensing area along both the x-axis and the y-axis. However, in this way
the path length is doubled compared with Scan. A level-n Hilbert (see Fig. 1(c) with n = 3) curve divides the
area into 4n square cells and connects the centers of those cells using 4n line segments, each of length equal to
the length of the side of a square cell. Hilbert provides more non-collinear anchor points but the path length
can be very long if the value of the parameter n increases. Note that, all the above three schemes are based on
straight lines. In order to heavily reduce the collinearity during localization, the S-Curves (see Fig. 1(d)) static
path was introduced in [
        <xref ref-type="bibr" rid="ref13">12</xref>
        ]. S-Curves is similar to Scan except that it uses curves rather than straight lines.
Even though the collinearity problem is almost resolved, the main problem is that it does not cover properly the
four corners of the square sensing area. The authors in [
        <xref ref-type="bibr" rid="ref14">13</xref>
        ] developed Lmat (see Fig. 1(e)): the main idea is to
plan a series of mobile anchor points in such a way as to form regular equilateral triangles, avoiding collinearity
problems. Each node inside a triangle is localized by a trilateration procedure using the three vertices.
      </p>
      <p>It is worthy to note that Dir and Omni adopt a Scan 2D trajectory. Short, instead, starts with a trajectory
similar to Lmat, which is then modi ed by solving an instance of the Traveling Salesman Problem (TSP).</p>
      <sec id="sec-2-1">
        <title>Measurement De nitions</title>
        <p>In all the three algorithms compared in this work, i.e., Short, Dir, and Omni, during the mission the drone
takes measurements at pre-established points of the path called waypoints.</p>
        <p>We assume that the drone, using the IR-UWB antenna, measures its distance from the sensors by the
roundtrip time of messages exchanged with them. To take a measurement, the drone acts as follows. At each waypoint
wi, the drone sends a beacon signal with a unique identi er which depends on the coordinates (xwi ; ywi ) of the
waypoint and the current timestamp. Each sensor on the ground that can hear a beacon replies to the drone
with an ack message that contains its ID, the current timestamp, and the identi er of the beacon received. The
drone computes the distance between itself and the sensor using the round-trip time of the beacon message and
then sends a message with the computed distance to the sensor. Once a sensor has collected three measurements,
it can locally apply the trilateration algorithm (Eq. (1)) to discover its position P . Hence, given three ground
measures, the estimated position of P is the point (xP ; yP ) that minimizes the sum of the least squares, that is:
min
s.t. p(xwi
xP )2 + (ywi
yP )2 + i = wiP
for i = 1; 2; 3</p>
        <p>The multilateration (in our case, a six-lateration) procedure can be considered as the generalization of
trilateration, as reported in Eq. (2), when six measurements are considered:
min</p>
        <p>P6 2
i i
s.t. p(xwi
xP )2 + (ywi
yP )2 + i = wiP
for i = 1; : : : ; 6
(1)
In other words, slant distances when projected on the ground cannot be larger than dmax. This dmax constraint
holds for all the three surveyed algorithms.</p>
      </sec>
      <sec id="sec-2-2">
        <title>Ground and Trilateration Precision</title>
        <p>
          Slant distances are a ected by errors that depend on the adopted technology [
          <xref ref-type="bibr" rid="ref11">10</xref>
          ], i.e., IR-UWB in our case. The
slant precision s denotes the maximum error in the measurements. The trilateration algorithm works on the
ground distances, thus the ground error ed(P ) for a point P on the ground is [
          <xref ref-type="bibr" rid="ref11">10</xref>
          ]:
ed(P ) = es
        </p>
        <p>1
cos( )</p>
        <p>r
= es
1 +
h2
d2
where is the angle of incidence of the slant distance to the ground, d is the distance on the ground between
the drone and the sensor, h is the altitude, and es the actual measured slant distance error (see Fig. 2).</p>
      </sec>
      <sec id="sec-2-3">
        <title>Distance De nitions</title>
        <p>The drone and the sensor have communication ranges, rdrone and rsensor, respectively. Since a message can
be exchanged between the drone and the sensor only if they can hear each other, throughout this paper, the
communication range will be r = min frdrone; rsensorg.</p>
        <p>The drone measures the (3D) slant distance s, which is de ned as the line-of-sight between itself and the
measured sensor. Clearly, s r. By simple geometric argument, since we assume negligible error in altitude, it
is easy to see that any ground measured distance d satis es:
d = mP2aQx ed(P ) = s
s
1 +</p>
        <p>h2
dmin2</p>
        <p>
          From Eq. (4), the ground error increases when the ground distance decreases. Since the slant error is maximum
when the distance on the ground is minimum, xed dmin as the minimum allowed ground distance, the maximum
possible ground error is [
          <xref ref-type="bibr" rid="ref11">10</xref>
          ]:
        </p>
        <p>However, only Dir and Omni are interested in bounding the localization error so, the dmin constraint is
required only for them.</p>
        <p>
          Moreover, to minimize the localization error eL(P ) due to the trilateration, the three waypoints from which
a sensor is measured cannot be collinear neither among them nor collinear with the sensor [
          <xref ref-type="bibr" rid="ref11">10</xref>
          ]. In fact, if the
waypoints are collinear among them, the trilateration algorithm cannot distinguish between the real position P
of the sensor and the mirror image P 0 of P . With regard to the collinearity with the sensor, as proved in [
          <xref ref-type="bibr" rid="ref11">10</xref>
          ],
the localization precision L is expressed as:
        </p>
        <p>L = max eL(P ) =</p>
        <p>
          P 2Q
sin
d
min
2
where min is the minimum angle obtained during the localization. The best case occurs when min = =3 [
          <xref ref-type="bibr" rid="ref11">10</xref>
          ].
If the waypoints are collinear with the sensor, min tends to 0 and the localization error becomes very large and
thus the estimated position very imprecise.
(3)
(4)
(5)
(6)
        </p>
        <p>For the same reason mentioned above, the min constraint is satis ed only for Dir and Omni since they need
to bound the localization error. min is not actually satis ed in Short because the built geometry by triangles
does not allow to bound the minimum angle min. For example, if a generic point P resides in the midpoint of
a triangle's side, dmin is respected but min = 0.</p>
        <p>To summarize, the aim of the precise Dir and Omni algorithms is to obtain an L-precise localization for each
point, where the maximum localization error L is determined as a function of dmin and min. In conclusion, the
sensor P in Q is L-precisely localized, where L is given by Eq. (6), if the drone chooses three ranging waypoints
w1(P ), w2(P ) and w3(P ) for P such that they satisfy the following constraints:
1. dmax: which controls the reachability for each point P in Q: wi(P )P
dmax for i = 1; 2; 3;
2. dmin: which controls the ground precision d for P in Q: wi(P )P
dmin for i = 1; 2; 3;</p>
        <p>min: which controls the collinearity of the drone with P : (P )
min;
4. non-linearity : which controls the collinearity among waypoints: w1(P ), w2(P ), and w3(P ) cannot belong
to the same straight line.</p>
        <p>As regard to Short, it only respects the reachability and collinearity constraints.</p>
        <p>We nally resume the constraints for each algorithm in Tab. 1.
Short assumes the drone to be equipped with a regular omnidirectional antenna. The aim of this algorithm is
to nd a static path as short as possible for localizing all the sensors, but any a-priori bound on the localization
error is accepted. Short operates in two phases: waypoint grid construction and waypoint ordering.</p>
        <p>As depicted in Fig. 3, in the rst phase, Short builds a set of waypoints forming a grid of isosceles triangles
(dashed lines) which covers the whole deployment area Q.</p>
        <p>Ty</p>
        <p>S</p>
        <p>Tx</p>
        <p>Qy</p>
        <p>Qx</p>
        <p>The waypoints reside at the vertices of each triangle (gray dots). The triangles have the same base Tx and
the same height Ty. Isosceles triangles are more exible than equilateral ones, because they can be resized along
both dimensions. This allows us to cover the whole deployment area as tightly as possible, without wasting
path length for covering external areas uselessly. The base and the height of the triangles must be as long
as possible, without violating the maximal triangle side dmax. Given a base Tx, the longest feasible height is
(Tx=2)2. Short computes the base and the height of the triangles as follows:</p>
        <p>Qx
dmax
Tx = Qx=</p>
        <p>Ty = Qy=</p>
        <p>Qy
Tymax</p>
        <p>Finally, in the waypoint ordering phase, Short executes the TSP solver to connect all the waypoints in order
to drastically reduce the path length. The TSP solver takes in input all the waypoints' locations and gives in
output the static path S . An example of S is depicted as a dashed line in Fig. 4, where the gray dots represent
the waypoints, while the black dots are the sensors inside Q.</p>
        <p>
          In our solution the directional antennas cover six di erent sectors, as illustrated in Fig. 5. The drone can
send at the same time the beacons in all six orientations using an array of directional antennas [
          <xref ref-type="bibr" rid="ref15">14</xref>
          ]. Note that
the sensors on the ground are equipped with omnidirectional antennas. We express each orientation as a pair:
type (up, dw, hor) and polarity (+, ). Each sensor saves in its local register R the rst beacon that it receives
for each orientation. The matching between the register positions and the orientations is also reported.
        </p>
        <p>The static path D is depicted as a dashed line in Fig. 6. D is uniquely determined posing Fx = dmin=2,
Fy = 0, F = ( Fx; Fy), and E = (xE ; yE ), where xE = Qx + Fx and yE can be either 0 or Qy depending on
the parity of the number of vertical scans. The waypoints reside only on the vertical scans. The inter-waypoint
distance between two consecutive waypoints is denoted by Iw.</p>
        <p>Fx H
Dir assumes the drone to be equipped with directional antennas. The drone is equipped with an array of six
directional antennas. Each antenna is assumed to transmit the beacon in a circular sector, centered at the
antenna position, of radius r, beamwidth 2 , where = arctan dIwm=in2 , and orientation .</p>
        <sec id="sec-2-3-1">
          <title>D of Dir (the segment EF is not sketched)</title>
          <p>From each vertical scan , the points of Q that we can precisely measure using any type of orientation are
those at distance at least dmin=2 and at most (dmax 2Iw)=2 from the scan . In fact, although the antennas of
type hor can precisely measure the points at distance from dmin up to dmax from , the most stringent limits for
the precise measurements come from the antennas of type up and dw. Therefore, any two consecutive vertical
scans are xed at distance no greater than at distance greater than (dmax dmin 2Iw)=2. Having xed the
rst and the last scan respectively at Fx and Qx + Fx, the length of H is xed in order to evenly distribute
Qx + 2Fx in stripes of width as tight as possible to the maximum value. In this way, the whole area is covered
without wasting path length. Finally, observe that the left and right stripes of Q of width dmin=2 adjacent to
each vertical scan are not measured by to satisfy the dmin constraint. The left and right stripes adjacent to
are then measured by the scan that precedes and the scan that follows .</p>
          <p>During the ight, for each waypoint, the drone sends the beacon's message along the six di erent orientations of
beamwidth: up+, up , dw+, dw , hor+, and hor . When a sensor receives the drone's message from orientation
, for = d =3 and d = 0; : : : ; 5, it rst checks whether the d-th location of its register R is empty or not. In
the rst case, the sensor is receiving for the rst time the beacon from the orientation d. In the second case, since
the sensor has already heard that beacon from orientation d, the message is ignored. When the sensor receives
one orientation for the rst time, it replies to the drone with an ack message. The drone infers the distance from
the time of the round-trip message, and sends to the sensor the measure. The sensor stores the measure in the
local register R. The trilateration is performed by the sensor when it has received three measurements.</p>
          <p>
            As explained previously, the localization error depends on the position of the three waypoints from which
the point is ranged. From Eq. (6), the error is minimum when for each sensor min = =3. In our localization
technique, the minimum angle at P is =3 if and only if the sensor P resides at the intersection of the orientations
of the three sectors centered at the waypoints w1(P ), w2(P ) and w3(P ). Although it is not possible to achieve
for every point P that the minimum angle is =3, we can prove that the algorithm Dir provides [
            <xref ref-type="bibr" rid="ref4">3</xref>
            ]
          </p>
          <p>
            Considering Eq. (7), we are now in a position to rewrite Eq. (6) as a function that depends on h, dmin and Iw.
Inverting this expression we can ensure the precision required by the nal user. In conclusion, according to [
            <xref ref-type="bibr" rid="ref4">3</xref>
            ],
the localization precision can be expressed as:
(7)
(8)
2.3
          </p>
        </sec>
      </sec>
      <sec id="sec-2-4">
        <title>The OMNI algorithm</title>
        <p>Now we present Omni, which does not require specialized hardware such as directional antennas. Actually, Omni
is based on a standard omnidirectional antenna.</p>
        <p>The omnidirectional antenna sends beacons with the drone's position isotropically. Since the sensors cannot
distinguish from which direction they receive the beacon, it is di cult to select the waypoints that guarantee the
desired geometry. For this reason, Omni adopts a two phase approach. In the rst phase, the sensors obtain a
rough estimation of their position. In the second phase they use this estimation to pick among all the available
waypoints the best ones, called precise waypoints, that equally divide the turn angle around the sensor itself to
perform a trilateration and obtain a precise localization.</p>
        <p>The static path O is depicted in Fig. 7. The drone's mission starts at F = ( Fx H; Fy), where Fx =
dmin=2, Fy = p3(dmax Iw)=2 and nishes at E = (xE ; yE ), where xE = Qx + Fx and yE can be either 0 or Qy.
The inter-scan distance is set to H = (dmax dmin 2Iw)=2. O consists of two vertical scans outside Q that
are used to measure the sensors in the leftmost stripe of Q close to the vertical border, while the last vertical
scan is at distance no larger than dmax2 2Iw from the rightmost border.</p>
        <p>The behavior of the drone under Omni is the same as during algorithm Dir. However, the behavior of the
sensors is quite di erent. According to Omni, until the rst trilateration, the sensor stores all the measurements
that it receives which are above the dmin ground distance. As soon as the sensor has collected three distance
measurements which belong to two di erent scans and are greater than the minimum ground distance, the sensor
performs the rst trilateration to compute a rough estimation of its position P^ = (xP^ ; yP^ ).
min
3</p>
        <p>2
s
L = s
1 +</p>
        <p>h2
dmin2
r
2
1
1 +
p3</p>
        <p>Iw=2 2
dmin
Iw=2
dmin
Iw</p>
        <p>F</p>
        <p>O</p>
        <p>Fy
Fx</p>
        <sec id="sec-2-4-1">
          <title>O of Omni (the segment EF is not sketched)</title>
          <p>Omni logically tessellates the area by diamonds (see Fig. 8). Then, from P^, the sensor locates the closest
vertical scan on its left. Then, conceptually, the sensor derives the six rays that equally divide the turn angle
around its current position. The intersections of such rays with the scan on the left of P^ are then the most
desirable positions as regard to the waypoint geometry (i.e., the min constraint) from where taking the nal
measurements. However, such points may not coincide with waypoints because the drone takes measurements at
regular distance and may also not satisfy the dmin constraint. So, to nd the three precise waypoints obeying all
the constraints, the sensor locally computes the three precise waypoints w1(P^); w2(P^); w3(P^) on the scans on its
left. Selected w1(P^); w2(P^); w3(P^), the sensor continues to hear to the drone until it has collected all the three
precise waypoints and discards the measurements which are useless. Finally, the sensor computes the second
precise trilateration using the precise waypoints, and the localization process is nished.
(9)
(10)
H
w3
dmdinmin
2
A</p>
          <p>C
B</p>
          <p>D
w2</p>
          <p>
            The localization error strictly depends on the position of the three precise waypoints w1 and w2, and w3 from
which the point P is ranged. It has been proved in [
            <xref ref-type="bibr" rid="ref3">2</xref>
            ] that Omni provides that
          </p>
          <p>
            Considering Eq. (9) and from Eq. (6) rewritten in terms of h, dmin and Iw, the localization precision in Omni
can be expressed as [
            <xref ref-type="bibr" rid="ref3">2</xref>
            ]:
          </p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Experimental Evaluation</title>
      <p>We have implemented the Short, Dir, and Omni localization algorithms in MATLAB programming language
in order to compare their performance under a variety of di erent conditions.</p>
      <p>Let rst observe that Omni and Dir belong to the class of the precise algorithms because they aim to
guarantee the user-de ned localization precision. Instead, Short belongs to the class of the best-e ort algorithms
that cannot ensure any a-priori guaranteed localization precision L, but they return a very short static path.
Consequently, we cannot use L as the main metric to compare the performance of the three algorithms. Namely,
Short does not respect dmin and min, and we cannot bound a-priori the localization error.</p>
      <p>Considering the two di erent classes to which the three algorithms belong, in order to compare the goodness
of them, we can only evaluate the experimental position error, expressed as the distance P P 0 between the original
position P and the estimated position P 0. For each drone's mission, we record the worst experimental error and
then we compute the error bound L as the average value of the worst bound on the position error in all the
missions. We also compute the 95%-con dence interval of the data reported in the experiments. Moreover, we
will continue to evaluate the ability of Dir and Omni of guaranteeing a-priori localization L given in input to
the algorithm. We compare and evaluate the three di erent algorithms under the metrics:</p>
      <sec id="sec-3-1">
        <title>1. we evaluate the path length,</title>
        <p>2. we compare localization performance in terms of precision and error, and
3. we analyze the di erence of using multilateration instead of trilateration.</p>
        <p>The goal of the experiment shown in Fig. 9, is to compare the length of the static paths S , D, and O
of the Short, Dir, and Omni algorithms having xed Iw = 5 m and L = 0:30 m for Dir and Omni. Dir and
Omni have more or less the same number of vertical scans, but each vertical scan in Omni goes well beyond
the deployment area. Precisely, in Omni, each vertical scan has length 2Fy + Qy, where Fy = (dmax Iw)p3=2,
whereas Fy = 0 for Dir. Fig. 9 shows that D is better than O. The increase on the path length of the static
path of Omni is the paid price for not using specialized hardware. Note that the path length in Omni is not
monotonic decreasing because we use a xed inter-scan value without resizing it as we do in Dir. Obviously,
Short is the shortest static path regardless of the length of the communication radius r, but one has to remember
that Short requires to solve o -line an instance of TSP before the drone can start its mission. Dir is twice as
long as Short. Short is the shortest path but it does not have neither dmin nor min constraints, and thus its
precision is not guaranteed (see Fig. 10 for the analysis of the precision of Short).</p>
        <p>14
12
10
h
tg 8
n
e
l
6
4</p>
        <p>Note that dmax, and thus the communication range r, is the parameter that mostly in uences the length of
the path in Dir and Omni. For this reason, we omitted to report the value of dmin adopted in Fig. 9 for Dir
and Omni.</p>
        <p>In the following, we compare in Fig. 10 the localization performance in terms of user-de ned precision and
error.</p>
        <p>In the rst experiment shown in Fig. 10(a) we compare the user-de ned precision L for the precise algorithms
Dir and Omni. Speci cally, we want to experimentally verify that our bound holds, and thus L &lt; L. In Dir
and Omni we set Iw = f5; 10g m and L = 0:30 m. Given the localization precision L, we select dmin for Dir and
Omni by applying Eq. (8) and Eq. (10), respectively, and compare it with the results from the experiments. It is
worthy to note that the error bound in Fig. 10(a) is always smaller than the user-de ned localization precision
L = 0:30 m. The Omni algorithm is slightly more precise than Dir because Omni uses values of dmin larger
than Dir, and the min angle is larger in Omni than Dir. Moreover, Omni uses two trilaterations.</p>
        <p>In the second experiment, reported in Fig. 10(b), we compare the localization error among all the three
algorithms, considering also the path length. Dir and Omni run using the values of dmin and Iw that guarantee
a maximum error of L = 1 m. For Short, instead, we just use for each sensor the three waypoints that form
the vertices of the triangle that better circumscribes the sensor itself. Although in Short the three waypoints
surround the point to be localized, neither minimal ground distance dmin nor minimum angle min constraints
are respected in a prede ned way. Thus, the error for Short increases in Fig. 10(b). In fact, even if the sensor is
inside the triangle formed by the three waypoints, it is su cient that it is close to one of the vertices to break the
dmin constraint. Hence, it is easy to see from Eq. (8) that, when the ground distance tends to 0, the error blows
up. Moreover, the reported error and the con dence interval of Short is higher when n is large than when n is
small because the probability of encountering sensors in the worst position increases. Contrary, Dir and Omni
have a good level of precision since all the four constraints dmax, dmin, min, and non-linearity , are satis ed and
as a consequence of the setting of the parameters Iw and dmin, they guarantee an a-priori localization error of
L = 1 m. It is important to note that the path length of Short is 3:2 km, which is less than half the path
length of Dir (6 km) and Omni (8:2 km).</p>
        <p>In the third experiment depicted in Fig. 10(c), we have relaxed the dmin constraint accepting any measurement,
i.e., dmin = 0. With this experiment we want to evaluate if, paying a higher localization error, we can considerably
shorten the path of the precise algorithms. First, note that the dmin constraints cannot be relaxed for Dir, since
= arctan dIwm=in2 . Hence, in Dir, dmin must be positive. In omni we can set dmin = 0. The error increases,
but it remains smaller than that of Short. Nonetheless, the path length of Omni is still the double of that of
Short. In conclusion, Omni and Dir have been mainly designed to be precise, and reducing the precision does
not reduce the path length. So, precision cannot be traded for path length in Omni and Dir.</p>
        <p>Finally, we compare our algorithms when multilateration is used instead of trilateration. A crucial nding of
our experiments is that multilateration helps when the algorithm is best-e ort, but it does not help when the
algorithms are precise.</p>
        <p>OMNI Iw = 5; dmin = 0:00</p>
        <p>SHORT
DIR
DIR6
max PP0 : h = 30; r = 150</p>
        <p>OMNI
OMNI6
100 200 300 400 500 600</p>
        <p>nodes
(a) Short vs Short6
100 200 300 400 500 600</p>
        <p>nodes
(b) Dir vs Dir6
100 200 300 400 500 600</p>
        <p>nodes
(c) Omni vs Omni6</p>
        <p>In the experiment shown in Fig. 11 we want to analyze whether an optimal geometry of three waypoints is
better than six generic waypoints, or not. Speci cally, Short6, Dir6, and Omni6 are, respectively, the modi ed
version of Short, Dir, and Omni, which localize each sensor exploiting the rst (in order of time) six
noncollinear waypoints that a sensor hears from the drone. Fig. 11(b) and Fig. 11(c) show that a good geometry, is
better than a blind multilateration. It is not su cient to increase the number of waypoints for obtaining a smaller
error. Namely, since Dir6 and Omni6 choose the rst three waypoints from a vertical scan and the other three
from the subsequent one, min can be very small. Although Omni6 (see Fig. 11(c)) is less precise than Omni, the
localization degrades less than between Dir and Dir6 (see Fig. 11(b)). We believe that this is due to the fact
that in general Omni and Omni6, according to the regular omnidirectional antenna, can take measurements at
distances greater than compared to those taken in Dir6 with directional antennas. On the other hand, Fig. 11(a)
highlights that using the rst six waypoints for Short6 the precision improves. Although no geometry has been
set in Short6, the experimental error is smaller than that in the original algorithm Short which aims to build
a weak geometry formed by isosceles triangles. Short performs as the common belief: more waypoints help for
improving the localization. Therefore, multilateration helps precision for the best-e ort algorithm Short.
max PP0 : h = 30; r = 150
SHORT
SHORT6OPT
max PP0 : h = 30; r = 150</p>
        <p>DIR
DIR6OPT
max PP0 : h = 30; r = 150</p>
        <p>OMNI
OMNI6OPT
1.2
1
0.8</p>
        <p>In the experiment shown in Fig. 12 we study whether multilateration improves the localization when the
geometry constraints are respected also by the six measurements. Speci cally, be Short6OPT the modi ed
version of Short which considers the initial three associated vertices of the triangle that better circumscribes
the sensor plus the three midpoints along each triangle's side. Dir6OPT is the modi ed version of Dir which
localizes all the sensors exploiting all the di erent possible six orientations that a sensor can hear. Moreover,
Omni6OPT is the modi ed version of Omni which uses for each sensor the optimal triple of waypoints plus
another triple, clone of the rst one. Fig. 12(b) and Fig. 12(c) show that the geometry of the waypoints, for any
algorithm belonging to the precise class, is essential to localize the sensors with a small error. In practice for Dir
and Omni, a non-blind multilateration works almost the same than as the simple trilateration. However, since
the precision is almost the same for trilateration and multilateration when both satisfy the geometry constraints,
there is no reason to pay an extra computational cost. Even in this experiment, in Fig. 12(a), Short behaves
di erently from Dir and Omni: with Short6OPT the precision deteriorates. This shows that it is di cult to
de ne which is a good geometry for an algorithm without rigid constraints for precision.</p>
        <p>In summary we believe that it is di cult to achieve small localization error without a careful design of the
localization algorithms. On the other hand, when we have a well de ned geometry that guarantees precision,
trilateration is good enough.
4</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Conclusion and Future Work</title>
      <p>In this paper, we surveyed three localization algorithms that replace multiple xed anchor sensors with a ying
drone. We investigated the impact of multilateration for the three di erent algorithms. For Dir and Omni,
which are precise algorithms that measure the sensors obeying to a rigid geometry, a multilateration that just
increases the number of waypoints is not useful to improve the precision of the localization. A multilateration
that increases the number of waypoints respecting the geometry improves localization in a negligible way. So
we can conclude that when the waypoints for each sensor follow a well pondered geometry, multilateration is
not useful. Whereas, the precision of the Short algorithm, which follows a best-e ort policy in selecting the
waypoints for each sensor, bene ts of taking more measurements. For the best-e ort algorithms, the design of
sophisticated multilateration techniques that preserve geometry requires further investigation.</p>
      <p>In our future works, in addition to further look for short drone's paths that preserve precision, we intend to
analyze the energy consumption of the localization algorithms on both the drone and the sensors side, considering
also the number of the waypoints. We also wish to perform a test-bed with a drone of our localization algorithms.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <article-title>The work has been partially supported by GEO-SAFE (H2020-691161), \RISE" Fondazione CR-PG</article-title>
          (code
          <year>2016</year>
          .
          <volume>0104</volume>
          .021), \
          <article-title>Project Static Path Planning for Drones to Secure Localization" granted by Fondo Ricerca di Base 2015 University of Perugia, and \GNCS{INdAM"</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>A.</given-names>
            <surname>Papadopoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Navarra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. A.</given-names>
            <surname>McCann</surname>
          </string-name>
          , and
          <string-name>
            <surname>C. M. Pinotti</surname>
          </string-name>
          , \VIBE:
          <article-title>An energy e cient routing protocol for dense and mobile sensor networks,"</article-title>
          <source>Journal of Network and Computer Applications</source>
          , vol.
          <volume>35</volume>
          , no.
          <issue>4</issue>
          , pp.
          <volume>1177</volume>
          {
          <issue>1190</issue>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>C. M.</given-names>
            <surname>Pinotti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Betti Sorbelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Perazzo</surname>
          </string-name>
          , and G. Dini, \
          <article-title>Localization with guaranteed bound on the position error using a drone,"</article-title>
          <source>in Proceedings of the 14th ACM International Symposium on Mobility Management and Wireless Access</source>
          , MobiWac '
          <fpage>16</fpage>
          , (New York, NY, USA), pp.
          <volume>147</volume>
          {
          <issue>154</issue>
          ,
          <string-name>
            <surname>ACM</surname>
          </string-name>
          ,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>F. B.</given-names>
            <surname>Sorbelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>K. Das</surname>
          </string-name>
          ,
          <string-name>
            <surname>C. M. Pinotti</surname>
            , and
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Silvestri</surname>
          </string-name>
          , \
          <article-title>Precise localization in sparse sensor networks using a drone with directional antennas,"</article-title>
          <source>in Proceedings of the 19th International Conference on Distributed Computing and Networking</source>
          , ICDCN '
          <fpage>18</fpage>
          , (New York, NY, USA), pp.
          <volume>34</volume>
          :
          <issue>1</issue>
          {
          <fpage>34</fpage>
          :
          <fpage>10</fpage>
          ,
          <string-name>
            <surname>ACM</surname>
          </string-name>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>L.</given-names>
            <surname>Gupta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Jain</surname>
          </string-name>
          , and G. Vaszkun, \
          <article-title>Survey of important issues in uav communication networks,"</article-title>
          <source>IEEE Communications Surveys Tutorials</source>
          , vol.
          <volume>18</volume>
          , pp.
          <volume>1123</volume>
          {
          <issue>1152</issue>
          ,
          <string-name>
            <surname>Secondquarter</surname>
          </string-name>
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>C.</given-names>
            <surname>Yuan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Z.</given-names>
            <surname>Liu</surname>
          </string-name>
          , \
          <article-title>A survey on technologies for automatic forest re monitoring, detection, and ghting using unmanned aerial vehicles and remote sensing techniques,"</article-title>
          <source>Canadian Journal of Forest Research</source>
          , vol.
          <volume>45</volume>
          , no.
          <issue>7</issue>
          , pp.
          <volume>783</volume>
          {
          <issue>792</issue>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>P.</given-names>
            <surname>Perazzo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F. B.</given-names>
            <surname>Sorbelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Conti</surname>
          </string-name>
          , G. Dini, and
          <string-name>
            <given-names>C. M.</given-names>
            <surname>Pinotti</surname>
          </string-name>
          , \
          <article-title>Drone path planning for secure positioning and secure position veri cation,"</article-title>
          <source>IEEE Transactions on Mobile Computing</source>
          , vol.
          <volume>16</volume>
          , pp.
          <volume>2478</volume>
          {
          <issue>2493</issue>
          ,
          <string-name>
            <surname>Sept</surname>
          </string-name>
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Enac</surname>
          </string-name>
          , UAV regulations in Italy,
          <source>2018 (accessed April 10</source>
          ,
          <year>2018</year>
          ). https://goo.gl/vgktKd.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>G.</given-names>
            <surname>Heredia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Caballero</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Maza</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Merino</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Viguria</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Ollero</surname>
          </string-name>
          ,
          <article-title>\Multi-unmanned aerial vehicle (uav) cooperative fault detection employing di erential global positioning (dgps), inertial and vision sensors,"</article-title>
          <source>Sensors</source>
          , vol.
          <volume>9</volume>
          , no.
          <issue>9</issue>
          , pp.
          <volume>7566</volume>
          {
          <issue>7579</issue>
          ,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [9] DecaWave, DecaWave Ltd:
          <source>ScenSor SWM1000 Module</source>
          ,
          <year>2018</year>
          (accessed
          <issue>April 10</issue>
          ,
          <year>2018</year>
          ). http://www. decawave.com/products/dwm1000-module.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>F. B.</given-names>
            <surname>Sorbelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>K. Das</surname>
          </string-name>
          ,
          <string-name>
            <surname>C. M. Pinotti</surname>
            , and
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Silvestri</surname>
          </string-name>
          , \
          <article-title>On the accuracy of localizing terrestrial objects using drones,"</article-title>
          <source>in IEEE International Conference on Communications, ICC</source>
          <year>2018</year>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>D.</given-names>
            <surname>Koutsonikolas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>M. Das</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Y. C.</given-names>
            <surname>Hu</surname>
          </string-name>
          , \
          <article-title>Path planning of mobile landmarks for localization in wireless sensor networks,"</article-title>
          <source>Comput. Commun.</source>
          , vol.
          <volume>30</volume>
          , pp.
          <volume>2577</volume>
          {
          <issue>2592</issue>
          ,
          <string-name>
            <surname>Sept</surname>
          </string-name>
          .
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>R.</given-names>
            <surname>Huang</surname>
          </string-name>
          and
          <string-name>
            <given-names>G. V.</given-names>
            <surname>Zaruba</surname>
          </string-name>
          , \
          <article-title>Static path planning for mobile beacons to localize sensor networks," in Pervasive Computing</article-title>
          and
          <string-name>
            <given-names>Communications</given-names>
            <surname>Workshops</surname>
          </string-name>
          ,
          <year>2007</year>
          . PerCom Workshops'
          <fpage>07</fpage>
          . Fifth Annual IEEE International Conference on, pp.
          <volume>323</volume>
          {
          <issue>330</issue>
          , IEEE,
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>J.</given-names>
            <surname>Jiang</surname>
          </string-name>
          , G. Han,
          <string-name>
            <given-names>H.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Shu</surname>
          </string-name>
          , and
          <string-name>
            <given-names>M.</given-names>
            <surname>Guizani</surname>
          </string-name>
          , \LMAT:
          <article-title>Localization with a mobile anchor node based on trilateration in wireless sensor networks,"</article-title>
          <source>in 2011 IEEE Global Telecommunications Conference - GLOBECOM</source>
          <year>2011</year>
          , pp.
          <volume>1</volume>
          {
          <issue>6</issue>
          ,
          <string-name>
            <surname>Dec</surname>
          </string-name>
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>H. T.</given-names>
            <surname>Hu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F. C.</given-names>
            <surname>Chen</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Q. X.</given-names>
            <surname>Chu</surname>
          </string-name>
          , \
          <article-title>A compact directional slot antenna and its application in MIMO array,"</article-title>
          <source>IEEE Transactions on Antennas and Propagation</source>
          , vol.
          <volume>64</volume>
          , pp.
          <volume>5513</volume>
          {
          <issue>5517</issue>
          ,
          <string-name>
            <surname>Dec</surname>
          </string-name>
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>