=Paper= {{Paper |id=Vol-3909/Paper_25.pdf |storemode=property |title=Development of Relative Positioning Algorithms for Agricultural Drone Swarms in PGS-Challenged Environments |pdfUrl=https://ceur-ws.org/Vol-3909/Paper_25.pdf |volume=Vol-3909 |authors=Maksym Ogurtsov,Vyacheslav Korolyov,Oleksandr Khodzinsky,Oleh Rybalchenko |dblpUrl=https://dblp.org/rec/conf/iti2/OgurtsovKKR24 }} ==Development of Relative Positioning Algorithms for Agricultural Drone Swarms in PGS-Challenged Environments== https://ceur-ws.org/Vol-3909/Paper_25.pdf
                                Development of relative positioning algorithms for
                                agricultural drone swarms in GPS-challenged
                                environments
                                Maksym Ogurtsov1,*, , Vyacheslav Korolyov1 , Oleksandr Khodzinsky1 and Oleh
                                Rybalchenko1,
                                1
                                 V.M. Glushkov Institute of Cybernetics of the National Academy of Sciences of Ukraine, Ac. Hlushkova str. 40, building 1,
                                apt. 801 03187 Kyiv, Ukraine



                                                Abstract
                                                In recent years, agricultural drones have significantly increased use for tasks such as field irrigation, pest
                                                detection, and crop health monitoring. To improve operational efficiency and speed across extensive
                                                agricultural regions, the deployment of drone swarms is becoming increasingly prevalent. However, in
                                                remote regions where Global Positioning System (GPS) signals may be inaccurate or unavailable and real-
                                                time kinematic services are expensive, the need arises for effective methods to determine the precise relative
                                                positions of drones within a swarm.
                                                The purpose of this work is to develop relative positioning algorithms for agricultural drone swarms in
                                                GPS-challenged environments. These algorithms should enable the determination of the relative positions
                                                of drones and the overall configuration of the swarm even in the complete absence of GPS signals.
                                                As a result of this study, an algorithm for determining the relative position of swarm elements in the
                                                absence of global positioning signals and the corresponding mathematical apparatus was developed. It takes
                                                into account possible inaccuracy of distance measurement.
                                                Various configurations of possible relative positions of swarm elements, their impact on determining the
                                                relative position, and possible solutions to problems in case of an unsuccessful initial arrangement were
                                                analyzed. The obtained results were successfully confirmed by practical calculations.

                                                Keywords
                                                UAV, drone, swarm, swarm control, local positioning, GPS, geometry 1



                                1. Introduction
                                In recent years, agricultural drones have significantly increased use for tasks such as field irrigation,
                                pest detection, and crop health monitoring [1]. To enhance the efficiency and speed of operations
                                over large agricultural areas, drone swarms are increasingly being used [2]. However, in remote
                                regions where Global Positioning System (GPS) signals may be inaccurate or unavailable, the need
                                arises for effective methods to determine the precise relative positions of drones within a swarm to
                                prevent collisions and ensure high-precision task execution [3].
                                    Proposed solutions that are used nowadays typically rely on expensive networks of fixed stations
                                (anchors) with verified positions [4].




                                Information Technology and Implementation (IT&I-2024), November 20-21, 2024, Kyiv, Ukraine
                                 Corresponding author.
                                 These authors contributed equally.
                                   maksymogurtsov@gmail.com (M. Ogurtsov); korolev.academ@gmail.com (V. Korolyov); okhodz@gmail.com (O.
                                Khodzinsky); rv.oleg.ua@gmail.com (O. Rybalchenko)
                                    0000-0002-6167-5111 (M. Ogurtsov); 0000-0003-1143-5846 (V. Korolyov); 0000-0003-4574-3628 (O. Khodzinsky); 0000-
                                0002-5716-030X (O. Rybalchenko)
                                           Β© 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).



                                                                                                                                                                                     311
CEUR
                  ceur-ws.org
Workshop      ISSN 1613-0073
Proceedings
2. Goal

The purpose of this work is to develop relative positioning algorithms for agricultural drone swarms
in GPS-challenged environments. These algorithms should enable the determination of the relative
positions of drones and the overall configuration of the swarm even in the complete absence of GPS
signals, or with GPS signals available only to the operator. Algorithms should work even in case of
long distances between drones.

3. Analysis of previous studies
This problem has been a persistent focus of scientific research, as evidenced by the large body of
published research on the topic. We will now examine some of the most significant and pertinent
studies.
   Work [5] provides a taxonomy of drone positioning systems. The taxonomy categorizes drone
positioning systems into two major methods: vision-based and non-vision-based. The taxonomy
further divides each method into several sub-methods based on the equipment and calculation
method. The taxonomy also provides the advantages and disadvantages of each method.
   In the work [6] the effect of multiple sensors on 3D indoor position accuracy is investigated using
the flexible Online Asynchronous State Estimation sensor fusion platform. However, this method is
not suitable for agricultural drones, as they may operate over long distances from each other and in
adverse weather conditions.
   Work [4] explored the approach of replacing all the fixed anchors with a single drone that flies
through a sequence of waypoints. At each waypoint, the drone acts as an anchor and securely
determines the positions. However, this approach has proven a slow and ineffective solution.
   Work [7] introduced several localization techniques that are independent of GPS, each with its
advantages and disadvantages. Unfortunately, these disadvantages also make such techniques
inapplicable for our case.
   Work [8] offers the usage of a gray-scale low-resolution camera and an ultra-low-power System-
on-Chip based on a novel vision, to use it with the fully convolutional neural network, however, for
our task it faces the same disadvantages, as the approach, offered in [6].
   In [9] topology perception and relative positioning algorithms were offered. But agricultural
drones are mostly working at low altitudes over the fields big similarly looking spaces which
makes this method inapplicable.

enough results to apply to the current task.


positions of drones and the overall configuration of the swarm even in the complete absence of GPS
signals.

4. Main part
We begin the analysis of the proposed algorithm by describing the fundamental approach used
determining the direction and distance from each drone to every other drone. To determine the
distance between a pair of drones, it is proposed to use either RSSI signal measurements [11] or the
built-in distance measurement tools that are available on many models [12]. However, in any case,
it is necessary to consider the presence of errors in signal measurement.




                                                                                                  312
4.1. Application of radio modules for mutual positioning of drones
The primary purpose of classical radio modules is to complement the Internet of Things networks to
enhance the capabilities of narrow artificial intelligence in sensor networks of smart homes, offices,
and household devices equipped with radio beacons, such as electronic key fobs, smart vacuum
cleaners, remote configuration of refrigerators and climate control equipment, etc. Such radio
modules were developed to improve positioning accuracy in indoor environments, transmit sensor
data over distances from a few centimeters to 300-600 meters, and interact with radio modules in
smartphones and tablets. This approach effectively addresses challenges such as multipath
reflections from walls, interference, and dead zones. By measuring the transmission time of a packet,
radio module can achieve positioning accuracy within a margin of one centimeter.
    Over the past eight years, numerous scientific studies and practical implementations have
explored the use of such technologies for positioning small mobile robotic systems, such as wheeled
robots and quadcopters. These systems can be organized into swarms, utilizing network-based
technologies for coordinated operation [13], [14]. One of the tasks to be solved for controlling a
swarm of drones [15] is to obtain an estimate of the coordinates of the swarm members in local and
global coordinate systems [16].
    The evaluation of local coordinates for a swarm element relative to others is achieved through
direction-finding techniques, which rely on measuring the time of flight of a data packet transmitted
between the swarm element and other network members (drones) and back. Radio modules equipped
with omnidirectional antennas, characterized by a toroidal radiation pattern, are employed to
calculate the distance between two drones within the swarm. Subsequently, a mathematical
algorithm, potentially coupled with a calibration maneuver of the drone, is used to resolve any spatial
uncertainty in positioning.
    Adding a directional antenna to ready-made drone swarm radio modules is difficult, as it would
require embedding into the highly integrated system-on-chip of the radio module. Some
manufacturers provide the ability to connect a directional antenna, but the size and weight
characteristics of such modules increase by 2-4 times.

4.2. Algorithm for calculating the distance from a drone to the position zone of
        another drone and the viewing angle from the drone to this zone
To accurately determine the directional angle of each drone relative to another, measuring only the
distance between them is insufficient. Consequently, the deployment of two drones is required to
measure the distance to a third drone, thereby facilitating the calculation of their respective angular
positions. This concept will be elaborated upon in the following sections.
    Figure 1 schematically depicts two drones, A and B, and a drone T (target), whose distance and
angle must be determined.
    Two circles are drawn centered at point A and two circles are centered at point B. These circles
represent the distance from drones A and B to T plus or minus the measurement error. Due to the
measurement error, T is not located at a specific point but rather in a certain zone formed by the
intersection of these circles.
    The distance between drones A and B is determined using the same methods as the distance to
the drone T. Their relative positions can be refined using data acquired from cameras installed on
these drones.
    We are interested in the relationships between the various distances and angles shown in Figure
1. Let us illustrate these distances and angles in more detail in Figure 2.




                                                                                                   313
Figure 1: Schematic depiction of the calculated distances from two drones to a third




Figure 2: Distance and angle notation
    We introduce a coordinate system and the following notation:
β€’ x-axis is horizontal and aligned with the baseline 𝐴𝐡
β€’ y-axis is vertical and passes through the midpoint of 𝐴𝐡 to
β€’ 𝐴𝐡 drones used for determining the position of a drone
β€’ r measured distances from drones 𝐴𝐡 to

                                                                                       314
   β€’         the measurement errors of the distance r from drone to
   β€’     b half-length of the baseline distance from the center of the segment AB to any end of
         this segment
   β€’ C intersection point of the circle centered at point B with radius r     and the circle centered
         at point with radius
   β€’ G point of intersection of two circles each having a radius of r
   β€’ F point of intersection of two circles each having a radius of
   β€’ E orthogonal projection of point C onto the x-axis
   β€’         angle of view from the midpoint of the AB segment at the T position locus
   β€’ h1 distance from the center of the segment AB to point G
   β€’ h2 distance from the center of the segment AB to point F
4.2.1. Relationship between r, b and
Firstly, we will show that the relationship between r, b and can be expressed by the following
formulas:
                                   𝛾             π‘Ÿπœ€
                                𝑑𝑔 =                                                              (1)
                                   2 √(𝑏 2 βˆ’ πœ€ 2 )(π‘Ÿ 2 βˆ’ 𝑏 2 )

                                                    πœ€2
                            π‘Ÿ 2 = 𝑏 2 [1 +                 𝛾      ]                               (2)
                                            (𝑏 2 βˆ’ πœ€ 2 )𝑑𝑔2 βˆ’ πœ€ 2
                                                           2
   For relatively large values of r, the term b can be neglected, and the formula (1) for the angle
simplifies and becomes independent of r:
                    𝛾             π‘Ÿπœ€                      π‘Ÿπœ€            πœ€
                 𝑑𝑔 =                            β‰ˆ                 =                              (3)
                    2 √(𝑏 2 βˆ’ πœ€ 2 )(π‘Ÿ 2 βˆ’ 𝑏 2 ) √(𝑏 2 βˆ’ πœ€ 2 )π‘Ÿ 2 βˆšπ‘ 2 βˆ’ πœ€ 2
                                          𝛾        πœ€
                                       𝑑𝑔 β‰ˆ                                                       (4)
                                          2 βˆšπ‘ 2 βˆ’ πœ€ 2
   The formula (3) explains why calculations yield similar values for the angles for various large r.
   The formula (4) also explains the similar values of b for large r and the independence of b from r.
                             𝛾          πœ€                   πœ€
                           𝑑𝑔 β‰ˆ                  β‡’ 𝑏2 β‰ˆ       𝛾+πœ€
                                                                    2
                                                                                                  (5)
                             2 √(𝑏 2 βˆ’ πœ€ 2 )              𝑑𝑔2 2
                                              πœ€2
                                      𝑏2 β‰ˆ       𝛾+πœ€
                                                        2
                                                                                                  (6)
                                             𝑑𝑔2 2
4.2.2. Formulas for h1 and h2
                                β„Ž1 = 𝐺𝑦 = √(π‘Ÿ βˆ’ πœ€)2 βˆ’ 𝑏 2                                         (7)
                                β„Ž2 = 𝐹𝑦 = √(π‘Ÿ + πœ€)2 βˆ’ 𝑏 2                                         (8)

   When r in (7)-(8) is big enough: h2 h1
   Next, we will demonstrate how the presented relationships between parameters r, b and were
obtained, and we will derive formulas for h1 and h2
4.2.3. Derivation of the relationship for r, b and
The x and y coordinates of point C.
   Since C is the intersection point of the circle centered at B with radius r and the circle centered
at A with radius , its coordinates can be found by solving the corresponding system of equations:
     𝐡: (π‘₯ βˆ’ 𝑏)2 + 𝑦 2 = (π‘Ÿ βˆ’ πœ€)2      π‘₯ 2 βˆ’ 2π‘₯𝑏 + 𝑏 2 + 𝑦 2 = π‘Ÿ 2 βˆ’ 2π‘Ÿπœ€ + πœ€ 2
   {                               β‡’  {                                                           (9)
     𝐴: (π‘₯ + 𝑏)2 + 𝑦 2 = (π‘Ÿ + πœ€)2      π‘₯ 2 + 2π‘₯𝑏 + 𝑏 2 + 𝑦 2 = π‘Ÿ 2 + 2π‘Ÿπœ€ + πœ€ 2
   Subtracting the first equation from the second in (9).
                                                        π‘Ÿπœ€
                                β‡’ 4π‘₯𝑏 = 4π‘Ÿπœ€ β‡’ π‘₯ =          β‡’                                     (10)
                                                        𝑏
   Substituting x into equation B:(x b)2+y2=(r )2, we can solve in (10) for y.
                                                                                                  315
                                                                       π‘Ÿπœ€   2
 β‡’ 𝑦 2 = (π‘Ÿ βˆ’ πœ€)2 βˆ’ (π‘₯ βˆ’ 𝑏)2 β‡’ 𝑦 = √(π‘Ÿ βˆ’ πœ€)2 βˆ’ (π‘₯ βˆ’ 𝑏)2 β‡’ √(π‘Ÿ βˆ’ πœ€)2 βˆ’ ( βˆ’ 𝑏) .                  (11)
                                                                       𝑏
   We chose the positive value of the root for y in (11) since point C is located above the horizontal
axis.
   Therefore,
                                π‘Ÿπœ€                          π‘Ÿπœ€
                         𝐢π‘₯ = , 𝐢𝑦 = √(π‘Ÿ βˆ’ πœ€)2 βˆ’ ( βˆ’ 𝑏)2                                         (12)
                                𝑏                            𝑏
   The viewing angle from AB center point to the target area T.
                           𝛾 𝐢π‘₯                      π‘Ÿπœ€
                       𝑑𝑔 =          =                              =
                           2 𝐢𝑦                      2     π‘Ÿπœ€     2
                                                                                                 (13)
                                       π‘βˆš(π‘Ÿ βˆ’ πœ€) βˆ’ ( βˆ’ 𝑏)
                                                           𝑏
   We will move b in (13) under the radical sign.
                                               π‘Ÿπœ€
                             =                                  =                                (14)
                                βˆšπ‘ 2 (π‘Ÿ βˆ’ πœ€)2 βˆ’ (π‘Ÿπœ€ βˆ’ 𝑏 2 )2
   We will expand the brackets in (14) and collect like terms.
                             π‘Ÿπœ€
  =
      √(π‘Ÿ 𝑏 βˆ’ 2π‘Ÿπœ€π‘ + πœ€ 𝑏 ) βˆ’ (π‘Ÿ 2 πœ€ 2 βˆ’ 2π‘Ÿπœ€π‘ 2 + 𝑏 4 )
         2 2       2     2  2
                                       π‘Ÿπœ€                                    π‘Ÿπœ€
                 =                                             =                                 (15)
                   βˆšπ‘Ÿ 2 𝑏2 + πœ€ 2 𝑏 2 βˆ’ πœ€ 2 𝑏 2 βˆ’ π‘Ÿ 2 πœ€ 2 βˆ’ 𝑏 4 βˆšπ‘Ÿ 2 𝑏2 βˆ’ 𝑏 4 + πœ€ 2 𝑏2 βˆ’ π‘Ÿ 2 πœ€ 2
                                    π‘Ÿπœ€                              π‘Ÿπœ€
                 =                                      =
                   βˆšπ‘ (π‘Ÿ βˆ’ 𝑏 ) βˆ’ πœ€ (π‘Ÿ βˆ’ 𝑏 ) √(𝑏 βˆ’ πœ€ )(π‘Ÿ 2 βˆ’ 𝑏 2 )
                       2   2      2      2    2     2          2    2

   Therefore,
                                  𝛾                π‘Ÿπœ€
                              𝑑𝑔 =                                                               (16)
                                  2 √(𝑏 2 βˆ’ πœ€ 2 )(π‘Ÿ 2 βˆ’ 𝑏 2 )
4.2.4. Measured distance between A and T
Let us solve equation (16) in terms of r.
              𝛾             π‘Ÿπœ€                                         𝛾
           𝑑𝑔 =                           β‡’ (𝑏 2 βˆ’ πœ€ 2 )(π‘Ÿ 2 βˆ’ 𝑏 2 )𝑑𝑔2 = π‘Ÿ 2 πœ€ 2 β‡’             (17)
              2 √(𝑏 βˆ’ πœ€ )(π‘Ÿ βˆ’ 𝑏 )
                      2     2    2    2                                2
   Let us distribute the term outside the second parentheses in (17).
                                          𝛾                       𝛾
                     β‡’ (𝑏 2 βˆ’ πœ€ 2 )π‘Ÿ 2 𝑑𝑔2 βˆ’ (𝑏 2 βˆ’ πœ€ 2 )𝑏 2 𝑑𝑔2 = π‘Ÿ 2 πœ€ 2 β‡’                    (18)
                                          2                       2
   Let's factor r2 in (18).
                                            𝛾                            𝛾
                     β‡’ π‘Ÿ 2 [(𝑏 2 βˆ’ πœ€ 2 )𝑑𝑔2 βˆ’ πœ€ 2 ] = (𝑏 2 βˆ’ πœ€ 2 )𝑏 2 𝑑𝑔2 β‡’                     (19)
                                            2                            2
                                                                𝛾
                                             𝑏 2 (𝑏2 βˆ’ πœ€ 2 )𝑑𝑔2 2
                                 β‡’ π‘Ÿ2 =                     𝛾                                   (20)
                                           (𝑏 2 βˆ’ πœ€ 2 )𝑑𝑔2 βˆ’ πœ€ 2
                                                            2
4.2.5. Measured b as half the base length the distance between points T and A
Let us solve equation (17) in terms of b, as illustrated by
   Figure 3. We will square both sides of the equation.
                 𝛾               π‘Ÿπœ€                     𝛾           π‘Ÿ2πœ€2
              𝑑𝑔 =                               β‡’ 𝑑𝑔2 = 2                       β‡’              (21)
                 2 √(𝑏 2 βˆ’ πœ€ 2 )(π‘Ÿ 2 βˆ’ 𝑏 2 )            2 (𝑏 βˆ’ πœ€ 2 )(π‘Ÿ 2 βˆ’ 𝑏 2 )
   Let us expand the parentheses in (21).
                                 𝛾                π‘Ÿ2πœ€2
                          β‡’ 𝑑𝑔2 = 2 2                              β‡’                            (22)
                                 2 π‘Ÿ 𝑏 + πœ€ 2 𝑏2 βˆ’ π‘Ÿ 2 πœ€ 2 βˆ’ 𝑏 4
                               𝛾
                       β‡’ 𝑑𝑔2 (π‘Ÿ 2 𝑏 2 + πœ€ 2 𝑏2 βˆ’ π‘Ÿ 2 πœ€ 2 βˆ’ 𝑏 4 ) = π‘Ÿ 2 πœ€ 2                      (23)
                               2
   By collecting at (23) the coefficients of b2 and b4, we obtain a quadratic equation in b2.


                                                                                                  316
                                                                    1
                     (𝑏 2 )2 βˆ’ (π‘Ÿ 2 + πœ€ 2 )𝑏 2 + π‘Ÿ 2 πœ€ 2 (1 +          𝛾) = 0 β‡’             (24)
                                                                   𝑑𝑔2 2
                                                                             1
                          π‘Ÿ 2 + πœ€ 2 Β± √(π‘Ÿ 2 + πœ€ 2 )2 βˆ’ 4π‘Ÿ 2 πœ€ 2 (1 +           𝛾)
                                                                           𝑑𝑔2 2            (25)
                 2
               𝑏 =                                                                  β‡’
                                                      2




Figure 3: Illustration of the derivation of formulas for h1 and h2
We will express in (25) tangent as sine over cosine.
                                                                       𝛾
                                                                 π‘π‘œπ‘  2 2
                           2       2
                          π‘Ÿ + πœ€ Β± √(π‘Ÿ 2 + πœ€ 2 )2 βˆ’ 4π‘Ÿ 2 πœ€ 2 (1 +       𝛾)                   (26)
                                                                 𝑠𝑖𝑛2 2
               𝑏2 =                                                                 β‡’
                                                      2
                                                                     𝛾         𝛾
                                                                𝑠𝑖𝑛2 2 + π‘π‘œπ‘  2 2
                      2        2
                     π‘Ÿ +πœ€          Β± √(π‘Ÿ 2 + πœ€ 2 )2 βˆ’ 4π‘Ÿ 2 πœ€ 2 (          𝛾      )          (27)
                                                                     𝑠𝑖𝑛2
                                                                          2
            𝑏2 =                                                                        β‡’
                                                      2
                                                                      4π‘Ÿ 2 πœ€ 2
                                       π‘Ÿ 2 + πœ€ 2 Β± √(π‘Ÿ 2 + πœ€ 2 )2 βˆ’         𝛾
                                                                      𝑠𝑖𝑛2 2                (28)
                            𝑏2 =
                                            2
We take only the negative sign in front of the radical to ensure that b2 > 0.
                                                                 𝛾         𝛾
                                                            𝑠𝑖𝑛2 2 + π‘π‘œπ‘  2 2
                     π‘Ÿ 2 + πœ€ 2 βˆ’ √(π‘Ÿ 2 + πœ€ 2 )2 βˆ’ 4π‘Ÿ 2 πœ€ 2 (          𝛾      )              (29)
                                                                 𝑠𝑖𝑛2 2
            𝑏2 =                                                                        β‡’
                                                      2
                                                                                             317
4.2.6. Derivation of formulas for h1 and h2
Given that G is the intersection point of two circles with radius r   and F is the intersection point of
two circles with radius r+ , from
   Figure 3 and the Pythagorean theorem.
                                β„Ž1 = 𝐺𝑦 = √(π‘Ÿ βˆ’ πœ€)2 βˆ’ 𝑏 2                                         (30)
                               β„Ž2 = 𝐺А𝑦 = √(π‘Ÿ + πœ€)2 βˆ’ 𝑏 2                                         (31)
   When r in (30)-(31) are big:
                       β„Ž1 = √(π‘Ÿ βˆ’ πœ€)2 βˆ’ 𝑏 2 β‰ˆ √(π‘Ÿ βˆ’ πœ€)2 = π‘Ÿ βˆ’ πœ€                                   (32)
   β„Ž2 = √(π‘Ÿ + πœ€)2 βˆ’ 𝑏 2 β‰ˆ π‘Ÿ                                                                       (33)
   Therefore,
                                      β„Ž2 βˆ’ β„Ž1 β‰ˆ 2πœ€                                                (34)
   So, we can determine the relative positions of the swarm elements with high precision.

4.3. The Problem of Poor Angles Between Drones
The schemes for determining the relative positions of drones in a swarm, considered in the previous
sections, are well-suited for the case of a successful relative arrangement of three drones used to
determine each other's position (Figure 4).




Figure 4: An example of good relative positioning of drones for determining their mutual position
   However, other scenarios are possible, including the worst-case scenario where all three drones
are located in the same straight line (Figure 5Figure 5). The problem of acute or obtuse angles can
be solved by hardware means by rotating one or two drones temporarily by 90 degrees if the
algorithm described in the previous sections detects that the current angle between the drones
produces large measurement errors.




Figure 5: An example of poor relative positioning of drones for determining their mutual position
   Options for adjusting sensor positions. The conditions for improving direction finding in a
drone can be created by rotating one drone relative to another (Figure 6).




Figure 6: Possible approaches to enhance the accuracy of determining a drone's direction
                                                                                                    318
4.4. Practical Testing
The derived formulas were validated experimentally by creating several test datasets for various
relative positions of three drones and different distances between them. The obtained results
confirmed the operability of the created mathematical model, allowing the determination of the
relative position of drones in a swarm relative to each other (Figure 7).




Figure 7: An example of the practical validation of the derived formulas for determining the relative
position of swarm elements

5. Conclusions
Autonomous swarm elements enable precise, large-scale monitoring and management of agricultural
tasks, optimizing resource use and reducing labor requirements. As a result of this study, the
algorithm for determining the relative position of swarm elements in the absence of GPS signals and
the corresponding mathematical apparatus was developed. It takes into account possible inaccuracy
of distance measurement.
    Various configurations of potential relative positions within swarm elements, their influence on
the accuracy of relative position determination, and solutions to issues arising from suboptimal
initial configurations were analyzed. The results were validated through practical computational
models. The obtained results confirmed the operability of the created mathematical model, giving
measurement error of 1 percent or less.
    Future research will focus on the practical application and empirical evaluation of these findings
within operational drone swarms. Special attention will be directed towards enhancing the relative
positioning accuracy of closely spaced drones using visual techniques, particularly by integrating
advanced computer vision algorithms.




                                                                                                  319
Declaration on Generative AI
The authors have not employed any Generative AI tools.

References
[1] M. R. Dileep, A. V. Navaneeth, et al. A study and analysis on various types of agricultural drones
     and its applications, in: Proc. of the 2020 fifth Intern. Conf. on research in computational
     intelligence        and       comm.        networks,        IEEE,     2020,        pp.      181 185.
     doi:10.1109/ICRCICN50933.2020.9296195
[2] D. Albani, J. Ijsselmuiden, et al. Monitoring and mapping with robot swarms for agricultural
     applications, in: Proceedings of the 2017 14th IEEE International Conference on Advanced Video
     and Signal Based Surveillance, IEEE, 2017, pp. 1-6. doi:10.1109/AVSS.2017.8078478
[3] S. Carberry, Base Defense: Evolving Threats Require New Approaches to Defending
     Installations,       National      Defense,       107(834),     2023,      pp.      22-24.      URL:
     https://www.nationaldefensemagazine.org/articles/2023/4/21/evolving-threats-require-new-
     approaches-to-defending-installations
[4] P. Perazzo, F. B. Sorbelli, et al.. Drone path planning for secure positioning and secure position
     verification. IEEE Trans. on Mobile Comp. 16, no. 9 (2016): 2478-2493.
     doi:10.1109/TMC.2016.2627552
[5] F. Jametoni and D. E. Saputra, A Study on Autonomous Drone Positioning Method, in:
     Proceedings of the 2021 6 Intern. Conf. on Informatics and Computing (ICIC), IEEE, 2021, pp. 1-
     5. doi:10.1109/ICIC54025.2021.9632926
[6] J. Gerwen, K. Geebelen, et al. Indoor drone positioning: Accuracy and cost trade-off for sensor
     fusion. IEEE Transactions on Vehicular Technology, Vol. 71, no. 1 (2021): 961-974.
     doi:10.1109/TVT.2021.3129917
[7] J.-H.Kang, K.-J. Park, et al., Analysis of localization for drone-fleet, in: Proc. of the 2015 Intern.
     Conf. on Inform. and Comm. Techn. Convergence (ICTC), IEEE, 2015, pp. 533-538.
     doi:10.1109/ICTC.2015.7354604
[8] L. Crupi, A. Giusti, et al. High-throughput visual nano-drone to nano-drone relative localization
     using onboard fully convolutional networks. arXiv preprint arXiv:2402.13756 (2024).
[9] Di, Chengliang, and Xiaozhou Guo. Topology Perception and Relative Positioning of UAV
     Swarm Formation Based on Low-Rank Optimization. Aerospace 11, no. 6 (2024): 466.
     doi:10.3390/aerospace11060466
[10] M. Ogurtsov, V. Korolyov, O. Khodzinskyi, Improving the Productivity of UAV Operations
     Based on Granular Computing and Fuzzy Sets, in: Proc. of the 2021 IEEE 6th Intern. Conf. on
     Actual Problems of Unmanned Aerial Vehicles Development (APUAVD) 19-21 Oct. 2021, 2021,
     pp. 33-36. doi:10.1109/APUAVD53804.2021.9615419.
[11]
     Measurements for RSSI, SNR and SF Performance Parameters in an Indoor LoRaWAN Network.
     Wireless Personal Comm. 134, no. 1 (2024): 339-360. doi:10.1007/s11277-024-10911-z
[12]
     When Flying in a Group." Aerospace 11, no. 4 (2024): 312. doi:10.3390/aerospace11040312
[13] J Lee, C. Y, Cooperative drone positioning measuring in internet-of-drones, in: 2020 IEEE 17th
     Ann. Consumer Comm. & Networking Conference (CCNC), IEEE, 2020, pp. 1-3.
[14] P. V. Klaine, J. P. Nadas, R. D. et al.. Distributed drone base station positioning for emergency
     cellular networks using reinforcement learning. Cogn. computation, 10(5) (2018): 790-804.
[15] Wu Chen, Jiayi Zhu, et al. A fast coordination approach for large-scale drone swarm. Journal of
     Network and Computer Applications 221 (2024): 103769.
[16] Yu. Wang, He Li, Zechen Tang et al.. "DeepH-2: Enhancing deep-learning electronic structure
     via an equivariant local-coordinate transformer." arXiv preprint arXiv:2401.17015 (2024).
                                                                                                      320