<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Method of local UAV navigation using neural networks⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Olga Pavlova</string-name>
          <email>pavlovao@khmnu.edu.ua</email>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Dmytro Denysiuk</string-name>
          <email>denysiuk@khmnu.edu.ua</email>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Andrii Harmatiuk</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Andrii Kuzmin</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Houda</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>EL Bouhissi</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>GMHost Company</institution>
          ,
          <addr-line>Prybuzka str., 70, Khmelnytskyi, 29007</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>LIMED Laboratory, Faculty of Exact Sciences,University of Bejaia</institution>
          ,
          <addr-line>06000, Bejaia</addr-line>
          ,
          <country country="DZ">Algeria</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>This paper reviews algorithms for UAV navigation and stabilization. The advantages and disadvantages of known algorithms are identified and a method for local UAV navigation using convolutional neural networks (CNN) and recurrent neural networks (RNN) is proposed. A deep convolutional neural network (CNN) in combination with a recurrent network (LSTM) is used to estimate the distance to objects and form a map of the environment. The input data consists of a sequence of video frames processed by CNN to extract features, and then transferred to LSTM to calculate spatial changes. As a result of the study, it was found that the proposed local navigation method based on CNN-LSTM-SLAM neural networks provides significantly higher accuracy of drone positioning in space than traditional methods. In particular, the average absolute error MAE for this method was 0.15 m, which is significantly less than that of optical flow (0.32 m) and IMU method (0.45 m). This demonstrates the ability of the neural network approach to more accurately predict drone movements.</p>
      </abstract>
      <kwd-group>
        <kwd>UAV</kwd>
        <kwd>navigation</kwd>
        <kwd>neural networks</kwd>
        <kwd>CNN</kwd>
        <kwd>LSTM</kwd>
        <kwd>SLAM 1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>Unmanned aerial vehicles (UAVs), known as drones, have become an integral part of the modern
world, finding application in various spheres of human activity. Their popularity is due to their
versatility, accessibility and ability to perform tasks that were previously unattainable or required
significant resources.</p>
      <p>One of the most common areas of use of drones is aerial photography and videography. Due to
the ability to climb to considerable heights and maneuver in confined spaces, drones allow you to
obtain unique footage used in the cinematography, journalism and advertising industries.</p>
      <p>In addition, they are actively used in agriculture to monitor crops, detect pests and optimize
irrigation and fertilizer processes, which increases the efficiency of agricultural production.</p>
      <p>In the military sphere, drones have made a real revolution, changing the tactics of warfare. They
are used for reconnaissance, adjusting artillery fire, delivering cargo and even as strike weapons. In
particular, during the war in Ukraine, drones became an important tool for gaining an asymmetric
advantage over the enemy, allowing for effective targeting and minimizing risks to personnel.</p>
      <p>However, to ensure the safe and efficient operation of drones, it is necessary to implement reliable
stabilization mechanisms. Flight stability is critical for performing precise maneuvers, obtaining
high-quality images and preventing emergency situations. Without proper stabilization, the drone
can become uncontrollable, which will lead to potentially dangerous consequences, especially in
urban environments or during critical missions.</p>
      <p>Drone stabilization ensures their ability to withstand external influences, such as wind gusts,
turbulence or sudden changes in load. This is especially important when performing tasks in extreme
conditions or when operating at high altitudes. Reliable stabilization systems allow the drone to
maintain a given trajectory and orientation, which is necessary for the accurate execution of missions
and ensuring the safety of both the operator and others around it.</p>
      <p>In addition, modern drones often rely on global navigation satellite systems (GNSS), such as GPS,
to determine their location and navigation. However, in conditions where the GNSS signal may be
unavailable or intentionally suppressed, there is a need to develop autonomous navigation
algorithms that do not depend on external signal sources. Such algorithms will allow drones to
effectively perform their tasks even in difficult conditions, ensuring high accuracy and reliability of
navigation.</p>
      <p>Drones open up wide opportunities in various industries, but their effective and safe use is
impossible without the implementation of modern stabilization mechanisms and autonomous
navigation systems. The development and improvement of such technologies is a key direction in
the development of unmanned aerial vehicles, which will contribute to the expansion of their
application and increase the reliability of performing the tasks assigned. Therefore, the use of neural
networks to improve local UAV navigation is a relevant and important task today.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Overview of existing drone navigation and stabilization algorithms</title>
      <p>In modern unmanned aerial vehicles (UAVs), navigation and stabilization are interrelated processes
that ensure accuracy, safety, and flight efficiency. These functions are implemented through the
integration of various algorithms, sensor systems, and control technologies that work together to
maintain the stability and controllability of the drone under various external influences.</p>
      <sec id="sec-2-1">
        <title>2.1. Navigation algorithms</title>
        <p>Traditionally, drones have used GPS to determine their location. However, in environments with
limited or no GPS signal, alternative methods are used. One key approach is simultaneous
localization and mapping (SLAM) algorithms, which allow a drone to build a map of an unknown
environment and determine its position relative to that map using data from cameras, lidar, and
inertial measurement units. This is especially important for navigating indoors or in urban
environments where GPS signals may be unavailable or distorted.</p>
        <p>
          Thus, in [1] an end-to-end UAV simulation platform for SLAM, navigation research, and
applications was introduced, including the detailed simulator setup and an out-of-box localization,
mapping, and navigation system. In [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] the authors propose a novel and complete framework to
realize the autonomous landing of UAVs in unknown indoor scenes based on visual SLAM, semantic
segmentation, terrain estimation, and a decision-making model. The paper [3] describes an
application of the Cartographer graph SLAM stack as a pose sensor in a UAV feedback control loop,
with certain application-specific changes in the SLAM stack such as smoothing of the optimized
pose. The article [
          <xref ref-type="bibr" rid="ref3">4</xref>
          ] presents a survey of simultaneous localization and mapping (SLAM) and data
fusion techniques for object detection and environmental scene perception in unmanned aerial
vehicles (UAVs).
        </p>
        <p>Another approach is to use star navigation, where the drone navigates by the location of stars in
the night sky. This method is particularly useful in cases where GPS is unavailable or jammed.
Scientists have developed algorithms that can determine the location of the drone from a series of
images of the night sky with an accuracy of up to four kilometers, even in difficult conditions, such
path planning, which calculate the shortest path to the target while avoiding obstacles. This ensures
efficient and safe drone navigation in complex environments. For example, an extension of the A*
algorithm, known as Cell A*, allows for stable route planning with less computational overhead,
which is important for long-duration missions.</p>
        <p>
          In [
          <xref ref-type="bibr" rid="ref4">5</xref>
          ] the evaluation function is revised by using dynamic weighting; use azimuth to change the
search neighborhood, and adjust the search method adaptively according to different map areas;
then, considering the influence of the actual size of the UAV, set the UAV and the safety radius of
obstacles. The study [
          <xref ref-type="bibr" rid="ref5">6</xref>
          ] introduces an improved algorithm for three-dimensional path planning in
obstacle-rich environments, such as urban and industrial areas. The proposed approach integrates
the A* search algorithm with a customized heuristic function which incorporates local obstacle
density. The study [
          <xref ref-type="bibr" rid="ref6">7</xref>
          ] proposes an efficient algorithm to detect air pollution in urban areas using
UAVs. An improved A-star algorithm that can efficiently perform searches based on a probabilistic
search model using a UAV is designed.
        </p>
        <p>
          In cases where it is necessary to coordinate the movement of several drones at the same time,
swarm intelligence and reinforcement learning algorithms are used. These methods allow a swarm
of drones to effectively explore unknown territories, avoiding obstacles and coordinating their
actions to achieve a common goal. In particular, Q-Learning algorithms help each drone in the swarm
make optimal decisions based on its own experience and information from other drones.
Thus, in [
          <xref ref-type="bibr" rid="ref7">8</xref>
          ] the authors propose an adaptive conversion speed Q-Learning algorithm (ACSQL).
Performing UAV missions autonomously is divided into two stages: rescue mission search stage and
optimal path search stage. In [
          <xref ref-type="bibr" rid="ref8">9</xref>
          ] the authors develop a DRL framework for UAV autonomous
navigation in a high dynamic and complex environment. The authors of [
          <xref ref-type="bibr" rid="ref9">10</xref>
          ] propose Q-learning
algorithm to efficiently plan the path of UAVs in environments containing both static and dynamic
obstacles. The study [
          <xref ref-type="bibr" rid="ref10">11</xref>
          ] proposes a new system that employs Q-Learning and ANNs with two dense
layers to control UAV swarms in maps with obstacles.
        </p>
      </sec>
      <sec id="sec-2-2">
        <title>2.2. Stabilization algorithms</title>
        <p>One of the key components of mechanical stabilization is gyrostabilized camera gimbals. These
gimbals use data from gyroscopes to compensate for unwanted movements and vibrations, providing
a stable image during flight. There are two- and three-axis gimbals, which allow you to compensate
for the movements of the drone along the corresponding axes, ensuring smooth video recording and
accurate navigation. Studies show that the combination of mechanical stabilization with digital
signal processing allows you to achieve high image quality even in conditions of significant
vibrations and external disturbances. At the software level, PID controllers
(proportional-integralderivative controllers) are widely used, which process data from inertial sensors to maintain a stable
position of the drone. However, in complex and noisy environments, traditional PID controllers may
not be effective enough. In such cases, advanced controllers such as
Proportional-IntegralDerivative-Accelerated (PIDA) with genetic filters are implemented. These algorithms allow to
improve the stability and accuracy of the drone flight, effectively compensating for the influence of
external disturbances and noise.</p>
        <p>
          Our previous study [
          <xref ref-type="bibr" rid="ref11">12</xref>
          ] was aimed at FPV drone stabilization on an automatically determined
target and its further observation. The study [
          <xref ref-type="bibr" rid="ref12">13</xref>
          ] proposes video repeater design concept for UAV
control.
        </p>
        <p>Current research is aimed at implementing deep learning algorithms for automatic adjustment of
stabilization parameters. In particular, the use of deep learning methods with reinforcement, such as
Proximal Policy Optimization (PPO), allows the drone to adaptively adjust its control parameters in
real time, ensuring optimal stability and maneuverability even in dynamically changing conditions.</p>
        <p>Thus, despite the significant development of navigation and stabilization algorithms, there is a
need to create new methods that will ensure the autonomous operation of unmanned aerial vehicles
without dependence on external signals, such as GPS. This study is aimed at developing a method
for drone navigation and stabilization that will use a camera to determine the position in space and
adjust the flight. Modern approaches, such as computer vision, SLAM algorithms and swarm
intelligence, demonstrate effectiveness in complex environments, but they have limitations in
processing speed, energy consumption and adaptability to unpredictable conditions. The proposed
method will allow the drone to autonomously navigate in space, which is critically important for
operation in environments with obstacles or under the influence of electronic warfare. The use of
real-time image analysis algorithms will ensure stable maintenance of the device in a given position
and accurate navigation, which opens up new opportunities for its application in military, rescue
and research missions. Further development in this area should be aimed at creating localized
algorithms capable of operating effectively in GPS-failure environments, ensuring the stability and
accuracy of drone flight in critical conditions.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Local navigation method</title>
      <p>Local navigation of an unmanned aerial vehicle (UAV) is a complex task that requires the integration
of computer vision, sensor analysis, and machine learning methods. The absence of GPS or other
global positioning systems makes it difficult to determine the location of the drone, which requires
the use of autonomous navigation methods. The main idea of the method is to use convolutional
neural networks (CNN) and recurrent neural networks (RNN) to analyze the video stream and
calculate the relative position of the drone in space. The neural network allows you to obtain the
necessary characteristics of the environment, which are critically important for building a flight
trajectory.</p>
      <p>Let   ( ,  ), image obtained from the drone camera at time  ,   (  ,   ,   ) - its coordinates in
space. The task of local navigation is to determine the trajectory   +1, which ensures flight stability
and obstacle avoidance. This uses a comprehensive approach that includes image segmentation,
object detection, and scene depth analysis.</p>
      <sec id="sec-3-1">
        <title>3.1. Model architecture</title>
        <p>A deep convolutional neural network (CNN) combined with a recurrent network (LSTM) is used to
estimate the distance to objects and form a map of the environment. The input data consists of a
sequence of video frames, processed by the CNN to extract features, and then passed to the LSTM to
calculate spatial changes.</p>
        <p>Formally, each frame   turns into signs   using a convolutional network can be denoted by the
Formula 1.</p>
        <p>where   is feature vector obtained after image processing   a convolutional neural network
that extracts structural and textural information about the environment.</p>
        <p>The recurrent layer uses the features described in the Formula 2 to predict position changes.
  =</p>
        <p>(  ),
Δ  =</p>
        <p>(  ,   −1,…,  − ),
where Δ  is a vector of change in the drone's position, estimated based on the analysis of
previous features   ,   −1,…,  − etc., using the long-term memory mechanism to detect patterns in
the motion.</p>
        <p>The new position vector is calculated by the Formula 3.</p>
        <p>̂ +1 =   + Δ  ,
where  ̂ +1 are predicted coordinates of the drone at a given point in time  + 1 , which are
determined based on the current situation   and displacement Δ  .
(1)
(2)
(3)</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Route correction</title>
        <p>The resulting coordinates are used to update the local map and avoid obstacles. This is done using
the simultaneous localization and mapping (SLAM) method, where each frame is compared with the
previous ones, and key points of the image are stored in the local map and denoted by the equation
  = SLAM( ,   ),
  +1 =   − 
objects relative to the drone.
desired and actual trajectory (Formula 5).</p>
        <p>where   is a current map of the environment containing information about the location of
Route correction is performed using gradient descent, which minimizes the error between the
trajectory.</p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Neural network training algorithm</title>
        <p>The neural network is trained in two stages: pre-training in safe conditions on a simulator and
further refinement of the model in real operating conditions. First, a simulator is used to generate a
data set that includes various flight conditions, obstacles and variable environmental parameters.
The network is trained to minimize the error between the predicted and actual coordinates (Formula
  and predicted  ̂ .</p>
        <p>where  is the loss function that measures the difference between the true position of the drone
Optimization is carried out using the gradient descent method denoted by the Formula 7.
environment.
where  is a learning speed,  is a loss function that determines the deviation from the desired
(4)
(5)
(6)
(7)
(8)
(9)
where  denotes model parameters,  is a learning rate. After completing simulation training,
the model is transferred to real conditions, where it adapts based on data obtained from the real

  +1 =   (  +1 −   ) +   ( ̇ +1 −  ̇ ) +    =1(  −   −1),
∑</p>
        <p>Auxiliary sensors are used to improve local navigation accuracy. Barometer estimates altitude
ℎ , which is included in the drone state vector (Formula 8).</p>
        <p>where   is the orientation of the drone, obtained from the compass, and   is a feature vector
from the camera. Taking these parameters into account allows to increase the resistance to external</p>
        <p>Based on the obtained coordinates and the landmark map, the drone velocity vector is
determined, which is controlled via a PID controller and denoted by the Formula 9.
conditions.
identified:</p>
        <p>where   ,  ,  is PID controller coefficients that control the proportional, differential and
integral contributions to the drone's speed according to the position error. This controller allows you
to maintain stable movement even in difficult conditions.</p>
        <p>The proposed method combines computer vision, deep neural networks and sensor data to
provide autonomous local drone navigation. The use of CNN and LSTM allows you to predict the
change in position in space, and the integration of additional sensors increases the accuracy and
reliability of the system. The PID controller ensures stability of movement based on the calculated
coordinates. This approach can be used for autonomous systems operating in GPS-unavailable
conditions, ensuring high navigation efficiency even in dynamic scenarios.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Experiments and Results</title>
      <p>To evaluate the effectiveness of the proposed method, an experimental study was conducted,
including a comparison with other local navigation methods. The main goal of the experiments was
to determine the navigation accuracy, processing speed, and resistance to interference in different</p>
      <sec id="sec-4-1">
        <title>4.1. Conditions for conducting the experiment</title>
        <p>The study was conducted in a test environment that included simulation of real-world flight
conditions using a simulator and physical experiments with a real drone. Three main scenarios were</p>
        <p>Stable environment. Minimal distractions, well-lit room.</p>
        <p>Changing environment. Presence of dynamic objects (moving obstacles).</p>
        <p>Low light. Testing operation in conditions of limited visibility.</p>
        <p>Each navigation method was tested in all three environments to evaluate its performance. Three
methods were selected for comparison: the proposed CNN-LSTM-SLAM, an optical flow-based
method, and a method using only IMU data.</p>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Evaluation metrics</title>
        <p>The results were analyzed according to the following indicators, which allow us to evaluate the
accuracy, adaptability and efficiency of the proposed method.</p>
        <p>The first criterion was the mean absolute error (MAE) in determining the location of the drone.
It was calculated by the Formula 10.</p>
        <p>MAE =

1
∑

 =1
|  −   ∗|,
(10)
where   is an actual drone position,   ∗ is the predicted position,  is the number of test points.</p>
        <p>The second indicator was the resistance to environmental changes, which was assessed by the
percentage of correctly adjusted trajectories when new objects appeared.</p>
        <p>The last criterion was the processing time required to analyze one frame and calculate the
correction commands.</p>
      </sec>
      <sec id="sec-4-3">
        <title>4.3. Experiment results</title>
        <p>As a result of the experiments conducted, Table 1 was obtained.</p>
        <p>The table shows that the proposed CNN-LSTM-SLAM method provides significantly better
positioning accuracy (MAE = 0.15 m) compared to optical flow (MAE = 0.32 m) and IMU method
(MAE = 0.45 m). Although the CNN-LSTM-SLAM method has a slightly longer processing time (35
ms), this is compensated by high noise immunity (94%), which is superior to traditional methods.
Additionally, testing was carried out in real conditions with strong wind and precipitation. However,
the results of this experiment were not taken into account in the overall analysis due to the
impossibility of repeating the weather conditions for each method participating in the experiment.
Despite this, observations showed that the proposed method demonstrates higher resistance to
external factors compared to traditional navigation methods.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions</title>
      <p>As a result of the study, it was found that the proposed method of local navigation based on neural
networks CNN-LSTM-SLAM provides a significantly higher accuracy of drone positioning in space
than traditional methods. In particular, the average absolute error MAE for this method was 0.15 m,
which is significantly less than the optical flow (0.32 m) and the IMU method (0.45 m). This indicates
the ability of the neural network approach to more accurately predict the movement of the drone.</p>
      <p>In addition, the proposed method demonstrates high resistance to interference (94%), which is
almost twice as high as the similar indicator for the IMU method (40%) and significantly better than
in the case of optical flow (68%). This confirms the effectiveness of using CNN-LSTM in complex
conditions with dynamic objects.</p>
      <p>The only drawback of the method is a slightly longer frame processing time (35 ms), compared to
other methods, such as IMU (10 ms) and optical flow (20 ms). However, this difference is justified in
view of the obtained accuracy and stability.</p>
      <p>Thus, the results of experimental studies confirmed the effectiveness of the proposed approach
for autonomous drone navigation in GPS-unavailable conditions. It is a promising solution for
application in complex dynamic environments, such as urban areas, forests or search and rescue
operations.</p>
      <p>Further research will be aimed at improving the data processing speed and reducing the
computational complexity of the method. In particular, a promising direction of development is the
optimization of the neural network architecture for operation on limited computing resources, which
will allow implementing the system on less powerful drones. The possibility of integrating additional
sensors, such as lidars and radar systems, to improve navigation accuracy in difficult weather
conditions and in the absence of visual information will also be investigated.</p>
    </sec>
    <sec id="sec-6">
      <title>Declaration on Generative AI</title>
      <p>The author have not employed any Generative AI tools.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhou</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>A. S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Wen</surname>
            ,
            <given-names>C. Y.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>An end-to-end UAV simulation platform for visual SLAM and navigation</article-title>
          .
          <source>Aerospace</source>
          ,
          <volume>9</volume>
          (
          <issue>2</issue>
          ),
          <fpage>48</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ye</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , Zhang,
          <string-name>
            <given-names>Y.</given-names>
            ,
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            , &amp;
            <surname>Qiu</surname>
          </string-name>
          ,
          <string-name>
            <surname>C.</surname>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>A semantic SLAM-based method for navigation and landing of UAVs in indoor environments</article-title>
          .
          <source>Knowledge-Based Systems</source>
          ,
          <volume>293</volume>
          ,
          <fpage>111693</fpage>
          .
          <article-title>Flying with cartographer: Adapting the cartographer 3D graph SLAM stack for UAV navigation</article-title>
          .
          <article-title>In 2021 Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO) (pp</article-title>
          .
          <fpage>1</fpage>
          -
          <lpage>7</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Gupta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Fernando</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>Simultaneous localization and mapping (slam) and data fusion in unmanned aerial vehicles: Recent advances and challenges</article-title>
          .
          <source>Drones</source>
          ,
          <volume>6</volume>
          (
          <issue>4</issue>
          ),
          <fpage>85</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Xiong</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          (
          <year>2023</year>
          ,
          <article-title>September). A Method of UAV Navigation Planning Based on ROS and Improved A-star Algorithm</article-title>
          .
          <source>In 2023 CAA Symposium on Fault Detection, Supervision and Safety for Technical Processes (SAFEPROCESS)</source>
          (pp.
          <fpage>1</fpage>
          -
          <lpage>5</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Yoo</surname>
            ,
            <given-names>Y. D.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Jung-Ho</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2025</year>
          ).
          <article-title>Study on A-Star Algorithm-Based 3D Path Optimization Method Considering Density of Obstacles</article-title>
          . Aerospace,
          <volume>12</volume>
          (
          <issue>2</issue>
          ),
          <fpage>85</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Ha</surname>
            ,
            <given-names>I. K.</given-names>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>Improved a-star search algorithm for probabilistic air pollution detection using UAVs</article-title>
          . Sensors,
          <volume>24</volume>
          (
          <issue>4</issue>
          ),
          <fpage>1141</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Wu</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sun</surname>
            ,
            <given-names>Y. N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shi</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gao</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          , ... &amp;
          <string-name>
            <surname>Wu</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>An adaptive conversion speed Q-learning algorithm for search and rescue UAV path planning in unknown environments</article-title>
          .
          <source>IEEE Transactions on Vehicular Technology</source>
          ,
          <volume>72</volume>
          (
          <issue>12</issue>
          ),
          <fpage>15391</fpage>
          -
          <lpage>15404</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Tong</surname>
            ,
            <given-names>G. U. O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jiang</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Biyue</surname>
            ,
            <given-names>L. I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Xi</surname>
            ,
            <given-names>Z. H. U.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ya</surname>
            ,
            <given-names>W. A. N. G.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Wenbo</surname>
            ,
            <given-names>D. U.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>UAV navigation in high dynamic environments: A deep reinforcement learning approach</article-title>
          .
          <source>Chinese Journal of Aeronautics</source>
          ,
          <volume>34</volume>
          (
          <issue>2</issue>
          ),
          <fpage>479</fpage>
          -
          <lpage>489</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Sonny</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yeduri</surname>
            ,
            <given-names>S. R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Cenkeramaddi</surname>
            ,
            <given-names>L. R.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>Q-learning-based unmanned aerial vehicle path planning with dynamic obstacle avoidance</article-title>
          .
          <source>Applied Soft Computing</source>
          ,
          <volume>147</volume>
          ,
          <fpage>110773</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Puente-Castro</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rivero</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pedrosa</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pereira</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lau</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Fernandez-Blanco</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>Qlearning based system for path planning with unmanned aerial vehicles swarms in obstacle environments</article-title>
          .
          <source>Expert Systems with Applications</source>
          ,
          <volume>235</volume>
          ,
          <fpage>121240</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Halytskyi</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Denysiuk</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kozhemiako</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Kvassay</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2025</year>
          ).
          <article-title>Method of FPV drone stabilization on an automatically determined target and its further observation</article-title>
          .
          <source>Computer Systems and Information Technologies, (1)</source>
          ,
          <fpage>36</fpage>
          <lpage>41</lpage>
          . https://doi.org/10.31891/csit-2025
          <source>-1-4.</source>
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Pavlova</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Halytskyi</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>Video repeater design concept for UAV control</article-title>
          .
          <source>Computer Systems and Information Technologies, (1)</source>
          ,
          <fpage>33</fpage>
          <lpage>38</lpage>
          . https://doi.org/10.31891/csit-2024
          <source>-1-4.</source>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Arshad</surname>
            ,
            <given-names>M. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Khan</surname>
            ,
            <given-names>S. H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Qamar</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Khan</surname>
            ,
            <given-names>M. W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Murtza</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gwak</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Khan</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>Drone navigation using region and edge exploitation-based deep CNN</article-title>
          .
          <source>IEEE Access</source>
          ,
          <volume>10</volume>
          ,
          <fpage>95441</fpage>
          -
          <lpage>95450</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Rezwan</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Choi</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>Artificial intelligence approaches for UAV navigation: Recent advances and future challenges</article-title>
          .
          <source>IEEE access</source>
          ,
          <volume>10</volume>
          ,
          <fpage>26320</fpage>
          -
          <lpage>26339</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Amer</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Samy</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shaker</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>ElHelw</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2021</year>
          , January).
          <article-title>Deep convolutional neural network based autonomous drone navigation</article-title>
          .
          <source>In Thirteenth International Conference on Machine Vision</source>
          . Vol.
          <volume>11605</volume>
          , pp.
          <fpage>16</fpage>
          -
          <lpage>24</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [17]
          <string-name>
            <surname>AlMahamid</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Grolinger</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>Autonomous unmanned aerial vehicle navigation using reinforcement learning: A systematic review</article-title>
          .
          <source>Engineering Applications of Artificial Intelligence</source>
          ,
          <volume>115</volume>
          ,
          <fpage>105321</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Deraz</surname>
            ,
            <given-names>A. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Badawy</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Elhosseini</surname>
            ,
            <given-names>M. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mostafa</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ali</surname>
            ,
            <given-names>H. A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>El-Desouky</surname>
            ,
            <given-names>A. I.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>Deep learning based on LSTM model for enhanced visual odometry navigation system</article-title>
          .
          <source>Ain Shams Engineering Journal</source>
          ,
          <volume>14</volume>
          (
          <issue>8</issue>
          ),
          <fpage>102050</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shen</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Xueyong</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          (
          <year>2025</year>
          ).
          <article-title>Enhancing Integrated Navigation with a Self-Attention LSTM Hybrid Network for UAVs in GNSS-Denied Environments</article-title>
          . Drones,
          <volume>9</volume>
          (
          <issue>4</issue>
          ),
          <fpage>279</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Santos</surname>
            ,
            <given-names>R. S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Matos-Carvalho</surname>
            ,
            <given-names>J. P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tomic</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Beko</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Correia</surname>
            ,
            <given-names>S. D.</given-names>
          </string-name>
          (
          <year>2023</year>
          , March).
          <article-title>A hybrid lstm-based neural network for satellite-less UAV navigation</article-title>
          .
          <source>In 2023 6th Conference on Cloud and Internet of Things (CIoT)</source>
          . pp.
          <fpage>91</fpage>
          -
          <lpage>97</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>