=Paper=
{{Paper
|id=Vol-3925/paper08
|storemode=property
|title=Estimation of the aircraft's position based on optical channel data
|pdfUrl=https://ceur-ws.org/Vol-3925/paper08.pdf
|volume=Vol-3925
|authors=Pylyp Prystavka,Olha Cholyshkina
|dblpUrl=https://dblp.org/rec/conf/cmigin/PrystavkaC24
}}
==Estimation of the aircraft's position based on optical channel data==
Estimation of the aircraft's position based on optical
channel data
Pylyp Prystavka1,∗,† and Olha Cholyshkina2,†
1
National Aviation University, Liubomyra Huzara Ave. 1, Kyiv, 03058, Ukraine
2
Interregional Academy of Personnel Management, Frometivska Str., 2, Kyiv, 03039, Ukraine
Abstract
Paper presents a mathematical support for solving the problem of navigation along the optical channel of
an aircraft. The stages were formalized and the main notations were introduced, a formal procedure for
obtaining coordinates of special points of a digital image from aircraft cameras was presented. It is proposed
to determine the coordinates of the aircraft based on the method that uses the estimation of the density
function of the distribution of the coordinates of special points of the digital image. A method of verifying
the reliability of the aircraft position assessment based on consistency measures has been formulated. The
results of testing the proposed algorithm are given. Conclusions are formulated.
Keywords
autonomous flight control UAV navigation, optical channel data, aircraft position, estimation image
processing, GPS-denied environment, optical sensors1
1. Introduction
The relevance of this topic is determined by the increasing significance of unmanned aerial vehicles
(UAVs) for various applications, including surveillance, reconnaissance, monitoring, and data
collection. With the advancement of UAV technologies, the importance of accurate and reliable
methods for assessing the vehicle's location becomes critical to ensure the effectiveness and safety
of their operations [1]. Precise assessment of UAV location is crucial for the successful execution of
missions, navigation, and control. This is especially important in scenarios where optical channels,
such as cameras or sensors, play a key role in determining the aircraft's location. Location assessment
methods based on optical channels are valuable for applications such as aerial imaging, search and
rescue operations, environmental monitoring, and precision agriculture. Furthermore, the ability to
assess the location of the UAV based on optical data is significant in urban environments, where GPS
signals may be delayed or less reliable [2, 3]. Optical channels provide an alternative means of
obtaining spatial information and enhancing UAV autonomy. Research in this field addresses
challenges related to the development of reliable algorithms, image processing methods, and
mathematical models that can accurately assess the UAV's location using optical data. The topic is
relevant special points only for the advancement of UAV capabilities but also for addressing safety
issues, regulatory requirements, and the integration of UAVs into various industries. Overall,
assessing the location of unmanned aerial vehicles based on optical channels is a pertinent and
evolving research area with broad practical implications.
CH&CMiGIN’24: Third International Conference on Cyber Hygiene & Conflict Management in Global Information Networks,
January 24–27, 2024, Kyiv, Ukraine
∗
Corresponding author.
†
These authors contributed equally.
chindakor37@gmail.com (P. Prystavka); greenhelga5@gmail.com (O. Cholyshkina)
0000-0002-0360-2459 (P. Prystavka); 0000-0002-0681-0413 (O. Cholyshkina)
© 2025 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR
ceur-ws.org
Workshop ISSN 1613-0073
Proceedings
2. Review of existing solutions and literature sources
Today, navigation systems for aircraft are the object of many studies. Both scientific institutions and
commercial business organizations around the world are actively engaged in this issue. Special
attention is paid to aircraft navigation in the absence of GPS signals [4].
This paper [5] presents a multi-tier UAV localization module that leverages GNSS, inertial, and
visual-depth data. The module is designed to enhance the localization accuracy and reliability of
UAVs in challenging environments.
The study proposes a novel integration scheme combining GNSS, INS, and LiDAR data for UAV-
based navigation, particularly in areas where GNSS signals are obstructed or unreliable [6]. The
integration aims to ensure continuous and accurate navigation. This research [7] introduces a low-
cost solution for UAV navigation in GPS-denied environments. The solution focuses on the use of
alternative sensors and methods to maintain accurate positioning and navigation without GPS
signals. The article [8] discusses a multilevel architecture for autonomous UAVs, aiming to improve
their operational capabilities and autonomy. The architecture addresses various challenges in UAV
operation, including decision-making and environmental interaction.
It is proposed to compensate for the work of satellite navigation due to the introduction of
additional optical navigation sensors. Sensors are integrated using the Kalman filter and its
modifications, for example [9, 10].
Much attention is paid to the application of AI in navigation systems. In work [11] for visual
odometry, in work [12] for optimization of sensor data integration. The method of determining the
presence of GPS navigation obstacles is described in [13].
This research [11] focuses on a deep learning-based LSTM model to enhance visual odometry
navigation systems. The model aims to increase the accuracy and robustness of UAV navigation
using advanced machine learning techniques.
The article [12] discusses adaptive step size learning with applications to velocity-aided inertial
navigation systems. The approach aims to optimize the performance of inertial navigation systems
through adaptive learning techniques.
This study [13] introduces a deep-ensemble-learning-based GPS spoofing detection method for
cellular-connected UAVs. The method aims to enhance the security and reliability of UAV navigation
by detecting and mitigating GPS spoofing attacks.
The works of Ukrainian authors [14–18] and author's additions [2, 19–25] are worthy of attention.
3. The method of determining the position of the aircraft
3.1. Preliminary indications and assumptions
It is assumed that the flight mission can be performed in the area where relevant data has been
previously collected. This requirement is put forward due to the fact that a set of descriptors of
special landmark points, together with their associated coordinates, must be stored on board the
UAV during the flight. So, the first step should be to conduct aerial photography and link the images
to the digital map.
Linking pictures to a digital map is a separate task [1], in the further explanation we will assume
that a similar operation is performed, i.e., for each pixel of digital picture coordinates are matched.
The second stage is the direct selection of special points and the determination of their
descriptors, it is described in detail in [2, 19, 20]. Given the potentially large number of features
(landmarks) in the aerial survey data, special attention should be paid to reducing their number by
reducing the original image with smoothing.
Thus, pre-flight preparation should ensure the availability of a text (or special format) file with
descriptors of special points and their corresponding coordinates on the on-board computer during
the flight mission
= , , , ; = −3,3, = −3,3 , , ; = 1, ,
where , is magnitude; , is angle; is the number of special points.
The third stage is a flight mission. If it is necessary to position the aircraft, the on-board computer
processes received picture from the target camera in order to find features and compare their
descriptors with the available descriptors of the Orient array. The purpose of the comparison is to
form a new array containing a list of coordinates corresponding to the matching descriptors:
= , ; = 1, ,
where is the number of singular points in the image, or the actual number of descriptor matches,
if some distance metric threshold is introduced.
In the following, we will analyze the set of coordinates of the array as a sample of realizations
of some random variable. The purpose of such an analysis is to establish the most likely location of
the aircraft, depending on the probability distribution of data.
3.2. Selection of special points for positioning the aircraft
Let's consider the approach to the selection of special points on the digital image from the camera of
the target load of the UAV directly during the flight mission. After detecting singular points using
the operators discussed in [2, 19, 20], the task of finding singular points that are common to the
reference and test images arises.
Let be
= , , , ; = −3,3, = −3,3 , , ; = 1,
– a set of descriptors of control singular points with their corresponding coordinates,
!= , , , ; = −3,3, = −3,3 , _# $ , _# $ , = 1,
– a set of descriptors of special points of the test image and corresponding pixel indices of the digital
image,
_# $ ∈ &0; ( ) ℎ − 1*; _# $ ∈ &0; + , ℎ − 1*,
where ( ) ℎ, + , ℎ - height and width of the picture from the LA camera.
If the values of the magnitude and angles are normalized, then as a measure of the distance
between the descriptors can be used a metric based on the Euclidean distance
, = ., / + , 1 ,
where
3 3
,/ = ∑48674 ∑45674 , − , ; ,1 = ∑48674 ∑45674 , − , ;
= 1, , = 1, ,
For each k-point of the test picture the nearest l-point of the array can be found from the
condition:
/89/89;<:= >
, :
, (1)
thereby defining a two-dimensional array of coordinates = , ; = 1, , corresponding
to the positions of special points _# $ , _# $ , = 1, on the current digital picture from
UAV cameras. Therefore, an array can be entered into consideration
?9<@A = BCD= , BCD= , , ; = 1, , (2)
in which the indices of special points on the digital terrain image are matched with some coordinates
from the array .
3.3. Preliminary determination of the coordinates of the aircraft
Let the _E , coordinate array be obtained as a result of comparing the descriptors of special
points on the picture and the data of the array. It is necessary to determine the point
_ F , , which contains the most likely coordinates near which the aircraft is located.
There is always a possibility that the obtained points (2), which are assigned the coordinates of
the array, may belong to the image of the landmark object in the test image, and may just
coincidentally be similar to such special points. If the two-dimensional density distribution [2] of the
coordinates of the array is estimated non-parametrically, then their compact arrangement will
determine the coordinates of a separate sought landmark, if it is represented by a compact set of
detector points. In words, we are talking about searching for areas on the GH plane where such a
density estimate will have local maximum, or one global maximum if there is only one landmark in
the field of vision of the aircraft's target load cameras.
As an estimate of the two-dimensional distribution density by the array, we can choose
construction and analysis of a histogram of relative frequencies to localize rectangular areas of the
probable location of landmarks, and already within the specified areas, we can determine a typical
value among and , = IIIIII 1, . For example, taking about the central point of the corresponding
rectangle.
To construct a two-dimensional histogram of the relative frequencies of the coordinate
distribution, the following is proposed. Let some (in general, arbitrary) ℎA >0, ℎJ >0.
Let's set the division KLM ,LN of the area of implementation of coordinates , , = 1, , the
upper left corners of which are determined by points with indices KLP ,LQ : /89 /89 , =
0, − 1, = 0, − 1, where , - the number of areas in the respective directions
; >/89 ,
6S,T
; >/89 .
6S,T
The relative frequencies of the distribution, which determine the empirical probability of the
appearance of a special point from the array in a specific local partition area KLM ,LN , are obtained
as follows:
S
U88,55 = ∑V6S E , = 0, − 1, = 0, − 1,
V
where
A 7A J 7J
1, U = X : L YZ[ \ F , ] : L YZ[ ^ ,
E =W M N
0, ℎ _ $ .
Some , -and the area of division KLP ,LQ , where the condition is true F U88,55 and will
88,55
contain the geometric location of the landmark object, and the central point of , area can be
taken as the position of the aircraft
_ F , :a = + 0,5 A + 0,5 J /89 c.
/89
If we assume that the search object on the test image can be located within several areas, then for
their localization it is enough to select the areas where the condition is fulfilled
U88,55 ≥ Ueℎf@DℎC < , (3)
where UeLf@DLC < is some threshold value.
Then the position of the aircraft can be chosen as the weighted average of the coordinates of the
centers of the rectangles, where the condition (3) is fulfilled, i.e.:
_ F , : ; = ̄ ; = ̄ >,
where
j ⋅l j ⋅l
̄ = ∑V97S
886i ∑556i
T/7S
85 ⋅ ∑r[no ∑mYno ̄ = ∑V97S
886i ∑556i
T/7S
85 ⋅ ∑r[no ∑mYno
Zk Zk Zk Zk
jZk ⋅lZk jZk ⋅lZk
; ; (4)
ZZpq kkpq ZZpq kkpq
1, U88,55 ≥ Uef@DLC < ,
s85 = t
0, U88,55 < Uef@DLC < .
Values U88,55 , = 0, − 1, = 0, − 1 provide the empirical probability of the appearance
of landmark detectors in areas of division KLM ,LN , so, if one such region is chosen (the most likely),
then the probability that the found region contains the object is:
= F U88,55 ,
88,55
and if the object occupies several areas, then:
= ∑V97S
886i ∑556i U88,55 ⋅ s88,55 .
T/7S
That in the last case, the estimation of the location of the aircraft based on weighted averages (4)
is robust, because improbable coordinates are simply taken into account.
3.4. Verification of the reliability of the determined position of the aircraft
We formalize the criterion for verifying that the received coordinates of the features of the current
picture really correspond in most cases to the area where the aircraft is located.
For example, if a complete inconsistency is observed for the components of the array (2), then
this is evidence that the determined coordinates of the features , do change ordinarily due to
the change in the positions _# $ , _# $ = 1, of special points in the picture.
As an estimate of the degree of consistency, it is possible to propose the use of Spearman or
Kendall rank correlation coefficients between array vectors:
, _# $ ; = 1, , , _# $ ; = 1, ,
, _# $ ; = 1, , _# $ ; = 1, .
(5)
та
Let the corresponding rank array be obtained for any of the specified arrays ?, , l, ; =
1, , where ?, , l, – ranks, i.e. serial numbers of the variant in variation series by or =
1, and _# $ or _# $ = 1, ; index E = ; , >; index s = ; _# $, _# $>.
The value of Spearman's rank correlation coefficient v̂ x is calculated according to the formula
y
v̂ x = 1 − T Tz 7S ∑T6S ,3 , де , = ?, − l, .
Significance v̂ x is determined on the basis of a hypothesis {i : vx = 0, for the verification of which
a statistical characteristic is
|} √T73
= ~ ,
.S7|}~z
it has t -distribution with the number of degrees of freedom € = − 2.
And if
| | ≤ S7„,… , (6)
3
S7 ,… - quantile of the t-distribution; ‡ - the probability of an error of the first kind when accepting
†
z
a hypothesis {i (level of significance), when here is no consistency between the determined
coordinates of special points and their location on the digital image, and the obtained coordinates of
the location of the UAV should raise doubts.
If (6) is true for each of arrays (5), when the position of the aircraft _ F , can be
considered statistically justified.
For a more reliable assessment of the consistency of the coordinates of the selected features with
their positions on the digital image, it is advisable to check the corresponding hypotheses for all data
arrays (5), but only for those that fall into the partition areas for which condition (3) is fulfilled. In
this case, it should be expected that random "coincidences" of descriptors will be characteristic of
points that are in areas with a relatively low level of relative frequencies. In further research, it is
worth paying attention to the possibility of evaluating the consistency of elements of arrays (5)
separately for each of the elementary areas of division and, perhaps, taking into account for further
use only those points where such a correlation occurs.
4. Experimental studies
4.1. Algorithm of determining the displacement of the aircraft based on the
analysis of two consecutive images from the target load cameras of the
unmanned aircraft during movement
Let the analysis be subject to the video from the observed camera, the field of vision of which is
directed vertically downwards. For two consecutively received frames with a frequency of, for
example, 5 seconds, it is necessary to calculate the displacement, namely the real distance that was
covered in the interval between the frames and the direction of movement.
For more accurate calculations, you can add a check of the value of the angle of inclination of the
machine (roll and pitch), and use only those frames that were recorded at the time when the
inclination of the machine was released. Provided that the image is obtained too much due to long,
large angles of inclination, which allow when turning or caused by weather factors, it is possible to
stop the movement, until the moment when it is recommended that the leveling takes place. At a
sufficiently high altitude, or a low speed of movement, an increase in the interval between receiving
frames for processing is possible.
A special factor that can affect the quality of determining the distance traveled by an aircraft is
the case when the image is subjected to various deformations such as fish-eye or distortion.
Distortion is primarily able when using zoom lenses, and the higher the zoom ratio, the more able
it is. In addition, the level of changes may vary depending on the distance to the object, in some cases
a close object may be distorted, while a distant one will be without deformations.
Thus, only the central area of the image, which is approximately 50-70% of the area of the entire
image, can be recommended to be used to determine special points. The specified approach should
contribute to reducing the calculation error caused by deformations.
Before starting processing, it is also a desirable procedure to scale the received image by reducing
it, which helps to reduce the load on the image processing microcontroller. For example, you can
recommend reducing the image to 600x480.
Taking into account the comments made about the image received from the cameras of the target
load of the aircraft, it is possible to proceed to its processing in order to determine the displacement.
Let's consider the algorithm (Figure 1) for calculating the displacement of the aircraft in more
detail with a description of each step. Let B1 and B2 be a set of singular point descriptors and
corresponding pixel indices of two consecutive images
!S = S
, , S , ; = −3,3, = −3,3 , _# $S , _# $S , = 1, S ,
!3 = 3
, , 3 , ; = −3,3, = −3,3 , _# $ 3 , _# $ 3 , = 1, 3 ,
and array B1 was received earlier than B2.
Algorithm. Determining the displacement of the aircraft based on the analysis of two consecutive
images from the target load cameras of the unmanned aircraft during movement.
Step 1. The first step is to get a zoomed image from the camera.
Step 2. We define arrays of special points on the image and their descriptors.
Step 3. We check whether there are points and descriptors for comparison.
Step 4. If there are no points for comparison (possible only when processing the first frame),
remember the points found (see Step 2) and their descriptors as points for comparison. We get a new
image and go to point 1.
Step 5. If there are special points for comparison, we compare their descriptors in order to detect
points that are present in the images.
Step 6. We subtract the pixel indices of pairs of points that coincide on images, thus obtaining a
set of displacements along the width and height of the image for each of the key points.
Step 7. We average the values obtained in point 6 of the value, thereby obtaining an estimate of
the average displacement of local features:
∑ˆ o z
Zpo 8_BCDZ 78_BCDZ ∑ˆ o z
Zpo 5_BCDZ 75_BCDZ
_# $ = ‰
, _# $ =
‰
,
where T – the number of coincidences of singular points in two images.
<
Step 8. Based on expression = ‹Œ• , we calculate the value of the distance per 1 pixel of the
Š Ž
Y•
image (to reduce the number of calculations, we check for a change in height, and if the height has
nspecial points changed or has changed insignificantly, we leave the previous value of the distance).
Step 9. Multiplying the values _# $ and _# $ obtained (see clauses 8 and 7) and we get an
estimate of the average displacement of the camera relative to the earth's surface for the time that
has passed between the moments of receiving frames.
A conclusion about the direction of movement of an unmanned aerial vehicle can be made based
on the sign of the obtained estimates _# $ and _# $
Step 10. We save the found (see Step 2) points and their descriptors as points for comparison. Let's
go to Step 1.
Figure 1: Block diagram of the displacement calculation algorithm.
Thus, using the given algorithm, it is possible only to estimate the distance the aircraft has moved,
but also to obtain the direction of such movement. Therefore, by fixing these indicators during the
flight task, it is possible to obtain an estimate of the location with reference to the coordinates.
However, the latter is possible only after the orientation of the initial image has been performed. In
addition, the given algorithm can acquire refinements for the case of performing various types of
maneuvers by an aircraft, so practical studies of this issue go beyond the scope of the purely
theoretical material of the current section.
4.2. Simulation testing of the technology of navigation of an aircraft using an
optical channel
4.2.1. Flight simulation on test images
The essence of the simulation is to replace the input video stream: instead of a real flight, test images
are used with dimensions that significantly exceed the resolution of the video camera.
Accordingly, the real flight is replaced by a sliding window (simulation of the field of vision of
the UAV camera [1]) according to the specified test picture, and the movement of the UAV is replaced
by the movement of the sliding window in the corresponding direction. Figure 2 shows an example
of a test picture used for simulation testing. In order to test the proposed approach, appropriate
software was developed [18]. The main window is on Figure 3.
Figure 2: An example of an aerial photograph of an area that used for testing. The size of the
photograph is 8176×6132 pixels.
Figure 3: The main window.
In the selected zone 1 of Figure 3 there are parameters for the filtering method of special points.
To set the signal loss point, press "Set GPS signal loss point", and to set the base - "Set base". After
that, we get the result, which can be seen in Figure 4.
After that, if the user presses the "Next step" button, the process of positioning the drone and
returning it to the base is described in the previous part. Namely, first the drone searches for its
position (Figure 5), after which it flies straight to the base (Figure 6).
Figure 4: Setting the base and point of loss of the satellite signal.
Figure 5: Simulating the flight of the UAV along the Archimedean spiral to find the position.
Figure 6: Simulation of the UAV flight to the base.
4.2.2. Results of simulation testing
The result of the first test can be seen in fig. 5, 6. They show that the drone will return to the base if
there is enough information on SPECIAL POINTS from the map and a clear enough picture from the
drone itself. It can be seen that due to the small size of the UAV camera, it took a long time to find
the position, as the drone flew in a spiral for a long time.
When the size of the image from the camera increases, the speed of finding increases, and the
possibility of correcting the location increases, as can be seen in zone 1 of Figure 7.
Figure 7: Testing in the case of a UAV flight to the base with an enlarged image from the camera.
When the drone flies for a long time without determining the position during the flight, the
position search takes place again (Figure 8). It can be seen that the developed algorithm works
correctly, but position search is required.
Figure 8: Testing in the case of a long journey without determining the position during the flight.
Testing was also conducted on different images with different image quality (Figure 9). It has
been confirmed that the method works on different images, and although it depends on the quality,
it can handle low-quality images.
Figure 9: Testing on lower quality images.
Thus, as a result of performing a simulation test of the method proposed in this section, the
expected results were obtained, which confirms its correctness.
5. Conclusions
Mathematical support for the navigation of an unmanned aircraft along an optical channel must take
into account the objectively existing limitations associated with the hardware capabilities of the on-
board computer and a number of assumptions. The formal stages of determining the position of the
aircraft and the conditions for their implementation have been formulated.
The procedure for determining the coordinates of special points during a flight mission and
preparing data for further analysis in order to determine the position of the aircraft is presented in
detail.
For the first time, a method of preliminary determination of the coordinates of the aircraft based
on the non-parametric estimation of the two-dimensional density function of the distribution of the
coordinates of special points is proposed.
A method of verifying the reliability of the previously determined position of the aircraft based
on the evaluation of the consistency coefficients of the determined coordinates of special points of
the image and their actual position on the image is proposed.
Approaches to solving the problem of estimating the location of an unmanned aircraft in cases
where it is not possible to reliably determine the coordinates based on the proposed method are
defined. In particular, the procedure for recording changes in the position of the aircraft based on
the analysis of consecutive frames of aerial photography has been formalized.
The software was developed to perform a simulation test of the proposed aircraft navigation
technology using an optical channel. A description of the structure of the developed software is
given.
Simulation testing was performed using the developed software. Correctness of operation of the
proposed technology on different images and in different situations has been confirmed.
Taking into account the confirmed correctness of the proposed method and the proven
technology of the automatic control of the aircraft, further research should be directed to their
unification and the creation of a suitable unmanned aircraft with the possibility of autonomous
navigation along the optical channel.
Declaration on Generative AI
The author(s) have not employed any Generative AI tools.
References
[1] I. Yurchuk, O. Piskunov, P. Prystavka, Information technology of the aerial photo materials
spatial overlay on the raster maps. In: T. Shmelova, Y. Sikirda, N. Rizun, D. Kucherov (Eds.),
Cases on Modern Computer Systems in Aviation, IGI Global, Pennsylvania, 2019, pp.191–201.
doi: 10.4018/978-1-5225-7588-7.ch007.
[2] P.O. Prystavka, O.G. Cholyshkina, Polynomial Splines in Alternative Navigation Problems
Based on Aerial Survey Data: Monograph, Interregional Academy of Personnel Management,
Kyiv, 2022.
[3] P.O. Prystavka, Polynomial Splines in Data Processing, DNU, Dnipro, 2004.
[4] E. Gallo, A. Barrientos, Long-distance GNSS-denied visual inertial navigation for autonomous
fixed-wing unmanned air vehicles: SO(3) mainifold filter based on virtual vision sensor,
Aerospace, 10(8):708 (2023). doi: 10.3390/aerospace10080708.
[5] A. Antonopoulos, M.G. Lagoudakis, P. Partsinevelos, A ROS multi-tier UAV localization module
based on GNSS, inertial and visual-depth data, Drones 6(6):135 (2022). doi:
10.3390/drones6060135.
[6] A. Elamin, N. Abdelaziz, A. El-Rabbany, A GNSS/INS/LiDAR integration scheme for UAV-based
navigation in GNSS-challenging environments, Sensors 22(24):9908 (2022). doi:
10.3390/s22249908.
[7] S. Ashraf, P. Aggarwal, P. Damacharla, H. Wang, A.Y. Javaid, V. Devabhaktuni, A low-cost
solution for unmanned aerial vehicle navigation in a global positioning system-denied
environment, International Journal of Distributed Sensor Networks 14(6) (2018). doi:
10.1177/1550147718781750.
[8] L. Bigazzi, M. Basso, E. Boni, G. Innocenti, M. Pieraccini, A multilevel architecture for
autonomous UAVs, Drones 5(3):55 (2021). doi: 10.3390/drones5030055.
[9] Y. D. Lee, L. W. Kim, H. K. Lee. A tightly‐coupled compressed‐state constraint Kalman Filter for
integrated visual‐inertial‐Global Navigation Satellite System navigation in GNSS‐Degraded
environments, ETI Radar, Sonar & Navigation 16(8) (2022) 1344–1363. doi: 10.1049/rsn2.12265.
[10] Y. Yang, X. Liu, W. Zhang, X. Liu, Y. Guo, A nonlinear double model for multisensor-integrated
navigation using the federated EKF algorithm for small UAVs, Sensors 20(10):2974 (2020). doi:
10.3390/s20102974.
[11] A. A. Deraz, O. Badawy, M. A. Elhosseini, M. Mostafa, H. A. Ali, A.I. Desouky, Deep learning
based on LSTM model for enhanced visual odometry navigation system, Ain Shams Engineering
Journal 14(8) (2023). doi: 10.1016/j.asej.2022.102050.
[12] B. Or, I. Klein, Adaptive step size learning with applications to velocity aided inertial navigation
system, IEEE Access 10 (2022) 85818–85830. doi: 10.1109/ACCESS.2022.3198672.
[13] Y. Dang, C. Benzaïd, B. Yang, T. Taleb, Y. Shen, Deep-ensemble-learning-based GPS spoofing
detection for cellular-connected UAVs, lEEE Internet of Things Journal 9(24) (2022) 25068–
25085. doi: 10.1109/JIOT.2022.3195320.
[14] A. Iatsyshyn, et al., Application of open and specialized geoinformation systems for computer
modelling studying by students and PhD students, CEUR Workshop Proceedings 2732 (2020)
893–908. URL: https://ceur-ws.org/Vol-2732/20200893.pdf.
[15] T. Hubanova, R. Shchokin, O. Hubanov, V. Antonov, P. Slobodianiuk, S. Podolyaka, Information
technologies in improving crime prevention mechanisms in the border regions of southern
Ukraine, Journal of Information Technology Management 13 (2021) 75-90.
[16] V. Kortunov, I. Dybska, G. Proskura, A. Kravchuk, Integrated mini INS based on MEMS sensors
for UAV control, IEEE Aerospace and Electronic Systems Magazine 24 (1) (2009) 41–43. doi:
10.1109/MAES.2009.4772754.
[17] F. M. Zakharin, S. A. Ponomarenko, Unmanned Aerial Vehicle integrated navigation complex
with adaptive tuning, in: Proceedings of 4th International Conference Actual Problems of
Unmanned Aerial Vehicles Developments (APUAVD), IEEE, Kyiv, Ukraine, 2017, pp. 23–26, doi:
10.1109/APUAVD.2017.8308768.
[18] S.O. Ponomarenko, F.M. Zakharin, Features of modeling the process of complex processing of
navigation information in on-board complexes of unmanned aerial vehicles, in: Proceedings of
XIX International Conference Dynamical System Modeling And Stability Investigation, Kyiv,
Ukraine, 2019, pp. 189–192.
[19] P. Prystavka, O. Cholyshkina, T. Sorokopud Experimental study of distributions differential
invariants based on spline image models, CEUR Workshop Proceedings 3530 (2022) 163-172.
URL: https://ceur-ws.org/Vol-3530/paper16.pdf.
[20] P. Prystavka, O. Cholyshkina, Comparative analysis of differential invariants based on the spline
model for various image distortion, Advanced Information System 4(4) (2020) 70–76.
[21] P. Prystavka, A.V. Chirkov, V.I. Sorokopud, D.V. Zdota, Simulation testing of the information
technology of aircraft navigation by optical channel, Science-Based Technologies 46(3) (2020)
370–377. doi: 10.18372/2310-5461.47.14935.
[22] Y. Averyanova, et al., UAS cyber security hazards analysis and approach to qualitative
assessment, In: S. Shukla, A. Unal, J. Varghese Kureethara, D.K. Mishra, D.S. Han (Eds.), Data
science and security, volume 290 of Lecture Notes in Networks and Systems, Springer,
Singapore, 2021, pp. 258–265. doi: 10.1007/978-981-16-4486-3_28.
[23] N. S. Kuzmenko, I. V. Ostroumov, K. Marais, An accuracy and availability estimation of aircraft
positioning by navigational aids, in: Proceedings of 5th International Conference on Methods
and Systems of Navigation and Motion Control (MSNMC), IEEE, Kiev, Ukraine, 2018, pp. 36–40.
doi: 10.1109/MSNMC.2018.8576276.
[24] V. Kharchenko, I. Chyrka, Detection of airplanes on the ground using YOLO neural network,
in: Proceedings of 17th International Conference on Mathematical Methods in Electromagnetic
Theory (MMET), IEEE, Kyiv, Ukraine, 2018, pp. 294–297. doi: 10.1109/MMET.2018.8460392.
[25] R. S. Odarchenko, S. O. Gnatyuk, T. O. Zhmurko, O. P. Tkalich, Improved method of routing in
UAV network, in: Proceedings of International Conference Actual Problems of Unmanned
Aerial Vehicles Developments (APUAVD), IEEE, Kyiv, Ukraine, 2015, pp. 294–297. doi:
10.1109/APUAVD.2015.7346624.