<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>M. Ivanytskyi);</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Interregional Academy of Personnel Management</institution>
          ,
          <addr-line>Frometivska str. 2, 03039, Kyiv</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Maxim Ivanytskyi</institution>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Plant Breeding and Genetics Institute - National Center of Seed and Cultivar Investigation</institution>
          ,
          <addr-line>Ovidiopolska Road 3, 65036, Odesa</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>State University Kyiv Aviation Institute</institution>
          ,
          <addr-line>Lyubomyr Huzar Avenue 1, 03058, Kyiv</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <volume>000</volume>
      <fpage>0</fpage>
      <lpage>0002</lpage>
      <abstract>
        <p>In this paper, we developed an ML-based algorithm to classify plant state and decision-making about effective responses. The algorithm aims to detect plants that is under the lack of nutrition or damaged by diseases. The algorithm also provides UAV flight mapping to match the field points with classes of plant states. The real pictures of Triticum aestivum under normal conditions, under the lack of nutrition, and damage due to diseases, were used as the training samples for automatic visual diagnostic of plant state. Cyclic training was performed in three key scenarios. A comparison of the accuracy of the simulation experiment was done. The results of the analysis demonstrated that a long-term experiment with 1000 epochs provides the highest accuracy. The peak is found within about 20-200 epochs, and the 200th epoch, where the accuracy curve reaches and remains around 60-62 % with fluctuations of ±1 %. The results of the research also demonstrate the potential of the integration of modern technologies, including UAV, artificial intelligence (AI), and machine learning, into the agricultural field of people activity, showing the advantages of considered technologies for automatic monitoring and diagnostic of the plant and quick response to treatment actions.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;machine learning</kwd>
        <kwd>object classification</kwd>
        <kwd>UAV</kwd>
        <kwd>crop monitoring</kwd>
        <kwd>UAV-based mapping</kwd>
        <kwd>agriculture</kwd>
        <kwd>wheat</kwd>
        <kwd>plant</kwd>
        <kwd>Triticum aestivum</kwd>
        <kwd>air navigation1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Agriculture in Ukraine is the key component of crop production. Grain production is a branch
of crop production and is of high importance. The winter bread wheat (Triticum aestivum) ranks
first among other grain crops because of its productivity value and crop capacity [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. The crop area
of winter bread wheat varies from 6,4 to 7,3 million hectares. The area under this crop is two-fifths
of all crops.
      </p>
      <p>
        Nowadays, under the conditions of intensive agricultural production, factors that significantly
restrict the increase in yield capacity and gross crop yield are the plant diseases caused by various
pathogens [
        <xref ref-type="bibr" rid="ref2 ref3">2,3</xref>
        ]. The plants affected by diseases vary from 15% to 70%, depending on pathogen
virulence and year conditions. This is equivalent to the cost of grain from an area of 1 million
hectares. The volume of affected plants can be even larger during the period of epiphytotium.
      </p>
      <p>
        The negative influence of the environment reduces the effectiveness of mineral nutrition [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ].
These include high or low temperatures, or soil acidity and alkalinity, imbalance in nutrition,
moisture deficit, overwetting, diseases, damage to plants, pesticides, and many others. As a result
of a decrease in mineral nutrition effectiveness, it is observed a deterioration in
the availability of nutrients in the soil for plants,
the actively absorbing surface and the ability of the root system to absorb nutrients,
chloroplast activity.
      </p>
      <p>The lack of each of the nutrient elements affects the appearance of the plants.</p>
      <p>Sometimes it can be difficult enough to determine which element is in short supply. The
chemical analysis of the soil or cell juice is a rather expensive procedure. Moreover, finding a
highquality specialized laboratory can also be difficult, and the analysis takes more than one day.
Therefore, the method of visual diagnostic is still relevant. For correct visual diagnostic, it is
necessary to inspect the plants in the field and make a decision about the cause of the visible
damage to the plant. These can be insects, diseases, mechanical influence, including hail or
machinery injury, and weather phenomena such as drought or cold. Most of the damages caused
by the mentioned factors are characterized by massiveness. But plants that suffer from the deficit
in nutrients, as a rule, are located in a group as a “bonfire” or “spots” on the field.</p>
      <p>The deficit of nutrients is more often revealed as characteristic symptoms, but exceptions take
place. Visually, it can be seen not only as symptoms characteristic of a certain type of starvation,
for example, necrosis on the leaf or change in coloration of certain organs, but also as a change in
the general plant appearance. The early detection and timely response to the problem can save the
crop.</p>
      <p>Taking into account the reviewed problems, it was our motivation to consider modern technical
solutions to obtain reliable and timely information about the plant state, as well as to organize the
operative relevant response.</p>
      <p>
        Unmanned aircraft vehicles (UAVs) have demonstrated the promising ability to perform many
tasks in agriculture [
        <xref ref-type="bibr" rid="ref5 ref6">5,6</xref>
        ]. The advantages of the use of UAVs in agriculture are provided by their
low cost and high efficiency, ability to be equipped with a variety of sensors and instruments,
reliability, ability to perform tasks operatively and in region that is difficult to access, accuracy,
friendship to environment and ecology, timesaving as well [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. It is not the full list of advantages.
The integration of UAV-based technologies with artificial intelligence (AI) and machine learning
technologies opens even wider possibilities of solution finding for precise and smart agriculture.
      </p>
    </sec>
    <sec id="sec-2">
      <title>2. Motivation</title>
      <p>The demand of farmers to obtain timely information about the state of the plant for a relevant
response to keep the required plant conditions or to save the crop requires reliable and
costeffective solutions for plant parameter monitoring. The advantages of the UAV to perform different
agricultural missions inspire us to consider the possibility of using UAVs as a multifunctional tool
for plant monitoring and immediate relevant response. In this paper, we develop an algorithm
based on machine learning (ML) for automatic monitoring of the condition of crops to determine
the critical areas and give a quick response to the identified problems.</p>
    </sec>
    <sec id="sec-3">
      <title>3. The latest research analysis</title>
      <p>
        Climate change and the growth of population are the most evident challenges to modern
agriculture that require searching for and studying novel solutions for the sustainable development
of this branch of people activity. Several reviews study the application of UAVs for agricultural
purposes. In the paper [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], the UAV application for monitoring of parameters of forage crops is
reviewed. A review of different types of UAVs and their applications is made in the paper [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. The
UAV-based remote sensing technologies in combination with machine learning are presented in
the paper [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. This paper also considers the advantages of AI algorithms for the estimation of
plant needs and makes forecasts of production. The integration of UAVs, AI, and the Internet of
Things (IoT) for precision agriculture is discussed in [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. In [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] the decision-support system for
agriculture works planning using UAVs as a multifunctional tool is presented and discussed.
Paper [13] makes a comparison of research that is devoted to the integration of UAVs and AI. In
this paper, the problems for future solutions are mentioned and are divided into the categories
connected with the physical part of unmanned aircraft systems (UAS), that is, UAV. For example,
the battery life is needed to finish the mission successfully. Another kind of issues for future
research is connected with the cyber domain and includes tasks connected with an adaptation of
learning algorithms to a random nature environment, data fusion from different sensors and
processing algorithms, development of energy-efficient AI models, benchmark datasets,
standardization, and others. Paper [14] provides a summary of the plant diseases and techniques of
pest monitoring with deep learning DL for classification and prediction. The present limitations of
the large language models were outlined in the paper as the insufficiency of the data for correct
and reliable outputs. The suggestions on the combined use of the AI models were made in the
paper as well. The latest research analysis proves that utilizing the new technologies for
automation and productivity enhancement of modern and future agriculture is an actual task [15].
That, in turn, requires further study of the potential when integrating UAV-based technologies
with artificial intelligence (AI) and machine learning technologies to develop smart agriculture.
      </p>
    </sec>
    <sec id="sec-4">
      <title>4. Learning algorithm of plant state classification and UAV flight routing</title>
      <p>To automatically monitor the condition of plants and crops using an unmanned aerial vehicle,
an intelligent data processing pipeline was built. The process begins with the continuous collection
of high-quality RGB images from a camera mounted on the UAV. The end of the process is the
construction of a detailed map of the plant condition and the formation of a flight route for the
targeted application of fertilizers or protective agents.</p>
      <p>During the preparation, the footage is automatically converted to a single-color space. Then,
they are scaled to a size of 50×50 pixels, and after this, each pixel block is converted into a vector
with a length of 7,500 elements. These vectors form the input features for the RandomForest
classifier [16], which is trained in a loop with the accuracy rate fixed on a control sample after each
iteration.</p>
      <p>The order of the automatic monitoring and flight route formation can be represented as the
algorithm in Figure 1.</p>
      <p>The algorithm can be represented as the following steps:
1. Start.
2. Formation of a learning base. For this purpose, the pictures of fields covered with plants
are uploaded. Pictures include samples of plants with typical reactions (features) to
diseases and samples of plants' reactions to the nutrition deficit.
3. Preliminary processing of the updated materials.
4. The next block of the algorithm represents the programming of the flight route, and then
the UAV makes a flight and records the coordinates and pictures. This is the formation of
the database.
5. The next stage is the processing of the images.
6. Then, the Learning of the model Random Forest based on the results of initial data
processing is done.
7. The next step is the mapping of the areas of plants under different states as high-quality</p>
      <p>RGB images. The color contains information about the plant's state.
8. Storage of the data taken from the UAV in Excel.
9. The final step of the algorithm is the accuracy diagram formation.</p>
      <p>At the preliminary stage, the randomly assigning class labels to test the pipeline's performance.
The labels include 0 for healthy plants, 1 for a plant that needs fertilizer, and 2 for plants that need
chemical treatment. This can be seen in detail in Figure 2. In the future, we plan to replace this
conditional markup with expert or semi-automatic markup. The data is split into training (80%) and
test (20%) samples once before the training cycle.</p>
      <p>In Figure 2, the 10x10 square that represents the field is shown. Horizontal and vertical planes
are the normalized field dimensions. On the right side, the color scale is shown, the scale
represents the color indications of the plant state. The zero value or the green corresponds to the
good conditions of the plant. In this case, no actions are required. The unit or the yellow color in
Figure 2 corresponds to the conditions of plants with a lack of nutrition. In this case, fertilization is
required. The number 3 or red color shows the plant that is damaged by the disease. In this case,
chemical treatment is required.</p>
      <sec id="sec-4-1">
        <title>Start</title>
      </sec>
      <sec id="sec-4-2">
        <title>Formation of learning base</title>
      </sec>
      <sec id="sec-4-3">
        <title>Preliminary processing</title>
      </sec>
      <sec id="sec-4-4">
        <title>Data recording (point coordinates) to the database</title>
      </sec>
      <sec id="sec-4-5">
        <title>Image data processing</title>
      </sec>
      <sec id="sec-4-6">
        <title>Random Forest model training</title>
      </sec>
      <sec id="sec-4-7">
        <title>Drone flight map construction</title>
      </sec>
      <sec id="sec-4-8">
        <title>Storage of the data taken from the UAV to the Excel</title>
      </sec>
      <sec id="sec-4-9">
        <title>Accuracy diagram formation</title>
        <p>End</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Simulation and analysis</title>
      <p>Cyclic training was performed in three key scenarios: at 10, 100, and 1000 epochs. During the first
phase (epochs 1-3), the classification accuracy of the fluctuating conveyor was in the range of
4065%. This can be seen in Figure 3, which indicates that the number of iterations was insufficient to
extract stable features. By the fifth iteration of the 10-epoch scenario, the accuracy value leveled off
around 40 %, while in the scenarios with 100 and 1000 epochs, a gradual increase to 60-65 % was
observed.</p>
      <p>From the analysis of Figure 3, it is possible to conclude that the number of iterations was
insufficient to extract stable features.</p>
      <p>By the fifth iteration of the 10-epoch scenario, the accuracy value leveled off around 40%, while
in the scenarios with 100 and 1000 epochs, a gradual increase to 60-65 % was observed. The Image
of the learning testing scenario for 100 epochs is shown in Figure 4.</p>
      <p>From Figure 4 it is possible to see that in the scenario with 100 epochs, the accuracy graph
shows initial instability (accuracy circles fluctuate between 45 % and 60 % within the first 10-15
iterations), after which a smooth leveling off occurs: by the 30th epoch, the average accuracy value
rises to ≈53 %, and in the range of 30-100 epochs, it is kept in the range of 53-57 % with a gradual
decrease in the fluctuation amplitude. This indicates that the model reaches its maximum ability to
generalize features with this number of training cycles and that a further increase in epochs does
not yield a noticeable increase in quality.</p>
      <p>The Image of the learning testing scenario for 1000 epochs is shown in Figure 5.</p>
      <p>As it is possible to see from Fig.5, in the long-term experiment with 1000 epochs, three
qualitatively different intervals of accuracy development are visible. In the initial period (the first
20 epochs), the accuracy rises from ≈40% to ≈45% with high variability. Then, within about 20-200
epochs, there is a more rapid linear increase, up to ≈65 % of the average accuracy, with a gradual
decrease in the amplitude of fluctuations. After the 200th epoch, the accuracy curve reaches a new
plateau: the values remain around 60-62 % with fluctuations of ±1 %, which indicates that the
information in the initial sample has been exhausted and the training has begun to saturate.
Further training exceeds the time and computational costs without any significant performance
improvement.</p>
      <p>After each epoch, a 10×10 field state matrix is automatically generated, where each element
contains the predicted class of the plot. At the beginning of the training (5th epoch), the green class
(‘healthy’) occupied about 40% of the cells, the yellow class (‘need fertiliser’) - 35%, and the red
class (‘need chemistry’) - 25%. By the 50th epoch, the proportion of green cells increased to ≈60%,
yellow cells decreased to ≈25%, and red cells to ≈15%. In the final maps, after 1000 epochs, the ratio
stabilised at around 65% / 25% / 10%. Each of these matrices was exported to an Excel file named
status_map_epoch_&lt;k&gt;.xlsx, which allowed for a detailed analysis of the spatial distribution of
states at different points in the training.</p>
      <p>To plan the flight, we used the snake algorithm [17], which generates a sequence of cell
coordinates in such a way as to minimize empty movements. The algorithm of automatic scaling of
geo data to minimize empty movements can also be found in [18,19,20]. The route was
superimposed on a false-color visualization of the state matrix so that critical areas (red) were
placed in the central part of the path, which ensured priority processing. The route visualization
showed uniform coverage of the entire area with a minimum trajectory length and quick response
to problem areas.</p>
    </sec>
    <sec id="sec-6">
      <title>6. Conclusions</title>
      <p>In this paper, we developed an ML-based algorithm for the classification of plant state and decision
making about effective response. The algorithm aims to detect plants that are under the lack of
nutrition or are damaged by diseases. The algorithm also provides UAV flight mapping in order to
match the field points with classes of plant states.</p>
      <p>To train the model, we used the real pictures of the field of Triticum aestivum. The visual
diagnostic was made using the UAV with a camera mounted on it. The pictures of Triticum
aestivum that contain characteristic features of the symptoms of damage by disease or suffering
from a lack of nutrition were used as well. These symptoms are mostly revealed as a change in the
coloration of certain organs and as a change in the general plant appearance. These allow to
classify objects in the green class as 'healthy', in the yellow class as 'need fertilizer', and in the red
class as 'need chemistry'.</p>
      <p>Cyclic training was performed in three key scenarios: at 10, 100, and 1000 epochs. The results of
the analysis demonstrated that a long-term experiment with 1000 epochs provides the highest
accuracy. The peak is found within about 20-200 epochs, and the 200th epoch, where the accuracy
curve reaches and remains around 60-62 % with fluctuations of ±1 %. This indicates that the
information in the initial sample has been exhausted, and the training has begun to saturate.
Further training exceeds the time and computational costs without any significant performance
improvement.</p>
      <p>The results of the research also demonstrate the potential of the integration of modern
technologies, including UAV, artificial intelligence (AI), and machine learning, into the
agricultural field of people activity, showing the advantages of considered technologies for
automatic monitoring and diagnostic of the plant and quick response to treatment actions.</p>
    </sec>
    <sec id="sec-7">
      <title>Acknowledgements</title>
      <p>PSR 24.01.01.06.F Monitoring and features of pathogenesis of pathogens of rust and sooty diseases,
powdery mildew, pyrenophorosis, fusarium head blight of wheat (species, races, strains) in the
conditions of southern Ukraine, 0121U107896, 2021-2025</p>
    </sec>
    <sec id="sec-8">
      <title>Declaration on Generative AI</title>
      <p>During the preparation of this work, the authors) used Grammarly in order to Grammar and
spelling check. After using this tool, the authors reviewed and edited the content as needed and
takes full responsibility for the publication’s content.</p>
    </sec>
    <sec id="sec-9">
      <title>References</title>
      <p>[13] Y. Averyanova, Y. Znakovska, Decision-Making Automation for UAS Operators using</p>
      <p>Operative Meteorological Information, CEUR Workshop Proceedings 3468 (2023) 139–149.
[14] J. Agrawal, M. Y. Arafat, Transforming Farming: A Review of AI-Powered UAV Technologies
in Precision Agriculture, Drones 8(11) (2024) 664. doi:10.3390/drones8110664.
[15] Hongyan Zhu, Chengzhi Lin, Gengqi Liu, Dani Wang, Shuai Qin, Anjie Li, Jun-Li Xu, Yong
He, Intelligent agriculture: deep learning in UAV-based remote sensing imagery for crop
diseases and pests detection, Frontiers in Plant Science 15 (2024). URL:
https://www.frontiersin.org/journals/plant-science/articles/10.3389/fpls.2024.1435016.
[16] C. Goi, V. Proskurnina, M. Kovalenko, K. Mamonov, S. Haidenko, Prospects and Challenges of
the Impact of Artificial Intelligence and Machine Learning on Social and Economic Progress,
Pakistan Journal of Life and Social Sciences 22(2) (2024) 6592–6601.
[17] What is random forest?, IBM. URL: https://www.ibm.com/think/topics/random-forest.
[18] W. Bong, C.C. Liew, H.Y. Lam, Ground-glass opacity nodules detection and segmentation
using the snake model, in: Xin-She Yang, João Paulo Papa (Eds.), Bio-Inspired Computation
and Applications in Image Processing, Academic Press, 2016, 87–104.
doi:10.1016/B978-0-12804536-7.00005-3.
[19] I. Ostroumov, et al., Effective Trajectory Data Storage for Tracking Applications, Proc. 2024
IEEE 17th International Conference on Advanced Trends in Radioelectronics,
Telecommunications and Computer Engineering (TCSET 2024), 2024, 276–280.
[20] I. Ostroumov, et al., Performance Analysis of Compact Position Report for Geodata Storing
and Transfering, Proc. 2023 IEEE 13th International Conference on Electronics and
Information Technologies (ELIT 2023), 2023, 227–231.
[21] M. Ivanytskyi, Y. Znakovska, Y. Averyanova, UAS Flight Trajectory Optimization Algorithm
Based on Operative Meteorological Information, CEUR Workshop Proceedings 3426 (2023)
287–297.Lipchuk, V., &amp; Malakhovskyi, D. (2016). Structural changes ingrain production:
regional aspect. Agrarian Economy, 9(3–4),53–60. [In Ukrainian]</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <article-title>[1] Areas, gross harvests and productivity of agricultural crops</article-title>
          ,
          <source>State Statistics Service of Ukraine</source>
          ,
          <year>2018</year>
          . URL: http://www.ukrstat.gov.ua/metaopus/2018/2_03_
          <fpage>07</fpage>
          _
          <fpage>03</fpage>
          _
          <year>2018</year>
          [in Ukrainian].
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>V.</given-names>
            <surname>Lipchuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Malakhovskyi</surname>
          </string-name>
          ,
          <article-title>Structural changes in grain production: regional aspect</article-title>
          ,
          <source>Agrarian Economy</source>
          <volume>9</volume>
          (
          <issue>3</issue>
          -4) (
          <year>2016</year>
          )
          <fpage>53</fpage>
          -
          <lpage>60</lpage>
          [in Ukrainian].
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>H. S.</given-names>
            <surname>Jackson</surname>
          </string-name>
          ,
          <article-title>Aecial stage of the orange leaf rust of wheat, Puccinia triticina Eriks</article-title>
          ,
          <source>Journal of Agricultural Research</source>
          <volume>22</volume>
          (
          <year>1921</year>
          )
          <fpage>151</fpage>
          -
          <lpage>172</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <article-title>[4] Resistance of soft winter wheat varieties against pathogens of major leaf-stem diseases in the South of Ukraine, 2025</article-title>
          . URL: https://www.researchgate.net/publication/382672360_Resistance_
          <article-title>of_soft_winter_wheat_variet ies_against_pathogens_of_major_leaf-stem_diseases_in_the_South_of_Ukraine.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>N.</given-names>
            <surname>Sauliak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Traskovetska</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Vasyliev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Bushulian</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Rudenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Tsapenko</surname>
          </string-name>
          ,
          <article-title>Resistance of soft winter wheat varieties against pathogens of major leaf-stem diseases in the South of Ukraine, Plant varieties studying</article-title>
          and protection
          <volume>20</volume>
          (
          <year>2024</year>
          )
          <fpage>84</fpage>
          -
          <lpage>89</lpage>
          . doi:
          <volume>10</volume>
          .21498/
          <fpage>2518</fpage>
          -
          <lpage>1017</lpage>
          .
          <year>20</year>
          .2.
          <year>2024</year>
          .
          <volume>304104</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>J.</given-names>
            <surname>del Cerro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Cruz Ulloa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Barrientos</surname>
          </string-name>
          , J. de León Rivas,
          <source>Unmanned Aerial Vehicles in Agriculture: A Survey, Agronomy</source>
          <volume>11</volume>
          (
          <issue>2</issue>
          ) (
          <year>2021</year>
          )
          <article-title>203</article-title>
          . doi:
          <volume>10</volume>
          .3390/agronomy11020203.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>R.</given-names>
            <surname>Pungavi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Praveenkumar</surname>
          </string-name>
          ,
          <article-title>Unmanned Aerial Vehicles (UAV) for Smart Agriculture</article-title>
          ,
          <source>in: Advances in Smart Grid and Renewable Energy</source>
          , Springer,
          <year>2024</year>
          . doi:
          <volume>10</volume>
          .1007/
          <fpage>978</fpage>
          -981-97-0341- 8_
          <fpage>13</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>D.</given-names>
            <surname>Kushwaha</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.K.</given-names>
            <surname>Sahoo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Pradhan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Kumar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Singh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Krishnan</surname>
          </string-name>
          ,
          <article-title>Benefits and Challenges in the Adoption of Agriculture Drones in India</article-title>
          ,
          <year>2023</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>W. M.</given-names>
            dos
            <surname>Santos</surname>
          </string-name>
          , L. D. C. dos Santos Martins,
          <string-name>
            <given-names>A. C.</given-names>
            <surname>Bezerra</surname>
          </string-name>
          , L. S. B.
          <string-name>
            <surname>de Souza</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. M. de R. F. Jardim</surname>
          </string-name>
          , M. V. da
          <string-name>
            <surname>Silva</surname>
          </string-name>
          , C. A.
          <string-name>
            <surname>A. de Souza</surname>
            ,
            <given-names>T. G. F. da Silva</given-names>
          </string-name>
          ,
          <article-title>Use of Unmanned Aerial Vehicles for Monitoring Pastures and Forages in Agricultural Sciences: A Systematic Review</article-title>
          ,
          <source>Drones</source>
          <volume>8</volume>
          (
          <issue>10</issue>
          ) (
          <year>2024</year>
          )
          <article-title>585</article-title>
          . doi:
          <volume>10</volume>
          .3390/drones8100585.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Jeongeun</surname>
            <given-names>Kim</given-names>
          </string-name>
          , Seungwon Kim, Chanyoung Ju, Hyoung Son,
          <article-title>Unmanned Aerial Vehicles in Agriculture: A Review of Perspective of Platform, Control, and Applications</article-title>
          , IEEE Access (
          <year>2019</year>
          )
          <fpage>1</fpage>
          -
          <lpage>1</lpage>
          . doi:
          <volume>10</volume>
          .1109/ACCESS.
          <year>2019</year>
          .
          <volume>2932119</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>S.C.</given-names>
            <surname>Kakarla</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Costa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Ampatzidis</surname>
          </string-name>
          ,
          <string-name>
            <surname>Z. Zhang,</surname>
          </string-name>
          <article-title>Applications of UAVs and Machine Learning in Agriculture</article-title>
          , in: Z. Zhang, H. Liu,
          <string-name>
            <given-names>C.</given-names>
            <surname>Yang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Ampatzidis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Zhou</surname>
          </string-name>
          , Y. Jiang (Eds.),
          <source>Unmanned Aerial Systems in Precision Agriculture, Smart Agriculture</source>
          , vol.
          <volume>2</volume>
          , Springer, Singapore,
          <year>2022</year>
          . doi:
          <volume>10</volume>
          .1007/
          <fpage>978</fpage>
          -981-
          <fpage>19</fpage>
          -2027-1_
          <fpage>1</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>A.</given-names>
            <surname>Bin Rashid</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.K.</given-names>
            <surname>Kausik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Khandoker</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.N.</given-names>
            <surname>Siddque</surname>
          </string-name>
          ,
          <article-title>Integration of Artificial Intelligence and IoT with UAVs for Precision Agriculture</article-title>
          ,
          <source>Hybrid Advances</source>
          <volume>10</volume>
          (
          <year>2025</year>
          )
          <fpage>100458</fpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>