<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Apple Fruits Monitoring Diseases on the Tree Crown Using a Convolutional Neural Network</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Nikolay Kiktev</string-name>
          <email>nkiktev@ukr.net</email>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alexey Kutyrev</string-name>
          <email>alexeykutyrev@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Pavel Mazurchuk</string-name>
          <email>eetc-1402_mazurchuk@ukr.net</email>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Federal Scientific Agroengineering Center VIM, Department of Technologies and Machines for Horticulture</institution>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>National University of Life and Environmental Sciences of Ukraine, Department of Automation and Robotic</institution>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Systems</institution>
          ,
          <addr-line>03041</addr-line>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Taras Shevchenko National University of Kyiv, Department of Intelligent Technologies</institution>
          ,
          <addr-line>01601</addr-line>
        </aff>
      </contrib-group>
      <fpage>78</fpage>
      <lpage>88</lpage>
      <abstract>
        <p>The article presents a monitoring system with a mobile application based on a neural network, which allows you to identify apple fruits on the crown of trees, count their number, determine diseases and ripening rates of apple fruits and crop volume per hectare. The monitoring system consists of a photo (image) collection unit, which includes a client software tool (mobile application, digital camera), a received image processing unit, consisting of a database, a neural network, and a received data analysis unit. A neural network based on the VGG-16 and SSD architecture has been developed to identify apple fruits on the tree crown. Healthy (red and green apple fruits) affected by diseases (scab, powdery mildew, fruit rot, mechanical damage) were selected as classes of apple fruits for training the neural network. The software runs and functions on the Ubuntu operating system, a mobile application on the Android operating system. As a result of the experiments carried out on an industrial plantation of an apple orchard, it was found that the accuracy of estimating the total number of fruits on a tree crown compared to the true value was 94.72%, the accuracy of counting the number of infected fruits was 90.44%. The average speed of pattern recognition does not exceed 0.6 seconds per image, the average speed of apple fruit segmentation from the background does not exceed 0.8 seconds per image, the average speed of analyzing one image and obtaining a recognition result does not exceed 1.5 seconds, subject to the technical requirements for the server and image requirements. digital monitoring, identification of apple fruits, mobile application, neural network, disease</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Currently, there are no methods for accurately assessing the magnitude of the potential yield of
horticultural crops. Agronomists, when conducting ground inspections of plantings to search for foci
of diseases for crop planning, are forced to rely on a limited set of data obtained by the visual method,
through systematic counts carried out at stationary sites and during route surveys, or using laboratory
tests [
        <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
        ]. The
most common laboratory
      </p>
      <p>
        methods for detecting horticultural diseases are
visualization methods such as fluorescence spectroscopy, visible and infrared spectroscopy, and
hyperspectral imaging [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. The main problem in visual assessment of diseases is that the assessor
takes on a subjective task, prone to psychological and cognitive functions, which can lead to low
recognition rates [
        <xref ref-type="bibr" rid="ref4 ref5">4, 5</xref>
        ]. The industrial horticulture industry needs to develop an automated monitoring
system that will predict the yield of an orchard with an accuracy of at least 80%. There are various
methods for identifying diseases of horticultural crops [
        <xref ref-type="bibr" rid="ref6 ref7 ref8">6-8</xref>
        ]. Digital cameras and decision support
systems (“expert systems”) are used [
        <xref ref-type="bibr" rid="ref10 ref11 ref12 ref13 ref9">9-13</xref>
        ]. However, for an “expert system”, the accuracy of pest
      </p>
      <p>
        2023 Copyright for this paper by its authors.
and disease identification mainly depends on the experience of the expert (agronomist). Currently,
these tasks in different countries of the world are helped by software tools and decision support
systems, such as Precision Gardening, Digital Garden [
        <xref ref-type="bibr" rid="ref14 ref15">14,15</xref>
        ], Agro Intelligent Vim [
        <xref ref-type="bibr" rid="ref16 ref17">16,17</xref>
        ], which
provide prompt processing of information flows from sensors and weather modules in real time,
reflecting the characteristics of growth and condition of plants in the critical phases of their
development. These software tools are based on the use of global positioning systems (Global
Navigation Satellite System), geographic information systems (Geographic information system),
systems for remote monitoring of the state of plantations and crops using UAVs and various IT
systems [
        <xref ref-type="bibr" rid="ref18 ref19 ref20">18-20</xref>
        ]. Existing approaches have proven themselves well for the cultivation of field crops to
take into account the numerical indicators of the quality and quantity of vegetation on the field plot,
various vegetation indices (NDVI, RVI, IPVI, WDVI) [
        <xref ref-type="bibr" rid="ref21 ref22">21, 22</xref>
        ], but for industrial horticulture, more
accurate monitoring is required using high-resolution cameras to analyze each fruit and subsequent
yield forecast [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ]. An actual modern tool that is used in decision-making on the management of
vegetation processes is machine learning. The importance of its application is due to the complexity
of the task of analyzing fields and predicting a possible harvest, since many factors influence the
overall results. Expert technologies are also successfully used to analyze the information obtained by
monitoring the state of the environment [
        <xref ref-type="bibr" rid="ref24 ref25">24, 25</xref>
        ].
      </p>
      <p>The purpose of the research is to improve the quality of identification of apple fruits on the crown
of trees by developing a monitoring system with a mobile application based on a neural network,
which will allow detecting, counting, determining diseases and ripening rates of apple fruits, as well
as the yield per hectare.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Theoretical Aspects of a Research</title>
      <p>A multi-scale network architecture for image diagnostics has been developed to identify apple
fruits on the tree crown. The developed software and hardware complex consists of a photo (image)
collection unit, which includes a client software tool (mobile application, digital camera), a received
image processing unit, which includes a database and a neural network, which are installed on the
server, and a received data analysis unit.</p>
      <p>The developed neural network based on the architecture of VGG-16 (Very Deep Convolutional
Networks) and SSD (Single Shot MultiBox Detector) recognizes and segments the fruits of an apple
tree on the crown of a tree, evaluates the object presented as a photograph, classifies it and gives the
result in the form of a probability in percent that it belongs to the fruit of an apple tree. The neural
network discretizes the output bounding box space into a default box set with different aspect ratios
and scales for each feature map (apple fruit) location. During prediction, the network generates a
presence score for each category of objects in each field by default and makes adjustments to the field
to better match the shape of the object.</p>
      <p>The neural network is used in two ways of working, through the REST WEB API service (remote
procedure call is an HTTP GET or POST request, and the necessary data is passed as request
parameters or in the request body) and locally on the computer by specifying files for recognition. To
use the neural network locally, you need to prepare images in a local folder. Full addresses (paths) of
all images in the local must be placed in the file "test.txt", in which the path to each image must be on
a separate line. As a result of recognition, photographs with selected regions are displayed on the
screen, where the neural network considers that there is an object belonging to one or another class
from the previously trained ones. At the input of the work, the neural network receives an array of
bytes of the image to be recognized. At the first stage, the image received for recognition is processed,
it is converted to RGB format and resized to the size necessary for the network to work. Further, in
the course of work, the array of bytes is converted into matrices of the required n-dimensionality,
which, in turn, are transferred to the neural network. At the output, the neural network generates a
matrix in the cells of which contains information about the recognized objects and information about
their belonging to a particular class.</p>
      <p>To train the neural network, photographs were collected using a Sony Alpha ILCE-7M4 camera,
the distances for shooting 0.2 m, 0.5 m and 1.0 m were determined, from angles that overlap each
other. The following varieties of columnar apple trees were used: President, Currency, Chervonets,
Lukomor, Malyukha. Senator, Triumph. Healthy apple fruits were chosen as the class - red, green;
apple fruits affected by diseases - scab, powdery mildew, fruit rot, mechanical damage. More than
10,000 photographs of given classes of apples were taken. To prepare a sample for training, the
obtained photographs were labeled. The open source online utility VGG Image Annotator was used
for markup (Fig. 1).</p>
      <p>The use of the VGG Image Annotator program for marking data made it possible to select the
necessary objects (apple fruits) in the image in the frame and assign a class to each bounding box,
saving the selection results in a JSON file.</p>
      <p>The client-side software provides for receiving requests for image evaluation from the client
software (mobile application), transferring the image to the server in the deep learning neural
network, receiving a response from it, and issuing a response to the client software.</p>
      <p>The database of the monitoring system stores structured information about all field measurements
made, the results of calculating the number of apple fruits on the studied rows of plantations, ensures
the integrity, completeness, reuse of data, and ease of updating data. The number of tables in the
database is 20, the maximum table size is 32 TB, the maximum record size is 1.6 TB, the maximum
field size is 1 GB, the maximum field size in a record is 250 pieces.</p>
      <p>The software runs and functions on the Ubuntu operating system, a mobile application on the
Android operating system. Software and mobile application have the ability to work based on
incoming photos (images) online, as well as using previously captured photo material. The Andr oid
application works on the basis of a neural network installed on a remote server to identify apple fruits
on the crown of trees and search for diseases on the fruit, saving the information received in the
database. It is possible to discretely count the number of fruits on a tree crown with the ability to
determine the yield of one tree, a selected row of plantations or the entire field. The information
obtained as a result of fruit identification and counting is sent online via GPRS mobile
communication and upon request of the API for Python, stored and processed using software on the
server for further forecasting and making agro-technological decisions.</p>
      <p>The segmentation convolutional neural network of deep learning in the case of classifying the
presented image as a known biological culture allows segmenting the area of culture presence with an
accuracy of at least 85%, while the image segmentation speed does not exceed 0.8 seconds per image.</p>
      <p>In addition, the network combines predictions from multiple feature maps at different resolutions
to naturally process features of different sizes. A distinctive feature of the chosen SSD architecture is
the ability to recognize objects in one run using a given grid of windows (default box) on the image
pyramid. Image processing speed can reach up to 59 FPS (Frames Per Second, frames per second).
The following list of Python libraries was used for the operation of the neural network:
tensorflowgpu, Numpy, OpenCV. Nvidia libraries: CUDA, CUDA toolkit, CuDNN. Driver Libraries: Driver
Nvidia Version: 515.65.01, CUDA Version: 11.7.</p>
      <p>As a result of the research, the stages of image processing using a neural network have been
developed:</p>
      <p>1. Low-level processing (low-level processing) - the first level of image processing, which
includes receiving an image from a mobile phone camera and converting it to digital form
(preprocessing), which brings images to a single format, removing noise, distortion and correcting color
levels.</p>
      <p>2. Intermediate-level processing (intermediate processing) - the level of processing at which the
object is selected in the image. The accuracy of the further determination of diseases depends on the
accuracy of the intermediate processing. Segmentation and selection of objects can be done in three
different ways: threshold segmentation, edge-based segmentation, and region-based segmentation.
The result of intermediate processing is the selection of the fetus in the image.</p>
      <p>3. High-level processing (high-level processing) - the last level of processing, in which the
recognition of infected fruits and their classification takes place. High-level processing includes the
use of deep learning algorithms and statistical methods (Fig. 2).
In each of the three types of processing, there is a continuous interaction with the knowledge base,
which stores the necessary materials (model coefficients, examples for training and classification) to
perform accurate classification. To improve the accuracy of fruit identification, the knowledge base
needs to be replenished with materials and periodically recalculate the coefficients of the models.</p>
      <p>The input data necessary to perform the specified functions of the complex and the output
information obtained as a result of the implementation of the complex of its functions are determined.
The input data includes an image of the research object in PNG or JPEG format, distance values to the
research object (measured by a digital camera or manually entered by the operator), geographic
coordinates of the place where the image was obtained (latitude, longitude, altitude), information
characterizing the optical system with which the image is obtained (real and imaginary size of the
camera matrix, focal length and equivalent focal length, image resolution of the object, vertical and
horizontal, digital zoom), information personalizing the research object (season, field, variety,
vegetation phase). The web application has a structure consisting of a database, calculation models,
type and maturity determination, a web interface part, and a main part that combines the part that the
user sees, interacts with him, and outputs model results. The interaction between the pages of the
software module is presented in the diagram (Fig. 3).
The output information includes the result of the study of the object for belonging to the fruit of
the apple tree - yes / no, the rate of ripening of the fruits of the apple tree, the result of counting the
number of identified healthy fruits and fruits affected by diseases (yield per hectare), the result of the
study of the object (selection of the area occupied by the object in the image, mask image file in PNG
format), the area of the area occupied by the object in the image, calculated from the characteristics of
the optical system, the distance to the object and the mask image, a log with systematized information
about the received data, personalizing the object of study with saving the image files and the mask of
the object of study on the server file system in the database with the ability to view the history of the
results of the study of the object in the software and sharing access to the history of the results using
authentication. The average absolute percentage error in measuring the number of apples on the
crown of a tree (comparison of their number with the true value measured by the visual method) is
determined by formula 1:
(1)
(2)</p>
      <p>=   +  
where   − the actual number of fruits measured by the visual method, pcs,  
– the number of
fetuses identified by the monitoring system, pcs,  – the total number of apples for a threefold
repetition of the experiment, pcs.</p>
      <p>The number of fetuses identified by the monitoring system is determined by formula 2:
eastern side of the row, pcs.</p>
      <p>where   − the number of identified fruits on the tree using monitoring system on the western
side of the row, pcs,   – the number of identified fruits on the tree using monitoring system on the</p>
    </sec>
    <sec id="sec-3">
      <title>3. Results and discussion</title>
      <p>Testing the operation of the monitoring system with a mobile application based on a neural
network, assessing the accuracy of counting the number of apple fruits on a tree crown was carried
out on an industrial plantation of an apple orchard aged 7 years. The scheme of the experiment is
shown in Figure 4.</p>
      <p>To assess the accuracy of the developed monitoring system, 5 trees in a row (trees No. 12-16) of
the orchard of the Currency variety were used. Using a mobile phone with an application installed on
it, based on a neural network on a remote server, the total number of apple fruits on the trees and the
number of infected fruits on the western and eastern sides of the row were determined in real time. To
avoid errors in estimating the number of fruits due to their mutual overlapping with leaves and
branches, the study considered fruits that were visually visible and identified at least 20% of their
actual size. The mobile application marks already recognized fruits, which eliminates duplication
when counting apples using a neural network (Fig. 5). Matrix barcodes (Quick Response code, optical
labels) were used to store and process the obtained data, containing the necessary information about
the object to which they are attached (Fig. 6). QR codes are read by digital devices and store data as a
series of pixels in a square grid. After scanning the QR code with the camera, the available data is
displayed on the screen of the mobile phone. The obtained data are stored in the developed database
Table 1).</p>
      <p>The service allows real-time processing of photos taken from a smartphone (Fig. 5), as well as
static processing of individual photos on a PC after collecting data from any device with a digital
camera (Fig. 7). The results of the statistical evaluation of the obtained research results are shown in
Table 2. The experiment was carried out in triplicate, the maximum and minimum number of
identified fruits on the trees was determined, as well as the percentage deviation of the number of
visible fruits to the number of identified fruits using a monitoring system. Based on the results of
processing the obtained experimental data, graphs were constructed for assessing the accuracy of
identifying apple fruits (Fig. 8). The percentage of fruit identification is calculated by the ratio of
fruits recognized by the monitoring system to the total number of fruits visible on the tree.
Figure 6 - Plot of an industrial plantation of an apple orchard with installed QR codes for storing
and processing data</p>
      <p>The mean absolute percentage error in counting the total number of fetuses was 5.59%, the mean
absolute percentage error in counting the number of infected fetuses was 11.5% compared to the true
value (visually measured). The accuracy of assessing the total number of fruits on the tree crown
compared to the true value was 94.72%, the accuracy of counting the number of infected fruits was
90.44%. Measurements of the speed of identification of apple fruits on trees were carried out (Fig. 9).
It has been established that the average time of pattern recognition does not exceed 0.6 seconds per
image, the average time of segmentation of an apple fruit from the background does not exceed 0.8
seconds per image, the average time of analyzing one image and obtaining a recognition result does
not exceed 1.5 seconds with comply with the technical requirements for the server and the
requirements for images.</p>
      <p>Figure 7 - The butt of the robotic software service with the number of known apples, including
healthy and sick ones</p>
    </sec>
    <sec id="sec-4">
      <title>4. Discussion</title>
      <p>
        The risk of this development can be calculated using the methodology for assessing the risks of
innovative projects based on fuzzy modeling (Yu. Samokhvalov, 2020) [
        <xref ref-type="bibr" rid="ref26">26</xref>
        ]. In modern conditions of
horticulture, the environmental factor is essential, the solution of the problem of long-term planning
according to the Leontiev-Ford ecological and economic model, taking into account the magnitude of
environmental costs, was proposed by the authors H. Hnatiienko et al. in [
        <xref ref-type="bibr" rid="ref27 ref28">27,28</xref>
        ].
      </p>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions</title>
      <p>The developed monitoring system based on a neural network makes it possible to carry out digital
monitoring both by photographic materials and by video stream online, operates stably in the
conditions of industrial garden plantations, regardless of the size and interference of foliage,
determines the color of the surface of the fruit, identifies the presence of diseases and fruit defects
with a probability of at least 90% as a result of incremental expansion of the dataset during the
operation of the complex and the gradual evolution of the solution by training the network in the
process of working on new data.
94,93
96,86
95,14</p>
      <p>93,22
13
43,67</p>
      <p>1
0,22
0,44
46
1
100
5,34
0</p>
      <p>14
51,33</p>
      <p>3
0,89
0,89
53
4
75
3,25
20</p>
      <p>15
45,67</p>
      <p>4
0,22
0,44
48
5
80
5,11
25
16
55
1
0
0
59
1
100
7,27
0
Average number of identified fruits for three
repetitions, pcs
Average number of identified infected fetuses for
three repetitions, pcs
Dispersion in the general population of identified
fruits, pcs2
Standard deviation of identified fruits, pcs
The number of apple fruits found on a tree by a
visual method, pcs
The number of found infected apple fruits on a tree
by a visual method, pcs
Percentage deviation of the number of visible fetuses
to the number of fetuses identified using PAC, %
Percentage deviation of the number of visible
infected fetuses to the number of identified infected
fetuses using PAK, %
Absolute percentage error in identification of the
total number of fruits, %
Absolute percentage error in identifying the number
of infected fetuses, %</p>
      <p>As a result of the research, it was found that for counting the number of apple fruits in industrial
garden plantations, the most suitable neural network is a recurrent deep learning network, since its use
allows you to recognize the contour of fruits and foci of diseases on them with high accuracy.</p>
      <p>
        The monitoring system provides the ability to process at least 200 requests simultaneously and
gives a result in the form of a percentage probability that the identified object belongs to an apple
fruit, allows you to identify apple fruits on the crown of trees, count their number, determine diseases
and ripening rates of apple fruits and crop volume per hectare. The developed neural network will
expand the functionality of the monitoring system not only for monitoring the yield of fruit crops, but
also for robotic harvesting of fruits [
        <xref ref-type="bibr" rid="ref16 ref17 ref29">16, 17, 29</xref>
        ] with the determination of the coordinates of each fruit
or its part and the return of the coordinates of the center of the fruit and its contour indicating the
areas of defects or diseases [
        <xref ref-type="bibr" rid="ref30">30</xref>
        ] on the fruit to the controller of the manipulator device.
      </p>
    </sec>
    <sec id="sec-6">
      <title>6. References</title>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Wulfsohn</surname>
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zamora</surname>
            <given-names>F.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Téllez</surname>
            <given-names>C.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lagos</surname>
            <given-names>I.Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>García-Fiñana</surname>
            <given-names>M</given-names>
          </string-name>
          .
          <article-title>Multilevel systematic sampling to estimate total fruit number for yield forecasts</article-title>
          .
          <source>Precis. Agric</source>
          .
          <year>2012</year>
          .
          <volume>13</volume>
          .
          <fpage>256</fpage>
          -
          <lpage>275</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Font</surname>
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tresanchez</surname>
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Martınez</surname>
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Moreno</surname>
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Clotet</surname>
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Palacın</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <article-title>Vineyard Yield Estimation Based on the Analysis of High Resolution Images Obtained with Artificial Illumination atNight</article-title>
          .
          <source>Sensors</source>
          .
          <year>2015</year>
          .
          <volume>15</volume>
          (
          <issue>4</issue>
          ):
          <fpage>8284</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Zou</surname>
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhao</surname>
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mel</surname>
            <given-names>H.</given-names>
          </string-name>
          <article-title>In-line detection of apple defects using three color cameras system</article-title>
          .
          <source>Comput. Electron. Agric</source>
          .
          <year>2010</year>
          .
          <volume>70</volume>
          ,
          <fpage>129</fpage>
          -
          <lpage>134</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Lak</surname>
            <given-names>M.B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Minaei</surname>
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Amiriparian</surname>
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Beheshti</surname>
            <given-names>B</given-names>
          </string-name>
          .
          <article-title>Apple fruit recognition under natural luminance using machine vision</article-title>
          .
          <source>Advance Journal of Food Science and Technology</source>
          .
          <year>2010</year>
          .
          <volume>2</volume>
          (
          <issue>6</issue>
          ):
          <fpage>325</fpage>
          -
          <lpage>327</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Garcia</surname>
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Barbedo</surname>
            <given-names>A</given-names>
          </string-name>
          .
          <article-title>A review on the main challenges in automatic plant disease identification based on visible range images</article-title>
          .
          <source>Biosystems Engineering</source>
          .
          <year>2016</year>
          .
          <volume>144</volume>
          :
          <fpage>52</fpage>
          -
          <lpage>60</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Ji</surname>
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhao</surname>
            <given-names>D</given-names>
          </string-name>
          ., Cheng F.,
          <string-name>
            <surname>Xu</surname>
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhang</surname>
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wang</surname>
            <given-names>J</given-names>
          </string-name>
          .
          <article-title>Automatic recognition vision system guided for apple harvesting robot</article-title>
          .
          <source>Computers and Electrical Engineering</source>
          .
          <year>2012</year>
          .
          <volume>38</volume>
          (
          <issue>5</issue>
          ):
          <fpage>1186</fpage>
          -
          <lpage>1195</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Jimenez</surname>
            <given-names>A. R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ceres</surname>
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pons</surname>
            <given-names>J. L.</given-names>
          </string-name>
          <article-title>A survey of computer vision methods for locating fruiton trees</article-title>
          .
          <source>Transactions of the ASAE-American Society of Agricultural Engineers</source>
          .
          <year>2000</year>
          .
          <volume>43</volume>
          (
          <issue>6</issue>
          ):
          <fpage>1911</fpage>
          -
          <lpage>1920</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Kapach</surname>
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Barnea</surname>
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mairon</surname>
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Edan</surname>
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ben-Shahar O. Computer</surname>
          </string-name>
          <article-title>Vision for FruitHarvesting Robots - state of the Art and Challenges Ahead</article-title>
          .
          <source>International Journal of Computational Vision and Robotics</source>
          .
          <year>2012</year>
          .
          <volume>3</volume>
          (
          <issue>1</issue>
          -2):
          <fpage>4</fpage>
          -
          <lpage>34</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Kim</surname>
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Choi</surname>
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Choi</surname>
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yoo S</surname>
          </string-name>
          . J.,
          <string-name>
            <surname>Han D. A Novel Red</surname>
          </string-name>
          <article-title>Apple Detection Algorithm Basedon AdaBoost Learning</article-title>
          .
          <source>IEIE Transactions on Smart Processing &amp; Computing</source>
          .
          <year>2015</year>
          .
          <volume>4</volume>
          (
          <issue>4</issue>
          ):
          <fpage>265</fpage>
          -
          <lpage>271</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Fountas</surname>
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sorensen</surname>
            <given-names>C.G.</given-names>
          </string-name>
          , Tsiropoulos
          <string-name>
            <given-names>Z.</given-names>
            ,
            <surname>Cavalaris</surname>
          </string-name>
          <string-name>
            <given-names>C.</given-names>
            ,
            <surname>Liakos</surname>
          </string-name>
          <string-name>
            <given-names>V.</given-names>
            ,
            <surname>Gemtos</surname>
          </string-name>
          <string-name>
            <surname>T.</surname>
          </string-name>
          <article-title>Farm machinery management information system</article-title>
          .
          <source>Computers and electronics in agriculture. 2015</source>
          . Vol.
          <volume>110</volume>
          .
          <fpage>131</fpage>
          -
          <lpage>138</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Kaloxylos</surname>
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Groumas</surname>
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sarris</surname>
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Katsikas</surname>
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Magdalinos</surname>
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Antoniou</surname>
            <given-names>E.</given-names>
          </string-name>
          , Politopoulou
          <string-name>
            <given-names>Z.</given-names>
            ,
            <surname>Wolfert</surname>
          </string-name>
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Brewster</surname>
          </string-name>
          <string-name>
            <given-names>C.</given-names>
            ,
            <surname>Eigenmann</surname>
          </string-name>
          <string-name>
            <surname>R.</surname>
          </string-name>
          , Maestre Terol C.
          <article-title>Acloud-based farm management system: architecture and implementation</article-title>
          . Computers and electronics in agriculture.
          <source>2014</source>
          . Vol.
          <volume>100</volume>
          .
          <fpage>168</fpage>
          -
          <lpage>179</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Anderson</surname>
            <given-names>N.T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Walsh</surname>
            <given-names>K.B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wulfsohn</surname>
            <given-names>D</given-names>
          </string-name>
          .
          <article-title>Technologies for Forecasting Tree Fruit Load</article-title>
          and
          <string-name>
            <given-names>Harvest</given-names>
            <surname>Timing-From</surname>
          </string-name>
          <string-name>
            <surname>Ground</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Skyand</given-names>
            <surname>Time</surname>
          </string-name>
          . Agronomy.
          <year>2021</year>
          .
          <volume>11</volume>
          ,
          <fpage>1409</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Zhao</surname>
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Liu</surname>
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            <given-names>M.Z.</given-names>
          </string-name>
          <article-title>Management information system for apple diseases and insect pests based on GIS</article-title>
          .
          <source>Trans. Chin. Soc. Agric. Eng</source>
          .
          <year>2006</year>
          .
          <volume>22</volume>
          .
          <fpage>150</fpage>
          -
          <lpage>154</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Qu</surname>
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hu</surname>
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhang</surname>
            <given-names>Y.</given-names>
          </string-name>
          <article-title>The Investigation of the Obstacle Avoidance for Mobile Robot Based on the Multi Sensor Information Fusion technology</article-title>
          .
          <source>International Journal of Manufacturing</source>
          ,
          <year>2013</year>
          ,
          <volume>1</volume>
          . - pp.
          <fpage>366</fpage>
          -
          <lpage>370</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Sgorbissa</surname>
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zaccaria</surname>
            <given-names>R</given-names>
          </string-name>
          .
          <article-title>Planning and obstacle avoidance in mobile robotics</article-title>
          .
          <source>Robotics and Autonomous Systems</source>
          ,
          <year>2012</year>
          ,
          <volume>60</volume>
          . - pp.
          <fpage>628</fpage>
          -
          <lpage>638</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Smirnov</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kutyrev</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kiktev</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>Neural network for identifying apple fruits on the crown of a tree. E3S Web of Conferences. International scientific forum on computer and energy Sciences</article-title>
          ,
          <source>WFCES</source>
          <year>2021</year>
          ,
          <volume>01021</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Kutyrev</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kiktev</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kalivoshko</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rakhmedov</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          <article-title>Recognition and Classification Apple Fruits Based on a Convolutional Neural Network Model. Selected Papers of the IX International Scientific Conference "Information Technology and Implementation" (IT&amp;I-</article-title>
          <year>2022</year>
          ).
          <source>Conference Proceedings, Kyiv, Ukraine. CEUR Workshop Proceedings</source>
          ,
          <year>2022</year>
          ,
          <volume>3347</volume>
          , pp.
          <fpage>90</fpage>
          -
          <lpage>101</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Kaivosoja</surname>
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jackenkroll</surname>
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Linkolehto</surname>
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Weis</surname>
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gerhards</surname>
            <given-names>R</given-names>
          </string-name>
          .
          <article-title>Automaticcontrol of farming operations based on spatial web services</article-title>
          .
          <source>Computers and electronics in agriculture. 2014</source>
          . Vol.
          <volume>100</volume>
          .
          <fpage>110</fpage>
          -
          <lpage>115</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Paraforos</surname>
            <given-names>D. S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vassiliadis</surname>
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kortenbruck</surname>
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stamkopoulos</surname>
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ziogas</surname>
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sapounas</surname>
            <given-names>A.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Griepentrog</surname>
            <given-names>H.W.</given-names>
          </string-name>
          <article-title>Multi-level automation of farm management information systems</article-title>
          .
          <source>Computers and Electronics in Agriculture. 2017</source>
          . Vol.
          <volume>142</volume>
          .
          <fpage>504</fpage>
          -
          <lpage>514</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Ampatzidis</surname>
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tan</surname>
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Haley</surname>
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Whiting M.D.</surname>
          </string-name>
          Cloud
          <article-title>-based harvest management information system for hand-harvested specialty crops</article-title>
          .
          <source>Computers and electronics in agriculture. 2016</source>
          . Vol.
          <volume>122</volume>
          .
          <fpage>161</fpage>
          -
          <lpage>167</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Miranda</surname>
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Santesteban</surname>
            ,
            <given-names>L.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Urrestarazu</surname>
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Loidi</surname>
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Royo</surname>
            <given-names>J.B.</given-names>
          </string-name>
          <article-title>Sampling stratification using aerial imagery to estimate fruit load in peach tree orchards</article-title>
          .
          <source>Agriculture</source>
          .
          <year>2018</year>
          .
          <volume>8</volume>
          . 78.
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <surname>Linker</surname>
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cohen</surname>
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Naor</surname>
            <given-names>A</given-names>
          </string-name>
          .
          <article-title>Determination of the numberof green apples in rgb images recorded in orchards</article-title>
          .
          <source>Computers and Electronics in Agriculture</source>
          .
          <year>2012</year>
          . vol.
          <volume>81</volume>
          , pp.
          <fpage>45</fpage>
          -
          <lpage>57</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <surname>Tan</surname>
            <given-names>W.X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhao</surname>
            <given-names>C.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wu</surname>
            <given-names>H.R.</given-names>
          </string-name>
          <article-title>CNN intelligent early warning for apple skin lesion image acquired by infrared video sensors</article-title>
          .
          <source>High Technol. Lett</source>
          .
          <year>2016</year>
          .
          <volume>22</volume>
          .
          <fpage>67</fpage>
          -
          <lpage>74</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Giang</surname>
            , T.T.H.; Ryoo,
            <given-names>Y.-J.</given-names>
          </string-name>
          <string-name>
            <surname>Pruning</surname>
          </string-name>
          <article-title>Points Detection of Sweet Pepper Plants Using 3D Point Clouds and Semantic Segmentation Neural Network</article-title>
          .
          <source>Sensors</source>
          <year>2023</year>
          ,
          <volume>23</volume>
          , 4040. https://doi.org/10.3390/s23084040
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <surname>Tang</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Jiang</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Long</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Fu</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Sun</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <article-title>Identification of the Yield of Camellia oleifera Based on Color Space by the Optimized Mean Shift Clustering Algorithm Using Terrestrial Laser Scanning</article-title>
          .
          <source>Remote Sens</source>
          .
          <year>2022</year>
          ,
          <volume>14</volume>
          , 642. https://doi.org/10.3390/rs14030642
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <surname>Samokhvalov</surname>
            <given-names>Y.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>Risk Assessment of Innovative Projects Based on Fuzzy Modeling</article-title>
          . In: Babichev,
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Lytvynenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            ,
            <surname>Wójcik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            ,
            <surname>Vyshemyrskaya</surname>
          </string-name>
          ,
          <string-name>
            <surname>S.</surname>
          </string-name>
          <source>(eds) Lecture Notes in Computational Intelligence and Decision Making. ISDMCI 2020. Advances in Intelligent Systems and Computing</source>
          , vol
          <volume>1246</volume>
          . Springer, Cham. https://doi.org/10.1007/978-3-
          <fpage>030</fpage>
          -54215-3_
          <fpage>17</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <surname>Hnatiienko</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Domrachev</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Saiko</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          <article-title>Monitoring the condition of agricultural crops based on the use of clustering methods</article-title>
          .
          <source>15th International Conference Monitoring of Geological Processes and Ecological Condition of the Environment, Monitoring</source>
          <year>2021</year>
          ,
          <year>2021</year>
          . https://doi.org/10.3997/
          <fpage>2214</fpage>
          -
          <lpage>4609</lpage>
          .20215K2049
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [28]
          <string-name>
            <surname>Hnatiienko</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kudin</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Onyshchenko</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Snytyuk</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Kruhlov</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <article-title>Greenhouse Gas Emission Determination Based on the Pseudo-Base Matrix Method for Environmental Pollution Quotas Between Countries Allocation Problem /в 2020</article-title>
          <source>IEEE 2nd International Conference on System Analysis &amp; Intelligent Computing (SAIC)</source>
          , Kyiv, Ukraine,
          <year>2020</year>
          , pp.
          <fpage>150</fpage>
          -
          <lpage>157</lpage>
          , doi: 10.1109/SAIC51296.
          <year>2020</year>
          .
          <volume>9239125</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [29]
          <string-name>
            <surname>Kiktev</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Didyk</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Antonevych</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>Simulation of Multi-Agent Architectures for Fruit and Berry Picking Robot in Active-</article-title>
          HDL.
          <year>2020</year>
          IEEE International Conference on Problems of Infocommunications. Science and
          <string-name>
            <surname>Technology (PIC S&amp;T)</surname>
          </string-name>
          , Kharkiv, Ukraine,
          <year>2020</year>
          ,
          <fpage>635</fpage>
          -
          <lpage>640</lpage>
          , doi: 10.1109/PICST51311.
          <year>2020</year>
          .
          <volume>9467936</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          [30]
          <string-name>
            <surname>Mohanty</surname>
            <given-names>S.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hughes</surname>
            <given-names>D.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Salathé</surname>
            <given-names>M.</given-names>
          </string-name>
          <article-title>Using deep learning for image-based plant disease detection</article-title>
          .
          <source>Frontiers in Plant Scienc</source>
          .
          <year>2016</year>
          .
          <volume>7</volume>
          :
          <fpage>1419</fpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>