=Paper= {{Paper |id=Vol-3040/paper2 |storemode=property |title=Efficiency Requirements for Robotic Fruit Crop Harvesting |pdfUrl=https://ceur-ws.org/Vol-3040/paper2.pdf |volume=Vol-3040 |authors=Anna A. Kuznetsova }} ==Efficiency Requirements for Robotic Fruit Crop Harvesting== https://ceur-ws.org/Vol-3040/paper2.pdf
      Efficiency Requirements for Robotic Fruit Crop
                       Harvesting *

                      Anna A. Kuznetsova (0000-0001-5934-2361)1(*)
          1 Financial University under the Government of the Russian Federation,

                                     Moscow, Russia
                                 AnAKuznetsova@fa.ru


        Abstract. The paper aims to determine the most critical indicators of the
        efficiency of robots for collecting fruits, based on which gardeners can make
        informed decisions about the appropriateness of using such robots. The
        author performs an analysis of the differences between fruit harvesting robots
        from robots that are successfully used in other industries and makes a report
        of indicators used to evaluate the effectiveness of fruit-picking robots by
        developers of such robots prototypes. Based on this analysis, the author
        identifies quality metrics crucial for making decisions on the advent of fruit
        harvesting robots. The analysis of 32 papers devoted to fruit harvesting robots
        revealed that due to the development of convolutional neural networks in
        machine vision systems, the fruit detection speed has significantly increased.
        It indicates the inevitability of the introduction of robotic technology for
        harvesting in gardening. However, the developers of fruit collection robots
        need to evaluate the undetected fruits rate, the objects mistaken for fruits rate,
        the average fruit detection time, the average fruit picking time, the share of
        successfully collected fruits rate among the detected, the damaged fruits rate,
        the lost fruit rate, and the unpicked fruits rate in order to perform the task.

        Keywords: Fruit harvesting robot · Machine vision · Quality metrics


1       Introduction

Horticulture is one of the least automated agricultural sectors to date. In particular,
most fruit crops are harvested manually, with seasonal workers engaged in heavy
physical labor. The quality of harvesting by seasonal workers is low; in particular,
up to 50% of the fruit remains unpicked.
    The use of fruit harvesting robots will increase both the acreage of orchards and
the efficiency of horticultural enterprises by increasing labor productivity in
harvesting and reducing crop shortages.
    Fruit picking robots have been developing since the 1970s, while robot
productivity has not increased over the past thirty years. (Bac, van Henten, Hemming
& Edan, 2014) analyzed 50 prototypes of fruit harvesting robots. The average fruit
detection rate was 85%, and the average fruit picking rate was 75% of the detected

* Copyright © 2021 for this paper by its authors. Use permitted under Creative Commons

    License Attribution 4.0 International (CC BY 4.0).
fruits or 63.75% of the total fruits on the trees. At the same time, robots spend, on
average, 33 seconds to pick one fruit.
     According to (Alpha Brown, 2017), 27% of 1,300 farmers surveyed would like
to buy harvest robots. However, the cost of existing robots for harvesting fruits at
the level of hundreds of thousands of euros does not allow such robots to pay off in
practical work and does not let farmers consider the potential purchase of these
robots.
     Therefore, despite many prototypes of fruit harvesting robots developed and the
potential willingness of farmers to purchase them, not a single horticultural farm
still uses them due to their high cost and low efficiency.
     One of the factors hindering the robotic technology introduction for fruit
harvesting is the insufficient attention of prototype developers of such robots to the
analysis of their efficiency.
     The paper aims to determine the most critical indicators of the efficiency of
robots for collecting fruits, based on which horticulturists can make informed
decisions about the feasibility of using such robots.
     The paper analyzes the differences between fruit-picking robots from robots that
have long been successfully used in industry and other agriculture sectors. It also
analyzes the indicators used to evaluate the efficiency of fruit harvesting by such
robot prototype developers. Furthermore, based on this analysis, the author
identifies quality metrics crucial for making decisions on the introduction of fruit
harvesting robots.


2      Materials and Methods

2.1    Fundamental Features of Fruit Harvesting Robots
Robots are effectively used for operations that require reduced labor or workloads
and are best suited for applications that require repeatable accuracy and high
performance in homogeneous environments (Holland & Nof, 1999). In horticulture,
there is a need to reduce manual labor with repeatable accuracy of fruit-picking
operations, but it is almost impossible to ensure uniformity of conditions.
    For quite some time now, robots have been used in industry and some agriculture
sectors, such as animal agriculture, because a lot can be standardized in these areas:
to ensure work area cleanliness and make other conditions close to ideal.
    The grain harvesting process can also be standardized. It was standardization
that allowed humanity to switch from wheat harvesting with a sickle to the use of
human-driven and autonomous combines.
    In gardening, such ideal standard conditions cannot be created due to various
environmental conditions: changing light, blowing wind, rain leaving drops on
fruits and leaves, branches and leaves overlapping fruits, etc. Simultaneously, the
robot, for the operation of which it is first necessary to go through the garden and
cut off all the leaves that overlap apples, will not be in demand.
    For example, all apples differ in shape and color, unlike tomatoes, lemons, kiwi,
and other fruits.
    Fruits are susceptible to environmental and physical conditions, such as
temperature, humidity, carbon dioxide content, acidity, pressure, friction, and
shock. Fruit production requires accurate and often complex picking operations to
ensure sufficient quality. That is why apples are still not harvested by machines like
wheat combines: robots for harvesting fruits are much more complicated than
harvesting machines for grains.
    Unlike industrial robots, which deal with relatively simple, clearly defined
repetitive tasks in stable reproducible (not changing day by day) conditions,
gardening requires technologies for working with unstructured objects (fruits) in
complex, highly variable environments (gardens).
    It is a significant problem for commercialization. A robot must be able to move
in a volatile environment, and there are many situations in which a robot may fail
due to unexpected events. Therefore, all existing fruit-picking robot prototypes are
structurally complex and very expensive.


2.2      Approaches to Fruit Harvesting Robots Efficiency Assessment
Confusion matrixes for pixels classification into those belonging to the fruit and
those belonging to the background were used for a long time to evaluate the
efficiency of fruit harvesting robots (Fig. 1).

                                                              Actual
                                              Pixel belongs             Pixel belongs
                                                 to fruit              to background
                       Pixel assigned               TPP                      FPP
                           to fruit
       Predicted
                       Pixel assigned              FN P                     TN P
                       to background
Fig. 1. Confusion matrix for pixels classification in fruit harvesting robots. Source: (Fawcett,
2006).

      The following notation is used:
• TPP (True Positive) – the number of pixels in the image correctly recognized as
      belonging to the fruit;
• TN P (True Negative) – the number of pixels in the image correctly recognized
  as belonging to the background;
• FPP (False Positive) – the number of errors of the first kind, i.e., background
  pixels, mistakenly attributed by the machine vision system to a fruit;
• FN P (False Negative) – the number of errors of the second kind, i.e., pixels that
  actually belong to the fruit but are mistakenly classified by the system of
  machine vision as background.
    Based on the confusion matrix for pixels classification, many authors calculated
the following quality characteristics of machine vision systems:
                        TPP + TN P
•   AccuracyP =                           – The share of pixels in the image that are
                 TPP + FPP + TN P + FN P
  correctly recognized by the machine vision system (i.e., correctly assigned to
  the fruit or background);
                    TPP
• PrecisionP =              – The share of pixels actually belonging to fruits among
                 TPP + FPP
  all pixels assigned by the machine vision system to fruits;
                 TPp
• Recall p =              – The share of pixels correctly assigned by the machine
             TPp + FN p
  vision system to fruits, among all the pixels truly related to fruits;
         2 ⋅ PrecisionP ⋅ RecallP
• F1P =                           – The harmonic means of precision and recall.
          PrecisionP + RecallP

    Tables 1 and 2 represent a summary of quality metrics calculated by the
developers of known fruit harvesting robots.
    From a practical point of view, such indicators only indirectly determine the
quality of the robotic harvesting system since the robot collects fruits, not pixels.
    With the development of the use of convolutional neural networks to determine
the quality of fruit detection systems, the IoU (Intersection over Union) metric has
become popular.
    In Fig. 2, the navy rectangular frame is described around the ground truth fruit,
and the red frame is obtained as a result of applying the fruit detection algorithm by
the machine vision system.
    A fruit detection system is considered to work satisfactorily if

                                 ∑          Area of Intersection
                              all objects
=IoU                                                               > 0,5.
                                      ∑         Area of Union
                                  all objects

    However, in practice, this indicator and quality indicators calculated based on
the analysis of the pixel classification confusion matrix are only an indirect indicator
of the quality of the fruit-picking system.
    Therefore, current works gradually begin to use quality metrics based on the
analysis of the fruit detection confusion matrix (Fig. 3).
    The notation in the fruit detection confusion matrix has the following meanings:
• TPF (True Positive) – the share of fruits correctly detected by the machine vision
    system;
• FPF (False Positive) – the number of errors of the first kind, i.e., background
    objects in the image, mistakenly accepted by the machine vision system as fruits;
• FN F (False Negative) – the number of errors of the second kind, i.e., fruits not
    detected by the machine vision system.
    TPF , FPF , FN F , can calculate the following quality metrics for a fruit detection
system:
                       TPF
• PrecisionF =                 – The share of actual fruits among all the objects that
                   TPF + FPF
    the machine vision system called the fruits;
                   TPF
• RecallF =                  – The share of fruits detected by the machine vision
                TPF + FN F
    system;
           2 ⋅ PrecisionF ⋅ RecallF
• F1F =                              – The harmonic means of precision and recall.
            PrecisionF + RecallF

  Table 1. Fruit detection quality metrics in harvesting robot prototypes (Before CNNs).

Source                               Fruit     N     AccuracyP Recall p PrecisionF RecallF t

                                     Apple,
(Sites & Delwiche, 1988)                               0.90
                                     peach
(Plebe & Grasso, 2001)               Orange    673                        0.15      0.87   7.1
(Zhao, Tow & Katupitiya, 2005)       Apple     20                                   0.90
(Mao, Ji, Zhan, Zhang & Hu, 2009)    Apple             0.90
(Hannan, Burks & Bulanon, 2009)      Orange    82                                   0.90
(Bulanon, Burks & Alchanatis, 2009) Citrus                       0.74
                                     Various
(Seng & Mirisaee, 2009)                        14      0.90
                                     fruits
(Bulanon & Kataoka, 2010)            Citrus    22                                   0.89   7.1
(Wachs, Stern, Burks & Alchanatis,
                                     Apple     180               0.74
2010)
(Kurtulmus, Lee & Vardar, 2011)      Citrus    64                                   0.75
(Arefi, Motlagh, Mollazade &
                                     Tomato 110        0.96
Teimourlou, 2011)
(Patel, Jain & Joshi, 2011)          Apple                       0.90
(Linker, Cohen & Naor, 2012)         Apple     9       0.85
(Ji et al., 2012)                    Apple     22                                   0.89
(Zhan, He & Shi, 2013)               Kiwi      215                        0.93      0.97
(Wei et al., 2014)                   Apple     80                                   0.95
(Lu, Sang & Hu, 2014)                Citrus    20      0.87
(Zhao, Gong, Huang & Liu, 2016)      Tomato 171                           0.84      0.97
(Tao & Zhou, 2017)                   Apple     59                         0.95      0.90
Source: Compiled by the author.
    Table 2. Fruit detection quality metrics in harvesting robot prototypes (CNN models).

Source                      Fruit        N        AccuracyP IoU   PrecisionF RecallF   F1     t
                            Various
(Sa et al., 2016)                         118                       0.81      0.84     0.90 0.40
                            fruits
(Liu et al., 2020)          Kiwi         2518                       0.90      0.91            0.13
                            Apple,
(Bargoti & Underwood,
                            mango,        488                       0.96      0.86     0.90
2017)
                            almond
                            Various
(Mureşan & Oltean, 2018)                 15.563     0.96
                            fruits
(Gan, Lee, Alchanatis,
                            Citrus        50                                  0.96     0.90
Ehsani & Schueller, 2018)
(Williams, Jones, Nejati,
Seabright & MacDonald,      Kiwi         1456                                 0.76
2018)
(Peebles, Lim, Duke &
                            Asparagus     74                                           0.73
McGuinness, 2019)
(Yu, Zhang, Yang, Zhang,
                            Strawberry    100              0.90               0.96     0.95
2019)
(Jia, Tian, Luo, Zhang &
                            Apple         120                                 0.97     0.96
Zheng, 2020)
(Gené-Mola et al., 2020)    Apple        1021                                          0.87
(Tian, Yang, Wang, Li &
                            Apple         480              0.90                        0.81 0.30
Liang, 2019)
(Kang & Chen, 2020)         Apple         560              0.87     0.87      0.88     0.87 0.70
                            Orange,
(Wan & Goudos, 2020)        apple,        490                                 0.90            0.58
                            mango
Source: Compiled by the author.


3        Results
From a practical point of view, the following indicators are the essential metrics of
the machine vision systems quality to assess the quality of fruit harvesting robots:
                                                   FP
 False Negative RateF =  FNRF =   1 − RecallF =F                – The share of
                                                TPF + FN F
undetected fruits;
                                                       FPF
False Positive RateF =  FPRF =   1 − PrecisionF =                – The share of
                                                   TPF + FPF
objects mistaken for fruits, which affects the harvesting speed.
    It is also necessary to understand which part of the detected fruits a robot can
pick in order to make decisions on the fruit harvesting robot’s purchase. In the
process of robot creation, it is essential to assess the share of successfully harvested
fruits among those identified. Besides, essential characteristics of the robot are the
average fruit detection time (this indicator in seconds is presented in column t of
Tables 1 and 2), as well as the average fruit harvesting time, the share of damaged
fruits, the share of lost fruits, and the share of uncollected fruits.

                              Actual
                            fruit frame                   Detected
                                                         fruit frame




            (a) Ground truth fruit bounding box and detected fruit bounding box




                         Intersection                       Union




                       (b) Intersection                                            (c) Union
Fig. 2. Intersection over Union for fruits detection. Source: Compiled by the author.

                                                      Actual
                                              Fruit    Background
                          Detected fruits      TPF         FPF
                         Undetected fruits    FN F          –

Fig. 3. Confusion matrix for fruit detection in harvesting robots. Source: (Fawcett, 2006).


4      Discussion

The proportion of fruits not detected by the robot and the percentage of objects
mistakenly considered to be fruits are estimated by less than half of robot
developers, which can be seen from Tables 1 and 2.
   An absolute minority of developers provide data on the average time of fruit
detection, and almost no one gives information on the average time of fruit picking
and the shares of successfully picked fruits among the detected, damaged fruits, lost
fruits, and uncollected fruits.
    Of all the papers examined, only (Williams et al., 2018) noted that the robot
could detect 76% of kiwi, while the manipulator could reach 55% of the fruits. In
the field trials, the robot harvested in the garden, which had 1,456 kiwi fruits. As a
result, 50.9% of the fruits were harvested, 24.6% were lost during the harvesting
process, and 24.5% remained in the trees. Picking one fruit took, on average, about
five seconds. The work of neural networks took most of the time. Nevertheless,
nowadays, it is one of the fastest harvesting robots.


5      Conclusion

The development of fruit-picking robots will replace the heavy manual labor in
horticulture, increase the area of orchards, reduce cost, and reduce crop shortages.
    The analysis shows that the speed of fruit detection had increased significantly
with the development of the use of convolutional neural networks in machine vision
systems of fruit harvesting robots. It indicates that robotic harvesting technology
will be introduced to horticulture very shortly. Nevertheless, robots should become
much cheaper for gardeners to start thinking about switching to robotic technology.
Thus, gardeners should get a clear justification for the efficiency of the robots.
    If the first problem is solved by itself due to technological development, then to
solve the second problem, developers should pay more attention to the evaluation
of the effectiveness of robots and analysis of the quality metrics noted in this paper.


References
Alpha Brown. (2017). Agricultural robotic harvesting solutions. Retrieved from
   https://www.alphabrown.com/product-page/robotic-harvesting-u-s-market-study
Arefi, A., Motlagh, A. M., Mollazade, K., & Teimourlou, R. F. (2011). Recognition and
   localization of ripen tomato based on machine vision. Australian Journal of Crop
   Science, 5(10), 1144-1149.
Bac, C. W., van Henten, E. J., Hemming, J., & Edan, Y. (2014). Harvesting robots for high-
   value crops: State-of-the-art review and challenges ahead. Journal of Field Robotics,
   31(6), 888-911. DOI: 10.1002/rob.21525
Bargoti, S., & Underwood, J. (2017). Deep fruit detection in orchards. In Proceedings from
   ICRA 2017: The 2017 IEEE International Conference on Robotics and Automation (pp.
   3626-3633). Singapore, Singapore: Institute of Electrical and Electronics Engineers.
   DOI: 10.1109/ICRA.2017.7989417
Bulanon, D. M., & Kataoka, T. (2010). A fruit detection system and an end effector for
   robotic harvesting of Fuji apples. Agricultural Engineering International: CIGR Journal,
   12(1), 203-210.
Bulanon, D. M., Burks, T. F., & Alchanatis, V. (2009). Image fusion of visible and thermal
   images for fruit detection. Biosystems Engineering, 103(1), 12-22.
Fawcett, T. (2006). An introduction to ROC analysis. Pattern Recognition Letters, 27(8),
   861-874. DOI: 10.1016/j.patrec.2005.10.010
Gan, H., Lee, W. S., Alchanatis, V., Ehsani, R., & Schueller, J. K. (2018). Immature green
     citrus fruit detection using color and thermal images. Computers and Electronics in
     Agriculture, 152, 117-125. DOI: 10.1016/j.compag.2018.07.011
Gené-Mola, J., Gregorio, E., Cheein, F. A., Guevara, J., Llorens, J., Sanz-Cortiella, R., …
     Rosell-Polo, J. R. (2020). Fruit detection, yield prediction and canopy geometric
     characterization using LiDAR with forced air flow. Computers and Electronics in
     Agriculture, 168, 105-121. DOI: 10.1016/j.compag.2019.105121
Hannan, M. W., Burks, T. F., & Bulanon, D. M. (2009). A machine vision algorithm
     combining adaptive segmentation and shape analysis for orange fruit detection.
     Agricultural Engineering International: CIGR Journal, 11, 1-17
Holland, S. W., & Nof, S. Y. (1999). Emerging trends and industry needs. In S. Y. Nof (Ed.),
     Handbook of Industrial Robotics (pp. 31-40). New York, NY: Wiley.
Ji, W., Zhao, D., Cheng, F. Y., Xu, B., Zhang, Y., & Wang, J. (2012). Automatic recognition
     vision system guided for apple harvesting robot. Computers and Electrical Engineering,
     38(5), 1186-1195. DOI: 10.1016/j.compeleceng.2011.11.005
Jia, W., Tian, Y., Luo, R., Zhang, Zh., & Zheng, Y. (2020). Detection and segmentation of
     overlapped fruits based on optimized mask R-CNN application in apple harvesting robot.
     Computers       and     Electronics     in    Agriculture,    172,    105380.       DOI:
     10.1016/j.compag.2020.105380
Kang, H., & Chen, C. (2020). Fruit detection, segmentation and 3D visualization of
     environments in apple orchards. Computers and Electronics in Agriculture, 171, 105302
     DOI: 10.1016/j.compag.2020.105302
Kurtulmus, F., Lee, W. S., & Vardar, A. (2011). Green citrus detection using ‘eigenfruit’,
     color and circular Gabor texture features under natural outdoor conditions. Computers
     and Electronics in Agriculture, 78(2), 140-149.
Linker, R., Cohen, O., & Naor, A. (2012). Determination of the number of green apples in
     RGB images recorded in orchards. Computers and Electronics in Agriculture, 81(1), 45-
     57. DOI: 10.1016/j.compag.2011.11.007
Liu, Z., Wu, J., Fu, L., Majeed, Y., Feng, Y., Li, R., … Cui, Y. (2020). Improved kiwifruit
     detection using pre-trained VGG16 with RGB and NIR information fusion. IEEE Access,
     8, 2327-2336. DOI: 10.1109/ACCESS.2019.2962513
Lu, J., Sang, N., & Hu, Y. (2014). Detecting citrus fruits with highlight on tree based on
     fusion of multi-map. Optik, 125(8), 1903-1907.
Mao, W. H., Ji, B. P., Zhan, J. C., Zhang, X. C., & Hu, X. A. (2009). Apple location method
     for the apple harvesting robot. In Proceedings from CIPE 2009: The 2nd International
     Congress on Image and Signal Processing (pp. 1-5). Tianjin, China: Institute of Electrical
     and Electronics Engineers. DOI: 10.1109/CISP.2009.5305224
Mureşan, H. & Oltean, M. (2018). Fruit recognition from images using deep learning. Acta
     Universitatis Sapientiae. Informatica, 10(1), 26-42. DOI: 10.2478/ausi-2018-0002
Patel, H. N., Jain, R. K., & Joshi, M. V. (2011). Fruit detection using improved multiple
     features based algorithm. International Journal of Computer Applications, 13(2), 1-5.
Peebles, M., Lim, S. H., Duke, M., & McGuinness, B. (2019). Investigation of optimal
     network architecture for asparagus spear detection in robotic harvesting. IFAC
     PapersOnLine, 52(30), 283-287. DOI: 10.1016/j.ifacol.2019.12.535
Plebe, A., & Grasso, G. (2001). Localization of spherical fruits for robotic harvesting.
     Machine Vision and Applications, 13(2), 70-79. DOI: 10.1007/PL00013271
Sa, I., Ge, Z., Dayoub, F., Upcroft, B., Perez, T., & McCool, C. (2016). DeepFruits: A fruit
     detection system using deep neural networks. Sensors, 16(8), 1222-1244. DOI:
     10.3390/s16081222
Seng, W. C., & Mirisaee, S. H. (2009). A new method for fruits recognition system. In
     Proceedings from ICEEI 2009: The 2009 International Conference on Electrical
    Engineering and Informatics (Vol. 1, pp. 130-134). Selangor, Malaysia: Institute of
    Electrical and Electronics Engineers. DOI: 10.1109/ICEEI.2009.5254804
Sites, P. W., & Delwiche, M. J. (1988). Computer vision to locate fruit on a tree.
    Transactions of the American Society of Agricultural Engineers, 31(1), 257-263
Tao, Y., & Zhou, J. (2017). Automatic apple recognition based on the fusion of color and 3D
    feature for robotic fruit picking. Computers and Electronics in Agriculture, 142(A), 388–
    396. DOI: 10.1016/j.compag.2017.09.019
Tian, Y., Yang, G., Wang, Zh., Li, E., & Liang, Z. (2019). Detection of apple lesions in
    orchards based on deep learning methods of CycleGAN and YOLO-V3-Dense. Journal
    of Sensors, Special Issue, Sensors in Precision Agriculture for the Monitoring of Plant
    Development and Improvement of Food Production, 2019, 1-14. DOI:
    10.1155/2019/7630926
Wachs, J. P., Stern, H. I., Burks, T., & Alchanatis, V. (2010). Low and high-level visual
    feature-based apple detection from multi-modal images. Precision Agriculture, 11, 717-
    735. DOI: 10.1007/s11119-010-9198-x
Wan, Sh., & Goudos, S. (2020). Faster R-CNN for multi-class fruit detection using a robotic
    vision system. Computer Networks, 168, 107036. DOI: 10.1016/j.comnet.2019.107036
Wei, X., Jia, K., Lan, J., Li, Y., Zeng, Y., & Wang, C. (2014). Automatic method of fruit
    object extraction under complex agricultural background for vision system of fruit
    picking robot. Optik, 125(12), 5684-5689. DOI: 10.1016/j.ijleo.2014.07.001
Williams, H. A. M., Jones, M. H., Nejati, M., Seabright, M. J., & MacDonald, B. A. (2019).
    Robotic kiwifruit harvesting using machine vision, convolutional neural networks, and
    robotic       arms.       Biosystems     Engineering,       181,      140-156.       DOI:
    10.1016/j.biosystemseng.2019.03.007
Yu, Y., Zhang, K., Yang, L., & Zhang, D. (2019). Fruit detection for strawberry harvesting
    robot in non-structural environment based on Mask-RCNN. Computers and Electronics
    in Agriculture, 163, 104846. DOI:10.1016/j.compag.2019.06.001
Zhan, W. T., He, D. J., & Shi, S. L. (2013). Recognition of kiwifruit in field based on
    Adaboost algorithm. Transactions of the Chinese Society of Agricultural Engineering,
    29(23), 140-146.
Zhao, J., Tow, J., & Katupitiya, J. (2005). On-tree fruit recognition using texture properties
    and color data. In Proceedings from ICIRS 2005: IEEE/RSJ International Conference on
    Intelligent Robots and Systems (pp. 263-268). Edmonton, Canada: Institute of Electrical
    and Electronics Engineers. DOI: 10.1109/IROS.2005.1545592
Zhao, Y. S., Gong, L., Huang, Y. X., & Liu, C. L. (2016). Detecting tomatoes in greenhouse
    scenes by combining AdaBoost classifier and colour analysis. Biosystems Engineering,
    148(8), 127-137. DOI: 10.1016/j.biosystemseng.2016.05.001