=Paper= {{Paper |id=Vol-2030/HAICTA_2017_paper83 |storemode=property |title=A Windows Phone Application for Plant Disease Diagnosis |pdfUrl=https://ceur-ws.org/Vol-2030/HAICTA_2017_paper83.pdf |volume=Vol-2030 |authors=Nikos Petrellis |dblpUrl=https://dblp.org/rec/conf/haicta/Petrellis17 }} ==A Windows Phone Application for Plant Disease Diagnosis== https://ceur-ws.org/Vol-2030/HAICTA_2017_paper83.pdf
      A Windows Phone Application for Plant Disease
                      Diagnosis

                                       Nikos Petrellis

  Department of Computer Science and Engineering, Technological Educational Institute of
                  Thessaly, Larissa, Greece, e-mail: npetrellis@teilar.gr




       Abstract. Although professional agriculture engineers are responsible for the
       recognition of plant diseases, intelligent systems can help in their diagnosis in
       early stages. Using such systems, low cost continuous plant monitoring can be
       applied. Some expert systems have been proposed in the literature for this
       purpose that are based on user descriptions and image comparison. The
       symptoms of a disease include lesions or spots in various parts of a plant. The
       color, area and the number of these spots can determine to a great extent the
       disease that has mortified a plant. Higher cost molecular analyses and tests can
       follow if necessary. In this paper, a Windows Phone application capable of
       measuring the plant lesion features is described. The accuracy in the plant
       disease recognition is higher 90% according to the experimental results
       performed using grape diseases as a case study.


       Keywords: plant disease, lesions, image processing, agricultural production.




1 Introduction

   Plant diseases can increase dramatically the cost of agricultural production if they
are not detected and treated in their early stages. The plants have to be monitored all
the time in order to detect the first symptoms of a disease before it is spread to the
whole crop. Professional agriculture engineers may not be available to continuously
monitor a crop especially if its size is small or medium and the cost for such a
process is high, or if the crop resides in a distant rural region. Remote monitoring
through machine vision can offer an alternative option. For example, the user can
send photos to professional agriculturists and ask opinions based on the visible
symptoms. Several additional tests may have to be performed in order to confirm if a
plant is affected by a specific disease.
   The plant disease diagnosis can be based on several symptoms that are described
in detail in (Riley et al., 2002). The symptoms can be grouped as follows: a)
Underdevelopment or, b) Overdevelopment of tissues or organs and c) Necrosis or
death of plant parts and alteration of normal appearance. The progression of the
symptoms can vary significantly and it is associated with problems caused by biotic
agents. The symptoms can be classified as primary and secondary. For example, the




                                              709
initial stages of a disease can include decayed roots of a tree, while the secondary
symptoms may include the toppling over of the tree. The invaders of the later disease
stages may also obscure the original disease symptoms misleading the diagnosis. The
improper herbicide usage can cause symptoms similar to spots caused by an
infectious agent. The sudden appearance of the symptoms and the absence of
progression can be an indication that the cause is the herbicide usage. Moreover, new
leaves are generally free of symptoms. More than one pathogens can often infect a
plant and the symptoms associated in this case may be significantly different from
the symptoms shown by each of the different pathogens when they act separately.
   The symptoms of a pathogen can be often expressed as fungal or bacterial leaf
spots. Vein banding, mosaic and ringspot can also appear. The leaves can be
distorted or a powdery mildew can appear. Spore structures may be present. The
needles may drop in conifer trees. The plants can be injured by chemical spray or air
pollution or by soil/air chemicals. Cankers can appear at the branches of a tree. The
fruits can have decays and rots or discoloration. Abnormal wilts or dying branches
can appear. In most of the cases listed above, an image processing technique could
have been used to locate the lesions and quantify them by estimating the number of
spots, their area, their color, etc.
   Several image processing techniques and molecular tests are reviewed in
(Sankaran et al., 2010). The sensitivity of molecular tests, depends on the detectable
minimum amount of microorganism. The ELISA is a popular molecular diagnosis,
based on the use of a microbial protein associated with the plant disease. The
antibodies are produced by an animal after this protein is injected to it. Another
popular technique based on DNA analysis is PCR (Schaad et al., 2002). The author
of this paper has been recently involved in the development of a portable low cost
equipment capable of performing molecular tests (Georgakopoulou et al., 2016).
   Observation (non-destructive) techniques like spectroscopic and image processing
can be used for plant disease diagnosis based on its symptoms. Spectroscopic
techniques can identify water stress levels, nutrient deficiency, measure the fruit
quality after the harvest, etc. Infrared spectroscopy is described for example in
(Purcell et al., 2009). Reviews of image processing techniques in visible light for
plant disease detection can be found in (Kulkarni et al., 2012). In that paper an image
segmentation takes place in the CIE L*a*b color scale, then a Gabor filter is used to
generate the input of a neural network that achieves a disease recognition with 91%
accuracy. A plant part shape, its texture, fractal dimensions, lacunarity, dispersion,
grey levels, grey histogram discrimination and the Fourier descriptors have also been
taken into account in disease diagnosis.
   A quantification technique for fruit traits is presented in (Mix et al., 2003). An
expert system based either on graphical representation or a step-by-step descriptive
method is discussed in (Abu-Nasser et al., 2008). Step by step description is based on
a questionnaire answered by the user. In graphical representation, the user compares
manually photos stored in a database to find the disease that matches his case. A
customized solution for corn diseases has been presented in (Lai, 2010). The most of
the image processing and spectroscopic techniques require the analysis to be
performed by high cost equipment or computational intensive software packages.
   In this paper, a mobile application is presented that is based on a low complexity
image processing technique that isolates the normal leaf area from the sick lesions.




                                           710
The described image processing technique can be used in the framework of a scalar
system that can operate in standalone mode as a single smart phone application or in
cooperation with remote clouds or databases or with a portable molecular analysis
equipment like the one described in (Georgakopoulou et al., 2016).
   In the present version, focus is given on the lesions that can appear at various parts
of a plant like the leaves, or the fruit. The lesions will also be called “spots” in this
paper and the developed application counts the number of these spots in the part of
the plant that appears in the photo, their area, their gray level and extracts color
histograms. The spot color Red-Green-Blue (RGB) features or their CIE L*a*b scale
can be extracted since a map of the spot positions is available. This allows the
support of other referenced techniques like the one presented in (Kulkarni et al.,
2012).
   In real time operation, the producer would use a smart phone with the application
installed, in order to take pictures of mortified plant parts. Then, the application
would ask him to provide some additional information that is useful for a more
reliable diagnosis. The application in its current form has been tested for a number of
plant diseases and can be easily extended for several other cases.
   In this paper we use the proposed application to measure the features of the spots
that appear in pear, orange, tangerine tree or grapevine leaves. The application can
successfully discriminate between pear tree diseases like Venturia Pirina, Septoria
Pyricola, Erwinia Amylovora, etc., orange/tangerine tree diseases like fungus
Capnodium oleae. Grape diseases like Powdery or Downy Mildew, Esca, etc. are
used in this paper to extract experimental results. The experimental results also show
that the measurement of the number of spots, their gray level and area in any plant
part photo can be achieved with higher than 90% accuracy.
   The image processing technique used for the characterization of the spots is
described in Section 2. The implementation of the Windows Phone application is
presented in Section 3 and experimental results are discussed in Section 4.




Fig. 1. Measurement of the leaf and spot size (Petrellis, 2015).




                                                711
2 Plant disease recognition method

   The developed application operates on a smart phone that has to be equipped with
a camera. The mobile phone should be capable of connecting to the Internet only if a
richer plant disease database or cloud has to be accessed in order to support a
diversity of plants. Moreover, this would be useful if the photos taken by the phone
camera have to be accessed by an agriculture engineer for approving the final
decision of the expert system. Otherwise, the application can be used offline and this
is useful especially if it has to be used in distant fields where the 3G/4G coverage
may be weak. In this case the application installed on the smart phone can support
only the specific plants, that the producer is interested of. He can take photos of plant
parts with lesions like leaves and fruits and then run the plant disease recognition
application using the captured images. The application asks him some additional
information like the kind of plant, its part that has been photographed (e.g., upper or
lower leaf surface), the draft distance the photo was taken from the leaf, an
estimation of how many leaves have been affected in a single plant or how many
plants of the field have been affected. The producer can also be asked about the time
the symptoms appeared. Environmental conditions (history of temperatures,
moisture, etc) can be retrieved by using the GPS locator of the smart phone.
   The image processing algorithm incorporated in the application extracts the lesion
features like number of spots, their grey level and area using an algorithm that has
already been described in (Petrellis, 2015). The present version is extended to extract
the color features of the normal leaf parts and the leaf spots too. These results can be
exploited by the decision module of the application in conjunction with the
information given by the user in order to draw a conclusion on the condition of the
plant. In future versions the decision module can be an advanced neural network. The
recognized disease here is the one that gets the higher score in a grading system that
will be defined below. In a more advanced setup, the mobile phone can cooperate
with a molecular analysis module like the biosensor readout circuit described in
(Georgakopoulou et al., 2016) that has been developed for the Corallia/LabOnChip
project. The communication with this module can be performed in a wired or
wireless manner (USB, Wi-Fi, Bluetooth, etc.). In this paper, we will focus only at
the stand-alone smart phone application.
   If the absolute spot dimensions have to be taken into consideration, the following
estimation method has to be followed. The spot detection algorithm assumes that the
photo with the plant part of interest has been captured from a known distance D (see
Fig. 1) e.g., one palm between the camera and the leaf. However, if the relative leaf
area is important instead of the absolute spot area, then the distance D and the
following estimations are not important. If the camera angle φ is also known then,
the half leaf length L is estimated by:
                                  L=D·tan(φ).                                         (1)
   The length L can be estimated if the photo is taken from a known distance D and it
corresponds to P pixels. The ratio S=L/P can be used to estimate another distance or
area from the number of pixels. For example, if the length of a spot in an axis is L’
and corresponds to P’ pixels, it can be estimated as:




                                            712
                                            P'                                         (2)
                                    L'= L
                                            P
Moreover, if the two points do not reside on the same vertical or horizontal axis,
equation (2) can also be used to estimate their distance if P’ is the number of pixels
in the shortest diagonal distance between them.
In a plane, if the dimensions of the covered area are Lx, Ly in the horizontal and
vertical axis respectively and they correspond to Px and Py pixels, then each pixel
occupies an area Ap that is estimated as follows

                                        1 Lx Ly                                        (3)
                                 Ap =
                                        S 2 Px Py
    Using equation (3), a spot of any shape that consists of P pixels will correspond to
an area that is equal to P times the area of a single pixel Ap.
    In order to apply the spot recognition algorithm, the plant part should be initially
separated by its background. This task could be implemented by a complicated
segmentation procedure or a dedicated image processing library that would consume
valuable smart phone resources. In the present version, it is assumed that the
background is much brighter than the plant color. For this reason, a leaf for example
is placed on a white sheet of paper as its background before it is captured by the
smart phone camera. In this way, no background separation algorithm is needed at
all. Moreover, it is also assumed that the photos have been captured under a canopy
or during a cloudy day so that the shadows are not important. The pixels with a grey
level higher than a configurable threshold BG are assumed to belong to the
background. More sophisticated algorithms for the separation of the background and
the leaf shadows will be implemented in future versions of the application.
    The three color components of an image (Red, Green, Blue) can be easily handled
to extract detailed image features but they are ignored initially, handling the image in
grey scale. The captured image is converted into a grey image separating the
background, the normal leaf surface and the spots. The pixels belonging to the leaf
are used to estimate an average grey level Ag. Then, the matrix of the image is
scanned to check if there are leaf pixels i with a grey level Gi that fulfil the following
condition:
                                                                                       (4)
                                 Gi − Ag > Th
   If the difference between the grey level of a leaf pixel and the average Ag is higher
than the threshold Th, the specific pixel is assumed to belong to a spot. A matrix
BGW1 is constructed with the same dimensions as the original image. Each cell in
BGW1 can be one of the following three grey levels: white for background (255),
grey for normal leaf surface (e.g., 120) and black for spot (0) as shown in Fig. 2. The
BGW1 is scanned again to group neighboring pixels belonging to the same spot. The
resulting matrix BGW2 has an integer number in each one of its cells. This number is
the identity of the spot that it belongs to. If a position in BGW2 is 0, then the
corresponding pixel does not belong to a spot.




                                            713
                                              (a)




                                             (b)




                                              (c)
Fig. 2. Original photograph (a), visualization of BGW1 matrix (b). Color histogram generated
from the spot pixels (c).

   The following algorithm is used to construct the matrix BGW2: a) the rows are
scanned from left to right and the neighboring pixels are assigned with the same
identity, b) if the previous pixel on the left does not belong to a spot, the neighboring
pixels at the row above are checked and if one or more of these has been assigned to
an identity different than 0, this identity is also used for the current pixel, c) the
scanning of the BW2 matrix is repeated merging spot identities until no changes are
performed. All of the desired features of the spots are made available through the
BGW2 matrix: the maximum spot identity is the number of spots. The area covered
by the spots is estimated by the non-zero cells of BGW2 and equation (3). The
fraction of the plant part that is occupied by spots is estimated as the number of non-
zero cells divided by the total number of cells (excluding the background). The
average grey level of each spot can be estimated by the non-zero cells. The
coordinates of each spot and its dimensions can be estimated by the BGW2 matrix
cells with the same identity. A filtering can also be applied discarding spots
consisting of very few pixels (less than a threshold: MinArea) because either they are
noise or they are too small to be considered.




                                             714
   Using either BGW1 or BGW2 matrices the spot pixels in the original image can
be isolated. The Red-Green-Blue (RGB) values for each spot pixel are used to
construct three histograms, each one with 256-positions. These histograms are
displayed by the application as shown in Fig. 2c. Each position of the Red the Green
or the Blue histogram corresponds to the number of pixels that have the exact color
level. For example, if the position 100 of the Red histogram is equal to 150, this
means that 150 spot pixels have Red level equal to 100. As can be seen from Fig. 2c,
each color histogram of the spot consists of a single lobe. The starting point (s), the
ending point (e) and the peak position (p) of each lobe is checked to see if it resides
within predefined limits: (smin, smax), (emin, emax), (pmin,pmax) respectively. These limits
have been extracted statistically by observing a number of photographs with plant
parts infected by the same disease when the disease parameters are defined. A
disease matching score Gr is extracted:
                      Gr = ∑ X sWs + ∑ X eWe + ∑ X pW p                                  (5)
                             s           e           p


   Where Ws, We, Wp are the weights of each condition (if a condition should not be
taken into consideration, then the corresponding weight can be 0) and Xs, Xe, Xp are
binary values indicating whether a parameter s, e, or p is within the predefined limits.
For example, Xs is defined as

                                ⎧1, if ( smin ≤ s ≤ smax )⎫                              (6)
                           Xs = ⎨                         ⎬
                                ⎩0, otherwise             ⎭

   Similar conditions and score weights can be derived by the color histograms of the
healthy part of a plant as well as from the global parameters like the number of spots,
their area, the average grey level of all spot pixels and the average grey level of all
the pixels that belong to healthy plant part, the compliance of weather conditions, etc.
The disease with parameters like (smin, smax), (emin, emax), (pmin,pmax) that lead to the
highest score Gr is selected as the one that matches the symptoms of the plant.


3     Smart Phone Application

  The smart phone application described in this paper was developed for Windows
Phone platform using Visual Studio 2015 and Silverlight. In our future work, it will
be ported to Universal Windows Platform in order to support more Microsoft
Windows platforms. It will also be ported to Xamarin in order to support smart
phone platforms different than Microsoft Windows like Android. The initial screen
of the developed application is shown in Fig. 3a where the supported plant parts are
displayed according to the plant disease database used. The user selects the plant or
tree that he is interested in. The selection can be implemented by Combo boxes if the
number of supported plants is small otherwise a List box or Long List Box can be
used. Then, the user selects the photo that displays the mortified plant part. The




                                              715
photo is displayed in the same page as shown in Fig. 3b that has been captured when
the application runs on a real Lumia 535 device. BGW1 images and histograms like
the ones shown in Fig. 2 are also displayed in this page. The next page (Fig. 3c) asks
the user to determine the part of the plant that the selected photo displays.




                         (a)                      (b)                           (c)




                       (d)                        (e)                           (f)


      .
Fig. 3. Pages of the developed application. Initial page (a) and initial page after the photo
selection (b). Selection of the plan part the photo displays (c). Additional Information entered
by the user (d). Geolocation for retrieving statistical and weather information about the area (e)
and the results page (f).


   In the next page (Fig. 3d), the user is asked to give additional information and
calibrate the image-processing algorithm. Some general questions that would make
easier the disease recognition would concern the number of the leaves and the
number of trees that are mortified, how long ago did the producer notice that the
symptoms appeared, age of the plants, etc. Calibration can concern the parameters
MinArea, BG and Th described in the previous section. Additional information like




                                                716
current date and the geographical position of the plant can be defined in the next
page (Fig. 3e). The local weather statistics (moisture, temperatures, etc) can be
retrieved automatically if the specific date and the geographical position of the plant
are known.

   The last page (Fig. 3f) shows the results of the analysis described in the previous
section. The analysis results include the number of spots and their relative area as
well as their grey level. The diagnosed disease of the plant is also listed. Moreover,
advice can be given that includes some practical actions like cutting the sick
branches and leaves and burning them away, or some prevention actions that can
restrict the spreading of the disease. The application could even suggest appropriate
herbicides, fungicides, etc, for the treatment of the disease if it is certified by the
appropriate authority.



4 Experimental Results




 (a)                            (b)                            (c)                            (d)
Fig. 4. Original image of a pear tree leaf (a), and the isolation of the spots (in white color) with
Th=60 (b), Th=80 (c) and Th=40 (d).

   In this section, the ability of the described application to accurately estimate the
spot number and area is investigated. Some cases where the application fails to
isolate the lesions are also discussed. Finally, experimental results from the use of the
proposed application in recognizing grape diseases are presented.
   The spot recognition method described in Section 2 has been applied to several
photos of leaves from pear and citrus trees, and grapevines. The number of spots and
the area they occupy on the leaf are significant inputs for the decision module of the
mobile disease recognition application described in the previous section. The
accuracy in estimating the features of these spots depends heavily on the value of the
parameter Th. For example, in Fig. 4b, Th=60 and using this value, the number of
estimated spots is 49. The number of real spots in the photo of Fig. 4a, is 45 or 50 if
some quite smaller ones are taken into account. The real area of the spots is 14% of
the leaf. The spot area may be a more credible measure than the number of spots
since the algorithm may split some spots or may consider multiple spots as one
depending on the value selected for Th. The estimated spot area of Fig. 4b is 12.34%.
The MinArea parameter for this case was selected equal to 4 pixels i.e., spots
consisting of fewer than 4 pixels are not considered. In order to understand the




                                                 717
importance of the appropriate Th value selection, two additional images are included
in Fig. 4. Fig. 4c shows the spot recognition result if Th=80. The number of spots and
their area in this case is 51 and 7.6% respectively. Although the number of spots
recognized is quite close to the actual number, the estimated area is far from the real
spot area. Fig. 4d shows the spot recognition result if Th=40. The number of spots
and their area in this case is 77 and 20% respectively. The visualization of BGW1
matrices like the one displayed in Fig. 1b can be used to calibrate the spot
recognition algorithm by selecting an appropriate value for Th in real time.




                (a)                            (b)                           (c)
Fig. 5. A grape leaf in inverted grey scale (a), spot isolation with Th=80 (b) and Th=75 (c).

   The spot recognition will be difficult in some cases even if the Th parameter is
tuned well. Such a case is demonstrated in Fig. 5 where a vine leaf is displayed. The
problems of this case are the following: a) the background is not bright enough and
has some spots that are confused with the leaf spots (difficulty in selecting the
appropriate BG value), b) the color of the spots is the same as the one of the leaf
veins and c) the area of each spot is surrounded by a halo brighter than the leaf color
and an internal spot that is darker. As can be seen in Fig. 5, the current algorithm
cannot be applied in this case since the spots are not recognized for any Th or
MinArea value. Supporting these cases will need a more sophisticated but still simple
algorithm. Current development focuses on isolating also the halo of the spots and
their perimeters.

Table 1. Average error in the estimated number of spots and their area for Pear, and Citrus
tree leaves.

 Tree             Part                               Th    Error in the         Error in the
                                                           estimated            estimated spot
                                                           number of spots      area
 Pear             Lower surface of the leaf 60             7%                   11%
 Citrus           Upper surface of the leaf 110            7%                   12%

   The spot recognition method was tested on spots like the ones shown in Fig. 4 for
several Pear, Orange and Tangerine tree leaves that have been mortified by diseases
like fungus Capnodium oleae, Erwinia Amylovora, etc. The accuracy in the
estimation of the number of spots and their area is demonstrated in Table 1. In all
cases MinArea=4 which seems to be a proper global value for the spot dimensions in
these specific diseases. The errors could have been further reduced by choosing more




                                                718
accurate Th values but they have been selected as multiples of 10 since the user will
be difficult to do a finer tuning in real time.
   The developed application was also evaluated using grape leaves infected by the
following diseases: Downy Mildew, Powdery Mildew, Esca, Phomopsis. The
grading method defined by equations (5) and (6) was used taking into consideration
the color histogram lobe starting/ending points and the peaks, the number of spots,
their area, grey levels and the compliance with weather data. A training set with 20
leaf photographs was used to define the limits. After using such a small number of
training samples to define the disease recognition rules, the application was tested
using a benchmark of 100 photographs. These photos were classified in diseases by
the application, based on the higher grade that they received for each disease. The
success rate of the disease classification is shown in Table 2.
   These experimental results show that the proposed low complexity disease
recognition method achieves an accuracy that is comparable or better with state of
the art disease classification methods. For example, in (Kulkarni et al., 2012) the
success rate in the disease recognition is 91%.

Table 2. Grape disease classification results.

                    Grape Disease                             Successful
                                                             Classification

                 Downy Mildew                          90%
                 Powdery Mildew                        98%
                 Phomopsis                             88%
                 Esca                                  98%



5 Conclusions

   An image processing technique that can be easily implemented on smart phones,
capable of recognizing plant lesion features and diseases has been presented. The
number of spots and their area on plant leaves showed accuracy higher than 90%.
The disease recognition was tested with grape leaves with an accuracy in the
classification between 88% and 98%, depending on the disease.
   In our future work, more diseases will be tested and the application will be ported
on different mobile phone platforms.

Acknowledgments. This work is protected by the provisional patent 1008484,
published by the Greek Patent Office, May 12, 2015.




                                                 719
References

1.  Abu-Naser, S.S., Kashkash, K.A., and Fayyad, M. (2008) Developing an Expert
    System for Plant Disease Diagnosis. Asian Network for Scientific Information,
    Journal of Artificial Intelligence 1, 2, p. 78-85.
2. Georgakopoulou, K., Spathis, C., Petrellis, N., and Birbas, A. (2016) A
    Capacitive to digital Converter with Automatic Range Adaptation. IEEE Trans.
    On Instrumentation and Measurements, 65,2 p. 336-345.
3. Kulkarni, A. and Patil, A. (2012) Applying image processing technique to detect
    plant diseases, International Journal of Modern Engineering Research, 2(5), p.
    3361-3364.
4. Lai, J.C., Ming, B., Li, S.K., Wang, K.R., Xie, R.Z., and Gao, S.J. (2010) An
    Image-Based Diagnostic Expert System for Corn Diseases. Elsevier Agricultural
    Sciences in China 9, 8, p. 1221-1229.
5. Mix, C., Picó, F.X., and Ouborg, N.J. (2003) A Comparison of
    Stereomicroscope and Image Analysis For Quantifying Fruit Traits. SEED
    Technology 25, 1.
6. Petrellis, N. (2015) Plant Disease Diagnosis Based on Image Processing,
    Appropriate for Mobile Phone Implementation. Proceedings of the 7th HAICTA
    2015 conference, Kavala, Sep. 17-20, 2015, p. 238-246.
7. Purcell, D.E., O’ Shea, M.G., Johnson, R.A., Kokot, S. (2009) Near infrared
    spectroscopy for the prediction of disease rating for Fiji leaf gall in sugarcane
    clones, Applied Spectroscopy, 63(4), p. 450-457.
8. Riley, M.B., Williamson, M.R., and Maloy, O. (2002) Plant disease diagnosis.
    The Plant Health Instructor. doi= 10.1094/PHI-I-2002-1021-01
9. Sankaran, S., Mishra, A., Eshani, R. and Davis, C. (2010) A review of advanced
    techniques for detecting plant diseases. Computers and Electronics in
    Agriculture, 72.
10. Schaad, N.W. and Frederick, R.D. (2002) Real time PCR and its application for
   rapid plant disease diagnostics, Canadian Journal of Plant Pathology, 24(3), p.
   250-258.




                                          720