<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Glasgow, UK, April</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Multiple leaflets-based identification approach for compound leaf species</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Olfa Mzoughi</string-name>
          <email>olfa.mzoughi@inria.fr</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Itheri Yahiaoui</string-name>
          <email>itheri.yahiaoui@inria.fr</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Nozha Boujemaa</string-name>
          <email>nozha.boujemaa@inria.fr</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ezzeddine Zagrouba</string-name>
          <email>ezzeddine.zagrouba@fsm.rnu.tn</email>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>1-INRIA Saclay France</institution>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>1-Inria Rocquencourt France, 2-CReSTIC Université de</institution>
          ,
          <addr-line>Reims</addr-line>
          ,
          <country country="FR">FRANCE</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>1-Inria Saclay France</institution>
          ,
          <addr-line>2-SIIVA/RIADI Intitut, Supérieur d'Informatique</addr-line>
          ,
          <institution>Université de Tunis El Manar</institution>
          ,
          <country country="TN">Tunisia</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>1-SIIVA/RIADI Intitut, Supérieur d'Informatique, Université de Tunis El Manar</institution>
          ,
          <country country="TN">Tunisia</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2014</year>
      </pub-date>
      <volume>1</volume>
      <issue>2014</issue>
      <fpage>53</fpage>
      <lpage>60</lpage>
      <abstract>
        <p>Leaves of plants can be classi ed as being either simple or compound according to their shapes. Compound leaves can be seen as a collection of simple leaf-like structures called lea ets. However, most computer vision-based approaches describe these two leaf categories similarly. In this paper, we propose a new description and identi cation method for compound leaves that takes into account particularities related to the arrangement of their shapes (speci cally, their division into lea ets). In fact, we propose a new multiple lea ets-based identi cation approach. Our main motivation behind this choice is that some compound leaf species may hold variabilities in terms of their lea ets number, size and even shape. Thus, a local description based on a certain number of lea ets may provide greater accuracy. In our approach, we were limited to three lea ets that were automatically extracted from image based on some geometric assumptions inspired from botany. Then, we construct and evaluate our identi cation scheme based on some classical texture descriptors for local lea ets description and using some state-of-the-art fusion algorithms to combine responses obtained from each lea et query. Experiments carried out on compound leaves of the Pl@ntLeaves scan database have shown an improvement in classi cation results with regard to entire image query.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. INTRODUCTION</title>
      <p>
        New interdisciplinary technologies that integrate computer
vision in botanical research are being developed in response
to ecological challenges such as global climate change, rapid
urban development, destruction of habitats,
overexploitation of natural resources, food insecurity, biodiversity crises,
etc. In particular, computer vision studies are increasingly
focusing on accurate, complete and user-friendly systems for
taxonomic identi cation of plant species (i.e, intended for
a wide range of people, not only experts). A number of
project-systems have already been built, for instance,
Leafsnap in America [
        <xref ref-type="bibr" rid="ref15 ref19">15, 19</xref>
        ], CLOVER [
        <xref ref-type="bibr" rid="ref25">25</xref>
        ] in Asia, Pl@ntNet[
        <xref ref-type="bibr" rid="ref4">4</xref>
        ],
ReVes [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] and ENVIROFI [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] in Europe, etc, and most of
these systems use leaves to identify plant species. In fact,
unlike other organs such as owers, fruits or seeds, leaves
are generally easy to collect (available throughout the year)
and to scan or photograph (they have an approximately
two-dimensional shape). Moreover, they often hold
discriminative information that is useful for characterizing plant
species.
      </p>
      <p>
        Existing leaf-based plant identi cation approaches di er
in several aspects: One is the type of feature used.
Fundamental features are shape [
        <xref ref-type="bibr" rid="ref22 ref29 ref30">22, 30, 29</xref>
        ] and texture [
        <xref ref-type="bibr" rid="ref3 ref8 ref9">9, 3,
8</xref>
        ] which describe respectively the leaf margins and the vein
pattern, the main key indicators of leaf species. Another
aspect concerns the way the leaf is viewed: using generic or
domain-speci c representations. Generic approaches consist
in using common computer vision representations such as
the Shape Context, the Curvature Scale Space, the
MultiScale Fractal representation, the Fourier and Wavelet
Transforms [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]). These methods have the advantage of being
simple and rapid. However, they are not always su cient
to provide accurate identi cations mainly due to the high
inter-class and low intra-class similarity that occur for some
species in terms of certain characteristics. For example, in
the case of the Acer Negundo (see Figure 1), the use of
contour descriptors may induce errors since some specimens
have lobed and/or serrated margins while others have
entire margins. For that reason, there has been a recent trend
toward using domain-speci c or botanical knowledge,
particularly about the leaf architecture, in order to enrich the
leaf image representation [
        <xref ref-type="bibr" rid="ref12 ref23 ref26 ref7">7, 12, 23, 26</xref>
        ]. In fact, the leaf
architecture, built and extensively used by botanists, refers
to the description and categorization of leaves according to
the properties of their structure. This includes several
foliar characters that describe, hierarchically, the form and the
placement of di erent elements constituting the leaf
structure such as venation pattern (see 1st row of Figure 2) [
        <xref ref-type="bibr" rid="ref17 ref31">31,
17</xref>
        ], marginal con gurations (see 2nd row of Figure 2) [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ],
shapes of leaf parts (see 3rd row of Figure 2) [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], etc. So
far, the use of this information has remained limited to some
simple characters (such as the laminar form described by
the ratio of the laminar width and height, the apical and
basal form expressed respectively by the apex and base
angles, etc. [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]). In this paper, we are interested in one of
the most important characters that has been less exploited:
the leaf arrangement (or type). In fact, leaves of trees are
grouped into two basic classes: simple and compound leaves
(see Figure 3).
      </p>
      <p>
        In current leaf image retrieval approaches, simple and
compound leaves are described similarly by using global
descriptors (for the whole leaf image) [
        <xref ref-type="bibr" rid="ref22 ref29 ref3 ref30 ref9">22, 30, 29, 9, 3</xref>
        ].
However, compound leaves can be seen as a subdivision of
simple leaves. Otherwise, each lea et has a blade-like structure.
For that reason, a local description, whose regions of interest
are lea ets, may provide greater accuracy, not only in the
case of occlusion or partial damage, but more speci cally,
when dealing with partial intra-species non-similarity. For
instance, the Gleditsia triacanthos species may have di
erent types of lea ets (simple and pinnate) within the same
leaf (see Figure 4). Also, the fraxinus angustifolia and the
fraxinus ornus species may have a variable number of lea ets
(see Figure 5). Furthermore, the Vitex Agnus Cactus
species hold lea ets with di erent sizes (see Figure 6).
      </p>
      <p>In these cases, the whole leaf images are clearly totally
di erent. However, the similarity can be revealed by
comparing lea ets separately. From this assertion, in this paper,
we propose an image retrieval system for compound leaves
based on the combination of response lists derived from each
lea et sub-image query. This involves the following steps:
First, we automatically detect at most three
represen</p>
      <p>(b) Fraxinus ornus
tative lea ets of the image using geometric properties
related to their contours.</p>
      <p>Then, we consider sub-images of lea ets as multiple
views of the same compound leaf image. In other
words, we replace the entire image by its three lea ets
in the identi cation scheme. We index each lea et
subimage separately. The list of responses includes all
the remaining lea ets sub-images other than the two
lea ets obtained from the same entire image as the
query.</p>
      <p>Finally, we combine the ranking lists obtained from
each lea et query obtained from the same original
compound leaf, a posteriori, in order to nd the overall
responses of the whole image. Di erent state-of-the-art
fusion methods are tested and evaluated with regard
to the entire compound leaf image query.</p>
      <p>This paper is organized as follows. First, we brie y
describe some previous work on parts-based plant identi
cation and speci cally those that deal with compound leaves.
Next, we describe the steps of our lea ets-based retrieval
scheme. Experiments and evaluations are presented in the
nal section.</p>
    </sec>
    <sec id="sec-2">
      <title>RELATED WORK</title>
      <p>
        The elementary analysis and description of leaves, or plants
in general, based on their parts are traditionally performed
by botanists (mainly using qualitative features) in order to
identify species. Some recent computer-vision approaches
have used this assumption to enhance plant retrieval results.
For instance, the authors of [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] have combined di erent
views of plant organs (such as owers, bark, leaves) using
a late fusion process. Analogically, the authors of [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ] have
used the same principle (that is the late fusion) to parts of
simple leaves. They follow the Manual of Leaf Architecture
[
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] for parts de nitions (which divides simple leaves into
three parts: the apical, basal and margin parts) and they
automatically detect them based on semantic geometric
features.
      </p>
      <p>
        Compound leaf identi cation based on their parts (lea ets)
has also been discussed in two previous studies: In the
rst one [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], the authors propose a two-stage compound
leaf shape modelling. In the rst stage, it makes the
assumption that compound leaves are re ectively symmetric
and that their lea ets have the same size and orientation.
In this phase, the lea ets are assimilated to uniform circles
arranged, pairwise, on either side of the main axis, and
dened by their positions, their radius and distance from the
main axis. In the second stage, a joint polygonal model is
used to estimate the shape of the lea ets (by estimating the
length, width, bilateral width, angles of base and apex of
each lea et. For each stage, an energy function, based on a
color dissimilarity map, is minimized. This method has the
advantage that it accomplishes both leaf segmentation and
recognition using the same model. However, it is limited by
the high computational cost and the intervention of the user
to initialize the model's parameters. Moreover, the model's
assumptions are so strict that they do not correspond to the
reality of the processed data (i.e, they are not valid for all
types of compound leaves). In fact, leaves are not always
axially symmetric even for pinnately compound leaves (see
Figure 3)). Furthermore, the lea ets' size may vary (see
Figure 6). The second study that has dealt with compound
leaves is presented in [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. It looks for the top 3 lea ets,
obtained using the following two hypotheses:(1) First, they are
located on either side of the main axis (estimated by a
polynomial of order 4). (2) they have the most elliptic shapes
(de ned by the ratio between the area of the shape and its
minimum enclosing ellipse). The nal identi cation stage is
based on only one lea et, selected from the three candidates
, based on its similarity distance, computed with the
complex network shape descriptor, which should be the lowest
with regard to the two others.
      </p>
      <p>
        The approach, presented in this paper, presents
similarities with regard to the two rst studies, mentioned above,
related to plant organs [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ] and simple leaf parts [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. In
fact, we aim to identify compound leaves based on an a late
fusion of responses of their parts (lea ets) queries. On the
other hand, the proposed method presents several di erences
from the two last studies [
        <xref ref-type="bibr" rid="ref2 ref7">2, 7</xref>
        ], described previously, about
lea et-based identi cation: First, we propose to decompose
both pinnately and palmately compound leaf shapes, and
even lea ets are not similar or symmetric. Second, we use
more than one lea et in order to cope with intra-variation of
lea ets shapes unlike [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Third, we choose to retrieve each
lea et separately and to combine the ranking lists obtained,
a posteriori unlike the work presented in [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], in which an
early fusion is rather used. In fact, the early fusion, which
consists in concatenating lea et representations in a single
one, needs an appropriate lea et representations matching
algorithm since lea ets are not selected in the same order
from an image to other.
3.
      </p>
    </sec>
    <sec id="sec-3">
      <title>AUTOMATIC LEAFLET EXTRACTION</title>
      <p>
        Parts-based shape decomposition is generally important
to shape representation and recognition. Several studies
have dealt with this problem. They are mainly based on
the perceptual rule of decomposing shapes into regions with
important concavities [
        <xref ref-type="bibr" rid="ref20 ref28">28, 20</xref>
        ]. The de nition of the best
cut that joins minima points is still a challenging issue.
Most approaches are based on a recursive procedure or use
some optimization criteria, which is a time-consuming task.
Domain-speci c knowledge related to shapes may be
useful to simplify the decomposition process. In our case, we
are dealing with shape of compound leaves (either pinnate
or palmate). They are, by de nition, fully subdivided into
lea ets, arranged on either side of the rachis (main stalk)
in pinnately compound leaves and centred around the base
point (the point that joins the blade to the petiole) in
palmately compound leaves (see Figure 3) [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. From that, we
can deduce that the generic perceptual rule can be applied
for shapes of compound leaves. In fact, lea et shapes may
be seen as regions separated by extra points with deep
concavities. In order to determine the points that limit lea ets,
we base our solution on the two following botanical
assumptions:
      </p>
      <p>
        In compound leaves, concave points may correspond,
besides to lea ets endpoints, to other irregularities such
as tooth, lobes, petiole bending or even points derived
from the aliasing e ect. These points should be
discarded (only points corresponding to lea ets and rachis
terminals should be kept). In order to do so, we rst
apply a smoothing to the leaf shape. We reject all
concave points that are aligned with its two-sided
neighbourhood in exion points (see circled green points in
Figure 7). The remaining concave points (see Figure
8) are used to determine lea ets. Notice that in exion
points and concave points are de ned respectively as
the zeros-crossing and the local maxima with negative
value of the curvature function of a contour. In
practice, we compute the curvature function as presented
in [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ].
      </p>
      <p>
        In practice, we only select, at most, three lea ets in order to
try to lead to the compromise between information richness,
derived from lea ets variations within the same species (see
Figures 4 and 6) and lea ets overlapping problem, that may
eventually induce false detections if the number of selected
lea ets is high (see Figure 10). In fact, since the processed
data (the pl@ntLeaves Scan dataset) hold generally partial
lea ets overlapping, the rst lea ets are often approximately
complete (see Figure 10). This simple hypothesis (about the
number of selected lea ets) ensure a low computational cost,
compared to sophisticated methods such as active polygonal
models [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] which also fail to distinguish and delimit
overlapped lea ets. Furthermore, trifoliate leaves (see Figure 3)
have only three lea ets (see 1st image, 1st column at left
in Figure 9). Finally, we judge the relevance of the selected
three lea ets, according their size. This is mainly important
in the case of approximately totally overlapped lea ets (for
example, in the 2nd column of Figure 10, only one lea et is
selected).
4.
      </p>
    </sec>
    <sec id="sec-4">
      <title>LEAFLETS TEXTURE DESCRIPTION</title>
      <p>We evaluate our multiple lea ets based identi cation
approach by testing some texture descriptors described as
following. The local description of lea ets texture allow to
outline vein networks which are an important attribute in
leaf identi cation.</p>
      <p>
        The Fourier histogram (Fourier) proposed in [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]
describes the distribution of the spectral power density
within the complex frequency plane. This is expressed
using two types of histogram de ned according to two
partitions of the Fourier plane: the rst is a disk
partition used to di erentiate between low, middle and
high frequencies, while the second is based on a
partition according to di erent directions of the spectrum.
The Edge Orientation Histogram (EOH) [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ],
computes the distribution of edge directions. In a leaf,
the edges are composed of two parts: the interior and
the exterior contours which correspond respectively to
the vein networks and the margins.
      </p>
      <p>
        The Local Edge Orientation Histogram (LEOH) [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ].
Here, instead of accumulating occurrences of
gradient orientations in n bins such as in EOH, the LEOH
encodes the relative frequency distribution of groups
of gradient points contained within a sliding window
(blob). All local distributions are combined into a
single global histogram.
      </p>
      <p>
        A Hough histogram (Hough) [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ], is a 2D histogram
based on the Hough transform which gives the overall
behaviour of pixels in the image along straight lines.
Each pixel is represented by the orientation of the
gradient and the projection of its position vector onto its
tangent vector (i.e the vector orthogonal to the
gradient).
      </p>
    </sec>
    <sec id="sec-5">
      <title>MULTIPLE LEAFLETS-BASED QUERIES</title>
    </sec>
    <sec id="sec-6">
      <title>FUSION</title>
      <p>The proposed multiple lea ets-based fusion method
consists in constructing a single overall ranked list by merging,
a posteriori, di erent lists obtained for di erent queries of
lea ets sub-images.</p>
      <p>Let Q be the whole compound leaf image query, and Qi
the lea et-based queries where 1 i 3. For the query Q,
the visual index is composed of all the remaining images of
the database, noted Rn, (where 1 &lt; n &lt; N and N is the
number of images in the dataset). In the same way, for each
lea et query Qi, the returned images are denoted by Rin
and may belong to the set of all the remaining lea et
subimages, except the other two lea et sub-images, obtained
from the same entire image as the lea et associated to the
query Qi. For each image, all queries Qi of lea et sub-images
are indexed separately. In order to show the e ectiveness of
the lea ets-based fusion, we test three fusion methods:</p>
      <sec id="sec-6-1">
        <title>The leave out method (LO) [18] Responses are inserted in the nal ranking list circularly from di erent lea ets queries lists. The best position of an image among the returned lists is kept.</title>
      </sec>
      <sec id="sec-6-2">
        <title>The inverse rank position method (IRP) [18]</title>
        <p>IRP (Q; Rn) =</p>
        <p>1
P3 1
1 rRin (Qi)
where rRin (Qi) is the rank of the partial response Rin
to the partial query Qi. The nal list is obtained by
sorting the IRP values in increasing order.</p>
      </sec>
      <sec id="sec-6-3">
        <title>The increasing distance method (DistInc) [27]</title>
        <p>The ascending order of the scores, which correspond
in our case, to the similarity distance values, de nes
the order of the nal list. In fact, we can concatenate
the ranking lists obtained by three lea et queries into
a single one. We sort the resulted list, in ascending
order, in terms of similarity distances in order to obtain
the overall ranking list.</p>
        <p>Finally, once the overall ranking list is obtained after the
late fusion of lea et queries, we apply the knn classi er in
order to determine the identity of the query image.
6.</p>
      </sec>
    </sec>
    <sec id="sec-7">
      <title>EXPERIMENTAL RESULTS</title>
      <p>
        Experiments were carried out on a subset including
compound leaves of the Pl@ntLeaves Scan pictures dataset [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
This subset contains 595 images belonging to 16 plant species
characterised mainly by a high intra-species variabilities (mainly
in terms of lea ets number, size variations, see Figures 5 and
6). The dataset categorisation (into simple and compound)
is performed automatically based on the approach proposed
in [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ]. We evaluate the e ectiveness of our approach using
the correct classi cation rate metric, obtained by the K-NN
classi er, for di erent values of k (k 2 f5; 10; 20; 25; 30g).
This metric is adequate to the context of plant species
identi cation because it re ects the user satisfaction, which is
achieved when the accurate species is the mostly present in
the k rst responses.
      </p>
      <p>Recall that the principle of our lea ets-based identi
cation scheme is that each lea et represents a di erent view of
the entire image (i.e, the entire image is replaced by these
views in the identi cation scheme). For that reason, we can
show the robustness of our approach by comparing its
classication results with regard to the classical retrieval scheme
based on the global representation of the entire image. We
perform several test con gurations using four texture
descriptors: Hough, Fourier, Leoh, Eoh (see Section 4), in
the description phase, and three fusion algorithms, in the
lea ets-based queries fusion phase: IRP, LO, DistInc (see
Section 5). Figure 11 presents results for these di erent
con gurations. We can see an enhancement in the classi
cation rates, obtained by all descriptors and fusion algorithms
tested, and for di erent values of k, with regard to the
classical retrieval scheme for compound leaves. This prove the
e ectiveness of our lea ets-based scheme for enhanced
compound leaves identi cation. Also, fusion algorithms IRP and
LO perform both the best classi cation rate values. Figure
12 presents the 6 top images, returned for the leoh
descriptor, for both the classical retrieval scheme with the entire
image (top) and our lea ets-based approach (bottom) using
the IRP fusion algorithm, for a specimen that belongs to
the species Fraxinus angustifolia. This species was
particularly chosen because it holds variability of lea ets number
(see some examples in Figure 5). We can see that all the
responses returned for the entire image are wrong (they does
not belong to the right species). It seems that the leoh
descriptor has provided these responses because of the high
global similarity in terms of macro-texture. Nevertheless,
when we use our lea ets-based approach, we obtain 5
accurate images from the 6 top ones, although that the three
last ones have di erent lea ets number with regard to the
query image which illustrates well the e ciency of our
strategy.</p>
      <p>In this paper, we propose a new multiple lea ets-based
identi cation approach dedicated to compound leaves. We
construct our approach in three phases: (1) The rst is the
lea ets extraction. This step is established using simple
geometric parameters which are de ned based on botanical
observations. We x the number of lea ets to three in order
to lead to the compromise of information richness derived
from lea ets variations within the same species and false
lea et detections induced by lea ets overlapping problem.
Our lea ets extraction method has the advantage of being
rapid and e cient for di erent types of compound leaves
unlike previous methods (2) The second step is the local
description of lea ets. In this step, we test four classical
texture descriptors (Hough, Fourier, Leoh, Eoh). (3) The
third step is the late fusion of ranking lists obtained by each
lea ets queries. We test three state-of-the-art fusion
algorithms which are IRP, LO and distInc. Experiments were
performed on compound leaves of the Pl@ntLeaves Scan
pictures dataset for the di erent con gurations of descriptors
and fusion algorithms. They have shown an improvement in
the classi cation rates with di erent values of the knn
classi er, with regard to the classical retrieval scheme obtained
by the entire image. Our ongoing work aims at
constructing and evaluating the global parts-based leaf identi cation,
de ned depending on the leaf type: simple or compound.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Enviro</surname>
          </string-name>
          . http://www.envirofi.eu/.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>A.</given-names>
            <surname>Arora</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Gupta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Bagmar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Mishra</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Bhattacharya</surname>
          </string-name>
          .
          <article-title>A plant identi cation system using shape and morphological features on segmented lea ets: Team iitk</article-title>
          , clef
          <year>2012</year>
          . In CLEF (Online Working Notes/Labs/Workshop),
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>A. R.</given-names>
            <surname>Backes</surname>
          </string-name>
          and
          <string-name>
            <given-names>O. M.</given-names>
            <surname>Bruno</surname>
          </string-name>
          .
          <article-title>Plant leaf identi cation using color and multi-scale fractal dimension</article-title>
          .
          <source>Computer Science</source>
          ,
          <volume>6134</volume>
          :
          <fpage>463</fpage>
          {
          <fpage>470</fpage>
          ,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>D.</given-names>
            <surname>Barthelemy</surname>
          </string-name>
          .
          <article-title>The pl@ntnet project: A computational plant identi cation and collaborative information system</article-title>
          .
          <source>Technical report, XIII World Forestry Congress</source>
          ,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>G.</given-names>
            <surname>Cerutti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Antoine</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Tougne</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Mille</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Valet</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Coquin</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Vacavant</surname>
          </string-name>
          .
          <article-title>Reves participation - tree species classi cation using random forests and botanical features</article-title>
          .
          <source>In Conference and Labs of the Evaluation Forum</source>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>G.</given-names>
            <surname>Cerutti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Tougne</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Coquin</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Vacavant</surname>
          </string-name>
          .
          <article-title>Curvature-Scale-based Contour Understanding for Leaf Margin Shape Recognition and Species Identi cation</article-title>
          .
          <source>In VISAPP</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>G.</given-names>
            <surname>Cerutti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Tougne</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Mille</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Vacavant</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D.</given-names>
            <surname>Coquin</surname>
          </string-name>
          .
          <article-title>A model-based approach for compound leaves understanding and identi cation</article-title>
          .
          <source>In IEEE International Conference on Image Processing</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>J. S.</given-names>
            <surname>Cope</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Corney</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. Y.</given-names>
            <surname>Clark</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Remagnino</surname>
          </string-name>
          , and
          <string-name>
            <given-names>P.</given-names>
            <surname>Wilkin</surname>
          </string-name>
          . Review:
          <article-title>Plant species identi cation using digital morphometrics: A review</article-title>
          .
          <source>Expert Syst. Appl.</source>
          , pages
          <volume>7562</volume>
          {
          <fpage>7573</fpage>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>J. S.</given-names>
            <surname>Cope</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Remagnino</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Barman</surname>
          </string-name>
          , and
          <string-name>
            <given-names>P.</given-names>
            <surname>Wilkin</surname>
          </string-name>
          .
          <article-title>Plant texture classi cation using gabor co-occurrences</article-title>
          .
          <source>In Proceedings of the 6th international conference on Advances in visual computing - Volume Part II</source>
          , pages
          <volume>669</volume>
          {
          <fpage>677</fpage>
          ,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>N.</given-names>
            <surname>Dalal</surname>
          </string-name>
          and
          <string-name>
            <given-names>B.</given-names>
            <surname>Triggs</surname>
          </string-name>
          .
          <article-title>Histograms of oriented gradients for human detection</article-title>
          .
          <source>In Computer Vision and Pattern Recognition</source>
          ,
          <year>2005</year>
          .
          <article-title>CVPR 2005</article-title>
          . IEEE Computer Society Conference on,
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>M.</given-names>
            <surname>Ferecatu</surname>
          </string-name>
          .
          <article-title>Image retrieval with active relevance feedback using both visual and keyword-based descriptors</article-title>
          .
          <source>PhD thesis</source>
          , Universite de Versailles Saint-
          <article-title>Quentin-en-</article-title>
          <string-name>
            <surname>Yvelines</surname>
          </string-name>
          ,
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>H.</given-names>
            <surname>Go</surname>
          </string-name>
          eau, P. Bonnet,
          <string-name>
            <given-names>J.</given-names>
            <surname>Barbe</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Bakic</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Joly</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.-F.</given-names>
            <surname>Molino</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Barthelemy</surname>
          </string-name>
          , and
          <string-name>
            <given-names>N.</given-names>
            <surname>Boujemaa</surname>
          </string-name>
          <article-title>. Multi-organ plant identi cation</article-title>
          .
          <source>In Proceedings of the 1st ACM International Workshop on Multimedia Analysis for Ecological Data, MAED '12</source>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>H.</given-names>
            <surname>Go</surname>
          </string-name>
          <article-title>eau, A</article-title>
          . Joly,
          <string-name>
            <given-names>S.</given-names>
            <surname>Selmi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Bonnet</surname>
          </string-name>
          , E. Mouysset, and
          <string-name>
            <given-names>L.</given-names>
            <surname>Joyeux</surname>
          </string-name>
          .
          <article-title>Visual-based plant species identi cation from crowdsourced data</article-title>
          .
          <source>In ACM Multimedia</source>
          ,
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>L. A. W.</given-names>
            <surname>Group</surname>
          </string-name>
          .65p.
          <article-title>Manual of Leaf Architecture</article-title>
          . Department of Paleobiology Smithsonian Institution, Cornell University Press,
          <fpage>1999</fpage>
          -
          <lpage>2000</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>G. A.</given-names>
            <surname>Haibin</surname>
          </string-name>
          , G. Agarwal,
          <string-name>
            <given-names>H.</given-names>
            <surname>Ling</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Jacobs</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Shirdhonkar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W. J.</given-names>
            <surname>Kress</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Russell</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Belhumeur</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Dixit</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Feiner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Mahajan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Sunkavalli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Ramamoorthi</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S.</given-names>
            <surname>White</surname>
          </string-name>
          .
          <article-title>First steps toward an electronic eld guide for plants</article-title>
          .
          <source>Taxon</source>
          ,
          <volume>55</volume>
          :
          <fpage>597</fpage>
          {
          <fpage>610</fpage>
          ,
          <year>2006</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>D. J.</given-names>
            <surname>Hearn</surname>
          </string-name>
          .
          <article-title>Shape analysis for the automated identi cation of plants from images of leaves</article-title>
          .
          <source>Taxon</source>
          , pages
          <volume>934</volume>
          {
          <fpage>954</fpage>
          ,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>P.</given-names>
            <surname>JinKyu</surname>
          </string-name>
          , H. EenJun, and
          <string-name>
            <given-names>N.</given-names>
            <surname>Yunyoung</surname>
          </string-name>
          .
          <article-title>Utilizing venation features for e cient leaf image retrieval</article-title>
          .
          <source>Journal of Systems and Software</source>
          ,
          <volume>81</volume>
          :
          <fpage>71</fpage>
          {
          <fpage>82</fpage>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>M. JoviA</surname>
             G,
            <given-names>Y.</given-names>
          </string-name>
          <string-name>
            <surname>Hatakeyama</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          <string-name>
            <surname>Dong</surname>
            , and
            <given-names>K.</given-names>
          </string-name>
          <string-name>
            <surname>Hirota</surname>
          </string-name>
          .
          <article-title>Image retrieval based on similarity score fusion from feature similarity ranking lists</article-title>
          .
          <source>In Fuzzy Systems and Knowledge Discovery</source>
          , volume
          <volume>4223</volume>
          of Lecture Notes in Computer Science, pages
          <volume>461</volume>
          {
          <fpage>470</fpage>
          . Springer Berlin Heidelberg,
          <year>2006</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>N.</given-names>
            <surname>Kumar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Belhumeur</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Biswas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Jacobs</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Kress</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Lopez</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J.</given-names>
            <surname>Soares</surname>
          </string-name>
          . Leafsnap:
          <article-title>A computer vision system for automatic plant species identi cation</article-title>
          .
          <source>In Computer Vision a^AS ECCV 2012, Lecture Notes in Computer Science</source>
          , pages
          <volume>502</volume>
          {
          <fpage>516</fpage>
          .
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>H.</given-names>
            <surname>Liu</surname>
          </string-name>
          , W. Liu, and
          <string-name>
            <given-names>L.</given-names>
            <surname>Latecki</surname>
          </string-name>
          .
          <article-title>Convex shape decomposition</article-title>
          .
          <source>In Computer Vision and Pattern Recognition (CVPR)</source>
          ,
          <source>2010 IEEE Conference on, pages 97{104</source>
          ,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>F.</given-names>
            <surname>Mokhtarian</surname>
          </string-name>
          and
          <string-name>
            <given-names>A.</given-names>
            <surname>Mackworth</surname>
          </string-name>
          .
          <article-title>Scale-based description and recognition of planar curves and two-dimensional shapes</article-title>
          .
          <source>IEEE Transactions on Pattern Analysis and Machine Intelligence</source>
          , PAMI-
          <volume>8</volume>
          :
          <fpage>34</fpage>
          {
          <fpage>43</fpage>
          ,
          <year>1986</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>F.</given-names>
            <surname>Mokhtarian</surname>
          </string-name>
          and
          <string-name>
            <given-names>A. K.</given-names>
            <surname>Mackworth</surname>
          </string-name>
          .
          <article-title>A theory of multiscale, curvature-based shape representation for planar curves</article-title>
          .
          <source>IEEE Transactions on Pattern Analysis and Machine Intelligence</source>
          ,
          <volume>14</volume>
          :
          <fpage>789</fpage>
          {
          <fpage>805</fpage>
          ,
          <year>1992</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <given-names>O.</given-names>
            <surname>Mzoughi</surname>
          </string-name>
          , I. Yahiaoui,
          <string-name>
            <given-names>N.</given-names>
            <surname>Boujemaa</surname>
          </string-name>
          , and
          <string-name>
            <given-names>E.</given-names>
            <surname>Zagrouba</surname>
          </string-name>
          .
          <article-title>Advanced tree species identi cation using multiple leaf parts image queries</article-title>
          .
          <source>In IEEE International Conference on Image Processing (ICIP)</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <given-names>O.</given-names>
            <surname>Mzoughi</surname>
          </string-name>
          , I. Yahiaoui,
          <string-name>
            <given-names>N.</given-names>
            <surname>Boujemaa</surname>
          </string-name>
          , and
          <string-name>
            <surname>E. Zagrouba.</surname>
          </string-name>
          <article-title>Automated semantic leaf image categorization by geometric analysis</article-title>
          .
          <source>In ICME</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Nam</surname>
          </string-name>
          , E. Hwang, and
          <string-name>
            <given-names>D.</given-names>
            <surname>Kim</surname>
          </string-name>
          .
          <article-title>Clover: A mobile content-based leaf image retrieval system</article-title>
          .
          <source>In Digital Libraries: Implementing Strategies and Sharing Experiences, Lecture Notes in Computer Science</source>
          , pages
          <volume>139</volume>
          {
          <fpage>148</fpage>
          .
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>A. R.</given-names>
            <surname>Sfar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Boujemaa</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D.</given-names>
            <surname>Geman</surname>
          </string-name>
          .
          <article-title>Vantage feature frames for ne-grained categorization</article-title>
          .
          <source>In CVPR. IEEE</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>R.</given-names>
            <surname>Snelick</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Indovina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Yen</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Mink</surname>
          </string-name>
          .
          <article-title>Multimodal biometrics: Issues in design and testing</article-title>
          .
          <source>In Proceedings of the 5th International Conference on Multimodal Interfaces</source>
          ,
          <source>ICMI '03</source>
          ,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [28]
          <string-name>
            <given-names>L.</given-names>
            <surname>Wan</surname>
          </string-name>
          .
          <article-title>Parts-based 2d shape decomposition by convex hull</article-title>
          .
          <source>In Shape Modeling and Applications</source>
          ,
          <year>2009</year>
          .
          <article-title>SMI 2009</article-title>
          . IEEE International Conference on, pages
          <volume>89</volume>
          {
          <fpage>95</fpage>
          ,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [29]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Chi</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D.</given-names>
            <surname>Feng</surname>
          </string-name>
          .
          <article-title>Shape based leaf image retrieval</article-title>
          .
          <source>VISP</source>
          , pages
          <volume>34</volume>
          {
          <fpage>43</fpage>
          ,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          [30]
          <string-name>
            <given-names>X. F.</given-names>
            <surname>Wanga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. S.</given-names>
            <surname>Huanga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. X.</given-names>
            <surname>Dua</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Xua</surname>
          </string-name>
          , and
          <string-name>
            <given-names>L.</given-names>
            <surname>Heutted</surname>
          </string-name>
          .
          <article-title>Classi cation of plant leaf images with complicated background</article-title>
          .
          <source>AMC</source>
          , pages
          <volume>916</volume>
          {
          <fpage>926</fpage>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          [31]
          <string-name>
            <given-names>N.</given-names>
            <surname>Yunyoung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Eenjun</surname>
          </string-name>
          , and
          <string-name>
            <given-names>K.</given-names>
            <surname>Dongyoon</surname>
          </string-name>
          .
          <article-title>A similarity-based leaf image retrieval scheme: Joining shape and venation features</article-title>
          .
          <source>Computer Vision</source>
          and Image Understanding,
          <volume>110</volume>
          :
          <fpage>245</fpage>
          {
          <fpage>259</fpage>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>