<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Deep Convolutional Neural Network for Pollen Grains Classi cation</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Hanane Menad</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Farah Ben-naoum</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Abdelmalek Amine</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>EEDIS Laboratory, University of Djillali Liabes</institution>
          ,
          <addr-line>Sidi Bel Abbes</addr-line>
          ,
          <country country="DZ">Algeria</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>GeCoDe Laboratory, University of Dr Moulay Tahar</institution>
          ,
          <addr-line>Saida</addr-line>
          ,
          <country country="DZ">Algeria</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>The beekeeping is the art of cultivating the bees in the aim to remove from this industry the maximum performance with the minimum expenditure. The apiculture products marketed are the honey, wax, pollen, propolis and royal jelly. This activity of topping up contributes to the development of the livestock and to the protection of the Environment. This paper presents the application of deep convolutional neural network for pollen grains recognition based on their images classi cation. The neural network contains 8 hidden layers where rst 5 are convolutionnal neurones responsible for image representations and next 3 are fully connected layers for image classi cation. The obtained results proved the e ciency of the proposed approach for pollen grains recognition.</p>
      </abstract>
      <kwd-group>
        <kwd>Melissopalynology</kwd>
        <kwd>Honey Pollen Classi cation</kwd>
        <kwd>Deep learning</kwd>
        <kwd>convolutional neural network</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        Honey is a nutritious food, with economic importance for many countries
worldwide. Nowadays, increasing attention was dedicated to the determination of the
geographical and botanical origin of the honey in the aim to de ne the di
erentiation character of honeys of di erent sources with a standard of quality and
authenticity competitive in the market [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>
        The melissopalynology or palynology is applied to all knowledge in all
relations of any order that exist between the bee and the plant. As such it integrates
ecological, ethological and physiological research because it allows the use of
pollen grain as a biological marker in the vast context of the relations plant-bee.
Pollen analysis of honey is used to di erentiate the oral source used by bees,
the harvest period and the geo-climatic conditions of the concerned regions [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ],
palaeoclimatic reconstruction [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], as it has also been used in various medical
elds among them allergenic processes [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ],etc.
      </p>
      <p>
        Computationally Motivated Biology is a eld that consist of studying
biology for modelling the biological systems using the computer science. For this
order, researchers study the behavior of a biological system then create tasks
as an arti cial model to facilitate the task for human beings. It is typically the
simulation of a natural phenomenon[
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
      </p>
      <p>
        Like all machine learning techniques, deep learning aims to cause a system to
solve situations without calculating all the necessary parameters to solve them by
the implementer. The goal is to train a variable-parameter algorithm (a "black
box") to make a correct decision about a given task. Learning is done by
optimizing variable parameters to improve the decisions made. Deep learning techniques
have made great progress in many areas that range from image recognition [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] .
      </p>
      <p>Founding an e cient automatic classi cation system for pollen identi cation
becomes a challenge that needs powerful techniques.This paper presents the
application of convolutionnal neural network for pollen grains classi cation. The
organisation of the paper was given as follow: Section 2 cites some works on the
classi cation of honey pollen, while section 3 gives an idea about the
implementation and results obtained during the experimentations and nally section 4
discusses the major conclusions.
2</p>
    </sec>
    <sec id="sec-2">
      <title>Classi cation of honey pollen</title>
      <p>
        Nowadays, automatic classi cation for pollen identi cation becomes a highly
active research eld. [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] is one of the most recent works, where authors proposed
an approach based on features extraction from image and classi cation of these
features using an ensemble classi er based on four di erent techniques. Another
work presented by [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], it was based on multilayer perceptron neural network for
classi cation of pollen species, in which authors claimed that they obtained 100%
fo accuracy. [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] also in a work where authors used neural network to classify two
sets of microscopic images. In [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], authors used Support Vector Machine to
classify melissopalynology data to recognize the origin of the pollen, they identify
the marker species that represent the area using z-scores algorithm, and they
predicted the area of origin by SVM algorithm, nally they used a statistical
analysis of the marker species. The images were collected in Italy, all samples
belonged to chestnut honey, so the results obtained showed a high accuracy of
discrimination of these samples in Italy. Authors in [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] captured pollen grains
from all sides, then they represented them in 3D space to extract Gray- scale
vectors from each side, nally they classi ed these vectors using SVM, the
accuracy obtained was about92%. [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] authors extracted texture in order to detect
Poles and Furrows for pollen grain recognition then they represented them in 732
variables before they applied MLP NN to choose the important variables. The
classi cation was done using stastical method and leave-one-out for evaluation.
To achieve the goal of pollen recognition, authors in [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] proposed an automatic
method, based on deep learning framework, the result achieved about 94%
classi cation rate on a dataset of 30 pollen types. Also, in[
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] authors present state
of the art of deep learning methods applied on POLLEN23E data set, the result
show a high e ciency, specialty when they used a hybrid approach,combining
transfer learning and feature extraction.
      </p>
    </sec>
    <sec id="sec-3">
      <title>The proposed approach and results</title>
      <p>The year 2006 was the beginning of the deep learning, then, it has emerged
as a new area of machine learning research. Since that, researchers focused on
developing deep learning based techniques that impact signal and information
processing, especially image processing that get the most part of deep learning
development.</p>
      <p>
        Deep Convolutional Activation Feature for Generic Visual Recognition(decaf)
is a python framework developped by Donahue et al. in [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ], where authors
adapted the deep convolutional neural network approach proposed by Krizhevsky
et al. in [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] to be implemented on only CPU rather than GPU. The approach
is divided into two parts, rst part consists of feature extraction from images
using ReLU non-linearities convolutional computation, then dimentionality
reduction using pooling technic, this part gives as result a vector of 2048 elements
to represent each image. The second part, is a 3 fully connected layers for image
classi cation.
3.1
      </p>
      <sec id="sec-3-1">
        <title>Pollen23E dataset</title>
        <p>
          Pollen23E is a set of 805 images divided in 23 species (classes). Each class
comprises of 35 images captured by a digital Bresser LCD microscope at a 40x
magni cation from di erent angles. Then the obtained images were transfered
to a laptop and segmented using the CorelDRAW1 software. [
          <xref ref-type="bibr" rid="ref15">15</xref>
          ].
3.2
        </p>
      </sec>
      <sec id="sec-3-2">
        <title>Obtained results</title>
        <p>In the following, we discuss the obtained results, in which, we based on the best
training accuracy, best validation accuracy, best cross entropy and test accuracy
to evaluate the e ect of number of iterations on pollen grains classi cation:</p>
        <p>As seen in table 1, number of iteration a ected the obtained results in a
manner that if we add more iterations, neural network builds better models.
And this is clear in terms of rst three measures:
{ in terms of Training accuracy: This measure is the accuracy of applying
the model on the training data, it is used to evaluate the model during
backpropagation steps in order to improve the model. In table 1, we cited
the best training accuracy obtained, as seen, the model is perfect since it
correctly classi ed 99% to 100% of training data. Figure 1 shows a detailed
updates of training accuracy during each iteration where (a), (b), (c), (d),
(e), (f), (g) and (h) present 100, 200, 500, 1000, 1500, 2000, 2500 and 4000
iterations respectively.
For all cases, we see in gure 1 how the training accuracy begins with small
values (about 5%), then it improved iteration after iteration to reach 100%.
{ in terms of Cross Entropy: When we use cross entropy loss while training
neural networks, we actually calculate the score function every time when
compute gradients for the weights in the network. So, the objective is
minimizing this measure, as seen in table 1, when we added more iteration, the
neural network minimized the cross entropy which means we got better
models. Figure 2 shows a detailed updates of cross entropy during each iteration
where (a), (b), (c), (d), (e), (f), (g) and (h) present 100, 200, 500, 1000, 1500,
2000, 2500 and 4000 iterations respectively.</p>
        <p>Fig. 2. Cross Entropy updates during each step
For all cases, we see in gure 2 how the entropy begins with high values
(about 5), then it improved iteration after iteration to converged to 0. While
detailing the gures, we see that more we added iterations, the cross entropy
became lower which means we have reduced information loss by improving
the model.
{ in terms of Validation accuracy: This measure is the accuracy of applying the
model on the validation data, as training accuracy, it is also used to evaluate
the model during backpropagation steps in order to improve the model. In
table 1, we cited the best validation accuracy obtained, as seen, the model
is perfect since it correctly classi ed 95% to 100% of training data. Figure 3
shows a detailed updates of validation accuracy during each iteration where
(a), (b), (c), (d), (e), (f), (g) and (h) present 100, 200, 500, 1000, 1500, 2000,
2500 and 4000 iterations respectively.
For all cases, we see in gure 3 how the validation accuracy begins with small
values (about 5%), then it improved iteration after iteration to reach 100%.
Also, since validation accuracy is lower than training accuracy, we validated
that our model did not su er from under tting problem.
{ in terms of Test accuracy: This measure is the accuracy of applying the nal
model on the test data, it is used to evaluate the prediction of new images.
Figure 4 shows the comparison of test accuracy according to number of
iteration.
In our case ( gure 4, the model recognized 76.6% of test images when we
built a model in 100 iterations, while it was improved when we augmented
the number of iterations to 200 iterations to recognize 85.1% of pollen grains,
then it became xed despite the model has been improved based on previous
measures.
3.3</p>
      </sec>
      <sec id="sec-3-3">
        <title>Comparative study</title>
        <p>
          To validate better our study, we compared our obtained results with other results
from literature. [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ] is a work that applied deep learning for pollen classi cation
using the same Pollen23E dataset. Cho et al. implemented three di erent setups,
the rst setup (TL) consist of using deep learning neural network for feature
extraction and classi cation of images, the second setup setup (FE+LD) consists
of using deep learning neural network for feature extraction and descriminant
learning for classi cation, while the third setup (TL+FE+LD) is a hybridization
of both setups where in classi catio phase, they used both neural network and
descriminant learning for pollen recognition. Figure 5 shows the comparison
done:
        </p>
        <p>
          As seen, after reimplementing approaches proposed by [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ] to compare them
in the same conditions as our proposed approach, our obtained results were in
general better than results gotten in [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ]. The problem in Cho et al. work was
the information loss, since they applied many lters for either feature extraction
or classi cation which cause a lot of information lost during these processes.
Contrary to our work, where we used only pooling technic for dimentionality
reduction, consequently, we minimized the information loss.
        </p>
        <p>Another comparison was done with same work, since they used deep
neural networks as our objective, so we compared the time complexity taken by
each approach with same conditions (number of iteration), Figure 6 shows the
comparison:</p>
        <p>As seen in gure 6, we reduced the training time, and the di erence of time
reduced became higher when we increased the number of iteration for each
approach, in which we gained from 0.47 minute up to 1.38, when we used 100
iteration, while the use of 4000 iteration proved that we can gain between 1.23
up to 2.17 minutes, and these values were obtained when we trained neural
networks using 790 images, so we believe that our proposed approach will be more
e ective in big data image analysis in terms of training time.
4</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Conclusion</title>
      <p>In this work, we applied a deep convolutional neural network for pollen identi
cation from their images. The approach was divided into two processes, the rst
process was feature extraction using ReLU non linearities convolutional
calculation, where 5 layers were responsible for it, then, the dimentionality of those
obtained features were reduced using pooling method to get vectors of size of
2048 element i which each vector represent an image in the dataset. After getting
the vectors, they were used a s input to 3 fully connected layers for classi
cation. The evaluation was based on four measures: Training accuracy, validation
accuracy, cross entropy and test accuracy. As seen in the paper, the obtained
measures' values proved that deep convolutional neural networks can be used as
a good solution to automate the pollen grains classi cation. Also, we saw that
the proposed approach avoid the under tting, this is proved by the validation
accuracy that was lower than training accuracy.</p>
      <p>These results motivated us for future works, we planned to develop more
deep learning approaches, and combine it with metaheuristics.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Alissandrakis</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Daferera</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tarantilis</surname>
            ,
            <given-names>P. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Polissiou</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Harizanis</surname>
            ,
            <given-names>P. C.</given-names>
          </string-name>
          (
          <year>2003</year>
          ).
          <article-title>Ultrasound-assisted extraction of volatile compounds from citrus owers and citrus honey</article-title>
          .
          <source>Food chemistry</source>
          ,
          <volume>82</volume>
          (
          <issue>4</issue>
          ),
          <fpage>575</fpage>
          -
          <lpage>582</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Aronne</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>De Micco</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Guarracino</surname>
            ,
            <given-names>M. R.</given-names>
          </string-name>
          (
          <year>2012</year>
          ).
          <article-title>Application of Support Vector Machines to Melissopalynological Data for Honey Classi cation</article-title>
          .
          <source>In New Technologies for Constructing Complex Agricultural and Environmental Systems</source>
          (pp.
          <fpage>144</fpage>
          -
          <lpage>153</lpage>
          ).
          <source>IGI Global.</source>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Flenley</surname>
            ,
            <given-names>J. R.</given-names>
          </string-name>
          (
          <year>1999</year>
          ).
          <article-title>Pollen texture identi cation using neural networks</article-title>
          .
          <source>Grana</source>
          ,
          <volume>38</volume>
          (
          <issue>1</issue>
          ),
          <fpage>59</fpage>
          -
          <lpage>64</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Corbi</surname>
            ,
            <given-names>A. L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cortes</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bousquet</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Basomba</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cistero</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Garcia-Selles</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , ... &amp;
          <string-name>
            <surname>Carreira</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>1985</year>
          ).
          <article-title>Allergenic cross-reactivity among pollens of Urticaceae</article-title>
          .
          <source>International Archives of Allergy and Immunology</source>
          ,
          <volume>77</volume>
          (
          <issue>4</issue>
          ),
          <fpage>377</fpage>
          -
          <lpage>383</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Devillers</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Morlot</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pham-Delegue</surname>
            ,
            <given-names>M. H.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Dore</surname>
            ,
            <given-names>J. C.</given-names>
          </string-name>
          (
          <year>2004</year>
          ).
          <article-title>Classi cation of mono oral honeys based on their quality control data</article-title>
          .
          <source>Food Chemistry</source>
          ,
          <volume>86</volume>
          (
          <issue>2</issue>
          ),
          <fpage>305</fpage>
          -
          <lpage>312</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Cho</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Van Merrinboer</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gulcehre</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bahdanau</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bougares</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schwenk</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Bengio</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          (
          <year>2014</year>
          ).
          <article-title>Learning phrase representations using RNN encoder-decoder for statistical machine translation</article-title>
          .
          <source>arXiv preprint arXiv:1406</source>
          .
          <fpage>1078</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Arias</surname>
            ,
            <given-names>D. G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cirne</surname>
            ,
            <given-names>M. V. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chire</surname>
            ,
            <given-names>J. E.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Pedrini</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          (
          <year>2017</year>
          , December).
          <article-title>Classi cation of Pollen Grain Images Based on an Ensemble of Classi ers</article-title>
          .
          <source>In Machine Learning and Applications (ICMLA)</source>
          ,
          <year>2017</year>
          16th IEEE International Conference on (pp.
          <fpage>234</fpage>
          -
          <lpage>240</lpage>
          ). IEEE.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Treloar</surname>
            ,
            <given-names>W. J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Flenley</surname>
            ,
            <given-names>J. R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Empson</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          (
          <year>2004</year>
          ).
          <article-title>Towards automation of palynology 2: the use of texture measures and neural network analysis for automated identi cation of optical images of pollen grains</article-title>
          .
          <source>Journal of Quaternary Science: Published for the Quaternary Research Association</source>
          ,
          <volume>19</volume>
          (
          <issue>8</issue>
          ),
          <fpage>755</fpage>
          -
          <lpage>762</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9. France,
          <string-name>
            <given-names>I.</given-names>
            ,
            <surname>Duller</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. W. G.</given-names>
            ,
            <surname>Duller</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. A. T.</given-names>
            , &amp;
            <surname>Lamb</surname>
          </string-name>
          ,
          <string-name>
            <surname>H. F.</surname>
          </string-name>
          (
          <year>2000</year>
          ).
          <article-title>A new approach to automated pollen analysis</article-title>
          .
          <source>Quaternary Science Reviews</source>
          ,
          <volume>19</volume>
          (
          <issue>6</issue>
          ),
          <fpage>537</fpage>
          -
          <lpage>546</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Ronneberger</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Burkhardt</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Schultz</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          (
          <year>2002</year>
          ).
          <article-title>General-purpose object recognition in 3D volume data sets using gray-scale invariants-classi cation of airborne pollen-grains recorded with a confocal laser scanning microscope</article-title>
          .
          <source>In Pattern Recognition</source>
          ,
          <year>2002</year>
          .
          <source>Proceedings. 16th International Conference on (Vol. 2</source>
          , pp.
          <fpage>290</fpage>
          -
          <lpage>295</lpage>
          ). IEEE.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Daood</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ribeiro</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Bush</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2016</year>
          , December).
          <article-title>Pollen Grain Recognition Using Deep Learning</article-title>
          .
          <source>In International Symposium on Visual Computing</source>
          (pp.
          <fpage>321</fpage>
          -
          <lpage>330</lpage>
          ). Springer, Cham.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Sevillano</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Aznarte</surname>
            ,
            <given-names>J. L.</given-names>
          </string-name>
          (
          <year>2018</year>
          ).
          <article-title>Improving classi cation of pollen grain images of the POLEN23E dataset through three di erent applications of deep learning convolutional neural networks</article-title>
          .
          <source>PloS one</source>
          ,
          <volume>13</volume>
          (
          <issue>9</issue>
          ),
          <year>e0201807</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Donahue</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jia</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vinyals</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ho</surname>
            <given-names>man</given-names>
          </string-name>
          , J., Zhang, N.,
          <string-name>
            <surname>Tzeng</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Darrell</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          (
          <year>2014</year>
          , January).
          <article-title>Decaf: A deep convolutional activation feature for generic visual recognition</article-title>
          .
          <source>In International conference on machine learning</source>
          (pp.
          <fpage>647</fpage>
          -
          <lpage>655</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Krizhevsky</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sutskever</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Hinton</surname>
            ,
            <given-names>G. E.</given-names>
          </string-name>
          (
          <year>2012</year>
          ).
          <article-title>Imagenet classi cation with deep convolutional neural networks</article-title>
          .
          <source>In Advances in neural information processing systems</source>
          (pp.
          <fpage>1097</fpage>
          -
          <lpage>1105</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Gonalves</surname>
            ,
            <given-names>A. B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Souza</surname>
            ,
            <given-names>J. S.</given-names>
          </string-name>
          , da Silva,
          <string-name>
            <given-names>G. G.</given-names>
            ,
            <surname>Cereda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. P.</given-names>
            ,
            <surname>Pott</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Naka</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. H.</given-names>
            , and
            <surname>Pistori</surname>
          </string-name>
          ,
          <string-name>
            <surname>H.</surname>
          </string-name>
          (
          <year>2016</year>
          ).
          <article-title>Feature Extraction and Machine Learning for the Classi cation of Brazilian Savannah Pollen Grains</article-title>
          .
          <source>PloS one</source>
          ,
          <volume>11</volume>
          (
          <issue>6</issue>
          ),
          <year>e0157044</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>