<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Classification of Dead Trees in Urban Parks Using Aerial Imagery and Convolutional Neural Networks</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>William Orozco-González</string-name>
          <email>williamorozco10791@gmail.com</email>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Juan C. Valdiviezo-Navarro</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Mauricio G. Orozco-del-Castillo</string-name>
          <email>mauricio.orozco@itmerida.edu.mx</email>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Paola Andrea Mejía-Zuluaga</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>León Dozal</string-name>
          <email>ldozal@centrogeo.edu.mx</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>CONAHCYT - CentroGeo</institution>
          ,
          <addr-line>Aguascalientes</addr-line>
          ,
          <country country="MX">México</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>CONAHCYT - CentroGeo</institution>
          ,
          <addr-line>Yucatán</addr-line>
          ,
          <country country="MX">México</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Centro de Investigación en Ciencias de Información Geoespacial, (CentroGeo)</institution>
          ,
          <addr-line>Ciudad de México</addr-line>
          ,
          <country country="MX">México</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Tecnológico Nacional de México/IT de Mérida</institution>
          ,
          <addr-line>Mérida, Yucatán</addr-line>
          ,
          <country country="MX">México</country>
        </aff>
      </contrib-group>
      <fpage>121</fpage>
      <lpage>130</lpage>
      <abstract>
        <p>In Mexico City, urban parks are facing a significant challenge due to a growing infestation of a hemi-parasitic plant known as mistletoe, which is detrimental to trees. Identifying trees killed by mistletoe is crucial for estimating infestation levels and determining the urgency of phytosanitary interventions. Traditional methods for monitoring tree health are manual, costly, time-consuming, and are realized by forestry specialists. This research introduces an approach using Convolutional Neural Networks (CNNs) to classify dead trees based on aerial imagery registered over an urban park in Mexico City. For this purpose, we collected 460 image sets and employed two CNN models, ResNet-34 and DenseNet-121. Our findings indicate that these models efectively discern unique vegetation patterns that could be associated with mistletoe infestation, and they perform with a high classification accuracy and a reduced computational cost.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Dead trees classification</kwd>
        <kwd>CNNs models</kwd>
        <kwd>UAV imagery</kwd>
        <kwd>ResNet-34</kwd>
        <kwd>DenseNet-121</kwd>
        <kwd>Forest monitoring</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Urban parks, also known as urban green spaces, are vital to cities for ofering recreational areas,
mitigating stress, reducing air pollution, and providing natural beauty along with other environmental
services [
        <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
        ]. These spaces also serve as sanctuaries for various animal and vegetation species, thereby
enhancing the quality of life for city residents [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. Over the past decade, the utilization of Unmanned
Aerial Vehicles (UAVs) for imagery has gained popularity in monitoring forests and urban parks [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ].
This technology has been instrumental in areas such as forest insect and pest control, including bark
beetles, and disease monitoring, like pine wilt disease [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. A distinctive advantage of UAVs over other
remote sensing technologies is the high spatial resolution of the images they capture, which can be on
the order of centimeters. This allows for the detailed identification of pest infestations and parasitic
species, among other concerns [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. Given the detailed nature of the data collected, the development of
eficient, accurate, and flexible methodologies for analyzing these results is essential [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
      </p>
      <p>
        In Mexico City, urban parks are being increasingly plagued by the rapid spread of a hemi-parasitic
plant known as mistletoe, posing significant challenges to tree health. Mistletoe seeds are dispersed
through a combination of their inherent ballistic propulsion mechanism and transportation by birds
and other fauna [
        <xref ref-type="bibr" rid="ref8 ref9">8, 9</xref>
        ]. Upon successful establishment on a host tree, mistletoe integrates into the tree’s
structure, consuming vital nutrients and resources. Without timely and efective management, this
parasitic relationship can lead to the host tree’s decline and eventual death. Therefore, there is an urgent
need for strategic control measures to mitigate the infestation and preserve the ecological balance of
urban green spaces [
        <xref ref-type="bibr" rid="ref10 ref11">10, 11</xref>
        ], including Mexico City.
      </p>
      <p>
        Although mistletoe species exhibit distinctive characteristics with respect to the host tree, their
visual identification often requires expertise, typically from a forestry specialist [
        <xref ref-type="bibr" rid="ref12 ref13">12, 13</xref>
        ]. Consequently,
efectively controlling new foci of infestation remains a challenging task. Moreover, pinpointing trees
deceased due to mistletoe infestation can serve as a means to gauge the extent of infestation within a
designated area, where a substantial number of dead trees signal the urgency for phytosanitary
intervention. Despite the existence of various manual methodologies for the surveillance of individual trees,
the integration of UAV imagery alongside classification algorithms naturally emerges as a promising
approach for tackling this issue [
        <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
        ].
      </p>
      <p>
        In the current literature oriented to trees classification based on remote sensing data, Support Vector
Machines and Random Forests have emerged as the most frequently used machine learning
techniques. However, a significant challenge associated with these techniques is the selection of image
features that enhance classification accuracy [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]. In contrast, deep learning strategies, capable of
autonomously learning from data, obviate the need for manual feature engineering [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. Consequently,
widely-used deep learning techniques, such as Convolutional Neural Networks (CNN), have become
high-performance tools suitable for various remote sensing applications [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]. Despite these advances,
few studies focus on automatic methodologies for dead trees inventory, crucial for efective forest
management and ecological monitoring. Notably, the studies by [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ] and [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ] ofer innovative approaches
for utilizing remote sensing data to detect and classify dead trees, emphasizing the need for expanded
research in this critical area.
      </p>
      <p>
        In this work, we propose the use of CNNs for the classification of trees dead by mistletoe, utilizing
multispectral aerial imagery from an urban park in Mexico City. Specifically, we employed two CNN
models, ResNet-34 and DenseNet-121, which have been previously recognized for their ability to identify
unique vegetation patterns, significantly enhancing the classification accuracy of images [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ]. This
investigation marks an initial endeavor to assess mistletoe infestation levels in Mexico’s urban parks.
The creation of automated classification tools for identifying mistletoe species and dead trees holds the
potential to significantly streamline the extensive work typically conducted by forestry experts.
      </p>
      <p>The structure of this document is outlined as follows. Section 2 details the specifications of the
imagery utilized in compiling our dataset and provides an overview of the CNNs employed in our study.
Section 3 elaborates on the methodology for training the models and the experiments conducted to
evaluate their performance. Section 4 delivers the key findings, highlighting the performance metrics
achieved by both models. Finally, Section 5 ofers a discussion on the implications of our research
ifndings and draws conclusions.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Materials and Methods</title>
      <sec id="sec-2-1">
        <title>2.1. Image dataset</title>
        <p>
          Our study area encompasses an urban green space in Mexico City, known as Ramón López Velarde
Garden, located at coordinates [19.4096∘ , -99.1563∘ ]. This park hosts a variety of tree species, e.g.,
Casuarina sp., Eucalyptus sp., Fraxinus sp., Cupressus sp., Ligustrum sp., and Grevillea sp. [
          <xref ref-type="bibr" rid="ref22">22</xref>
          ]. The
mistletoe species Sthruthantus Interruptus is prevalent among many of these trees, significantly altering
the natural landscape. Consequently, the mistletoe infestation has led to the demise of several trees, as
illustrated in Figure 1-(a).
        </p>
        <p>
          To monitor mistletoe propagation, multispectral images were collected on September 22, 2022, as
described in [
          <xref ref-type="bibr" rid="ref23">23</xref>
          ]. For this purpose, an UAV equipped with a five CMOS sensors camera was employed,
covering the spectral regions of Blue (450nm ± 16nm), Green (560nm ± 16nm), Red (650nm ± 16nm),
Red-edge (730nm ± 16nm), and Near-infrared (840nm ± 26nm). In total, 460 images with dimensions
of 1600 × 1300 pixels were registered in both orthogonal and oblique modes along the study area. Of
these, only 111 images revealed the presence of dead trees.
        </p>
        <p>
          As discussed in [
          <xref ref-type="bibr" rid="ref23">23</xref>
          ], the image sets were co-registered prior to analysis to ensure that every pixel
corresponds to the same geographical coordinate across the five bands. Later, binary masks delineating
the presence of three classes —Dead Trees (DT), Green Vegetation (GV), and Man-made Structures
(MS)— were created based on a manual segmentation process from the RGB versions of the original
images; color images were uploaded and annotated within the CVAT application (www.cvat.ai). The
binary masks were then employed for the subsequent training and validation of our models. Figure
1-(b) shows a representative example of the segmented areas for the classes mentioned.
(a) RGB color image.
(b) Segmented image.
        </p>
      </sec>
      <sec id="sec-2-2">
        <title>2.2. Convolutional neural network models</title>
        <p>
          In similar studies oriented towards the classification of tree species, convolutional models such as
Residual Networks (ResNet) and Densely Connected Networks (DenseNet) have shown promising
results based on remote sensing data [
          <xref ref-type="bibr" rid="ref17 ref24 ref25">17, 24, 25</xref>
          ]. Given that both models are pre-trained, they are
capable of reducing computational costs while extracting distinctive vegetation patterns from the
collected images. These models are usually trained on 3-channel images, which could be a limitation,
especially when multispectral sets are available [
          <xref ref-type="bibr" rid="ref26">26</xref>
          ].
        </p>
        <p>
          The unique structure of ResNet’s residual blocks is particularly beneficial for object classification,
enabling the eficient training of deep networks without incurring performance loss as their depth
increases [
          <xref ref-type="bibr" rid="ref27">27</xref>
          ]. In addition to consecutive convolutions and activations within each block, ResNet
features skip connections that utilize identity or convolutional shortcuts to mitigate the vanishing
gradient problem. This design not only facilitates the construction of deeper networks, but also reduces
the complexity by minimizing the number of max-pooling layers required, typically incorporating only
one [
          <xref ref-type="bibr" rid="ref26">26</xref>
          ]. Figure 2 illustrates the architecture of ResNet-34, showcasing its overall structure, which
includes: 32 convolutional layers with a 3 × 3 filter and ReLU activation functions, an average pooling
layer, and a fully connected layer. Observe that a shortcut connection is inserted to every pair of 3 × 3
iflters.
        </p>
        <p>On the other hand, unlike traditional convolutional networks, DenseNet introduces a dense
connectivity structure among layers; that is, each layer is directly connected to all subsequent layers. This
approach ensures eficient information flow and mitigates data loss, a common issue with residual
connections. Additionally, DenseNet controls feature growth through a bottleneck strategy and employs
compression to maintain an optimal number of parameters. This architecture has demonstrated strong
performance across various datasets, attributed to its efective management of information throughout
the network.</p>
        <p>As illustrated in Figure 3, DenseNet features densely interconnected blocks where each layer is directly
connected to every subsequent one. Within each block, the fundamental layers include convolutions,
batch normalization, and ReLU activation functions. The architecture incorporates transition layers
that perform dimensionality reduction, including both spatial and feature-channel compression, to
eficiently manage the flow of information through the network. Additionally, DenseNet generates
compact models that are parameter-eficient and straightforward to train through features reusing. By
concatenating feature maps from diferent layers, the network enriches the feature set available to each
layer, enhancing its representational power. This ability to leverage a rich feature set from across the
network could potentially improve the classification of dead trees by providing a more diverse range of
information for decision-making.</p>
      </sec>
      <sec id="sec-2-3">
        <title>2.3. Hardware and Software</title>
        <p>The experiments described in this manuscript were executed on a workstation with the following
characteristics: Intel(R) Xeon(R) CPU e5-2620 v2 2.10 GHz, 128 GB RAM, Windows 11 (64-bits) as the
operating system.</p>
        <p>Manual segmentation and labeling of aerial images were conducted on the annotation platform
known as CVAT (http://www.cvat.ai). The image preprocessing steps and classification routines were
developed using Python 3.8 and implemented in the PyTorch framework, version 2.2.1.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Experiments</title>
      <p>Given that only 111 images were labeled with the presence of dead trees, a data augmentation process
was applied to the RGB versions of these images. This process included image transformations, such as
rotations by 90∘ , 180∘ , and 270∘ , as well as horizontal and vertical reflections. As a result, a total of 666
images were generated for our experiments.</p>
      <p>For the training and validation of our models, each image in the augmented set was divided into
16 × 16 tiles to enhance and increase the variability of the input data. We then carefully selected tiles
that contained at least 90% of color and texture information of the original images. This selection
process yielded 43,844 tiles for the Dead Tree class, 237,689 for Green Vegetation, and 111,110 for
Man-made Structures. It is noteworthy that the quantities of these last two classes account for about
80% of the total data in our dataset.</p>
      <p>ResNet-34 and DenseNet-121 models were adapted to suit our specific application, with modifications
primarily to their output layers to classify the target classes, while leveraging the pre-trained weights
of the other layers. We conducted an initial experiment using a subset of 5000 tiles per class to assess
the models’ overall performance. This assessment utilized a holdout method, allocating 80% of the
samples for training and 20% for validation, and a 5-fold cross-validation process. Table 1 shows the
performance metrics from this preliminary training, including Precision, Accuracy, F1-score and Recall.</p>
      <p>From the previous results, the overall accuracy with 5-fold cross-validation was slightly higher than
the obtained with the holdout method. For instance, ResNet-34 achieved an accuracy of 0.95 with the
former compared to approximately 0.9 with the latter validation method. However, the main drawback
of 5-fold cross-validation relies on its high computational cost. Therefore, throughout this research,
we opted for the holdout validation method. For both models, we used Cross Entropy Loss as our loss
function, set the batch size to 64, and employed the Adam Optimizer. Table 2 details these and additional
CNN training parameters.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Results</title>
      <p>ResNet-34 and DenseNet-121 were trained on 105,225 tiles, constituting 80% of the entire dataset. The
remaining 26,307 tiles, which make up approximately 20% of the dataset, were allocated for validation.
The models were configured according to the parameters outlined in Table 2 and underwent a training
regimen spanning 100 epochs. The training and validation performances, as depicted in Figure 4-(a)
for ResNet-34 and Figure 4-(b) for DenseNet-121, reveal that the training loss trends closely align with
those of the validation loss across epochs. Similarly, the training and validation accuracy trends show
parallelism. This consistency between training and validation phases indicates an absence of overfitting
and underscores the models’ capability to accurately classify the input images.</p>
      <p>(a) Performance graph obtained from training using ResNet-34.</p>
      <p>(b) Performance graph obtained from training using DenseNet-121.</p>
      <p>Table 3 presents the confusion matrices for both ResNet-34 and DenseNet-121, based on a validation
set consisting of 26,307 tiles. In particular, for ResNet-34 the overall accuracy reached 96.68%, with
the DT class achieving an accuracy of 95.80%, primarily due to confusions with the MS class (about
3.4%). The accuracies for the GV and MS classes were 98.89% and 95.33%, respectively. Comparatively,
DenseNet-121 showed an improvement in classification accuracy across all classes. Specifically, the DT
class had 97.89% of its samples correctly classified, the MS class reached an accuracy of around 96.00%,
and the GV class achieved 98.72% of accuracy. Therefore, the overall performance of DenseNet-121 was
approximately 97.53%.</p>
      <p>Additionally, Table 4 presents the performance metrics for ResNet-34 and DenseNet-121, obtained
using a holdout validation on the entire dataset. Coincidentally, both models exhibit almost the same
metric values: 0.96 for ResNet-34 and 0.97 for DenseNet-121. These results indicate that DenseNet-121</p>
      <p>
        It is interesting to denote that the classification process conducted by the trained models can be
visualized as heat maps, aimed at demonstrating how the models identify specific classes of interest,
such as dead trees. In order to generate a heat map, every pixel of an input image is processed through
the softmax function for the DT class, which then assigns a probability value between [
        <xref ref-type="bibr" rid="ref1">0,1</xref>
        ] to it. This
process results in the creation of a map, whose red tones represent maximum probabilities, while purple
ones correspond to low values. Examples of heat maps produced for the DT class are displayed in Figure
5 for ResNet-34 and DesNet-121, respectively.
      </p>
    </sec>
    <sec id="sec-5">
      <title>5. Discussion and Conclusion</title>
      <p>This study addresses the growing concern of mistletoe infestation within urban green spaces, particularly
focusing on an urban park in Mexico City. The detrimental impact of mistletoe on tree health and,
consequently, on the ecological balance and aesthetic value of urban parks requires for innovative
monitoring and management strategies. By taking advantage of the high spatial resolution capabilities
of aerial imagery, we relied on the pattern recognition potential of CNNs, specifically ResNet-34 and
DenseNet-121, for the automated classification of dead trees afected by mistletoe. To compensate for
the limited availability of labeled images, we realized a data augmentation process, which was followed
by the segmentation of these images into tiles to enhance the variability and richness of the input data
for model training. Through this approach, we aimed to develop a robust, eficient, and scalable solution
for the classification task.</p>
      <p>The challenge posed by the limited quantity of labeled images depicting dead trees required the
implementation of a data augmentation strategy to enhance our dataset. Hence, by applying rotations
and reflections to the original images, we expanded our dataset sixfold, ensuring a more
comprehensive representation of the classes of interest. Further refinement was achieved by segmenting these
augmented images into 16 × 16 tiles, a process that not only amplified the diversity of the training
data, but also allowed for a more granular analysis of the imagery. The selection process of those tiles
preserving at least 90% of color and texture information of the original images was instrumental in
curating a dataset that closely mimics the variability encountered in natural settings and allows a more
eficient training for CNN models.</p>
      <p>In adapting the ResNet-34 and DenseNet-121 models to our specific classification task, we tailored
the output layers to distinguish between Dead Trees, Green Vegetation, and Man-made Structures,
(a) ResNet-34.</p>
      <p>(b) DenseNet-121.
while retaining the pre-trained weights of other layers to capitalize on the models’ inherent strengths.
This strategic use of transfer learning facilitated an eficient training process, enabling the models
to adeptly identify distinctive vegetation patterns. The initial experiment, which utilized a balanced
subset of 5000 tiles per class, revealed the capabilities of both models. DenseNet-121, in particular,
demonstrated a slight edge over ResNet-34, achieving higher performance metrics across Precision,
Accuracy, Recall, and F1-score. For the task described in this work, DenseNet-121 emerged as a better
option than ResNet-34 in terms of the ability to manage and leverage feature information, likely due to
its densely connected architecture.</p>
      <p>The evaluation of model performance employed two validation methods: holdout and 5-fold
crossvalidation. Despite the slightly superior accuracy achieved by a 5-fold cross-validation, the significant
computational resources it demanded led us to predominantly utilize the holdout method for our
experiments. This pragmatic approach balanced eficiency with eficacy, ensuring robust model assessment
without unduly taxing computational resources. The eficiency of the models, particularly DenseNet-121,
is evident by the resulting performance metrics, but the closely aligned training and validation trends
point also to the model’s robust generalization capabilities.</p>
      <p>The utility of the trained models extends beyond numerical metrics to ofer intuitive visual insights
through the generation of heat maps; these maps serve as a visual proxy, showing the models’
interpretative process by highlighting areas within the input images deemed significant for classification,
a fundamental aim in explainable artificial intelligence (XAI). Particularly for the DT class, the heat
maps efectively demarcate regions identified by the models as indicative of dead vegetation, with
varying intensities reflecting the confidence levels of the models. By translating the models’ complex
decision-making into a comprehensible format, these heat maps not only validate the accuracy of the
models, but also provide a valuable tool for forestry specialists. Such visual aids facilitate a deeper
understanding of mistletoe spread patterns and can significantly augment traditional surveillance
methods, ofering a novel lens through which to assess and strategize phytosanitary interventions in
urban green spaces.</p>
      <p>This research demonstrates the efective application of CNNs, specifically ResNet-34 and
DenseNet121, in classifying dead trees within urban parks using UAV-derived imagery. Through data augmentation
and tile-based analysis, we have established a robust framework that not only enhances the dataset’s
diversity, but also refines the training process of the models. The superior performance of DenseNet-121
suggests the potential of densely connected architectures in handling complex classification tasks. The
complementary use of heat maps for visual interpretation further accentuates the practical utility of
these models in urban forestry management. Future research could explore the integration of additional
spectral bands to leverage the full potential of multispectral imagery, the inclusion of larger and more
varied datasets to improve model robustness, and the application of these methodologies to other urban
ecological environments.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>V.</given-names>
            <surname>Mehta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Mahato</surname>
          </string-name>
          ,
          <article-title>Designing urban parks for inclusion, equity, and diversity</article-title>
          ,
          <source>Journal of Urbanism: International Research on Placemaking and Urban Sustainability</source>
          <volume>14</volume>
          (
          <year>2020</year>
          )
          <fpage>457</fpage>
          -
          <lpage>489</lpage>
          . doi:
          <volume>10</volume>
          .1080/17549175.
          <year>2020</year>
          .
          <volume>1816563</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>J.</given-names>
            <surname>Fernandez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Song</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Padua</surname>
          </string-name>
          , P.-C.
          <article-title>Liu, A framework for urban parks</article-title>
          ,
          <source>Landscape Journal</source>
          <volume>41</volume>
          (
          <year>2022</year>
          )
          <fpage>15</fpage>
          -
          <lpage>29</lpage>
          . doi:
          <volume>10</volume>
          .3368/lj.41.1.15.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>V.</given-names>
            <surname>Kasyanov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Silin</surname>
          </string-name>
          ,
          <article-title>Method for multi-criteria evaluation of urban parks</article-title>
          ,
          <source>IOP Conference Series: Materials Science and Engineering</source>
          <volume>687</volume>
          (
          <year>2019</year>
          ). doi:
          <volume>10</volume>
          .1088/
          <fpage>1757</fpage>
          -899X/687/5/055040.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Gao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Fan</surname>
          </string-name>
          ,
          <article-title>Big data-based evaluation of urban parks: A chinese case study</article-title>
          ,
          <source>Sustainability</source>
          (
          <year>2019</year>
          ). doi:
          <volume>10</volume>
          .3390/SU11072125.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>K.</given-names>
            <surname>Pristouris</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Nakos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Stavrakas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. I.</given-names>
            <surname>Kotsopoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Alexandridis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. S.</given-names>
            <surname>Barda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. P.</given-names>
            <surname>Ferentinos</surname>
          </string-name>
          ,
          <article-title>An integrated system for urban parks touring and management</article-title>
          , Urban
          <string-name>
            <surname>Science</surname>
          </string-name>
          (
          <year>2021</year>
          ). doi:
          <volume>10</volume>
          . 3390/urbansci5040091.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>J.</given-names>
            <surname>Zhu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Lu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Zheng</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Rong</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Yan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Tang</surname>
          </string-name>
          ,
          <article-title>Vitality of urban parks and its influencing factors from the perspective of recreational service supply, demand, and spatial links</article-title>
          ,
          <source>International Journal of Environmental Research and Public Health</source>
          <volume>17</volume>
          (
          <year>2020</year>
          ). doi:
          <volume>10</volume>
          .3390/ijerph17051615.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>G. D.</given-names>
            <surname>Moore</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Hopkins</surname>
          </string-name>
          ,
          <article-title>Urban parks and protected areas: on the front lines of a pandemic (</article-title>
          <year>2021</year>
          )
          <fpage>73</fpage>
          -
          <lpage>84</lpage>
          . doi:
          <volume>10</volume>
          .2305/IUCN.CH.
          <year>2021</year>
          .PARKS-27-SIGM.EN.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>L.</given-names>
            <surname>Skrypnik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Maslennikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Feduraev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Pungin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Belov</surname>
          </string-name>
          ,
          <article-title>Ecological and landscape factors afecting the spread of european mistletoe (viscum album l.) in urban areas (a case study of the kaliningrad city</article-title>
          , russia),
          <source>Plants</source>
          <volume>9</volume>
          (
          <year>2020</year>
          ). doi:
          <volume>10</volume>
          .3390/plants9030394.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>D.</given-names>
            <surname>Alvarado-Rosales</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. D. L.</given-names>
            <surname>Saavedra-Romero</surname>
          </string-name>
          ,
          <article-title>Tree damage and mistletoe impact on urban green areas</article-title>
          ,
          <source>Revista Árvore</source>
          (
          <year>2021</year>
          ). doi:
          <volume>10</volume>
          .1590/1806-908820210000030.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>L.</given-names>
            <surname>Skrypnik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Maslennikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Feduraev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Pungin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Belov</surname>
          </string-name>
          ,
          <article-title>Specific features of the response of the antioxidant system of urban trees to mistletoe infection, E3S Web of Conferences (</article-title>
          <year>2021</year>
          ). doi:
          <volume>10</volume>
          .1051/e3sconf/202129102013.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>E.</given-names>
            <surname>Sreenivasan</surname>
          </string-name>
          ,
          <article-title>Occurrence of mistletoe (loranthus spp.) infestation on garden croton (codiaeum variegatum) and other host trees (</article-title>
          <year>2020</year>
          ). doi:
          <volume>10</volume>
          .33564/ijeast.
          <year>2020</year>
          .
          <year>v04i10</year>
          .
          <fpage>010</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>M.</given-names>
            <surname>Miraki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Sohrabi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Fatehi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Kneubuehler</surname>
          </string-name>
          ,
          <article-title>Detection of mistletoe infected trees using uav high spatial resolution images</article-title>
          ,
          <source>Journal of Plant Diseases and Protection</source>
          <volume>128</volume>
          (
          <year>2021</year>
          )
          <fpage>1679</fpage>
          -
          <lpage>1689</lpage>
          . doi:
          <volume>10</volume>
          .1007/s41348-021-00502-6.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>P. A.</given-names>
            <surname>Mejia-Zuluaga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Dozal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Valdiviezo-N.</surname>
          </string-name>
          ,
          <article-title>Genetic programming approach for the detection of mistletoe based on uav multispectral imagery in the conservation area of mexico city</article-title>
          ,
          <source>Remote Sensing</source>
          <volume>14</volume>
          (
          <year>2022</year>
          ). doi:
          <volume>10</volume>
          .3390/rs14030801.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>A.</given-names>
            <surname>Abdollahnejad</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Panagiotidis</surname>
          </string-name>
          ,
          <article-title>Tree species classification and health status assessment for a mixed broadleaf-conifer forest with uas multispectral imaging</article-title>
          ,
          <source>Remote. Sens</source>
          .
          <volume>12</volume>
          (
          <year>2020</year>
          )
          <article-title>3722</article-title>
          . doi:
          <volume>10</volume>
          .3390/rs12223722.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>A.</given-names>
            <surname>Safonova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Hamad</surname>
          </string-name>
          , E. Dmitriev,
          <string-name>
            <given-names>G.</given-names>
            <surname>Georgiev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Trenkin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Georgieva</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Dimitrov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Iliev</surname>
          </string-name>
          ,
          <article-title>Individual tree crown delineation for the species classification and assessment of vital status of forest stands from uav images</article-title>
          ,
          <source>Drones</source>
          (
          <year>2021</year>
          ). doi:
          <volume>10</volume>
          .3390/drones5030077.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>P. A.</given-names>
            <surname>Mejia-Zuluaga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. F.</given-names>
            <surname>Dozal-García</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Valdiviezo-Navarro</surname>
          </string-name>
          ,
          <article-title>Detection of phoradendron velutinum implementing genetic programming in multispectral aerial images in mexico city</article-title>
          ,
          <source>Lecture Notes in Geoinformation and Cartography</source>
          (
          <year>2022</year>
          )
          <fpage>109</fpage>
          -
          <lpage>129</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>030</fpage>
          -98096-
          <issue>2</issue>
          _
          <fpage>9</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>L.</given-names>
            <surname>Hui</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Baoxin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Qian</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Linhai</surname>
          </string-name>
          ,
          <article-title>Cnn-based individual tree species classification using high-resolution satellite imagery and airborne lidar data</article-title>
          ,
          <source>Forests</source>
          <volume>12</volume>
          (
          <year>2021</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>Z. X.</given-names>
            ,
            <surname>T. D.</surname>
          </string-name>
          ,
          <string-name>
            <surname>M. L.</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Xia</surname>
          </string-name>
          ,
          <string-name>
            <surname>Z. L.</surname>
          </string-name>
          ,
          <string-name>
            <surname>X. F.</surname>
          </string-name>
          ,
          <string-name>
            <surname>F. F.</surname>
          </string-name>
          ,
          <article-title>Deep learning in remote sensing: a comprehensive review and list of resources</article-title>
          ,
          <source>IEEE Geoscience and Remote Sensing Magazine</source>
          <volume>5</volume>
          (
          <year>2017</year>
          )
          <fpage>8</fpage>
          -
          <lpage>36</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>V.</given-names>
            <surname>Mosin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Aguilar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Platonov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Vasiliev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Kedrov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Ivanov</surname>
          </string-name>
          ,
          <article-title>Remote sensing and machine learning for tree detection and classification in forestry applications</article-title>
          ,
          <source>Proc. SPIE 11155</source>
          ,
          <article-title>Image and Signal Processing for Remote Sensing XXV,</article-title>
          <year>111550F</year>
          (
          <year>2019</year>
          ). doi:
          <volume>10</volume>
          .1117/12.2531820.
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <surname>M.-C.</surname>
            Jutras-Perreault,
            <given-names>T.</given-names>
          </string-name>
          <string-name>
            <surname>Gobakken</surname>
            , E. Naesset,
            <given-names>H.</given-names>
          </string-name>
          <string-name>
            <surname>Ørka</surname>
          </string-name>
          ,
          <article-title>Comparison of diferent remotely sensed data sources for detection of presence of standing dead trees using a tree-based approach</article-title>
          ,
          <source>Remote Sensing</source>
          <volume>15</volume>
          (
          <year>2023</year>
          )
          <article-title>2223</article-title>
          . doi:
          <volume>10</volume>
          .3390/rs15092223.
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>K. P.</given-names>
            <surname>Ferentinos</surname>
          </string-name>
          ,
          <article-title>Deep learning models for plant disease detection and diagnosis, Computers and Electronics in Agriculture 145 (</article-title>
          <year>2018</year>
          )
          <fpage>311</fpage>
          -
          <lpage>318</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.compag.
          <year>2018</year>
          .
          <volume>01</volume>
          .009.
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>L.</given-names>
            <surname>Muñoz-Gutiérrez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Pérez-Miranda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Reséndiz-Martínez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Reyes-Robles</surname>
          </string-name>
          , Caracterización de árboles de riesgo en el parque nacional viveros de coyoacán, ciudad de méxico,
          <source>Revista Mexicana de Ciencias Forestales</source>
          <volume>13</volume>
          (
          <year>2022</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <given-names>P. A.</given-names>
            <surname>Mejia-Zuluaga</surname>
          </string-name>
          , J. C. V.
          <string-name>
            <surname>-N. B</surname>
          </string-name>
          , L. Dozal,
          <article-title>Texture descriptors and machine learning algorithms for mistletoe detection in urban forests using multispectral imagery, Remote Sensing for Agriculture, Ecosystems,</article-title>
          and
          <string-name>
            <surname>Hydrology</surname>
            <given-names>XXV</given-names>
          </string-name>
          12727 (
          <year>2023</year>
          ). doi:
          <volume>10</volume>
          .1117/12.2684136.
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <given-names>S.</given-names>
            <surname>Natesan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Armenakis</surname>
          </string-name>
          , U. Vepakomma,
          <article-title>Resnet-based tree species classification using uav images, in: The International Archives of the Photogrammetry</article-title>
          ,
          <source>Remote Sensing, and Spatial Information Sciences, volume XLII-2/W13</source>
          ,
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>G.</given-names>
            <surname>Huang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Pleiss</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Maaten</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Weinberger</surname>
          </string-name>
          ,
          <article-title>Convolutional netwroks with dense connectivity</article-title>
          ,
          <source>IEEE Transactions on Pattern Analysis and Machine Intelligence</source>
          (
          <year>2019</year>
          )
          <fpage>8704</fpage>
          -
          <lpage>8716</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>T.</given-names>
            <surname>Kattenborn</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Leitlof</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Schiefer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Hinz</surname>
          </string-name>
          ,
          <article-title>Review on convolutional neural networks (cnn) in vegetation remote sensing</article-title>
          ,
          <source>ISPRS Journal of Photogrammetry and Remote Sensing</source>
          <volume>173</volume>
          (
          <year>2021</year>
          )
          <fpage>24</fpage>
          -
          <lpage>49</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.isprsjprs.
          <year>2020</year>
          .
          <volume>12</volume>
          .010.
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>A.</given-names>
            <surname>Dimou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Ataloglou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Dimitropoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Álvarez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Daras</surname>
          </string-name>
          ,
          <article-title>Lds-inspired residual networks</article-title>
          ,
          <source>IEEE Transactions on Circuits and Systems for Video Technology</source>
          <volume>29</volume>
          (
          <year>2019</year>
          )
          <fpage>2363</fpage>
          -
          <lpage>2375</lpage>
          . doi:
          <volume>10</volume>
          . 1109/TCSVT.
          <year>2018</year>
          .
          <volume>2869680</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>