<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Machine learning to classify animal species
in camera trap images: applications in ecology. Methods in Ecology and Evolution</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>A Literature Review on Real-Time Image Classification for Dragonfly Species Using TensorFlow.js and Biodiversity Monitoring.</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Orion Lici</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Inva Bilo</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Karafil Kareci</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ana Ktona</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>1,3,4 Department of Informatics, Faculty of Natural Sciences, University of Tirana</institution>
          ,
          <addr-line>Tirana</addr-line>
          ,
          <country country="AL">Albania</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Department of Informatics, Faculty of Natural Sciences, University of Gjirokastra</institution>
          ,
          <addr-line>Tirana</addr-line>
          ,
          <country country="AL">Albania</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2019</year>
      </pub-date>
      <volume>10</volume>
      <issue>4</issue>
      <fpage>309</fpage>
      <lpage>321</lpage>
      <abstract>
        <p>In this paper we will explore TensorFlow.js, a JavaScript library for using machine learning in browsers. The main goal is to compare and classify different species of dragonfly based on their visual characteristics by using machine learning models and this paper prepares the ground. The study investigates the effectiveness of TensorFlow.js for analyzing and classifying images in real time and shows the potential of Artificial Intelligence to assist in ecological studies, biodiversity conservation, and entomological research. Based on that classification, images can be stored in a back-end application, by having a store confirmation button in the front-end application. The system should use a camera to capture images, classify them using a convolutional neural network (CNN) model, and store the classified images. The performance of the system will be evaluated based on accuracy, speed, and scalability. TensorFlow.js, machine learning, browser-based, biodiversity conservation, entomological research, dataset diversity, convolutional neural network</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <sec id="sec-1-1">
        <title>The rapid advancement of AI has opened the door for more efficient</title>
        <p>
          and automated processes in various fields like medicine, education,
tourism [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ][
          <xref ref-type="bibr" rid="ref11">11</xref>
          ];[
          <xref ref-type="bibr" rid="ref7">7</xref>
          ]; [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ];[
          <xref ref-type="bibr" rid="ref8">8</xref>
          ], biodiversity from environmental
monitoring to wildlife protection [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ]; [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ];[
          <xref ref-type="bibr" rid="ref16">16</xref>
          ]; [
          <xref ref-type="bibr" rid="ref22">24</xref>
          ], and so on.
One type of application is the real-time classification and detection
of different species of insects, such as dragonflies, using AI image
recognition technologies. Dragonflies, with their diversity and
ecological significance, serve as an ideal subject for studying the
potential of real-time image classification in biodiversity research.
        </p>
      </sec>
      <sec id="sec-1-2">
        <title>This literature review explores the different usage of TensorFlow.js,</title>
        <p>a powerful JavaScript library, for creating real-time image classification and object detection system. The
focus is on comparing various dragonfly species by leveraging pre-trained models and custom datasets.
TensorFlow.js offers huge opportunity to run machine learning models directly in the browser through the
development of lightweight, real-time solutions without consuming extensive server-side infrastructure,
and offering more data privacy as the row data are processed on client browser.</p>
        <p>The main goal of this literature review is to contribute to the growing field of real-time wildlife monitoring
through innovative AI techniques and provide a clear path for implementing similar solutions in various
ecological fields.</p>
        <p>This paper is organized in “Background” section that mentions different AI technologies. Then a
comparison between them according to our needs, following up with the limitations of each of them. Also
describing actual role of IA and possible improvement by having a look at similar systems.</p>
      </sec>
    </sec>
    <sec id="sec-2">
      <title>1. Background</title>
      <p>
        Convolutional Neural Networks (CNNs) are a special class of deep neural networks created to handle grid
data structures, such as images, with remarkable success. One of the first works in CNNs dates to 1989. It
was the development of the LeNet-5 architecture, which paved the way for modern image classification
tasks [
        <xref ref-type="bibr" rid="ref23">25</xref>
        ].
      </p>
      <p>
        CNNs are applied in different domains, from medical image analysis [20] to autonomous driving systems
[
        <xref ref-type="bibr" rid="ref20">22</xref>
        ]. CNNs have also been applied to classify and detect various animal species from images [
        <xref ref-type="bibr" rid="ref19">21</xref>
        ]. Their
flexibility in recognizing different patterns has made CNNs the most used architecture for image
classification tasks in academic research and practical applications.
      </p>
      <p>TensorFlow.js is a JavaScript library developed by Google. It can be used for both training and deploying
machine learning models in the browser. This way, it powers users with machine learning features without
the need for server-side infrastructure. This feature is very useful for real-time applications, such as the live
monitoring of species, where fast data processing is a key point.</p>
      <p>TensorFlow.js gives you the possibility to deploy pre-trained models as well as training new models directly
within the browser environment. This is how TensorFlow significantly reduces latency and improves the
user experience. It sustains a variety of tasks, among them, image classification, object detection, and
natural language processing, making it one of the best tools for different types of application.
One of the main advantages of using TensorFlow.js is the fact that it can be integrated into new and existing
web-based applications with minimal effort. As we already know, Web-based applications can easily
incorporate real-time image classification and object detection functionalities without the need of an
external server or expensive infrastructure. This gives us a lightweight application for real-time
environmental monitoring and wildlife tracking, as is the case in the dragonfly species classification
explored in this thesis.</p>
      <p>
        TensorFlow.js also allows us to use trained models for object detection tasks on other platforms such as
e.g., MobileNet, COCO-SSD [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. These models, trained on TensorFlow or other platforms, can be
improved by using the dragonfly image dataset through the technique of transfer learning [
        <xref ref-type="bibr" rid="ref24">26</xref>
        ]; [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] However,
the ability to train models directly within the browser offered by TensorFlow offers the AI developer unique
opportunities to interact with and customize the application.
      </p>
    </sec>
    <sec id="sec-3">
      <title>2. Comparison between TensorFlow.js and other Machine Learning</title>
    </sec>
    <sec id="sec-4">
      <title>Frameworks for Web-Based Applications</title>
      <p>TensorFlow.js is one of the most powerful frameworks for web-based applications but, as in all fields,
here too there exist competitors. Each has their own distinctive features. Let’s try to wrap up the
differences among them, as well as their advantages and drawbacks.</p>
      <p></p>
      <p>
        TensorFlow.js vs. TensorFlow (Python-based) - TensorFlow (the Python-based library) is
practically the predecessor of TensorFlow.js. [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. It offers a comprehensive ecosystem, lots of tools,
libraries and pre-trained models. All these features make it one of the most perfect frameworks for
complex machine learning tasks. TensorFlow offers a lot of flexibility and power by easing the
training process for large models and manipulating large datasets. However, it is a server-side
framework and it requires sending information from client to server, processing it on the server side
and sending the result back to the client. as a result, it requires considerable bandwidth and expose
the end user to security concerns. On the other hand, TensorFlow.js’s ability to run directly in the



browser without requiring server-side processing provides us with enhanced security features and
significantly less resources needed for the network and server-side. This is perfect for applications
that require real-time inference without reliance on a network connection.
      </p>
      <p>
        TensorFlow.js vs. PyTorch (Python-based) - PyTorch is another powerful deep learning framework.
Its main strength is dynamic computational graph, and it is very easy to use. While it offers
competitive tools for training models, it is still a server-side framework [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. This constrains force
developers to a usforcinger web-based framework such as ONNX or simply using it as a backend,
thus creating the same drawback as TensorFlow (Python-based). On the other hand, TensorFlow.js
natively supports real-time deployment within the browser. This feature makes it more efficient for
web-based applications that require low-latency predictions.
      </p>
      <p>
        TensorFlow.js vs. Keras.js - Another frontend library in this field is Keras.js [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]. It offers a
lightweight solution for the interface. The main drawback of this framework is that it cannot train
the models by themselves. Instead, it relies on pretrained models on Keras (Python-based).
Meanwhile, TensorFlow.js is more versatile due to its functionality in supporting both model
training and interface. Furthermore, it also offers the functionality to use pre-trained models on
TensorFlow (Python-based). This extended capability makes it far more competitive than Keras.js
TensorFlow.js vs. ONNX.js - ONNX.js is a Javascript library for running ONNX models on
browsers and on Node.js[
        <xref ref-type="bibr" rid="ref4">4</xref>
        ][
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. It has adopted WebAssembly and WebGL technologies to provide
an optimized ONNX model inference runtime for both CPUs and GPUs. With ONNX.js, web
developers can score pre-trained ONNX models directly on browsers with various benefits
including a reduction in server-client communication and protection of user privacy, as well as
offering an install-free and cross-platform in-browser ML experience. The main drawback
compared with TensorFlow.js is that it cannot train new models on the web. In addition, the lack of
active maintenance is a downside because this library is in process to be replaced by the “ONNX
Runtime Web” library.
      </p>
    </sec>
    <sec id="sec-5">
      <title>3. Existing Methods for Species Identification and Their Limitations</title>
      <p>Species identification is a challenging task. It is crucial in ecological studies, especially when dealing with
biodiversity monitoring, conservation efforts and environmental protection. Manual methods for species
identification are derived from field-based knowledge. It requires experts or taxonomists to visually inspect
different species and classify them based on physical characteristics, behavior or other distinguishing
features. This method requires one of the following techniques.</p>
      <p>Field Observations - This method requires the expert to observe species in the wild. This assumes that
experts master a sharp knowledge of the visual characteristics, behavior and ecology. In some situations,
for specific species, it could be effective, but it is a time-consuming task. It is also limited by the physical
location of the habitat and its level of accessibility. Due to these limitations the study by McClinton et al.
(2022) uses both field observations (on-the-ground data) and remote sensing tools to identify the most
significant threats to critically endangered or rare plant species in Nevada.</p>
      <p>
        Morphological Identification – Physical characteristics are the focus of this technique including -but
not limited to- size, color, shape, etc. [
        <xref ref-type="bibr" rid="ref21">23</xref>
        ] developed a convolutional neural network method. Morphological
and molecular data for species identification are integrated into the morphology-molecule network
(MMNet). This is extremely difficult, especially for species that mimic others or for species at different life
stages. Furthermore, it has been proven extremely difficult for closely related species. All of these
complications can lead to non-feasible usage of this methodology in the field.
      </p>
      <p>
        Taxonomic Keys - These are dichotomous keys that allow the user to determine the identity of items
using a sequence of alternative choices [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Dichotomous keys always give two mutually exclusive choices
in parallel statements. The user makes a choice about a particular characteristic of an organism and is led
to a new branch or couplet of the key. This technique is complex, especially for non-experts and
fieldamateurs.
      </p>
      <p>Despite the constant improvements that are leading to better and faster classification, these techniques have
proven to be prone to errors and is overall a slow process. Another huge limitation is the level of expertise
required for this task. All these limitations lead to the lack of scalability and efficiency of these techniques.</p>
    </sec>
    <sec id="sec-6">
      <title>4. Role of Machine Learning in Biodiversity Monitoring</title>
      <p>
        Over the last number of years, machine learning (ML) has dramatically improved species identification,
offering a highly performant process [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Another benefit is that it can be scalable very easily. The latest
improvements in ML also offer a higher level of accuracy than that offered by traditional methods. This
latest development in ML enabled large-scale biodiversity monitoring by significantly reducing costs.
Applying machine learning in biodiversity monitoring requires various techniques and areas, which
includes image recognition and classification, audio analysis and environmental data processing.
      </p>
    </sec>
    <sec id="sec-7">
      <title>4.1 Advantages of ML for Biodiversity Monitoring</title>
      <p>Image Recognition and Classification - One of the most useful applications of machine learning is
Convolutional Neural Networks (CNNs) for image-based identification. This important feature is also
applied in species identification. The application of CNN in combination with camera traps, drones or
smartphones has been very useful for image-based species identification. Based on these techniques, many
models have been designed, such as ResNet, Inception, and MobileNet. They offer an automatic species
identification in large datasets, which would have been impossible to process and analyze manually.</p>
      <p>
        Object Detection - Image classification is a very useful feature, but it is not enough. Object detection
models like YOLO (You Only Look Once) and Faster R-CNN, can distinguish and precisely locate multiple
species in a single image. Working beyond image classification, these models can distinguish different
species in addition to identifying their precise location within the image. This data can be used for other
biological analysis regarding animal behavior, their density per square meters, or biodiversity distribution
[
        <xref ref-type="bibr" rid="ref15">15</xref>
        ].
      </p>
      <p>
        Environmental Data Integration - Machine learning can be used to predict species distribution and
monitor biodiversity over time based on environmental changes. These environmental changes can include
-but are not limited to- climate change, soil pollution, vegetation type, etc. Some techniques like Random
Forests and Gradient Boosting are used to develop models that predict species frequency or their presence
based on environmental factors [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
      </p>
      <p>By using a combination of machine learning and existing ecological data, we could achieve some
advantages as set out below:</p>
      <p>Scalability - Machine learning models do not have the limitations faced by human experts. They can
process huge datasets, examine millions of images or other electronic data far faster and without
complaining. This makes it possible to monitor large areas that have multiple species to monitor.</p>
      <p>Automation - Machine learning algorithms can offer real-time species identification. This leads us to
automatic selection. So, while using camera traps or live audio recordings we can instantly gather results
from the field about the species we are interested in.</p>
      <p>Accessibility - By simplifying the selection process through machine learning, we can settle in this
process also non-expert but enthusiastic naturalists. This is important when many citizens are invited to
contribute to their local environment.</p>
      <p>Improved Accuracy - Machine learning models, when trained properly and highly tuned with real life
datasets, especially those that are considerably large, can offer astonishing levels of accuracy in species
identification. They always surpass the field expert.</p>
      <p>That being said, while machine learning in biodiversity can be a very powerful tool, there are also some
limitations which include:</p>
      <p>Data Quality and Quantity - Machine learning models must be provided with large, high-quality
datasets for training. In some cases, such datasets may be difficult to provide. This can jeopardize the
creation of new models for new or rare species. [19].</p>
      <p>Model Generalization - One model trained in specific environmental data may not perhaps be useful in
gathering new environment data or similar species beyond its particular dataset. This limitation requires
our models to be frequently tuned with new data.</p>
      <p>Bias in Training Data - There are some cases when training data is biased regarding habitat or
environmental conditions. Machine learning could wrongly develop new models based on biased
predictions. This could lead to potentially incorrect classifications for that species.</p>
      <p>One famous tool for developing, training and deploying machine learning models in web browsers is
TensorFlow.js. It is built by the same team that previously built the TensorFlow (Python-based) library,
which is why they share so many common features. However, they have their differences too. TensorFlow
supports a number of programming languages, including Python and Java.TensorFlow.js is built only on
JavaScript, this way it can be widely used via the web-browser. This feature makes it the perfect framework
for real-time machine learning applications. This model enables us to empower environmental monitoring
and ecological research.</p>
      <p>Real-Time Inference - The capability to operate on the front-end, offering real-time inference without
the need of a back-end server support makes it one of the perfect tools in this field. Processing machine
learning models directly on the front-end eliminates the latency required by other back-end-based
competitors. This way it offers immediate feedback from machine learning predictions. It is this feature that
gives TensorFlow.js a huge advantage in ecological monitoring where data captured by cameras could be
processed in real time.</p>
    </sec>
    <sec id="sec-8">
      <title>4.2TensorFlow.js approach</title>
      <p>Browser-Based Deployment - The main advantage of TensorFlow.js is the fact that it can run machine
learning models directly in the web-browser or on Node.js. This makes it usable on almost all mobile
devices that have their own operating system, such as smartphones or other portable devices. It eliminates
the necessity for a cutting edge back-end infrastructure like the cloud and makes it the perfect tool for
processing machine learning models on devices with reduced resources. TensorFlow.js enables lightweight
web applications to be powered with real-time application processing, making it accessible on any device
with a web browser. This factor is the main reason for such a huge expansion in ecological monitoring
projects.</p>
      <p>No Server-Side Dependency - TensorFlow.js runs on the client-side, in this way it does not require a
cutting edge back-end environment. All it needs is simply a normal back-end to host it as a normal web
application. All calculations are made on the client-side, making it in practice a real-time and low-latency
application. This feature is extremely beneficial in scenarios where data is generated and manipulated in
the field and thus perfectly matches our case. It also keeps maintenance and operational costs at an
extremely low level, because we can use many small devices to minimize server bottlenecks.</p>
      <p>Ease of Integration with Web Technologies - TensorFlow.js is practically a JavaScript Library. This way
it can perfectly be integrated into web technologies with other JavaScript libraries, HTML5 and CSS3. It
makes it possible for developers to easily incorporate machine learning models into web-based interfaces,
providing developers an easy way to create intuitive applications with which to interact. For example, in
ecological studies where we need to process locally data captured in the field and send the results back to
the server, TensorFlow.js offers a platform to develop real-time applications.</p>
      <p>Support for Pre-Trained Models - TensorFlow.js is created by the same team that created
TensorFlow(python-based). All models created with other machine learning are also compatible with
TensorFlow.js. Therefore, pre-trained models (such as MobileNet, COCO-SSD, or PoseNet) that are already
optimized can be easily used in TensorFlow.js by simply importing them into the web browser.</p>
      <p>Cross-Platform Compatibility - A wide range of platforms support TensorFlow.js from computer
browsers to smartphones, leading to the use of that kind of application across diverse environments. This
feature is crucial for ecological research as it offers the possibility to use the system in both environments,
in field settings and laboratory conditions.</p>
    </sec>
    <sec id="sec-9">
      <title>5. Applicability of TensorFlow.js with Biodiversity Monitoring</title>
      <p>Artificial intelligence applications, especially those in machine learning frameworks built on TensorFlow.js
for real-time ecological research are becoming the normal standard nowadays. Next, we will take a look at
some interesting types of machine learning used for monitoring biodiversity and species identification.</p>
      <p>Camera Trap Species Identification - One very serious work on this field is that of [19] who applied
deep learning algorithms to camera trap images. The purpose was to identify wildlife species in real-time,
quite the same idea that we aim to achieve in our work. Their system offers automatic classification of the
animals directly from camera traps, in this way eliminating the need for expert classification. TensorFlow.js,
can be perfectly used in such scenarios to power a browser-based system with real-time identification
features from camera trap images. This feature will speed up the process and make research more
economically feasible and faster than traditional methods.</p>
      <p>Insect Species Detection - When it comes to insect species detection, the identification process is far
more difficult due to their small size and similar appearances across species. By using a combination of
machine learning models and high-resolution images we could achieve our goal. TensorFlow.js is a very
powerful tool for implementing this kind of real-time system in mobile applications or web platforms,
providing researchers the possibility to collect and analyze data in the field. In this way it contributes to a
more accurate assessment of biodiversity.</p>
    </sec>
    <sec id="sec-10">
      <title>6. Conclusion</title>
      <p>TensorFlow.js is a powerful tool for creating real-time applications for ecological research. It is especially
valuable in species identification and biodiversity monitoring. Its feature to power in-browser machine
learning, without the need for a cutting-edge server-side infrastructure makes it a very popular tool used in
real-time applications. The case studies presented here are a perfect example that shows the power of
machine learning in ecological research. It becomes more crucial when it comes to automated species
identification in real-time. More and more ecological research projects use web-based technologies in
combination with machine learning, so it is expected to lead to an exponential growth of TensorFlow.js
usage, making the entire process more accessible, scalable and efficient. Additionally, the veteran tool in
this field, CNN, is a consolidated solution since it runs on the server-side and can be used for more complex
solutions such as high-resolution images with complicated features. The accuracy level of both
TensorFlow.js and CNN are identical, as both have been built by the same team behind TensorFlow
(pythonbased); that being said, CNN can run faster because it is generally sustained by a more powerful hardware
infrastructure. The advantage of TensorFlow.js is that it offers low latency. As system availability and
accuracy improves and models are becoming more and more accurate, it is expected that this technology
will play a pivotal role in global conservation efforts and ecological research.</p>
    </sec>
    <sec id="sec-11">
      <title>Declaration on Generative AI</title>
      <p>During the preparation of this work, the author(s) used X-GPT-4 and Gramby in order to: Grammar and
spelling check. Further, the author(s) used X-AI-IMG for figures 3 and 4 in order to: Generate images.
After using these tool(s)/service(s), the author(s) reviewed and edited the content as needed and take(s)
full responsibility for the publication’s content.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Abadi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Agarwal</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Barham</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Brevdo</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Citro</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Corrado</surname>
            ,
            <given-names>G.S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Davis</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dean</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Devin</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ghemawat</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Goodfellow</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Harp</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Irving</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Isard</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jia</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jozefowicz</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kaiser</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kudlur</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Levenberg</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mane</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Monga</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Moore</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Murray</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Olah</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schuster</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shlens</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Steiner</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sutskever</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Talwar</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tucker</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vanhoucke</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vasudevan</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Viegas</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vinyals</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Warden</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wattenberg</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wicke</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yu</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Zheng</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          (
          <year>2016</year>
          ).
          <article-title>Tensorflow: Largescale machine learning on heterogeneous distributed systems</article-title>
          .
          <source>Proceedings of the 12th USENIX Symposium on Operating Systems, November 2-4</source>
          ,
          <year>2016</year>
          , Savannah,
          <string-name>
            <surname>GA</surname>
          </string-name>
          , USA. arXiv preprint arXiv:
          <volume>1603</volume>
          .
          <fpage>04467</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Dalton</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Berger</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kirchmeir</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Adams</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Botha</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Halloy</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hart</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Švara</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>Torres</given-names>
            <surname>Ribeiro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            ,
            <surname>Chaudhary</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            , &amp;
            <surname>Jungmeier</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          (
          <year>2024</year>
          ).
          <article-title>A framework for monitoring biodiversity in protected areas and other effective area-based conservation measures: Concepts, methods and technologies</article-title>
          .
          <source>UCN WCPA Technical Report Series No. 7</source>
          ,
          <string-name>
            <surname>Gland</surname>
          </string-name>
          , Switzerland: IUCN.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Elith</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Phillips</surname>
            ,
            <given-names>S.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hastie</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dudík</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chee</surname>
            ,
            <given-names>Y.E.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Yates</surname>
            ,
            <given-names>C.J.</given-names>
          </string-name>
          (
          <year>2011</year>
          ).
          <article-title>A statistical explanation of MaxEnt for ecologists</article-title>
          .
          <source>Diversity and Distributions</source>
          ,
          <volume>17</volume>
          :
          <fpage>43</fpage>
          -
          <lpage>57</lpage>
          . https://doi.org/10.1111/j.1472-
          <fpage>4642</fpage>
          .
          <year>2010</year>
          .
          <volume>00725</volume>
          .x
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Goh</surname>
            ,
            <given-names>H.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ho</surname>
            ,
            <given-names>C.K.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Abas</surname>
            ,
            <given-names>F.S.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>Front-end deep learning web apps development and deployment: a review</article-title>
          .
          <source>Applied intelligence</source>
          . Dordrecht, Netherlands,
          <volume>53</volume>
          (
          <issue>12</issue>
          ),
          <fpage>15923</fpage>
          -
          <lpage>15945</lpage>
          . https://doi.org/10.1007/s10489-022-04278-6
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>He</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhou</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ma</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Berg-Kirkpatrick</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Neubig</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>Towards a Unified View of Parameter-Efficient Transfer Learning</article-title>
          . ArXiv, abs/2110.04366.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Hosna</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Merry</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gyalmo</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Alom</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Aung</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Azim</surname>
            ,
            <given-names>M.A.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>Transfer learning: a friendly introduction</article-title>
          .
          <source>J Big Data</source>
          <volume>9</volume>
          ,
          <fpage>102</fpage>
          . https://doi.org/10.1186/s40537-022-00652-w
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Kika</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dashi</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Alimehmeti</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>Graph Convolutional Network Approach for Predicting the Severity of Carpal Tunnel Syndrome</article-title>
          .
          <source>Transactions on Engineering and Computing Sciences</source>
          ,
          <volume>11</volume>
          (
          <issue>5</issue>
          ),
          <fpage>38</fpage>
          -
          <lpage>44</lpage>
          . https://doi.org/10.14738/tecs.115.15441
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Koroveshi</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          &amp;
          <string-name>
            <surname>Ktona</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>Training an intelligent tutoring system using reinforcement learning</article-title>
          .
          <source>International Journal of Computer Science and Information Security</source>
          ,
          <volume>19</volume>
          (
          <issue>3</issue>
          ),
          <fpage>10</fpage>
          -
          <lpage>18</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Ktona</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mitre</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shehu</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Xhaja</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>Support allergic patients, using models found by machine learning algorithms, to improve their quality of life</article-title>
          .
          <source>International Journal of Intelligent Systems and Applications Engineering</source>
          ,
          <volume>10</volume>
          (
          <issue>4</issue>
          ),
          <fpage>512</fpage>
          -
          <lpage>517</lpage>
          . https://www.scopus.com/sourceid/21101021990
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Ktona</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Muça</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Çollaku</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shahini</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Boboli</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>Support the creation of appropriate tourism offers by finding a model, using machine learning algorithms, to forecast spending by tourists</article-title>
          .
          <source>International Journal of Technology Marketing (IJTMKT)</source>
          ,
          <volume>17</volume>
          (
          <issue>1</issue>
          ),
          <fpage>1</fpage>
          -
          <lpage>10</lpage>
          . https://doi.org/10.1504/IJTMKT.
          <year>2022</year>
          .10048142
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Liu</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yan</surname>
          </string-name>
          , H.-F.,
          <string-name>
            <surname>Newmaster</surname>
            ,
            <given-names>S.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pei</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ragupathy</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ge</surname>
            ,
            <given-names>X.J.</given-names>
          </string-name>
          &amp;
          <string-name>
            <surname>Lowe</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2015</year>
          ).
          <article-title>The use of DNA barcoding as a tool for the conservation biogeography of subtropical forests in China</article-title>
          . Diversity &amp; distributions,
          <volume>21</volume>
          (
          <issue>2</issue>
          ),
          <fpage>188</fpage>
          -
          <lpage>199</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>McClinton</surname>
            ,
            <given-names>J.D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kulpa</surname>
            ,
            <given-names>S.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Grames</surname>
            ,
            <given-names>E.M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Leger</surname>
            ,
            <given-names>E.A.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>Field observations and remote assessment identify climate change, recreation, invasive species, and livestock as top threats to critically imperiled rare plants in Nevada</article-title>
          .
          <source>Frontiers in Conservation Science</source>
          , Vol.
          <volume>3</volume>
          https://www.frontiersin.org/journals/conservation-science/articles/10.3389/fcosc.
          <year>2022</year>
          .1070490, DOI=10.3389/fcosc.
          <year>2022</year>
          .1070490
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Olawade</surname>
            ,
            <given-names>D.B</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wada</surname>
            ,
            <given-names>O.Z</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ige</surname>
            ,
            <given-names>A.O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Egbewole</surname>
            ,
            <given-names>B.I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Olojo</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Oladapo</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          (
          <year>2024</year>
          ).
          <source>Artificial Intelligence in Environmental Monitoring: Advancements</source>
          , Challenges, and
          <string-name>
            <given-names>Future</given-names>
            <surname>Directions</surname>
          </string-name>
          .
          <source>Hygiene and Environmental Health Advances 12. 100114. 10</source>
          .1016/j.heha.
          <year>2024</year>
          .
          <volume>100114</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Paszke</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gross</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Chintala</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          (
          <year>2019</year>
          ).
          <article-title>PyTorch: An Imperative Style, High-Performance Deep Learning Library</article-title>
          . NIPS.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Redmon</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Divvala</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Girshick</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Farhadi</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2016</year>
          ). You Only Look Once: Unified,
          <string-name>
            <surname>Real-Time Object</surname>
            <given-names>Detection</given-names>
          </string-name>
          ,
          <source>2016 IEEE Conference on Computer Vision</source>
          and
          <article-title>Pattern Recognition (CVPR), Las Vegas</article-title>
          ,
          <string-name>
            <surname>NV</surname>
          </string-name>
          , USA,
          <year>2016</year>
          ,
          <fpage>779</fpage>
          -
          <lpage>788</lpage>
          , doi: 10.1109/CVPR.
          <year>2016</year>
          .
          <volume>91</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Reynolds</surname>
            ,
            <given-names>S.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Beery</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Burgess</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Burgman</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Butchart</surname>
            ,
            <given-names>S.H.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cooke</surname>
            ,
            <given-names>S.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coomes</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Danielsen</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Di Minin</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Durán</surname>
            ,
            <given-names>A.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gassert</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hinsley</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jaffer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jones</surname>
            ,
            <given-names>J.P.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>B.V.</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>Mac</given-names>
            <surname>Aodha</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            ,
            <surname>Madhavapeddy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>O'Donnell</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.A.L.</given-names>
            ,
            <surname>Oxbury</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.M.</given-names>
            ,
            <surname>Peck</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            ,
            <surname>Pettorelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            ,
            <surname>Rodríguez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.P.</given-names>
            ,
            <surname>Shuckburgh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            ,
            <surname>Strassburg</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            ,
            <surname>Yamashita</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            ,
            <surname>Miao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            , &amp;
            <surname>Sutherland</surname>
          </string-name>
          ,
          <string-name>
            <surname>W.J.</surname>
          </string-name>
          (
          <year>2025</year>
          ).
          <article-title>The potential for AI to revolutionize conservation: A horizon scan</article-title>
          .
          <source>Trends in Ecology &amp; Evolution</source>
          ,
          <volume>40</volume>
          (
          <issue>2</issue>
          ),
          <fpage>191</fpage>
          -
          <lpage>207</lpage>
          . https://doi.org/10.1016/j.tree.
          <year>2024</year>
          .
          <volume>11</volume>
          .013
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Rivera</surname>
            ,
            <given-names>Juan.</given-names>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>Practical TensorFlow</article-title>
          .
          <source>js: Deep Learning in Web App Development</source>
          .
          <volume>10</volume>
          .1007/978- 1-
          <fpage>4842</fpage>
          -6273-3.
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Smilkov</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Thorat</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Assogba</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nicholson</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kreeger</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yu</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          , Zhang,
          <string-name>
            <given-names>K.</given-names>
            ,
            <surname>Cai</surname>
          </string-name>
          ,
          <string-name>
            <surname>S.</surname>
          </string-name>
          , Nielsen,
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Tannous</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stefanini</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Romano</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>'A Deep-Learning-Based Detection Approach for the Identification of Insect Species of Economic Importance'</article-title>
          .
          <source>Insects</source>
          <volume>14</volume>
          (
          <issue>2</issue>
          ):
          <fpage>148</fpage>
          . https://doi.org/10.3390/insects14020148
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [22]
          <string-name>
            <surname>Tran</surname>
            ,
            <given-names>D.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fischer</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Smajic</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>So</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>Real-time Object Detection for Autonomous Driving using Deep Learning</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [23]
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhang</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>C-Q.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Orr</surname>
            ,
            <given-names>M.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Zhang</surname>
            ,
            <given-names>A.B.</given-names>
          </string-name>
          (
          <year>2022</year>
          ).
          <article-title>Identification of Species by Combining Molecular and Morphological Data Using Convolutional Neural Networks</article-title>
          ,
          <source>Systematic Biology</source>
          ,
          <volume>71</volume>
          (
          <issue>3</issue>
          ), May,
          <fpage>690</fpage>
          -
          <lpage>705</lpage>
          , https://doi.org/10.1093/sysbio/syab076
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Xhina</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bilo</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ktona</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Paparisto</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Liçi</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Qirinxhi</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Haxhiu</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          (
          <year>2023</year>
          ).
          <article-title>Machine learning methods for understanding biodiversity and its conservation</article-title>
          .
          <source>International Journal of Ecosystems and Ecology Science (IJEES)</source>
          ,
          <volume>13</volume>
          (
          <issue>3</issue>
          ),
          <fpage>39</fpage>
          -
          <lpage>44</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [25]
          <string-name>
            <surname>LeCun</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          (
          <year>1989</year>
          ).
          <article-title>Generalization and network design strategies</article-title>
          .
          <source>Technical Report CRG-TR-89-4</source>
          , Department of Computer Science, University of Toronto.
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [26]
          <string-name>
            <surname>Zhuang</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Qi</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Duan</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Xi</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhu</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhu</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Xiong</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>He</surname>
            ,
            <given-names>Q.</given-names>
          </string-name>
          (
          <year>2020</year>
          ).
          <article-title>A Comprehensive Survey on Transfer Learning</article-title>
          .
          <source>Proceedings of the IEEE. 1-34. 10</source>
          .1109/JPROC.
          <year>2020</year>
          .
          <volume>3004555</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>