<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Computer vision system for fire detection and report using UAVs</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Pablo Chamoso</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alfonso Gonz´alez-Briones</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Fernando De La Prieta</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Juan M. Corchado</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>BISITE Digital Innovation Hub, University of Salamanca.</institution>
          <addr-line>Calle Espejo 2, 37007. Salamanca</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Continuous technological progress has led to great changes in our society. UAVs (Unnamed Aerial Vehicles), commonly known as drones, are one of the most significant technological advancements of the last decade. UAVs offer a wide variety of new possibilities and have become a tool that is used on an everyday basis. UAVs can be used in fire control due to their ability to manoeuvre rapidly and their wide range of operation. This article presents a review of the main uses of UAVs in combatting fire. Special emphasis is placed on fire detection techniques using computer and infrared computer techniques, as well as the hardware systems that drones must incorporate to perform this task. This article also presents a simple proposal for fire detection and alert using UAVs and computer vision.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>has become the main way to detect and monitor fires. UAVs and drones are a low-cost option for monitoring,
detecting and even fighting forest fires.</p>
      <p>
        UAVs are not new; they have been in existence for dozens of years, however only recently have they become
popular. With the development of technologies, modern UAVs have also become more advanced and our
possibilities for using them have become greater, which poses new legal and regulatory questions [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. Initially, they were
designed for military purposes: to transport balloon bombs and as a form of training for anti-aircraft weapons
during World War II. Today, their use is becoming more frequent and they can carry out a wider range of tasks
in both the military and professional sectors. These unmanned aerial vehicles allow the integration of remote
sensing techniques that can also meet the requirements of spatial, spectral and temporal resolution [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]
serving as a tool for the management of the collected data. Unmanned aerial vehicles allow for the execution of
long-term, monotonous and repeated tasks that go beyond human capabilities. This has led to increased global
attention to the applications of UAVs in forest fires in recent years [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ], [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ], [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] or [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ].
      </p>
      <p>
        UAVs have been widely used for forestry, agriculture and livestock. For example, it has been used to obtain
scans of large areas of the livestock system [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]. Counting and monitoring of animal species can be performed
with video recordings taken by UAVs. Moreover, the system keeps track of the number of detected animals by
analyzing the images taken with the UAV’s cameras. Another work using UAVs presents a system capable of
detecting ground vehicles through aerial images taken by a UAV in real time. In addition, the system offers
the possibility of guiding the UAV autonomously to keep track of a vehicle that has been detected previously
[
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. Other investigations make use of the functionalities provided by existing MultiAgent Systems (MAS) to
coordinate tasks among UAVs. The article presents a case study that uses the capabilities to perform the
detection of oil spills, [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ].
      </p>
      <p>
        In [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ], an octacopter is presented to capture photographs and images in multiple formats, and to carry
sensors and scientific-technical measuring equipment. It is a collapsible, lightweight, multi-rotor, vertical take-off
UAV (Unmanned Aerial Vehicle) aircraft, made of aerospace materials with maximum resistance. It includes a
communication center and a station base, all of which are transportable, lightweight and compact.
      </p>
      <p>The objective of this paper is to review the state of the art about the use of UAVs in fire detection in large
forested areas. Specifically, there will be a review of aircraft fire detection techniques.</p>
      <p>The rest of the article is structured as follows: Section 2 describes diverse state of the art proposals in the
area of UAVs and fire detection and vision computer technologies. Section 3 provides a full description of the
system proposed in this work, including the functionality of each of its components. Section 4 details the case
study and discusses the results obtained from this work. Finally, Section 5 outlines the conclusions drawn from
this research.
2</p>
    </sec>
    <sec id="sec-2">
      <title>State of the art of related techniques</title>
      <p>This section studies the main techniques to be incorporated into a UAV for the detection and communication of
fire alerts to the people responsible for the forest area. This will allow them to take the corresponding actions.
2.1</p>
      <sec id="sec-2-1">
        <title>Computer Vision in UAVs</title>
        <p>
          One of the main problems detected in the application of computer vision techniques is that most of these
techniques employ classifiers that must be trained. These classifiers need a large number of images of forest fires
for their correct classification, such as Eigenfaces, Fisherfaces, LBP [
          <xref ref-type="bibr" rid="ref20">20</xref>
          ]. Often researchers need to download
images from Internet search engines or have images of fires [21]. This makes it very difficult to test and improve
proposed algorithms. Another possibility is to use infrared images, which are easier to process than the visible
images because the intensity of the fire pixels is much greater than that of the other pixels [
          <xref ref-type="bibr" rid="ref21">23</xref>
          ]. The detection
of a fire zone of an infrared image is to find the threshold that differentiates the pixels belonging to the fire from
those of the background. There are several algorithms to perform this task that can be applied to the detection of
fire pixels [22], [
          <xref ref-type="bibr" rid="ref22">24</xref>
          ], [
          <xref ref-type="bibr" rid="ref23">25</xref>
          ], [
          <xref ref-type="bibr" rid="ref24">26</xref>
          ]. However, this technique also has a number of limitations. One of these limitations
is that areas near fire such as hot gases can produce a difference between areas of fire that appear in the visible
domain and those in the infrared domain. A paper showing this deficiency shows that the near-infrared domain
produces areas of forest fires that are very similar to those obtained in the visible domain [
          <xref ref-type="bibr" rid="ref25">27</xref>
          ]. Considering that
it is easier to detect the fire pixel in infrared images but that visible images remain the reference, new algorithms
for fire pixel detection using image fusion could be developed [
          <xref ref-type="bibr" rid="ref26">28</xref>
          ].
Once the UAVs have detected fire in a recoded video, it is notified to the notification platform. One of the
most widely used technologies is XBee. XBee are small electronic chips capable of communicating wirelessly
with each other. XBee modules are integrated solutions that provide a wireless medium for interconnection and
communication between devices. These modules use the network protocol called IEEE 802.15.4 to create FAST
POINT-TO-MULTIPOINT (point-to-multipoint) networks; or for PEER-TO-PEER (point-to-point) networks.
They were designed for applications that require high data traffic, low latency and predictable communication
synchronization. So basically, XBee is owned by Digi based on the Zigbee protocol. Another option is the use
of WiFi. When using Wifi, it is necessary to mark the forest area with access points so that there is a local
network, enabling control over the area. Knowledge of black spots in the whole area is also important so that
they can be avoided in communications.
For fire identification, it is necessary to combine classifiers and infrared images to minimize the deficiencies of
these two techniques. That is to say, to use classifiers trained with fire images and the use of infrared images. For
communication with the UAVsy alerts, the use of WiFi is preferable. Communications via XBee cannot exceed
100m, which greatly limits their use. Therefore, the main role of the proposed base station control software is
to offer autonomous control through (WiFi Communication). This is done by applying a self-contained flight
algorithm designed for this purpose that follows a series of points entered by the software with the help of the
UAV status received via telemetry. In addition, the software will allow you to view the configured flights, as well
as receive notifications about the coordinates at which a fire has been detected.
3
        </p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Proposed system</title>
      <p>The previous section began with a study of the existing remote-control technologies in each of the three parts
into which the proposal can be divided: aircraft, communication and control. The initial objective was to offer
a complete system that would improve these existing technologies in the field of fire detection.</p>
      <p>As far as the aircraft is concerned, there are two distinct parts that have an influence when it comes to
flying: its assembly or chassis and its electronics. The analysis of these parts leads to the conclusion that the
improvements that can be made in the available resources are minimal if not nil for the purposes of this project.
This is because there are large international companies that have years of experience and invest millions of euros
in the development of both chassis and electronic stabilization systems, as well as open source projects conducted
by independent developers collaborating on do-it-yourself (do it yourself) platforms whose systems are much less
stable than the above-mentioned international companies, so competing against them would make no sense.</p>
      <p>On the other hand, the communication and control blocks are very similar in all the existing projects, they do
not reflect the great advances made in the flight system. All of them use radio station systems for communication
which are very stable and offer a long communication range but do not allow data transmission in digital
form, so communication is limited to flight orders using the previously explained PPM transmission. This type
of transmission requires the presence of a second communication module for telemetry transfer and a third
communication system for the transmission of video in real time.</p>
      <p>As for the control system used to control the UAV systems, they all use radio stations and are very sophisticated
what makes them easy to integrate with PPM radio systems, which can even adapt telemetry reception modules
and view them on a digital display. From our analysis we propose to design a system capable of simulating the
connection of radio systems to control any UAV stabilizer (if possible the best on the market) and capable of
transmitting telemetry digitally together with flight orders and video using only Wi-Fi connection and remotely
controlled from the ground with a gamepad connected to a computer instead of using radio stations, displaying
telemetry information and video transmitted from the multirotor on the computer screen. With this Wi-Fi
communication system which is easily accessible to consumers, has high-powered access points which allow for
connections of up to 50km and its cost is much lower than long-range radio systems. Although it does not exceed
the distance of the aforementioned radio links, but given the limited flight time of the UAV due to the battery
life (common element for all systems), which today barely exceeds 30 minutes, this distance range is more than
sufficient for the system to be developed.</p>
      <p>In addition, the possible control carried out with a computer is not comparable today to that carried out by
a radio station, so designing a control software capable of controlling and displaying the telemetry transmitted
by the UAV through a computer would substantially improve the leveraged systems. In addition, it could be
seamlessly integrated with the above-mentioned Wi-Fi communication system, which would receive the data
measured by the sensors and the video and, finally, could transmit the flight orders from the control software to
the UAV.
The hardware required to control any of the existing stabilizers on the market must be capable of transmitting
the information to the stabilizer through a series of outputs (at least 8) as if it were a radio receiver.</p>
      <p>Radio receivers use manufacturer-specific settings to order the data to be transmitted, but maintain a standard
for the connection, using three-pin connectors for each of the channels, through which the data associated with
each is sent. The only exception is the case of the manufacturer Futaba, which uses a proprietary protocol
(S-BUS) with which a single three-pin connector is able to send information from all radio channels. However,
all controllers are compatible with the above-mentioned standard, while only a few are compatible with Futaba’s
S-BUS protocol.</p>
      <p>We chose a Raspberry PI 3 microcontroller as hardware for performing this task, it is capable of running a
Linux operating system and it allows to connect the APM 2.5 through one of its USB ports and connects a Wi-Fi
adapter to the other port. All these features made us opt for this hardware. Therefore, the controller consists of
a Raspberry PI 3 and an APM 2.5 connected via the USB port.</p>
      <p>This hardware has to interact with at least the stabilizers of the multirotor. There are many multi-rotor
stabilizers on the market, so the DJI Wookong-M stabilizer has been chosen. It provides very precise stabilization
and control.
3.1.1</p>
      <sec id="sec-3-1">
        <title>Hardware Connections</title>
        <p>The LiPo battery communicates directly with the ammeter of the APM 2.5, which performs consumption
measurements. The ammeter data cable is connected to the APM 2.5 itself (the result of the measurement is sent)
and the continued power supply is connected to the PMU of the DJI Wookong-M, to the multi-turner drives and
to the UBEC voltage regulator. PMU connects to the other components of the DJI Wookong-M in the manner
specified by the manufacturer, DJI Innovations, as shown in Figure 2.</p>
        <p>The drives will connect their data cable to the DJI Wookong-M’s MC inputs in the order specified by DJI
Innovations and its banana connectors to the corresponding motor connectors regardless of the order (a test
without propellers is required to check the direction of rotation before the first flight). The UBEC will be
connected to the Raspberry PI model B GPIO pins (pin 2 for power and pin 6 for ground) and power. It will
also do this with the flight camera and the high-resolution camera stabilizer. The Raspberry PI connects through
one of its USB 2.0 ports with the USB Wi-Fi antenna to establish communication with the access point and with
the other USB port it connects to the APM 2.5 microUSB to exchange data (flight orders in one direction and
sensor information in the other). The flight IP camera will be connected to the Ethernet port of the Raspberry
PI. The APM 2.5 will connect each of its first 6 outputs (marked as outputs) to the DJI Wookong-M’s MC with
3-pin connectors so that the corresponding signal will be sent to each of the DJI’s channels through each of the
outputs. If a camera stabilizer is available, it will connect to outputs 7 (pitch movement) and 8 (roll movement)
of the APM 2.5, Figure 3.</p>
        <p>With the present connection and the type of each link between components shown above, the wiring diagram
of the electronic components that control and monitor the multirotor is as shown in Figure 4.
The main role of the base station’s control software is to offer the necessary functionality to perform remotely
controlled flights manually thanks to the use of the previously mentioned gamepad or to perform autonomous
control through the application of an autonomous flight algorithm designed for this project. This algorithm
follows a series of points introduced by the software with the help of the UAV status received through telemetry.
The control software offers a configurator that allows you to set the parameters needed to perform a new flight
or to schedule a future flight without having to connect the UAV. In addition, it is possible to export the
configuration into files so that they can be used as flight profiles, selectable according to the type of flight to
be performed or the specific configuration of the multirotor, as well as other parameters such as the software
language.</p>
        <p>The route programming mode is designed so that whoever programs the routes that will later be travelled
by the UAV, will be able to load these routes on the UAV at the time of flight and will be able to repeat the
same routes without having to plan them again, thus this mode helps save time. Scheduled routes are exported
in files with extension .route and can be loaded from the main configuration or even with the software started
in flight mode, from where the route can also be configured without the need to have previously programmed
it in route programming mode. Another of the features of the control base station software is that it will be
executed from the places where the flights are performed, which will be in most cases areas where there is no
Internet connection (or at least not at an acceptable speed), so you cannot depend on the connection during the
flight to show the information regarding the area where the flight is performed, so the map of the area must be
previously obtained and geo-positioned. The image can be obtained from anywhere as long as it is perpendicular
to the terrain, this allows for photographs taken by the UAV to be used, but the coordinates of at least three
of the four corners must be known in order to assign a coordinate (longitude and latitude) to each of the pixels
in the image. This is called image geolocation, which can be done through the control software if an image is
included for the first time without associated geolocation data or using the developed software to facilitate the
acquisition of terrain images using Google Maps, accessible online and free of charge, currently published on
http://servidor-online.com/hawk-geoposition/
(a) View of the route for sale.</p>
        <p>(b) View of the collected video</p>
        <p>The system used for the geopositioning stores in the metadata of the image the information
of the coordinates of the corners in a transparent way to the user without the user even
realizing it, since no external files are used, but rather, as metadata, they are included in the
image. The metadata format is formed with the coordinates of each corner, in the following format:
GEO − Inf ormation, LatitudeA : LongitudeA, LatitudeB : LongitudeB, LatitudeC : LongitudeC :
LongitudeC , LatitudeD : LongitudeD, 0.00000000Example : GEO − Inf ormation, 40.967755287228385 :
−5.62173769696952938, 40.96775528287228385 : −5.628174998588699, 40.96410968463514 :
−5.621737696952938, 40.96410968463514 : −5.628174998588699, 0.0000000000 Another of the main features of
the software is the control of the UAV in two ways: manual (with a pilot on the ground via the gamepad) and
autonomous or automatic (without a pilot, performing a calculation of the movement automatically and based
on the introduction of a series of route points).
An algorithm has been developed to detect fires using the Python script language. The OpenCV and Numpy
APIs have been used for this purpose. The video taken by the UAV camera is processed frame by frame. The
colour frame of each frame is changed to HSV. The HSV color space is very similar to the way we humans
perceive the images of the environment, more so than even the RGB space. A mask with the upper and lower
color values is then defined. A mask is applied to the frame and only the colors in the range we have defined are
visible. In figure 6, we can see how the developed algorithm that is deployed in the Raspberry PI of the UAV
works.
This paper has made an overview of the main computer vision techniques for fire detection in forested areas.
Unlike ground or space systems, the cost of deploying these techniques in UAVs is low and no humans are put
at risk when performing this activity.</p>
        <p>Since the density of trees in woodland areas impedes the detection of small fires through simple monitoring.
The combination of sensorization and artificial vision techniques in a UAV is the best choice for fire detection in
forests.</p>
        <p>One of the drawbacks encountered in carrying out this work is the possibility that the smoke may block the
images of the fire, although before the smoke becomes dense, the fire should have already been detected. Sunlight,
for example, can cause false positives, so the combination of infrared images and the proposed algorithm on the
collected video can contribute to robust detection of forest fires, including high probability of detection, low false
alarm rates and improved adaptive capabilities in various environmental conditions.</p>
        <p>The use of infrared images, which are easier to analyze, often causes hot gases to also be detected as fire.
Since these areas are similar to those of fire, they can produce a difference between the fire areas that appear in
the visible domain and those in the infrared domain.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Acknowledgments References</title>
      <p>This work was developed as part of “MOVIURBAN: M´aquina social para la gesti´on sostenible de ciudades
inteligentes: movilidad urbana, datos abiertos, sensores m´oviles”. ID: SA070U 16. Project co-financed with
Junta Castilla y Le´on, Consejer´ıa de Educaci´on and ERDF funds.
[21] Forest health, natural resources, fire, trees, wildfire, silviculture photos. http : //www.f orestryimages.org/.</p>
      <p>(Accessed: 10 March 2018).
[22] J. Ramiro Mart´ınez-de Dios, Luis Merino, An´ıbal Ollero. Fire detection using autonomous aerial vehicles
with infrared and visual cameras. Proceedings of the 16th IFAC World Congress 2005</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>WWF</given-names>
            <surname>Espan</surname>
          </string-name>
          <article-title>˜a</article-title>
          .
          <source>Incendios Forestales</source>
          .
          <year>2018</year>
          . https : //www.wwf.es/nuestrotrabajo/bosques/incendiosf orestales/
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>WWF</given-names>
            <surname>Espan</surname>
          </string-name>
          <article-title>˜a</article-title>
          .
          <source>Informe Incendios Forestales</source>
          .
          <year>2018</year>
          . https : //www.wwf.es/nuestrotrabajo/bosques/incendiosf orestales/i
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Martinez-de Dios</surname>
            ,
            <given-names>J. R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Arrue</surname>
            ,
            <given-names>B. C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ollero</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Merino</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          , &amp; G´omez
          <article-title>-Rodr´ıguez, F. Computer vision techniques for forest fire perception</article-title>
          .
          <source>Image and vision computing</source>
          ,
          <volume>26</volume>
          (
          <issue>4</issue>
          ),
          <fpage>550</fpage>
          -
          <lpage>562</lpage>
          ,
          <year>2008</year>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Vipin</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          <article-title>Image processing based forest fire detection</article-title>
          .
          <source>International Journal of Emerging Technology and Advanced Engineering</source>
          ,
          <volume>2</volume>
          (
          <issue>2</issue>
          ),
          <fpage>87</fpage>
          -
          <lpage>95</lpage>
          ,
          <year>2012</year>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Lin</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Liu</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhao</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Zhang</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          <article-title>Early warning system of forest fire detection based on video technology</article-title>
          .
          <source>In Proceedings of the 9th International Symposium on Linear Drives for Industry Applications</source>
          , Volume
          <volume>3</volume>
          (pp.
          <fpage>751</fpage>
          -
          <lpage>758</lpage>
          ). Springer Berlin Heidelberg. January,
          <year>2014</year>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Chisholm</surname>
            ,
            <given-names>R. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cui</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lum</surname>
            ,
            <given-names>S. K.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>B. M. UAV</given-names>
          </string-name>
          <article-title>LiDAR for below-canopy forest surveys</article-title>
          .
          <source>Journal of Unmanned Vehicle Systems</source>
          ,
          <volume>1</volume>
          (
          <issue>01</issue>
          ),
          <fpage>61</fpage>
          -
          <lpage>68</lpage>
          .
          <year>2013</year>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Everaerts</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <article-title>The use of unmanned aerial vehicles (UAVs) for remote sensing and mapping</article-title>
          .
          <source>The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences</source>
          ,
          <volume>37</volume>
          (
          <year>2008</year>
          ),
          <fpage>1187</fpage>
          -
          <lpage>1192</lpage>
          ,
          <year>2008</year>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Berni</surname>
            ,
            <given-names>J. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zarco-Tejada</surname>
            ,
            <given-names>P. J.</given-names>
          </string-name>
          , Su´arez,
          <string-name>
            <given-names>L.</given-names>
            , &amp;
            <surname>Fereres</surname>
          </string-name>
          ,
          <string-name>
            <surname>E.</surname>
          </string-name>
          <article-title>Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle</article-title>
          .
          <source>IEEE Transactions on Geoscience and Remote Sensing</source>
          ,
          <volume>47</volume>
          (
          <issue>3</issue>
          ),
          <fpage>722</fpage>
          -
          <lpage>738</lpage>
          ,
          <year>2009</year>
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Chamoso</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <article-title>Gonz´alez-</article-title>
          <string-name>
            <surname>Briones</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rivas</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , Bueno de Mata,
          <string-name>
            <given-names>F.</given-names>
            &amp;
            <surname>Corchado</surname>
          </string-name>
          ,
          <string-name>
            <surname>J.M.</surname>
          </string-name>
          <article-title>The Use of Drones in Spain: Towards a Platform for Controlling UAVs in Urban Environments</article-title>
          .
          <volume>1416</volume>
          ,
          <issue>18</issue>
          (
          <issue>5</issue>
          ),
          <fpage>2018</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Ambrosia</surname>
            ,
            <given-names>V. G.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Zajkowski</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          <article-title>Selection of appropriate class UAS/sensors to support fire monitoring: experiences in the United States</article-title>
          .
          <source>In Handbook of Unmanned Aerial Vehicles</source>
          (pp.
          <fpage>2723</fpage>
          -
          <lpage>2754</lpage>
          ). Springer Netherlands,
          <year>2015</year>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Merino</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mart´</surname>
            ınez-de Dios,
            <given-names>J. R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Ollero</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <article-title>Cooperative unmanned aerial systems for fire detection, monitoring, and extinguishing</article-title>
          .
          <source>In Handbook of Unmanned Aerial Vehicles</source>
          (pp.
          <fpage>2693</fpage>
          -
          <lpage>2722</lpage>
          ). Springer Netherlands,
          <year>2015</year>
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Shahbazi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , Th´eau, J., &amp;
          <article-title>M´enard, P. Recent applications of unmanned aerial imagery in natural resource management</article-title>
          .
          <source>GIScience &amp; Remote Sensing</source>
          ,
          <volume>51</volume>
          (
          <issue>4</issue>
          ),
          <fpage>339</fpage>
          -
          <lpage>365</lpage>
          ,
          <year>2014</year>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Sharifi</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhang</surname>
            ,
            <given-names>Y. M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Aghdam</surname>
            ,
            <given-names>A. G.</given-names>
          </string-name>
          <article-title>Forest fire detection and monitoring using a network of autonomous vehicles</article-title>
          .
          <source>In The 10th International Conference on Intelligent Unmanned Systems (ICIUS</source>
          <year>2014</year>
          )
          <article-title>(Vol</article-title>
          .
          <volume>29</volume>
          ).
          <year>2014</year>
          , September
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Bosch</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Serrano</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Vergara</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          <article-title>Multisensor network system for wildfire detection using infrared image processing</article-title>
          .
          <source>The Scientific World Journal</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Merino</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Caballero</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <article-title>Mart´ınez-de-</article-title>
          <string-name>
            <surname>Dios</surname>
            ,
            <given-names>J. R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Maza</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Ollero</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <article-title>An unmanned aircraft system for automatic forest fire monitoring and measurement</article-title>
          .
          <source>Journal of Intelligent &amp; Robotic Systems</source>
          ,
          <volume>65</volume>
          (
          <issue>1-4</issue>
          ),
          <fpage>533</fpage>
          -
          <lpage>548</lpage>
          ,
          <year>2012</year>
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Chamoso</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Raveane</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Parra</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          , &amp;
          <article-title>Gonz´alez, A. UAVs applied to the counting and monitoring of animals</article-title>
          .
          <source>In Ambient Intelligence-Software and Applications</source>
          (pp.
          <fpage>71</fpage>
          -
          <lpage>80</lpage>
          ). Springer, Cham,
          <year>2014</year>
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17] P´erez,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Chamoso</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            ,
            <surname>Parra</surname>
          </string-name>
          ,
          <string-name>
            <surname>V.</surname>
          </string-name>
          , &amp; S´anchez,
          <string-name>
            <surname>A. J.</surname>
          </string-name>
          <article-title>Ground vehicle detection through aerial images taken by a UAV</article-title>
          .
          <source>In Information Fusion (FUSION)</source>
          , 17th International Conference on (pp.
          <fpage>1</fpage>
          -
          <lpage>6</lpage>
          ). IEEE,
          <year>2014</year>
          , July
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Chamoso</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          , P´erez,
          <string-name>
            <surname>A.</surname>
          </string-name>
          , Rodr´ıguez,
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Corchado</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. M.</given-names>
            ,
            <surname>Sempere</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            ,
            <surname>Rizo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            , ... &amp;
            <surname>Pujol</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <article-title>Modeling Oil-Spill Detection with multirotor systems based on multi-agent systems</article-title>
          .
          <source>In Information Fusion (FUSION)</source>
          , 17th International Conference on (pp.
          <fpage>1</fpage>
          -
          <lpage>8</lpage>
          ). IEEE,
          <year>2014</year>
          , July
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19] Bernab´eu,
          <string-name>
            <surname>C.</surname>
          </string-name>
          , Corchado Rodr´ıguez,
          <string-name>
            <surname>J. M.</surname>
          </string-name>
          , Rodr´ıguez,
          <string-name>
            <given-names>S.</given-names>
            , &amp;
            <surname>Chamoso</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Aracnoc</surname>
          </string-name>
          <article-title>´optero: An Unmanned Aerial VTOL Multi-rotor for Remote Monitoring</article-title>
          and Surveillance, 2011
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Gonz</surname>
          </string-name>
          <article-title>´alez-</article-title>
          <string-name>
            <surname>Briones</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Villarrubia</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>De Paz</surname>
            ,
            <given-names>J. F.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Corchado</surname>
          </string-name>
          ,
          <string-name>
            <surname>J. M.</surname>
          </string-name>
          <article-title>A multi-agent system for the classification of gender and age from images</article-title>
          .
          <source>Computer Vision</source>
          and Image Understanding,
          <year>2018</year>
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [23]
          <string-name>
            <given-names>J.R.</given-names>
            <surname>Mart</surname>
          </string-name>
          ´ınez -de
          <string-name>
            <surname>Dios</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          <string-name>
            <surname>Merino</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          <string-name>
            <surname>Caballero</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Ollero</surname>
          </string-name>
          .
          <article-title>Automatic forest-fire measuring using ground stations and unmanned aerial systems</article-title>
          .
          <source>Sensors</source>
          ,
          <volume>11</volume>
          (
          <issue>6</issue>
          ), pp.
          <fpage>6328</fpage>
          -
          <lpage>6353</lpage>
          ,
          <year>2011</year>
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [24]
          <string-name>
            <given-names>Nobuyuki</given-names>
            <surname>Otsu</surname>
          </string-name>
          .
          <article-title>A threshold selection method from gray-level histograms</article-title>
          .
          <source>Automatica</source>
          ,
          <volume>11</volume>
          (
          <fpage>285</fpage>
          -
          <lpage>296</lpage>
          ), pp.
          <fpage>23</fpage>
          -
          <lpage>27</lpage>
          ,
          <year>1975</year>
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>T.W.</given-names>
            <surname>Ridler</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Calvard</surname>
          </string-name>
          .
          <article-title>Picture thresholding using an iterative selection method</article-title>
          .
          <source>IEEE Trans. Syst. Man Cybern.</source>
          ,
          <volume>8</volume>
          (
          <issue>8</issue>
          ), pp.
          <fpage>630</fpage>
          -
          <lpage>632</lpage>
          ,
          <year>1978</year>
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [26]
          <string-name>
            <surname>Josef</surname>
            <given-names>Kittler</given-names>
          </string-name>
          , John Illingworth.
          <article-title>Minimum error thresholding</article-title>
          .
          <source>Pattern Recognit</source>
          .,
          <volume>19</volume>
          (
          <issue>1</issue>
          ), pp.
          <fpage>41</fpage>
          -
          <lpage>47</lpage>
          ,
          <year>1986</year>
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>L.</given-names>
            <surname>Rossi</surname>
          </string-name>
          , T. Toulouse,
          <string-name>
            <given-names>M.</given-names>
            <surname>Akhloufi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Pieri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Tison</surname>
          </string-name>
          .
          <article-title>Estimation of spreading fire geometrical characteristics using near infrared stereovision</article-title>
          . IS&amp;T/SPIE Electronic Imaging. pages
          <fpage>86500A</fpage>
          -
          <lpage>86500a</lpage>
          .
          <source>(March</source>
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [28]
          <string-name>
            <given-names>Tom</given-names>
            <surname>Toulouse</surname>
          </string-name>
          .
          <article-title>Estimation par st´er´eovision multimodale de caract´eristiques g´eom´etriques d'un feu de v´eg´etation en propagation (</article-title>
          <source>Ph.D. thesis)</source>
          .
          <source>University of Corsica (November</source>
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>