<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>UAV Teams In Emergency Scenarios: A Summary Of The Work Within The Pro ject PRISMA</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Carmine Recchiuto</string-name>
          <email>carmine.recchiuto@dibris.unige.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Antonio Sgorbissa</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Francesco Wanderlingh</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Renato Zaccaria</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>DIBRIS Department, University of Genova</institution>
          ,
          <addr-line>via all'Opera Pia 13, 16145, Genova</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>In recent years autonomous robots, and Unmanned Aerial Vehicles (UAVs) in particular, are becoming always more important in the context of emergency scenarios, being able to anticipate the actions of human operators and to support them during rescue operations. In this context, the investigation of strategies for the autonomous control of UAVs, for the development of Human-Swarm Interfaces and for the coverage of large areas is crucial. All these aspects have been analyzed within the Italian project PRISMA, and they will be here summarized.</p>
      </abstract>
      <kwd-group>
        <kwd>UAVs</kwd>
        <kwd>monitoring</kwd>
        <kwd>Search&amp;Rescue</kwd>
        <kwd>Human-Swarm Interfaces</kwd>
        <kwd>coverage algorithms</kwd>
        <kwd>virtual reality</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>The work described in this article has been performed during the PRISMA
project, which focusses on the development and deployment of robots and
autonomous systems able to operate in emergency scenarios, with a specific
reference to monitoring, pre-operative management, and real-time intervention. The
work has been focussed on Unmanned Aerial Vehicles (UAVs), being able of
monitoring a wide area in a small time, quickly moving and being easily controlled
by human operators. In particular, some aspects have been analyzed more in
details: techniques for localization and autonomous control, coverage algorithms,
strategies for moving in a structured formation and the integration of virtual
reality tools for visualization and control.</p>
    </sec>
    <sec id="sec-2">
      <title>Indoor localization and autonomous control</title>
      <p>
        An indoor experimental setup can be extremely useful when dealing with aerial
robots, in order to speed up the development of model and algorithms. In this
context, the main problem is related to the localization of the robots, since GPS
signal is denied. In the project the problem was solved using a camera
(MatrixVision, mvBlueFox) on board of the esarotor Asctec Firefly [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] and integrating the
artificial vision library ArUco [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. The principal functionality of the library is to
recognize up to 1024 different markers, applying an Adaptive Thresholding and
2
the Otsu’s algorithm [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. When a marker is recognized, the relative distance and
orientation of the camera with respect of the marker is given.
      </p>
      <p>For improving the accuracy of the localization, a wall of 35 markers has
been created (Fig. 1), and a custom algorithm has been developed, based on
the elimination of the outliers and the estimation of the average value. The
reference in position is then used for the actual control of the robot. Indeed,
the UAV calculates the error between the target position (a fixed value when
hovering, a series of waypoints in more complex cases) and uses the error in space
as input of three PID controllers for the three directions in space. The resulting
target accelerations are used to calculate the reference thrust u and the control
angles d (pitch) and d (roll), considering the dynamics of the system, the mass
m of the UAV and the angle d (yaw):
u = mq 2x + 2y + ( z + g)2
d = sin 1</p>
      <p>m
d = tan 1
x sin d</p>
      <p>y cos d
u
x cos d</p>
      <p>y sin d
z + g</p>
      <p>
        The control of the orientation of the multirotor ( d) has been achieved with a
proportional controller, directly based on the error between the reference angle
and the actual one. The whole control software, composed of the localization
system, the PID controllers and other modules dedicated to the planning of
the actions (i.e., taking-off, hovering, reaching a waypoint, taking a picture,
landing) and to the interfacing with the user, has been implemented on board of
the esarotor Asctec Firefly, within the ETHNOS framework [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], a programming
environment for the design of real time control systems.
3
      </p>
    </sec>
    <sec id="sec-3">
      <title>Coverage algorithms for Search &amp; Rescue</title>
      <p>
        With a similar control approach, but using mainly GPS as localization system,
a monitoring strategy has been implemented and tested outdoor, using two
multirotors (Asctec Pelican and Asctec Firefly). The main idea was to analyze and
compare the performances of some real-time multi-robot coverage algorithms
(i.e., Node Count, Edge Counting, Learning Real-Time A* and PatrolGRAPH* )
[
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ], aimed at finding a decision procedure allowing a team of robots to
navigate in a workspace modelled as a navigation graph.
      </p>
      <p>The four algorithms have been firstly tested in simulation to the end of
comparing their performances, considering as indicators the length of the longest
path among all the robots and the overall distance travelled by all robots. The
results confirmed many of the previous works in literature, suggesting in
particular that the Node Count algorithm, despite its simplicity, is the most efficient
one. This becomes more evident increasing the number of robots and the size
of the grids, mainly because of the update rule of other high-performance
algorithms (e.g. LRTA*).</p>
      <p>Finally, the algorithms have been practically implemented using the two
multirotors, in order to test the whole framework (Fig. 2). A ROS/ETHNOS
interface has been developed to implement the communication between the off-board
controller (executing the algorithms) and the two robots.</p>
    </sec>
    <sec id="sec-4">
      <title>Movement in formation and implementation of a custom Human-Swarm Interface</title>
      <p>Even if many steps forward have been taken towards the fully autonomous
control of UAVs, a human pilot is usually in charge of controlling the robots.
However, teleoperating UAVs can become a hard task whenever it is necessary to
deploy a swarm of robots instead of a single unit, to the end of increasing the
area under observation. In this case, the organization of robots in a structured
formation may reduce the effort of the operator.</p>
      <p>
        For all these reasons, a custom Human-Swarm Interface has been built,
allowing human operators to control a team of multirotors in environments filled
with obstacles. The algorithm is mainly based on the work of Balch and Arkin
4
[
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], with a unit-center approach and the organization of the whole strategy as a
sum of concurrent behaviours (i.e., avoiding obstacles, avoiding inter-robots
collision, following user commands, keeping the formation, in a descending priority
order), handling a certain number of predetermined typologies of formation, and
receiving user inputs by means of a two-axis joypad (Fig. 3). The HSI has been
tested with a simulated environment, investigating also the effect of different
point of views on the user performances, showing a strong relation between
human performances, typology of the task and situational awareness. In particular,
it has been shown that a first person point of view is suitable for some typologies
of tasks, where a direct view of the environment is sufficient, whereas a more
evident degradation of the performances is noticed in a task where a higher level
of situational awareness is necessary.
5
      </p>
    </sec>
    <sec id="sec-5">
      <title>Integration of a virtual reality platform</title>
      <p>
        Given the necessity of easing the control of the robot from the operator point
of view, the integration of virtual reality tools has also been investigated. In
particular, the Oculus Rift [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], a virtual reality head-mounted display, has been
used in order to give inputs to the robot (or to the whole swarm in simulation)
and to visualize the images taken from the on-board cameras (Fig. 4).
      </p>
      <p>More in details, the inertial sensors embedded in the head-mounted display
are used to periodically measure the yaw orientation of the user’s head, using
it as reference for the yaw control of the multirotor. The ROS/ETHNOS bridge
has again been used for implementing the bidirectional communication (angles
and video streaming).</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Achtelik</surname>
            ,
            <given-names>M. C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Doth</surname>
            ,
            <given-names>K. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gurdan</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Stumpf</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>2012</year>
          ).
          <article-title>Design of a Multi Rotor MAV with regard to Efficiency, Dynamics and Redundancy</article-title>
          . In AIAA Guidance, Navigation, and Control Conference (pp.
          <fpage>1</fpage>
          -
          <lpage>17</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Munoz-Salinas</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          (
          <year>2012</year>
          ).
          <article-title>ARUCO: a minimal library for Augmented Reality applications based on OpenCv</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Hao</surname>
            ,
            <given-names>Y. M.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Zhu</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          (
          <year>2005</year>
          ).
          <article-title>Fast Algorithm for Two-dimensional Otsu Adaptive Threshold Algorithm [J]</article-title>
          .
          <source>Journal of Image and Graphics</source>
          ,
          <volume>4</volume>
          ,
          <fpage>014</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Piaggio</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sgorbissa</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Zaccaria</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          (
          <year>2000</year>
          ).
          <article-title>A programming environment for real-time control of distributed multiple robotic systems</article-title>
          .
          <source>Advanced Robotics</source>
          ,
          <volume>14</volume>
          (
          <issue>1</issue>
          ),
          <fpage>75</fpage>
          -
          <lpage>86</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Koenig</surname>
            , Sven,
            <given-names>Boleslaw</given-names>
          </string-name>
          <string-name>
            <surname>Szymanski</surname>
          </string-name>
          , and Yaxin Liu.
          <article-title>"Efficient and inefficient ant coverage methods</article-title>
          .
          <source>" Annals of Mathematics and Artificial Intelligence</source>
          <volume>31</volume>
          .
          <fpage>1</fpage>
          -
          <lpage>4</lpage>
          (
          <year>2001</year>
          ):
          <fpage>41</fpage>
          -
          <lpage>76</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Baglietto</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cannata</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Capezio</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Grosso</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sgorbissa</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Zaccaria</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          (
          <year>2008</year>
          ,
          <article-title>July)</article-title>
          .
          <article-title>PatrolGRAPH: a distributed algorithm for multi-robot patrolling</article-title>
          .
          <source>In IAS10-The 10th International Conference on Intelligent Autonomous Systems</source>
          , Baden Baden, Germany (pp.
          <fpage>415</fpage>
          -
          <lpage>424</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Balch</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Arkin</surname>
            ,
            <given-names>R. C.</given-names>
          </string-name>
          (
          <year>1998</year>
          ).
          <article-title>Behavior-based formation control for multirobot teams</article-title>
          .
          <source>Robotics and Automation</source>
          , IEEE Transactions on,
          <volume>14</volume>
          (
          <issue>6</issue>
          ),
          <fpage>926</fpage>
          -
          <lpage>939</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Oculus</surname>
            ,
            <given-names>V. R.</given-names>
          </string-name>
          (
          <year>2015</year>
          ). Oculus Rift. Available from: http://www.oculusvr.com/rift.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>