=Paper= {{Paper |id=Vol-2617/paper7 |storemode=property |title=Human-Drone Interactions with Semi-Autonomous Cohorts of Collaborating Drones |pdfUrl=https://ceur-ws.org/Vol-2617/paper7.pdf |volume=Vol-2617 |authors=Jane Cleland-Huang,Ankit Agrawal |dblpUrl=https://dblp.org/rec/conf/chi/Cleland-HuangA20 }} ==Human-Drone Interactions with Semi-Autonomous Cohorts of Collaborating Drones== https://ceur-ws.org/Vol-2617/paper7.pdf
                                   Human-Drone Interactions with
                                   Semi-Autonomous Cohorts of
                                   Collaborating Drones
Jane Cleland-Huang                         Ankit Agrawal                             Abstract
University of Notre Dame                   University of Notre Dame                  Research in human-drone interactions has primarily fo-
Notre Dame, IN 46556, USA                  Notre Dame, IN 46556, USA                 cused on cases in which a person interacts with a single
JaneHuang@nd.edu                           aagrawa2@nd.edu
                                                                                     drone as an active controller, recipient of information, or
                                                                                     a social companion; or cases in which an individual, or a
                                                                                     team of operators interacts with a swarm of drones as they
                                                                                     perform some coordinated flight patterns. In this position
                                                                                     paper we explore a third scenario in which multiple humans
                                                                                     and drones collaborate in an emergency response sce-
                                                                                     nario. We discuss different types of interactions, and draw
                                                                                     examples from current DroneResponse project.

                                                                                     Author Keywords
                                                                                     Human-drone collaboration, emergency response

                                                                                     Introduction
                                                                                     Small Unmanned Aerial Systems, which we refer to here as
                                                                                     drones, can be effectively deployed to support emergency
                                                                                     responders for diverse scenarios such as search-and-
                                                                                     rescue, accident surveillance, and flood inspections. Cur-
                                                                                     rently, emergency responders tend to operate drones man-
This paper is published under the Creative Commons Attribution 4.0 International     ually or using off-the-shelf applications that allow them to
(CC-BY 4.0) license. Authors reserve their rights to disseminate the work on their
personal and corporate Web sites with the appropriate attribution.
                                                                                     preprogram sets of waypoints. However, equipping drones
Interdisciplinary Workshop on Human-Drone Interaction (iHDI 2020)                    with onboard intelligence allows them to perform tasks au-
CHI ’20 Extended Abstracts, 26 April 2020, Honolulu, HI, US
© Creative Commons CC-BY 4.0 License.
                                                                                     tonomously and to contribute more fully to the emergency
                                                                                     response.
                                   In our DroneResponse project we are designing and de-            annotated video to the incident commander. Finally, the
                                   veloping a system to deploy and coordinate the efforts of        incident commander uses the information relayed by the
                                   multiple semi-autonomous drones for use in emergency sit-        drone, to confirm the victim sighting and to push informa-
                                   uations [1, 3]. Our vision is for humans and drones to work      tion from the drone to a physical rescue team (E4). This
                                   closely together as part of a complex mission – for exam-        final step is an example of human-to-human (H2H) interac-
                                   ple to monitor air quality following a chemical explosion, to    tion, triggered by the initial D2H exchange. This sequence
                                   perform search and rescue, to deliver medical supplies, or       of events illustrates the complex socio-technical aspects of
                                   to support firefighters during structural fires. As depicted     emergent multi-user, multi-drone interaction spaces.
                                   in Figure 1, there are several facets to human-drone inter-
                                   action in such scenarios. Humans need to communicate             Drone-to-Human Communication (D2H)
                                   mission goals and directives to groups of drones as well as      There are numerous challenges that must be addressed in
                                   to individuals, while drones need to keep humans informed        order to achieve efficient human-drone collaboration. In our
                                   of their current state and progress, and at times, need to       concurrently published work [1], we have focused on de-
                                   seek permission or guidance to perform specific tasks. In        signing a user interface that enables situational-awareness
                                   addition, both humans and drones need to communicate             (SA) [5] for human first-responders. SA involves percep-
                                   between themselves (i.e., drone-to-drone and human-to-           tion (i.e., recognizing and monitoring), comprehension
                                   human) to coordinate their activities.                           (i.e., interpreting and synthesizing information), and pro-
                                                                                                    jection (i.e., understanding the situation, projecting future
                                   An Interaction Example                                           outcomes) so that a user can make effective and action-
                                   We provide examples of such interactions in the sequence         able decisions. The key to designing D2H communication
                                   diagram depicted in Figure 2. The Incident Commander             is identifying information that is needed by different user
Figure 1: Humans and drones
                                   first defines a search area and sends a request to the hive      roles within specific contexts. As an example, a drone may
interact in many different ways.
                                   controller to start the search (E1). This is an example of       be ascribed the ability to autonomously decide its speed
                                   human-to-drone interactions (H2D). The hive-controller then      and altitude during a search. If visibility is good, the drone
                                   creates a search plan and assigns search routes to drones        might fly higher and faster in order to cover the search area
                                   (E2). The coordination between drones represents drone-          more quickly, while still returning accurate results. On the
                                   to-drone (D2D) interaction. In the modeled sequence of           other hand, if visibility is lower, the drone might need to fly
                                   actions, Dronen detects a potential drowning victim. It then     lower and slower, and adapt its flight plan to compensate
                                   notifies the human incident commander and starts stream-         for a reduced field of view. In this scenario, the operator
                                   ing annotated video to the ground (E3), thereby illustrating     needs visual cues and awareness of why a drone behaves
                                   drone-to-human communication (D2H). The part of the se-          as it does. As an outcome of a four month co-design pro-
                                   quence diagram highlighted in yellow provides an example         cess with our local fire department, we identified two design
                                   of a more complex bi-directional human-drone conversa-           strategies to address this specific scenario. First, we de-
                                   tion in which the drone uses its sensing (image detection)       signed our DroneResponse GUI to depict any environmen-
                                   abilities to detect a victim. It then autonomously switches to   tal factors that were likely to impact drone behavior – for
                                   track-victim mode, raises a victim-found alert, and streams      example, low visibility, high-winds, or prohibited airspace.
                                                                                                                                  Incident
                                                     HiveController                        Dronen                               Commander                            Rescuer

                                                       Plan search                               startSearch(GPS_Coords) E1
                                                       routes and                                                              Initiate search
                                                                         assignRoute()
                                                       assign tasks                        Search
                                                                         E2
                                                                                           Possible      E3   Notify(event)
                                                                                         victim found                          Evaluate Victim
                                                                                                                                    claim
                                                                                                              streamImage()                      initiateRescue
                                                                                                                                                 (GPS Coordinates,
                                                     Monitor mission                     Track victim         confirmTrack()   Confirm Victim    Imagery) E4
                                                                                                                                   Found
Figure 3: Firefighters in South                                                                               streamImage()                                     Initiate Rescue
Bend manually operate drones
                                                                                                        missionCompleted()      End Mission                     Rescue Victim
                                                                                                                                                  Confirm
                                                                         RTL()                                                                    Rescue()
                                                     Recall all Drones                   Return home

                                      Figure 2: A Sequence Diagram showing the interactions between humans and drones for the scenario in which a drone searches for, and
                                      detects, a potential drowning victim.



                                      Secondly, we made the drones explain themselves upon                                tions under stressful time-constraints of a life-and-death
                                      demand by describing their current strategies and permis-                           response. A user interface is therefore needed that en-
                                      sions. In the case of searching for a victim in inclement                           ables quick mission planning and configuration and which
Figure 4: Notre Dame researchers      weather, the drone might explain “flying lower than nor-                            supports high-level directives addressed to the cohort of
and firefighters interact with        mal at 10 meters due to low visibility” or “searching river                         drones, as well as specific directives addressed to individ-
multiple drones using Dronology [4]   banks at a greater distance than normal due to high wind                            ual drones.
                                      gusts and moving branches.” We report outcomes from our
                                      co-design experience, especially with respect to D2H inter-                         Researchers have previously explored diverse solutions
                                      actions and achieving situational awareness in our related                          for issuing commands to drones, such as the use of ges-
                                      paper [1].                                                                          tures and voice commands [6, 10] or airplane-like cockpits
                                                                                                                          for controlling large military-style drones [8]. We have pro-
                                      Human to Drone Mission Directives                                                   totyped the use of gestures and voice commands; how-
                                      Achieving effective H2D communication is challenging in                             ever, they have several shortcomings that inhibit their use
                                      systems with multiple drones and complex missions. Sev-                             in emergency response scenarios. Voice commands, while
                                      eral research groups have explored ways to specify drone                            appealing, are impractical due to the noise inherent to a
Figure 5: Defibrillator delivery by
                                      missions using formal commands, often embedded in do-                               rescue scene. This includes sirens, constant radio-chatter,
drone requires remote UI
                                      main specific languages [7]. However, it is infeasible for                          and now the additional noise of drone motors. Gestures
                                      emergency responders to write such mission specifica-                               are similarly impractical. They have been shown to work
                                      effectively in controlled near-distance environments, which
                                      is far from the case for an emergency response scenario
                                      [2]. Furthermore, they introduce significant room for error,
                                      which is unacceptable in an emergency response envi-
Figure 6: Mission commands can        ronment, where mistakes could cost the lives of both the
be addressed to specific groups of    victims and the rescuers. Domain experts, collaborating in
drones.                               the co-design of DroneResponse soundly ruled out either
                                      of these approaches [1]. We therefore have opted to create
                                      a GUI-based solution for HD2 commands for emergency
                                      response missions.

                                      In our GUI, which is currently under development, users can
                                      initially select a mission type from a high-level list of mis-
Figure 7: High-level commands,
                                      sions as depicted in Figure 9. They then perform a series
such as search, deliver, or relay,    of configurations such as marking a search area. Each pre-
are domain specific. These ones       defined mission type will have a corresponding underlying
target river search and rescue.       mission plan with configuration points. This plan is suffi-
                                      cient for allowing the mission to proceed through a series       Figure 9: A user initially selects a predefined mission type which
                                      of predefined stages and tasks (e.g., search, track, return      include mission objectives and task specifications.
                                      home). However, users will also need to configure or tweak
                                      the mission dynamically as it evolves, by providing addi-
                                      tional directives.
                                                                                                       connaissance, delivery, or serving as a communication re-
                                      Each of these directives must specify who, what, where,          lay if drones are communicating using onboard communica-
                                      and how a task is to be accomplished. ‘Who’ refers to whether    tion channels such as ad-hoc wifi. ‘Where’ refers to a region
                                      the command is addressed to the entire cohort or an indi-        or point of interest defined by GPS coordinates. For exam-
                                      vidual drone. In the case of the cohort, then the hive coor-     ple, in the case of reconnaissance, the user might need to
Figure 8: The user can mark an        dination is empowered to autonomously figure out which           direct the drone to a certain part of a wooded river bank
area on the map to define either a    drones are best fit to respond. ’Who’ could also be speci-       where somebody has sighted a piece of clothing; while in
region or a point of interest to be   fied with constraints such as 3 drones or drones with ther-      the case of establishing a communication relay, the user
targeted by the H2D directive.                                                                         could either specify coordinates or allow it to dynamically
                                      mal cameras onboard. Finally ‘who’ could be addressed to
                                      a specific drone if it were the case, that the Incident Com-     position itself so as to optimize communication between all
                                      mander wished to assign a task to a specific drone. This is      drones. Finally, ‘how’ enables specific directives for how the
                                      more risky, as the selected drone might be unfit for service     task is to be completed. In some cases, the drones could
                                      (e.g., due to low battery or current critical service). ‘What’   be given significant autonomy to complete well-defined
                                      refers to the specific task to be completed – for example re-    tasks, while in other cases more specific guidelines might
  Incorrect position                  be required. We are currently working closely with several       the drone with respect to the remote pilot so that physical
                                      emergency response organizations to better discover their        and GUI controls become aligned relative to the drone’s
                                      needs and to formally model diverse mission plans.               and pilot’s positions. Given this reorientation, a forward
                                                                                                       command would then consistently send the drone away
                                      The GUI therefore provides a human-facing interface to an        from the pilot, and a moveRight command would make it
                                      underlying mission plan specified using a more formal ap-        move right. For deployment in emergency situations, more
Figure 10: The throttle on the        proach such as belief-desire-intent [9]. Drones are able to      though should be invested in the use of both graphical and
physical user interface was           interpret the more formal specification. In our initial proto-   physical interfaces, the interactions between them, and
incorrectly positioned during a       type we are experimenting with a flow-chart of buttons to        transitions of control across devices and between different
transition from software-controlled   enable humans to configure mission directives in a known         operators.
flight to manually operated flight,   space of options. These are depicted in Figures 6-8.
causing the UAV to crash.
                                                                                                       Conclusion
                                      GUI versus Physical Devices                                      This position paper has presented an informal framework
                                      The discussion in this position paper has focused almost         for considering human-drone interactions along the dimen-
                                      entirely on human-drone interfaces based on the use of           sions of H2D, D2H, D2D, and H2H communication in multi-
                                      graphical interfaces; however, drones can also be controlled     user, multi-drone environments where drones are permitted
                                      using physical hand-held devices. In a multi-drone scenario,     to operate with some degree of autonomy. We have de-
                                      humans might need to switch between graphical and phys-          scribed some of the challenges we are facing in the design
                                      ical interfaces for several reasons including taking manual      of DroneResponse and some initial ideas for addressing
                                      control of a malfunctioning drone or temporarily using man-      those challenges. Our prior work [1] has focused primarily
                                      ual controls for a specific task that is currently beyond the    on the D2H challenge of supporting situational awareness,
                                      capabilities of a drone to perform autonomously. Our prior       while our ongoing work focuses on providing a meaningful
                                      work has shown that misalignment of GUI’s and physical           interface for more complex bidirectional human and drone
                                      controllers can easily lead to accidents [3] (see Figure 10.     interactions.
                                      For example, when control is passed from a computer to a
                                      handheld device, the physical switches on the hand-held          Acknowledgements
                                      controller must be set to stable ’flightmode’ positions, oth-    The work described in this position paper has primarily
                                      erwise accidents, including crashlandings, could occur.          been funded by the National Science Foundation under
                                      Furthermore, when humans take-over control of remote             grants CNS-1737496, CNS-1931962, CCF-1647342, and
                                      drones, it can be exceedingly difficult to figure out which      CCF-1741781. We also thank the Firefighters of South
                                      direction the drone is facing. Commands are interpreted rel-     Bend for closely collaborating on the DroneResponse project.
                                      ative to the drones position, which means that issuing a ‘for-
                                      ward’ command would cause the drone to fly forward, but          REFERENCES
                                      without clear orientation from the remote-pilot’s perception,     [1] A. Agrawal, S. Abraham, B. Burger, C. Christine, L.
                                      that could actually be in any direction. A simple design solu-        Fraser, J. Hoeksema, S. Hwang, E. Travnik, S. Kumar,
                                      tion might be to provide a feature to autonomously reorient           W. Scheirer, J. Cleland-Huang, M. Vierhauser, R.
   Bauer, and S. Cox. 2020. The Next Generation of             Second Edition (2nd ed.). CRC Press, Inc., Boca
   Human-Drone Partnerships: Co-Designing an                   Raton, FL, USA.
   Emergency Response System. In Conference on
                                                            [6] Markus Funk. 2018. Human-drone interaction: Let’s
   Human Factors in Computing Systems (Chi), Hawai.
                                                                get ready for flying user interfaces! Interactions 25, 3
   DOI:http://dx.doi.org/10.1145/3313831.3376825                (2018), 78–81. DOI:
[2] Jessica R. Cauchard, Jane L. E, Kevin Y. Zhai, and          http://dx.doi.org/10.1145/3194317
    James A. Landay. 2015. Drone Me: An Exploration         [7] Sergio García, Patrizio Pelliccione, Claudio Menghi,
    into Natural Human-Drone Interaction. In Proceedings        Thorsten Berger, and Tomas Bures. 2019. High-Level
    of the 2015 ACM International Joint Conference on           Mission Specification for Multiple Robots. In
    Pervasive and Ubiquitous Computing (UbiComp ’15).           Proceedings of the 12th ACM SIGPLAN International
    Association for Computing Machinery, New York, NY,          Conference on Software Language Engineering (SLE
    USA, 361–365. DOI:                                          2019). Association for Computing Machinery, New
    http://dx.doi.org/10.1145/2750858.2805823                   York, NY, USA, 127–140. DOI:
[3] Jane Cleland-Huang and Michael Vierhauser. 2018.            http://dx.doi.org/10.1145/3357766.3359535
    Discovering, Analyzing, and Managing Safety Stories     [8] Alan Hobbs and B. Lyall. 2016. Human Factors
    in Agile Projects. In 26th IEEE International               Guidelines for Unmanned Aircraft Systems.
    Requirements Engineering Conference, RE 2018,               Ergonomics in Design: The Quarterly of Human
    Banff, AB, Canada, August 20-24, 2018. 262–273.             Factors Applications 24 (04 2016). DOI:
    DOI:http://dx.doi.org/10.1109/RE.2018.00034                 http://dx.doi.org/10.1177/1064804616640632
[4] Jane Cleland-Huang, Michael Vierhauser, and Sean        [9] Anand S. Rao and Michael P. Georgeff. 1995. BDI
    Bayley. 2018. Dronology: an incubator for                   Agents: From Theory to Practice. In Proceedings of
    cyber-physical systems research. In Proceedings of          the First International Conference on Multiagent
    the 40th International Conference on Software               Systems, June 12-14, 1995, San Francisco, California,
    Engineering: New Ideas and Emerging Results, ICSE           USA. 312–319.
    (NIER) 2018, Gothenburg, Sweden, May 27 - June 03,
    2018. 109–112. DOI:                                    [10] D. Tezza and M. Andujar. 2019. The State-of-the-Art of
    http://dx.doi.org/10.1145/3183399.3183408                   Human–Drone Interaction: A Survey. IEEE Access 7
                                                                (2019), 167438–167454. DOI:
[5] Mica R. Endsley. 2011. Designing for Situation              http://dx.doi.org/10.1109/ACCESS.2019.2953900
    Awareness: An Approach to User-Centered Design,