=Paper= {{Paper |id=Vol-2617/paper3 |storemode=property |title=ProxyDrone: Autonomous Drone Landing on the Human Body |pdfUrl=https://ceur-ws.org/Vol-2617/paper3.pdf |volume=Vol-2617 |authors=Jonas Auda,Martin Weigel,Jessica Cauchard,Stefan Schneegass |dblpUrl=https://dblp.org/rec/conf/chi/AudaWCS20 }} ==ProxyDrone: Autonomous Drone Landing on the Human Body== https://ceur-ws.org/Vol-2617/paper3.pdf
                                   ProxyDrone: Autonomous Drone
                                   Landing on the Human Body

Jonas Auda                                 Jessica Cauchard                          Abstract
paluno                                     Ben-Gurion University                     Launching drones often requires several steps that the op-
University of Duisburg-Essen               of the Negev                              erator needs to complete. Yet, in many scenarios, such as
jonas.auda@uni-due.de                      jcauchard@bgu.ac.il
                                                                                     search and rescue, saving time is crucial. For instance, res-
                                                                                     cue personnel might be occupied with safety-critical tasks,
                                                                                     while needing to operate drones to get an overview of the
                                                                                     environment. We propose the concept of a drone that is
                                                                                     located on the human body (e.g., on the back). The drone
                                                                                     can take-off and land without human intervention. We plan
Martin Weigel                              Stefan Schneegass                         to build a working prototype and investigate drone maneu-
Honda Research Institute                   paluno                                    vers that are suitable for both taking off and landing opera-
Europe, Offenbach, Germany                 University of Duisburg-Essen              tions on the human body. We will further investigate the op-
martin.weigel@honda-ri.de                  stefan.schneegass@uni-due.de              erator’s perception and extract task-related design factors.
                                                                                     This work will help derive guidelines for implicit human-
                                                                                     drone interaction at close proximity.

                                                                                     Author Keywords
                                                                                     Human-Drone Interaction; On-Body; Drones.

                                                                                     CCS Concepts
                                                                                     •Human-centered computing → Haptic devices; Human
This paper is published under the Creative Commons Attribution 4.0 International     computer interaction (HCI); Haptic devices;
(CC-BY 4.0) license. Authors reserve their rights to disseminate the work on their
personal and corporate Web sites with the appropriate attribution.
Interdisciplinary Workshop on Human-Drone Interaction (iHDI 2020)                    Introduction
CHI ’20 Extended Abstracts, 26 April 2020, Honolulu, HI, US                          Drones will likely become ubiquitous companions for hu-
© Creative Commons CC-BY 4.0 License.
                                                                                     mans in the near future. They can be used for a range of
                                                                                     applications, such as video production and photography, to
guide visually impaired people [4], support artistic perfor-      Such a drone requires two major considerations: 1. the
mances [11], body movement [15, 16] or sports education           technical capability for the drone to attach and detach it-
[24], and even support search and rescue missions [18].           self to the human body, and 2. the user’s acceptability of
We expect that this growing range of applications will in-        close body interaction. We indeed envision that the design
crease interactions between drones and humans [12]. In            of the drone and its position with regards to the person will
addition, drones are now being used as flying interfaces [6,      affect its acceptability. Prior work highlighted many design
13] and can serve as haptic proxies to enhance Virtual Re-        factors that influence the perception of a drone user [23,
ality (VR) experiences [2, 14]. Different aspects of human-       5], including that current safety mechanisms are perceived
drone interaction were investigated by previous research,         negatively in terms of trust.
such as input using drones [3, 7], expressive drone flight
behavior [9] and orchestration of drones [17].                    As such, we plan to prototype different form factors and
                                                                  technical solutions suited for taking-off and landing-on the
The use-cases for drones are versatile, yet we find that the      human body. That includes the design and development
interaction is often centered around the human body. That         of bespoke drones with diverse mechanisms (e.g., elec-
topic was exposed in prior work designing a drone user            tric magnets, hooks, and textile solutions), for the drone to
interface projected around the user’s body [8]. However, we       attach itself to its operator. We then propose to develop a
note that the close proximity of the drone to the user is still   framework for close human-drone interaction that will en-
not yet investigated in the literature. In prior robotics work,   able us to research and identify suitable flight behavior and
researchers had investigated the use of robots on a user’s        design factors to accomplish automated land and take-off
body [10], which inspired our work.                               procedures on the human body.

We find examples of body-worn drones like the Nixie which         Design Space of Body Worn Drones
is used for photography [1]. This wrist-worn drone can be         In the following, we propose a design space for body-worn
used on-demand to take selfies of its operator. Similar to        drones and discuss each of the identified dimension.
the Nixie drone, we propose to develop a drone that can
land-on and take-off from the human body. Being in close          Body Location
proximity to the user opens new doors to human drone              We want to investigate which parts of the body are suitable
interaction through multiple modalities, from vibrations to       to serve as a spot for drones to land and take-off. This can
pulling or bumping into the user.                                 influence the design and size of the drone, as well as its
                                                                  flight behavior (i.e., take-off and landing procedures). An
To enable this close interaction, we explore how a drone          important question to be considered, is how the user per-
can land-on and take-off from a person’s body. We envision        ceives the drone while it is approaching various body parts.
several scenarios that would benefit from such functionality.     We consider the following body locations:
For example, drone operators can seamlessly start and
stop operating a drone that might be attached to the back of      Back. The back might offer plenty of space for a drone to
the operator. In such a situation, rescue personnel can use       land. Possible larger drones might be deployed on the back
drones while being engaged in safety-critical tasks.              of a user. Further, functional garments might provide an-
choring on the back. The back might be suitable for drones           ware solutions, from electric magnets to velcro tape. Spe-
that take-off and land while the operator is engaged in a            cially designed clothing might provide docking capabilities
task. We expect that the posture of the user will present            for easier landing and take-off, although we prefer ad-hoc
some importance. For example, when the drone is hang-                solutions that do not require the user to wear specific equip-
ing vertically from the back (e.g., like a fly on a wall), it must   ment. We will investigate how a drone can rest on a per-
carry out a specific maneuver to stabilize itself in the air         son’s body, while not falling off while he/she is moving. We
when it takes off. Such a procedure must be done safely to           propose that the drone may use its own force to stabilize
protect the user.                                                    itself on the human body.

Shoulder. Like a parrot, a drone could rest on the shoulder          Level of Automation
of a user. Smaller, light-weighted drones might be suitable          Triggering take-off and landing sequences can be done
in this case. Safety means and specific maneuvers should             in various ways. It can be automated with no hands used
be investigated due to the proximity to the user’s face.             or triggered explicitly by the user (e.g., the drone can be
                                                                     grabbed and put into place). The drone might detect ges-
Head. Drones that are deployed on the head might have                tures, speech commands, or context to initiate take-off and
a special design. The proximity to the face will influence           landing. It will therefore be important to communicate the
both design and maneuvering operations. Drones in close              intent of the drone to the user and vice versa. If a drone
proximity to the head require a small-size and light-weight          approaches the user to land, the user should understand
design, so that the drone does not get in the way of the             the next steps of the drone’s landing process. This can
human body sensory systems. Helmets might serve as a                 be achieved by wearing smart glasses that display the
ramp for the drone to land and take off.                             flight plan or even Augmented Reality (AR) to visualize the
                                                                     planned trajectory of the drone. We expect that lights might
Arm. Like a falcon a drone could land on the arm [19, 20].
                                                                     be used to communicate intent, [22] as planes do. Also,
The falcon metaphor implies certain behaviors, such as fly-
                                                                     the user might intervene with an autonomous operating
ing to a location and coming back to the user. We imagine
                                                                     drone. Therefore, the drone should provide an intervention
the user could hold up their arm to indicate to the drone that
                                                                     interface [21]. Implicit and explicit interactions might vary
it can take-off or land. On the one hand, triggering such in-
                                                                     depending on the use case. However, detected commands
teractions might become intuitive to the user and require
                                                                     triggered by false positives can lead to dangerous situa-
little cognitive load. On the other hand, take-off and landing
                                                                     tions. In that case, it is very important to use appropriate
sequences might be difficult if the user’s hands are busy
                                                                     context aware controls and triggering mechanisms that can
(e.g., carrying a device or performing a task). Small and
                                                                     adapt to the situation of the operator (e.g., occupied hands).
medium-sized drones might be suitable for this body part.
                                                                     Drone Shape and Function
Body Adhesion Method
                                                                     The size of a drone will most likely determine its use cases.
Since the drone should remain on the human body after
                                                                     A small drone with a camera can be used for scouting and
landing, we will investigate materials and techniques to at-
                                                                     overview, while a larger drone can enable physical interac-
tach drones to the body. We are considering different hard-
tion with the user and other objects (e.g., carrying a pay-       Personal Assistant
load). These factors will influence the drone design and          Close proximity to the user enables more intimate relation-
determine the interaction space.                                  ships between drones and humans. We expect such drones
                                                                  will be understood like a pet sitting on its owner’s shoul-
Application Scenarios                                             der, rather than as a piece of technology. As such, we envi-
We outline three different application scenarios in which we      sion that the drone could become a personal assistant. The
envision close-to-body drones to be applicable.                   drone can use the operator’s body as a base station (e.g.,
                                                                  when charging) and take-off to perform off-body tasks, such
Search and Rescue                                                 as taking a photo (as in [1]), navigating the user to a des-
Rescue personnel can benefit from drones that take-off            tination, and transporting small objects. This exceeds the
automatically while being engaged in primary tasks. The           capabilities of today’s body-worn robots [10].
drone could be used as a scout for planning a mission,
while critical tasks can be fulfilled without the interruption,   Research Plan
that is currently required, to start operating the drone. For     We plan the following steps to build and evaluate our pro-
example, in firefighting missions, the firefighters have to       totype and to extract guidelines for close proximity human-
pay attention to their environment and protect their own life     drone interaction. In the initial step, we will gather litera-
while trying to rescue survivors. A drone could be of great       ture on drones and on-body interaction to derive a suitable
help to sense the surrounding environment, but should             concept. Afterwards, we will implement the system (i.e.,
however do so without interrupting the firefighters or adding     drone and the control application), so that the drone should
to their cognitive load.                                          be directed towards its target automatically. Once being
                                                                  in proximity to the target, it should initiate a suitable land-
On-demand 3rd Eye
                                                                  ing maneuver and attach itself to a person. Once attached,
Climbers might need an overview of their surroundings
                                                                  the drone should be able to identify opportune moments to
while being suspended at great heights. For example, they
                                                                  take-off based on its role. After the implementation phase,
may want to check for changing weather conditions or map
                                                                  we evaluate our prototype in a user study and derive guide-
out their climbing path. Getting a drone to take off from
                                                                  lines from the study results.
one’s hand or from the ground while climbing might be com-
plicated, if not dangerous, or impossible. We propose that a
                                                                  Conclusion
drone, attached to the back of a climber, could take-off and
                                                                  We proposed to investigate close proximity drones that can
gather information before landing back on its operator. The
                                                                  land and take-off from the human body. First, we identified
action of take-off or landing could be done without requiring
                                                                  requirements to span an initial design space. We then dis-
the use of the climbers hands. In addition, the drone could
                                                                  cussed various aspects that must be considered for body-
directly support the climber, such as by lifting and secur-
                                                                  worn drones, including body location, level of automation,
ing a carabiner. Such scenarios would increase the safety
                                                                  drone shape and functionality. Finally, we introduced appli-
of the climber, especially when the climber is exhausted or
                                                                  cation scenarios and presented a research plan.
can not reach the next spot to secure him/herself.
REFERENCES                                                     Conference on Human Factors in Computing Systems
[1] Accessed: 2020-02-18. Nixie - Selfie-Drone.                (CHI ’19). Association for Computing Machinery, New
    https://time.com/3559818/                                  York, NY, USA, Article 250, 13 pages. DOI:
    meet-nixie-the-selfie-drone-you-wear-on-your-wrist/.       http://dx.doi.org/10.1145/3290605.3300480
    (Accessed: 2020-02-18).                                 [6] Sean Braley, Calvin Rubens, Timothy Merritt, and Roel
[2] Parastoo Abtahi, Benoit Landry, Jackie (Junrui) Yang,       Vertegaal. 2018. GridDrones: A Self-Levitating
    Marco Pavone, Sean Follmer, and James A. Landay.            Physical Voxel Lattice for Interactive 3D Surface
    2019. Beyond The Force: Using Quadcopters to                Deformations. In Proceedings of the 31st Annual ACM
    Appropriate Objects and the Environment for Haptics         Symposium on User Interface Software and
    in Virtual Reality. In Proceedings of the 2019 CHI          Technology (UIST ’18). Association for Computing
    Conference on Human Factors in Computing Systems            Machinery, New York, NY, USA, 87–98. DOI:
    (CHI ’19). Association for Computing Machinery, New         http://dx.doi.org/10.1145/3242587.3242658
    York, NY, USA, Article 359, 13 pages. DOI:
                                                            [7] Jessica R. Cauchard, Jane L. E, Kevin Y. Zhai, and
    http://dx.doi.org/10.1145/3290605.3300589
                                                                James A. Landay. 2015. Drone & Me: An Exploration
[3] Parastoo Abtahi, David Y. Zhao, Jane L. E., and             into Natural Human-Drone Interaction. In Proceedings
    James A. Landay. 2017. Drone Near Me: Exploring             of the 2015 ACM International Joint Conference on
    Touch-Based Human-Drone Interaction. Proc. ACM              Pervasive and Ubiquitous Computing (UbiComp ’15).
    Interact. Mob. Wearable Ubiquitous Technol. 1, 3,           Association for Computing Machinery, New York, NY,
    Article 34 (Sept. 2017), 8 pages. DOI:                      USA, 361–365. DOI:
    http://dx.doi.org/10.1145/3130899                           http://dx.doi.org/10.1145/2750858.2805823
[4] Mauro Avila, Markus Funk, and Niels Henze. 2015.        [8] Jessica R. Cauchard, Alex Tamkin, Cheng Yao Wang,
    DroneNavigator: Using Drones for Navigating Visually        Luke Vink, Michelle Park, Tommy Fang, and James A
    Impaired Persons. In Proceedings of the 17th                Landay. 2019. Drone.io: A Gestural and Visual
    International ACM SIGACCESS Conference on                   Interface for Human-Drone Interaction. In 2019 14th
    Computers & Accessibility (ASSETS ’15). Association         ACM/IEEE International Conference on Human-Robot
    for Computing Machinery, New York, NY, USA,                 Interaction (HRI). 153–162. DOI:
    327–328. DOI:                                               http://dx.doi.org/10.1109/HRI.2019.8673011
    http://dx.doi.org/10.1145/2700648.2811362
                                                            [9] Jessica R. Cauchard, Kevin Y. Zhai, Marco Spadafora,
[5] Mehmet Aydin Baytas, Damla Çay, Yuchong Zhang,              and James A. Landay. 2016. Emotion Encoding in
    Mohammad Obaid, Asim Evren Yantaç, and Morten               Human-Drone Interaction. In The Eleventh ACM/IEEE
    Fjeld. 2019. The Design of Social Drones: A Review of       International Conference on Human Robot Interaction
    Studies on Autonomous Flyers in Inhabited                   (HRI ’16). IEEE Press, 263–270.
    Environments. In Proceedings of the 2019 CHI
[10] Artem Dementyev, Hsin-Liu (Cindy) Kao, Inrak Choi,           Conference Extended Abstracts on Human Factors in
     Deborah Ajilo, Maggie Xu, Joseph A. Paradiso, Chris          Computing Systems (CHI EA ’17). Association for
     Schmandt, and Sean Follmer. 2016. Rovables:                  Computing Machinery, New York, NY, USA, 433–436.
     Miniature On-Body Robots as Mobile Wearables. In             DOI:http://dx.doi.org/10.1145/3027063.3050426
     Proceedings of the 29th Annual Symposium on User
                                                              [15] Joseph La Delfa, Mehmet Aydın Baytas, Rakesh
     Interface Software and Technology (UIST ’16).
                                                                   Patibanda, Hazel Ngari, and Rohit Ashok Khot. 2020.
     Association for Computing Machinery, New York, NY,
                                                                   Drone Chi: Somaesthetic Human-Drone Interaction. In
     USA, 111–120. DOI:
                                                                   Proceedings of the 2020 CHI Conference on Human
     http://dx.doi.org/10.1145/2984511.2984531
                                                                   Factors in Computing Systems (CHI ’20). Association
[11] Sara Eriksson, Åsa Unander-Scharin, Vincent Trichon,          for Computing Machinery, New York, NY, USA.
     Carl Unander-Scharin, Hedvig Kjellström, and Kristina
     Höök. 2019. Dancing With Drones: Crafting Novel          [16] Joseph La Delfa, Mehmet Aydin Baytas, Olivia
     Artistic Expressions Through Intercorporeality. In            Wichtowski, Rohit Ashok Khot, and Florian Floyd
     Proceedings of the 2019 CHI Conference on Human               Mueller. 2019. Are Drones Meditative?. In Extended
     Factors in Computing Systems (CHI ’19). Association           Abstracts of the 2019 CHI Conference on Human
     for Computing Machinery, New York, NY, USA, Article           Factors in Computing Systems (CHI EA ’19).
     617, 12 pages. DOI:                                           Association for Computing Machinery, New York, NY,
     http://dx.doi.org/10.1145/3290605.3300847                     USA, 4. DOI:
                                                                   http://dx.doi.org/10.1145/3290607.3313274
[12] Markus Funk. 2018. Human-Drone Interaction: Let’s
     Get Ready for Flying User Interfaces! Interactions 25,   [17] Marinus Burger Albrecht Schmidt Thomas Kosch
     3 (April 2018), 78–81. DOI:                                   Matthias Hoppe, Yannick Weiß. 2020. Do not Drone
     http://dx.doi.org/10.1145/3194317                             Yourself in Work: A Framework to Program Drone
                                                                   Flight Paths. In 2st International Workshop on
[13] Antonio Gomes, Calvin Rubens, Sean Braley, and                Human-Drone Interaction. Hawaii, United States.
     Roel Vertegaal. 2016. BitDrones: Towards Using 3D
     Nanocopter Displays as Interactive Self-Levitating       [18] Sven Mayer, Lars Lischke, and Paweł W. Wozniak.
     Programmable Matter. In Proceedings of the 2016 CHI           2019. Drones for Search and Rescue. In International
     Conference on Human Factors in Computing Systems              workshop on Human-Drone Interaction, CHI ’19
     (CHI ’16). Association for Computing Machinery, New           Extended Abstracts (2019-05-04) (iHDI’19). Glasgow,
     York, NY, USA, 770–780. DOI:                                  Scotland, UK, 6.
     http://dx.doi.org/10.1145/2858036.2858519                     https://hal.archives-ouvertes.fr/hal-02128385
[14] Pascal Knierim, Thomas Kosch, Valentin Schwind,          [19] Wai Shan Ng and Ehud Sharlin. 2011. Collocated
     Markus Funk, Francisco Kiss, Stefan Schneegass, and           interaction with flying robots. In 2011 RO-MAN.
     Niels Henze. 2017. Tactile Drones - Providing                 143–149. DOI:
     Immersive Tactile Feedback in Virtual Reality through         http://dx.doi.org/10.1109/ROMAN.2011.6005280
     Quadcopters. In Proceedings of the 2017 CHI
[20] Beat Rossmy and Kai Holländer. 2019. Lure the                 York, NY, USA, 19–26. DOI:
     Drones - Falconry Inspired HDI. In 1st International          http://dx.doi.org/10.1145/2696454.2696475
     Workshop on Human-Drone Interaction. Ecole
                                                              [23] Anna Wojciechowska, Jeremy Frey, Esther
     Nationale de l’Aviation Civile [ENAC], Glasgow, United
                                                                   Mandelblum, Yair Amichai-Hamburger, and Jessica R.
     Kingdom.
                                                                   Cauchard. 2019. Designing Drones: Factors and
     https://hal.archives-ouvertes.fr/hal-02128393
                                                                   Characteristics Influencing the Perception of Flying
[21] Albrecht Schmidt and Thomas Herrmann. 2017.                   Robots. Proc. ACM Interact. Mob. Wearable
     Intervention User Interfaces: A New Interaction               Ubiquitous Technol. 3, 3, Article 111 (Sept. 2019), 19
     Paradigm for Automated Systems. Interactions 24, 5            pages. DOI:http://dx.doi.org/10.1145/3351269
     (Aug. 2017), 40–45. DOI:
                                                              [24] Sergej G. Zwaan and Emilia I. Barakova. 2016. Boxing
     http://dx.doi.org/10.1145/3121357
                                                                   against Drones: Drones in Sports Education. In
[22] Daniel Szafir, Bilge Mutlu, and Terry Fong. 2015.             Proceedings of the The 15th International Conference
     Communicating Directionality in Flying Robots. In             on Interaction Design and Children (IDC ’16).
     Proceedings of the Tenth Annual ACM/IEEE                      Association for Computing Machinery, New York, NY,
     International Conference on Human-Robot Interaction           USA, 607–612. DOI:
     (HRI ’15). Association for Computing Machinery, New           http://dx.doi.org/10.1145/2930674.2935991