<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Drone.io: A Gestural and Visual
Interface for Human-Drone Interaction. In</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.1145/3242587.3242658</article-id>
      <title-group>
        <article-title>ProxyDrone: Autonomous Drone Landing on the Human Body</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Jonas Auda</string-name>
          <email>jonas.auda@uni-due.de</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jessica Cauchard</string-name>
          <email>jcauchard@bgu.ac.il</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Martin Weigel</string-name>
          <email>martin.weigel@honda-ri.de</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Stefan Schneegass</string-name>
          <email>stefan.schneegass@uni-due.de</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>CCS Concepts</string-name>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Ben-Gurion University, of the Negev</institution>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Honda Research Institute</institution>
          ,
          <addr-line>Europe, Offenbach</addr-line>
          ,
          <country country="DE">Germany</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>paluno, University of Duisburg-Essen</institution>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Human-centered computing ! Haptic devices; Human</institution>
          ,
          <addr-line>computer interaction (HCI); Haptic devices;</addr-line>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2019</year>
      </pub-date>
      <volume>14</volume>
      <fpage>87</fpage>
      <lpage>98</lpage>
      <abstract>
        <p>Launching drones often requires several steps that the operator needs to complete. Yet, in many scenarios, such as search and rescue, saving time is crucial. For instance, rescue personnel might be occupied with safety-critical tasks, while needing to operate drones to get an overview of the environment. We propose the concept of a drone that is located on the human body (e.g., on the back). The drone can take-off and land without human intervention. We plan to build a working prototype and investigate drone maneuvers that are suitable for both taking off and landing operations on the human body. We will further investigate the operator's perception and extract task-related design factors. This work will help derive guidelines for implicit humandrone interaction at close proximity.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>This paper is published under the Creative Commons Attribution 4.0 International
(CC-BY 4.0) license. Authors reserve their rights to disseminate the work on their
personal and corporate Web sites with the appropriate attribution.
Interdisciplinary Workshop on Human-Drone Interaction (iHDI 2020)
CHI ’20 Extended Abstracts, 26 April 2020, Honolulu, HI, US
© Creative Commons CC-BY 4.0 License.</p>
    </sec>
    <sec id="sec-2">
      <title>Introduction</title>
      <p>
        Drones will likely become ubiquitous companions for
humans in the near future. They can be used for a range of
applications, such as video production and photography, to
guide visually impaired people [
        <xref ref-type="bibr" rid="ref2">4</xref>
        ], support artistic
performances [11], body movement [15, 16] or sports education
[
        <xref ref-type="bibr" rid="ref10">24</xref>
        ], and even support search and rescue missions [
        <xref ref-type="bibr" rid="ref4">18</xref>
        ].
We expect that this growing range of applications will
increase interactions between drones and humans [12]. In
addition, drones are now being used as flying interfaces [6,
13] and can serve as haptic proxies to enhance Virtual
Reality (VR) experiences [2, 14]. Different aspects of
humandrone interaction were investigated by previous research,
such as input using drones [
        <xref ref-type="bibr" rid="ref1">3, 7</xref>
        ], expressive drone flight
behavior [9] and orchestration of drones [17].
      </p>
      <p>The use-cases for drones are versatile, yet we find that the
interaction is often centered around the human body. That
topic was exposed in prior work designing a drone user
interface projected around the user’s body [8]. However, we
note that the close proximity of the drone to the user is still
not yet investigated in the literature. In prior robotics work,
researchers had investigated the use of robots on a user’s
body [10], which inspired our work.</p>
      <p>We find examples of body-worn drones like the Nixie which
is used for photography [1]. This wrist-worn drone can be
used on-demand to take selfies of its operator. Similar to
the Nixie drone, we propose to develop a drone that can
land-on and take-off from the human body. Being in close
proximity to the user opens new doors to human drone
interaction through multiple modalities, from vibrations to
pulling or bumping into the user.</p>
      <p>
        To enable this close interaction, we explore how a drone
can land-on and take-off from a person’s body. We envision
several scenarios that would benefit from such functionality.
For example, drone operators can seamlessly start and
stop operating a drone that might be attached to the back of
the operator. In such a situation, rescue personnel can use
drones while being engaged in safety-critical tasks.
Such a drone requires two major considerations: 1. the
technical capability for the drone to attach and detach
itself to the human body, and 2. the user’s acceptability of
close body interaction. We indeed envision that the design
of the drone and its position with regards to the person will
affect its acceptability. Prior work highlighted many design
factors that influence the perception of a drone user [
        <xref ref-type="bibr" rid="ref3 ref9">23,
5</xref>
        ], including that current safety mechanisms are perceived
negatively in terms of trust.
      </p>
      <p>As such, we plan to prototype different form factors and
technical solutions suited for taking-off and landing-on the
human body. That includes the design and development
of bespoke drones with diverse mechanisms (e.g.,
electric magnets, hooks, and textile solutions), for the drone to
attach itself to its operator. We then propose to develop a
framework for close human-drone interaction that will
enable us to research and identify suitable flight behavior and
design factors to accomplish automated land and take-off
procedures on the human body.</p>
    </sec>
    <sec id="sec-3">
      <title>Design Space of Body Worn Drones</title>
      <p>In the following, we propose a design space for body-worn
drones and discuss each of the identified dimension.</p>
      <sec id="sec-3-1">
        <title>Body Location</title>
        <p>We want to investigate which parts of the body are suitable
to serve as a spot for drones to land and take-off. This can
influence the design and size of the drone, as well as its
flight behavior (i.e., take-off and landing procedures). An
important question to be considered, is how the user
perceives the drone while it is approaching various body parts.
We consider the following body locations:
Back. The back might offer plenty of space for a drone to
land. Possible larger drones might be deployed on the back
of a user. Further, functional garments might provide
anchoring on the back. The back might be suitable for drones
that take-off and land while the operator is engaged in a
task. We expect that the posture of the user will present
some importance. For example, when the drone is
hanging vertically from the back (e.g., like a fly on a wall), it must
carry out a specific maneuver to stabilize itself in the air
when it takes off. Such a procedure must be done safely to
protect the user.</p>
        <p>Shoulder. Like a parrot, a drone could rest on the shoulder
of a user. Smaller, light-weighted drones might be suitable
in this case. Safety means and specific maneuvers should
be investigated due to the proximity to the user’s face.
Head. Drones that are deployed on the head might have
a special design. The proximity to the face will influence
both design and maneuvering operations. Drones in close
proximity to the head require a small-size and light-weight
design, so that the drone does not get in the way of the
human body sensory systems. Helmets might serve as a
ramp for the drone to land and take off.</p>
        <p>
          Arm. Like a falcon a drone could land on the arm [
          <xref ref-type="bibr" rid="ref5 ref6">19, 20</xref>
          ].
The falcon metaphor implies certain behaviors, such as
flying to a location and coming back to the user. We imagine
the user could hold up their arm to indicate to the drone that
it can take-off or land. On the one hand, triggering such
interactions might become intuitive to the user and require
little cognitive load. On the other hand, take-off and landing
sequences might be difficult if the user’s hands are busy
(e.g., carrying a device or performing a task). Small and
medium-sized drones might be suitable for this body part.
        </p>
      </sec>
      <sec id="sec-3-2">
        <title>Body Adhesion Method</title>
        <p>Since the drone should remain on the human body after
landing, we will investigate materials and techniques to
attach drones to the body. We are considering different
hardware solutions, from electric magnets to velcro tape.
Specially designed clothing might provide docking capabilities
for easier landing and take-off, although we prefer ad-hoc
solutions that do not require the user to wear specific
equipment. We will investigate how a drone can rest on a
person’s body, while not falling off while he/she is moving. We
propose that the drone may use its own force to stabilize
itself on the human body.</p>
      </sec>
      <sec id="sec-3-3">
        <title>Level of Automation</title>
        <p>
          Triggering take-off and landing sequences can be done
in various ways. It can be automated with no hands used
or triggered explicitly by the user (e.g., the drone can be
grabbed and put into place). The drone might detect
gestures, speech commands, or context to initiate take-off and
landing. It will therefore be important to communicate the
intent of the drone to the user and vice versa. If a drone
approaches the user to land, the user should understand
the next steps of the drone’s landing process. This can
be achieved by wearing smart glasses that display the
flight plan or even Augmented Reality (AR) to visualize the
planned trajectory of the drone. We expect that lights might
be used to communicate intent, [
          <xref ref-type="bibr" rid="ref8">22</xref>
          ] as planes do. Also,
the user might intervene with an autonomous operating
drone. Therefore, the drone should provide an intervention
interface [
          <xref ref-type="bibr" rid="ref7">21</xref>
          ]. Implicit and explicit interactions might vary
depending on the use case. However, detected commands
triggered by false positives can lead to dangerous
situations. In that case, it is very important to use appropriate
context aware controls and triggering mechanisms that can
adapt to the situation of the operator (e.g., occupied hands).
        </p>
      </sec>
      <sec id="sec-3-4">
        <title>Drone Shape and Function</title>
        <p>The size of a drone will most likely determine its use cases.
A small drone with a camera can be used for scouting and
overview, while a larger drone can enable physical
interaction with the user and other objects (e.g., carrying a
payload). These factors will influence the drone design and
determine the interaction space.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Application Scenarios</title>
      <p>We outline three different application scenarios in which we
envision close-to-body drones to be applicable.</p>
      <sec id="sec-4-1">
        <title>Search and Rescue</title>
        <p>Rescue personnel can benefit from drones that take-off
automatically while being engaged in primary tasks. The
drone could be used as a scout for planning a mission,
while critical tasks can be fulfilled without the interruption,
that is currently required, to start operating the drone. For
example, in firefighting missions, the firefighters have to
pay attention to their environment and protect their own life
while trying to rescue survivors. A drone could be of great
help to sense the surrounding environment, but should
however do so without interrupting the firefighters or adding
to their cognitive load.</p>
      </sec>
      <sec id="sec-4-2">
        <title>On-demand 3rd Eye</title>
        <p>Climbers might need an overview of their surroundings
while being suspended at great heights. For example, they
may want to check for changing weather conditions or map
out their climbing path. Getting a drone to take off from
one’s hand or from the ground while climbing might be
complicated, if not dangerous, or impossible. We propose that a
drone, attached to the back of a climber, could take-off and
gather information before landing back on its operator. The
action of take-off or landing could be done without requiring
the use of the climbers hands. In addition, the drone could
directly support the climber, such as by lifting and
securing a carabiner. Such scenarios would increase the safety
of the climber, especially when the climber is exhausted or
can not reach the next spot to secure him/herself.</p>
      </sec>
      <sec id="sec-4-3">
        <title>Personal Assistant</title>
        <p>Close proximity to the user enables more intimate
relationships between drones and humans. We expect such drones
will be understood like a pet sitting on its owner’s
shoulder, rather than as a piece of technology. As such, we
envision that the drone could become a personal assistant. The
drone can use the operator’s body as a base station (e.g.,
when charging) and take-off to perform off-body tasks, such
as taking a photo (as in [1]), navigating the user to a
destination, and transporting small objects. This exceeds the
capabilities of today’s body-worn robots [10].</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Research Plan</title>
      <p>We plan the following steps to build and evaluate our
prototype and to extract guidelines for close proximity
humandrone interaction. In the initial step, we will gather
literature on drones and on-body interaction to derive a suitable
concept. Afterwards, we will implement the system (i.e.,
drone and the control application), so that the drone should
be directed towards its target automatically. Once being
in proximity to the target, it should initiate a suitable
landing maneuver and attach itself to a person. Once attached,
the drone should be able to identify opportune moments to
take-off based on its role. After the implementation phase,
we evaluate our prototype in a user study and derive
guidelines from the study results.</p>
    </sec>
    <sec id="sec-6">
      <title>Conclusion</title>
      <p>We proposed to investigate close proximity drones that can
land and take-off from the human body. First, we identified
requirements to span an initial design space. We then
discussed various aspects that must be considered for
bodyworn drones, including body location, level of automation,
drone shape and functionality. Finally, we introduced
application scenarios and presented a research plan.</p>
    </sec>
    <sec id="sec-7">
      <title>REFERENCES</title>
      <p>[1] Accessed: 2020-02-18. Nixie - Selfie-Drone.</p>
      <p>https://time.com/3559818/
meet-nixie-the-selfie-drone-you-wear-on-your-wrist/.
(Accessed: 2020-02-18).
[2] Parastoo Abtahi, Benoit Landry, Jackie (Junrui) Yang,
Marco Pavone, Sean Follmer, and James A. Landay.
2019. Beyond The Force: Using Quadcopters to
Appropriate Objects and the Environment for Haptics
in Virtual Reality. In Proceedings of the 2019 CHI</p>
      <sec id="sec-7-1">
        <title>Conference on Human Factors in Computing Systems</title>
        <p>(CHI ’19). Association for Computing Machinery, New
York, NY, USA, Article 359, 13 pages. DOI:
http://dx.doi.org/10.1145/3290605.3300589
[9] Jessica R. Cauchard, Kevin Y. Zhai, Marco Spadafora,
and James A. Landay. 2016. Emotion Encoding in
Human-Drone Interaction. In The Eleventh ACM/IEEE</p>
      </sec>
      <sec id="sec-7-2">
        <title>International Conference on Human Robot Interaction</title>
        <p>(HRI ’16). IEEE Press, 263–270.
[10] Artem Dementyev, Hsin-Liu (Cindy) Kao, Inrak Choi,
Deborah Ajilo, Maggie Xu, Joseph A. Paradiso, Chris
Schmandt, and Sean Follmer. 2016. Rovables:
Miniature On-Body Robots as Mobile Wearables. In</p>
      </sec>
      <sec id="sec-7-3">
        <title>Proceedings of the 29th Annual Symposium on User</title>
      </sec>
      <sec id="sec-7-4">
        <title>Interface Software and Technology (UIST ’16).</title>
        <p>Association for Computing Machinery, New York, NY,
USA, 111–120. DOI:
http://dx.doi.org/10.1145/2984511.2984531
[11] Sara Eriksson, Åsa Unander-Scharin, Vincent Trichon,
Carl Unander-Scharin, Hedvig Kjellström, and Kristina
Höök. 2019. Dancing With Drones: Crafting Novel
Artistic Expressions Through Intercorporeality. In</p>
      </sec>
      <sec id="sec-7-5">
        <title>Proceedings of the 2019 CHI Conference on Human</title>
      </sec>
      <sec id="sec-7-6">
        <title>Factors in Computing Systems (CHI ’19). Association</title>
        <p>for Computing Machinery, New York, NY, USA, Article
617, 12 pages. DOI:
http://dx.doi.org/10.1145/3290605.3300847
[12] Markus Funk. 2018. Human-Drone Interaction: Let’s
Get Ready for Flying User Interfaces! Interactions 25,
3 (April 2018), 78–81. DOI:
http://dx.doi.org/10.1145/3194317
[13] Antonio Gomes, Calvin Rubens, Sean Braley, and
Roel Vertegaal. 2016. BitDrones: Towards Using 3D
Nanocopter Displays as Interactive Self-Levitating</p>
      </sec>
      <sec id="sec-7-7">
        <title>Programmable Matter. In Proceedings of the 2016 CHI</title>
      </sec>
      <sec id="sec-7-8">
        <title>Conference on Human Factors in Computing Systems</title>
        <p>(CHI ’16). Association for Computing Machinery, New
York, NY, USA, 770–780. DOI:
http://dx.doi.org/10.1145/2858036.2858519
[14] Pascal Knierim, Thomas Kosch, Valentin Schwind,
Markus Funk, Francisco Kiss, Stefan Schneegass, and
Niels Henze. 2017. Tactile Drones - Providing
Immersive Tactile Feedback in Virtual Reality through</p>
      </sec>
      <sec id="sec-7-9">
        <title>Quadcopters. In Proceedings of the 2017 CHI</title>
      </sec>
      <sec id="sec-7-10">
        <title>Conference Extended Abstracts on Human Factors in</title>
      </sec>
      <sec id="sec-7-11">
        <title>Computing Systems (CHI EA ’17). Association for</title>
        <p>Computing Machinery, New York, NY, USA, 433–436.</p>
        <p>DOI:http://dx.doi.org/10.1145/3027063.3050426
[15] Joseph La Delfa, Mehmet Aydın Baytas, Rakesh
Patibanda, Hazel Ngari, and Rohit Ashok Khot. 2020.
Drone Chi: Somaesthetic Human-Drone Interaction. In</p>
      </sec>
      <sec id="sec-7-12">
        <title>Proceedings of the 2020 CHI Conference on Human</title>
      </sec>
      <sec id="sec-7-13">
        <title>Factors in Computing Systems (CHI ’20). Association</title>
        <p>for Computing Machinery, New York, NY, USA.
[16] Joseph La Delfa, Mehmet Aydin Baytas, Olivia
Wichtowski, Rohit Ashok Khot, and Florian Floyd
Mueller. 2019. Are Drones Meditative?. In Extended</p>
      </sec>
      <sec id="sec-7-14">
        <title>Abstracts of the 2019 CHI Conference on Human</title>
      </sec>
      <sec id="sec-7-15">
        <title>Factors in Computing Systems (CHI EA ’19).</title>
        <p>Association for Computing Machinery, New York, NY,
USA, 4. DOI:
http://dx.doi.org/10.1145/3290607.3313274
[17] Marinus Burger Albrecht Schmidt Thomas Kosch
Matthias Hoppe, Yannick Weiß. 2020. Do not Drone
Yourself in Work: A Framework to Program Drone</p>
      </sec>
      <sec id="sec-7-16">
        <title>Flight Paths. In 2st International Workshop on</title>
      </sec>
      <sec id="sec-7-17">
        <title>Human-Drone Interaction. Hawaii, United States.</title>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Parastoo</given-names>
            <surname>Abtahi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>David Y.</given-names>
            <surname>Zhao</surname>
          </string-name>
          ,
          <string-name>
            <surname>Jane L</surname>
          </string-name>
          . E., and
          <string-name>
            <surname>James</surname>
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Landay</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Drone Near Me: Exploring Touch-Based Human-Drone Interaction</article-title>
          .
          <source>Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1</source>
          ,
          <issue>3</issue>
          ,
          <string-name>
            <surname>Article 34</surname>
          </string-name>
          (
          <issue>Sept</issue>
          .
          <year>2017</year>
          ),
          <article-title>8 pages</article-title>
          . DOI: http://dx.doi.org/10.1145/3130899
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>Mauro</given-names>
            <surname>Avila</surname>
          </string-name>
          , Markus Funk, and
          <string-name>
            <given-names>Niels</given-names>
            <surname>Henze</surname>
          </string-name>
          .
          <year>2015</year>
          .
          <article-title>DroneNavigator: Using Drones for Navigating Visually Impaired Persons</article-title>
          .
          <source>In Proceedings of the 17th International ACM SIGACCESS Conference on Computers &amp; Accessibility (ASSETS '15)</source>
          .
          <article-title>Association for Computing Machinery</article-title>
          , New York, NY, USA,
          <fpage>327</fpage>
          -
          <lpage>328</lpage>
          . DOI: http://dx.doi.org/10.1145/2700648.2811362
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Mehmet</given-names>
            <surname>Aydin</surname>
          </string-name>
          <string-name>
            <surname>Baytas</surname>
          </string-name>
          , Damla Çay, Yuchong Zhang, Mohammad Obaid, Asim Evren Yantaç, and
          <string-name>
            <given-names>Morten</given-names>
            <surname>Fjeld</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>The Design of Social Drones: A Review of Studies on Autonomous Flyers in Inhabited Environments</article-title>
          .
          <source>In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19)</source>
          .
          <article-title>Association for Computing Machinery</article-title>
          , New York, NY, USA, Article
          <volume>250</volume>
          , 13 pages. DOI: http://dx.doi.org/10.1145/3290605.3300480
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Sven</surname>
            <given-names>Mayer</given-names>
          </string-name>
          , Lars Lischke, and
          <string-name>
            <surname>Paweł</surname>
            <given-names>W.</given-names>
          </string-name>
          <string-name>
            <surname>Wozniak</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Drones for Search and Rescue</article-title>
          . In International workshop on Human-Drone
          <string-name>
            <surname>Interaction</surname>
          </string-name>
          , CHI '19 Extended
          <string-name>
            <surname>Abstracts</surname>
          </string-name>
          (
          <year>2019</year>
          -05-04) (iHDI'
          <fpage>19</fpage>
          ). Glasgow, Scotland, UK, 6. https://hal.archives-ouvertes.fr/hal-02128385
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>Wai</given-names>
            <surname>Shan</surname>
          </string-name>
          Ng and
          <string-name>
            <given-names>Ehud</given-names>
            <surname>Sharlin</surname>
          </string-name>
          .
          <year>2011</year>
          .
          <article-title>Collocated interaction with flying robots</article-title>
          .
          <source>In 2011 RO-MAN</source>
          .
          <fpage>143</fpage>
          -
          <lpage>149</lpage>
          . DOI: http://dx.doi.org/10.1109/ROMAN.
          <year>2011</year>
          .6005280
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>Beat</given-names>
            <surname>Rossmy</surname>
          </string-name>
          and
          <string-name>
            <given-names>Kai</given-names>
            <surname>Holländer</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Lure the Drones - Falconry Inspired HDI</article-title>
          . In 1st International Workshop on Human-Drone
          <string-name>
            <surname>Interaction. Ecole Nationale de l'Aviation Civile</surname>
          </string-name>
          [ENAC], Glasgow, United Kingdom. https://hal.archives-ouvertes.fr/hal-02128393
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>Albrecht</given-names>
            <surname>Schmidt</surname>
          </string-name>
          and
          <string-name>
            <given-names>Thomas</given-names>
            <surname>Herrmann</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Intervention User Interfaces: A New Interaction Paradigm for Automated Systems</article-title>
          .
          <source>Interactions 24</source>
          ,
          <issue>5</issue>
          (Aug.
          <year>2017</year>
          ),
          <fpage>40</fpage>
          -
          <lpage>45</lpage>
          . DOI: http://dx.doi.org/10.1145/3121357
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>Daniel</given-names>
            <surname>Szafir</surname>
          </string-name>
          , Bilge Mutlu, and
          <string-name>
            <given-names>Terry</given-names>
            <surname>Fong</surname>
          </string-name>
          .
          <year>2015</year>
          .
          <article-title>Communicating Directionality in Flying Robots</article-title>
          .
          <source>In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI '15)</source>
          .
          <article-title>Association for Computing Machinery</article-title>
          , New York, NY, USA,
          <fpage>19</fpage>
          -
          <lpage>26</lpage>
          . DOI: http://dx.doi.org/10.1145/2696454.2696475
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [23]
          <string-name>
            <surname>Anna</surname>
            <given-names>Wojciechowska</given-names>
          </string-name>
          , Jeremy Frey, Esther Mandelblum, Yair Amichai-Hamburger, and
          <string-name>
            <surname>Jessica</surname>
            <given-names>R.</given-names>
          </string-name>
          <string-name>
            <surname>Cauchard</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Designing Drones: Factors and Characteristics Influencing the Perception of Flying Robots</article-title>
          .
          <source>Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3</source>
          ,
          <issue>3</issue>
          ,
          <string-name>
            <surname>Article 111</surname>
          </string-name>
          (
          <issue>Sept</issue>
          .
          <year>2019</year>
          ),
          <volume>19</volume>
          pages. DOI:http://dx.doi.org/10.1145/3351269
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Sergej</surname>
            <given-names>G.</given-names>
          </string-name>
          <string-name>
            <surname>Zwaan</surname>
            and
            <given-names>Emilia I.</given-names>
          </string-name>
          <string-name>
            <surname>Barakova</surname>
          </string-name>
          .
          <year>2016</year>
          .
          <article-title>Boxing against Drones: Drones in Sports Education</article-title>
          .
          <source>In Proceedings of the The 15th International Conference on Interaction Design and Children (IDC '16)</source>
          .
          <article-title>Association for Computing Machinery</article-title>
          , New York, NY, USA,
          <fpage>607</fpage>
          -
          <lpage>612</lpage>
          . DOI: http://dx.doi.org/10.1145/2930674.2935991
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>