=Paper=
{{Paper
|id=Vol-2617/paper9
|storemode=property
|title=Don’t Drone Yourself in Work: Discussing DronOS as a Framework for Human-Drone Interaction
|pdfUrl=https://ceur-ws.org/Vol-2617/paper9.pdf
|volume=Vol-2617
|authors=Matthias Hoppe,Yannick Weiß,Marinus Burger,Thomas Kosch
|dblpUrl=https://dblp.org/rec/conf/chi/HoppeWBK20
}}
==Don’t Drone Yourself in Work: Discussing DronOS as a Framework for Human-Drone Interaction==
Don’t Drone Yourself in Work:
Discussing DronOS as a Framework
for Human-Drone Interaction
Matthias Hoppe, Yannick Weiß, Abstract
Marinus Burger, Thomas Kosch More and more off-the-shelf drones provide frameworks
LMU Munich, Munich, Germany that enable the programming of flight paths. These frame-
{firstname.lastname}@ifi.lmu.de works provide vendor-dependent programming and com-
munication interfaces that are intended for flight path def-
initions. However, they are often limited to outdoor and
GPS-based use only. A key disadvantage of such a solu-
tion is that they are complicated to use and require read-
justments when changing the drone model. This is time-
consuming since it requires redefining the flight path for the
new framework. This workshop paper proposes additional
features for DronOS, a community-driven framework that
enables model-independent automatisation and program-
ming of drones. We enhanced DronOS to include additional
functions to account for the specific design constraints in
human-drone-interaction. This paper provides a starting
point for discussing the requirements involved in designing
a drone system with other researchers within the human-
drone interaction community. We envision DronOS as a
community-driven framework that can be applied to generic
Figure 1: Custom drone build from of the shelf components. A drone models, hence enabling the automatisation for any
mounted HTC Vive tracker enables low-cost tracking. We commercially available drone. Our goal is to build DronOS
3D-printed a rotor case for direct human-drone interaction. as a software tool that can be easily used by researchers
and practitioners to prototype novel drone-based systems.
This paper is published under the Creative Commons Attribution 4.0 International
(CC-BY 4.0) license. Authors reserve their rights to disseminate the work on their
personal and corporate Web sites with the appropriate attribution. Author Keywords
Interdisciplinary Workshop on Human-Drone Interaction (iHDI 2020), CHI ’20
Extended Abstracts, 26 April 2020, Honolulu, HI, US Drone; Framework; Programming; Automation
© 2020 Creative Commons CC-BY 4.0 License.
CCS Concepts ated in the study. The use of an expensive tracking system
•Human-centered computing → User interface toolkits; remains as a key limitation of their work.
Introduction The implementation of interactive drone flight paths was
Drones are becoming commonplace for user interaction re- subject to past research. However, the (a) use of propri-
search [4]. The increasing availability of consumer drones etary communication protocols, (b) expensive tracking sys-
fosters the creation and prototyping of novel drone-based tems, (c) self-built drones, and (d) closed-libraries remain a
user interfaces. For example, drones can serve as a proxy key challenge of previous drone systems [8]. This workshop
for haptic feedback in virtual reality [7, 10], navigation aid paper discusses DronOS, a framework that enables the
for persons with visual impairments [2, 3], and as an obser- interactive automatisation of drone flight paths using off-the-
vation unit for rescue operations [12, 13]. Knierim et al. [9] shelf components (see Figure 1). DronOS has been evalu-
investigated a design space for different drone interaction ated with three drone programming modalities in a previous
modalities. Auda et al. [1] investigated how drones can be user study [6], finding that users appreciated the usability
used as a proxy for user interfaces. of the system. Our overall aim is to establish DronOS as a
community-driven framework for Human-Drone Interaction
Some of these drones are delivered with vendor-dependent (HDI) researchers as well as practitioners. In the following,
frameworks that enable flight automatisation through de- we elaborate on the basic concept of DronOS, explain the
fined waypoints. These are usually programmed before- programming modalities, and present future research that
hand using defined targets or dynamic positioning relative we want to discuss with the HDI community.
to other objects. Most of these software frameworks are
complicated or unstable to use and require prior program- System Concept
ming knowledge. Furthermore, a plethora of options, such DronOS uses off-the-shelf hardware to implement the ba-
as the velocity of the drone, flight path corrections, or the sic requirements for drone flight path definitions. DronOS
surrounding has to be considered by the user. Contempo- employs HTC’s Lighthouse1 tracking technology which is
rary available frameworks are often limited to a certain com- included in the HTC Vive kits. These offer a simple calibra-
munication protocol, can only be applied on few drone mod- tion procedure. The HTC Lighthouse system uses infrared
els, and require manual programming of autonomous flight to locate the position of an HTC Vive Tracker2 . We initially
paths. To cope with this, Gomes et al. [5] presented Bit- support this tracking system since it offers a low-budget
Drones, a toolbox that enables the programming of drones tracking in contrast to, although professional, more expen-
including interaction scenarios. The outlined interaction sive tracking systems. DronOS uses radio signals to com-
scenarios were sketched with custom-made drones that re- municate flight directions between a computer and a drone.
quire users to reproduce them. Instead of using customised Presets of PID controllers are available which can also be
drones, Kosch et al. [11] investigate the use of a remote set manually for more experienced users. The drone itself is
controller for drone control. The remote control implements controlled via radio signals from a drone controller. This
different visualisation modalities that communicate the tar-
1
get of the drone. Furthermore, several gestures were evalu- www.vive.com/eu/accessory/base-station
2
www.vive.com/eu/vive-tracker
(a)
(a) (b) (c)
Figure 2: DronOS supports three programming modes: (a) Unity Scripting for the advanced definition of waypoints, (b) Vive Scripting using
point and click gestures, and (c) Vive Realtime where the drone follows the users’ pointing direction.
controller is connected to a computer that transmits the In Vive Realtime the drone levitates into the pointing direc-
(b) signal to the drone. DronOS supports three programming tion of an HTC Vive Controller similar to the work of Kosch
modes Unity Scripting, Vive Scripting, and Vive Realtime to et al. [11] (see Figure 2c and 3c). The distance between the
define drone flight paths. controller and the drone can be adjusted.
Unity Scripting provides a user interface where flight paths Research Plan
can be defined using a Unity interface (see Figure 2a and The current version of DronOS fully provides the afore-
3a). New waypoints can be set using drag and drop. These mentioned functionalities. DronOS includes the use of all
are immediately visualised in 3D space and can be modi- self-built or off-the-shelf drones that support Betaflight.
fied with advanced parameters. The framework enables users of all experience levels, from
novice to expert, to create and redefine flight paths via both
Vive Scripting uses an HTC Vive controller to define way-
scripting and real-time control.
(c) points in an “programming by demonstration” approach
(see Figure 2b and 3b).This allows the fast creation of flight As human-drone interaction has special requirements, such
Figure 3: User interacting with the paths without the need for graphical scripting or program- as direct contact with the drone, we support researchers in
modes (a) Unity Scripting, (b) Vive ming. the field of human-drone-interaction by implementing addi-
Scripting, and (c) Vive Realtime for
tional functions. Furthermore, DronOS is currently limited
creating flight paths.
to the operation of one drone at a time. Hence, we plan to Drone Landing on the Human Body. In 2nd
add functionalities that add the orchestration of multiple International Workshop on Human-Drone Interaction.
drones at the same time. This includes the communica- Honolulu, HI, USA.
tion between single drones to optimise the flying behaviour,
[2] Mauro Avila, Markus Funk, and Niels Henze. 2015.
such as avoiding collisions with other drones or users.
DroneNavigator: Using Drones for Navigating Visually
As safety is a key requirement when working with drones, Impaired Persons. In Proceedings of the 17th
we will include no-fly zones. These can be deployed as International ACM SIGACCESS Conference on
static areas (e.g. obstacles) where the drone will not be Computers Accessibility (ASSETS ’15). Association
able to move and as dynamic areas (e.g., 20 cm around for Computing Machinery, New York, NY, USA,
a moving user) so that the drone reacts to the movement 327–328. DOI:
of the user. A core limitation is the use of an indoor track- http://dx.doi.org/10.1145/2700648.2811362
ing system. We explore alternative tracking modalities to [3] Mauro Avila Soto, Markus Funk, Matthias Hoppe,
realise omnipresent HDI use cases within the paradigm Robin Boldt, Katrin Wolf, and Niels Henze. 2017.
of ubiquitous computing. This includes the use of GPS DroneNavigator: Using Leashed and Free-Floating
and WiFi-based tracking that obviates the need for sta- Quadcopters to Navigate Visually Impaired Travelers.
tionary tracking systems. Finally, we envision DronOS as a In Proceedings of the 19th International ACM
community-driven project. We continue to publish new fea- SIGACCESS Conference on Computers and
tures of the framework on Github to foster research and the Accessibility (ASSETS ’17). Association for Computing
implementation of new features within the HDI community3 . Machinery, New York, NY, USA, 300–304. DOI:
http://dx.doi.org/10.1145/3132525.3132556
Outlook
This workshop paper presented DronOS, a generic frame- [4] Markus Funk. 2018. Human-Drone Interaction: Let’s
work that enables users to define the flight paths. We pre- Get Ready for Flying User Interfaces! Interactions 25,
sented the currently available functionalities and operating 3 (April 2018), 78–81. DOI:
principles. In contrast to the available features, we sketch http://dx.doi.org/10.1145/3194317
a research plan with future features that will support re- [5] Antonio Gomes, Calvin Rubens, Sean Braley, and
searchers as well as practitioners in the development of Roel Vertegaal. 2016. BitDrones: Towards Using 3D
future human-drone interfaces. We believe that our frame- Nanocopter Displays as Interactive Self-Levitating
work paves the way for the efficient deployment of drone Programmable Matter. In Proceedings of the 2016 CHI
interfaces. Conference on Human Factors in Computing Systems
(CHI ’16). Association for Computing Machinery, New
REFERENCES York, NY, USA, 770–780. DOI:
[1] Jonas Auda, Martin Weigel, Jessica Cauchard, and http://dx.doi.org/10.1145/2858036.2858519
Stefan Schneegass. 2020. ProxyDrone: Autonomous
[6] Matthias Hoppe, Marinus Burger, Albrecht Schmidt,
3
www.github.com/hcum/dronos and Thomas Kosch. 2019. DronOS: A Flexible
Open-Source Prototyping Framework for Interactive Niels Henze. 2017. Tactile Drones - Providing
Drone Routines. In Proceedings of the 18th Immersive Tactile Feedback in Virtual Reality through
International Conference on Mobile and Ubiquitous Quadcopters. In Proceedings of the 2017 CHI
Multimedia (MUM ’19). Association for Computing Conference Extended Abstracts on Human Factors in
Machinery, New York, NY, USA, Article Article 15, 7 Computing Systems (CHI EA ’17). Association for
pages. DOI: Computing Machinery, New York, NY, USA, 433–436.
http://dx.doi.org/10.1145/3365610.3365642 DOI:http://dx.doi.org/10.1145/3027063.3050426
[7] Matthias Hoppe, Pascal Knierim, Thomas Kosch, [11] Thomas Kosch, Markus Funk, Daniel Vietz, Marc
Markus Funk, Lauren Futami, Stefan Schneegass, Weise, Tamara Müller, and Albrecht Schmidt. 2018.
Niels Henze, Albrecht Schmidt, and Tonja Machulla. DroneCTRL: A Tangible Remote Input Control for
2018. VRHapticDrones: Providing Haptics in Virtual Quadcopters. In The 31st Annual ACM Symposium on
Reality through Quadcopters. In Proceedings of the User Interface Software and Technology Adjunct
17th International Conference on Mobile and Proceedings (UIST ’18 Adjunct). Association for
Ubiquitous Multimedia (MUM 2018). Association for Computing Machinery, New York, NY, USA, 120–122.
Computing Machinery, New York, NY, USA, 7–18. DOI:http://dx.doi.org/10.1145/3266037.3266121
DOI:http://dx.doi.org/10.1145/3282894.3282898
[12] Sven Mayer, Lars Lischke, and Pawel W. Woźniak.
[8] Matthias Hoppe, Thomas Kosch, Pascal Knierim, 2019. Drones for Search and Rescue. In 1st
Markus Funk, and Albrecht Schmidt. 2019. Are Drones International Workshop on Human-Drone Interaction.
Ready for Takeoff? Reflecting on Challenges and Ecole Nationale de l’Aviation Civile [ENAC], Glasgow,
Opportunities in Human-Drone Interfaces. In 1st United Kingdom.
International Workshop on Human-Drone Interaction. https://hal.archives-ouvertes.fr/hal-02128385
Ecole Nationale de l’Aviation Civile [ENAC], Glasgow,
[13] R. Tariq, M. Rahim, N. Aslam, N. Bawany, and U.
United Kingdom.
Faseeha. 2018. DronAID : A Smart Human Detection
https://hal.archives-ouvertes.fr/hal-02128388
Drone for Rescue. In 2018 15th International
[9] Pascal Knierim, Thomas Kosch, Alexander Achberger, Conference on Smart Cities: Improving Quality of Life
and Markus Funk. 2018. Flyables: Exploring 3D Using ICT IoT (HONET-ICT). 33–37. DOI:
Interaction Spaces for Levitating Tangibles. In http://dx.doi.org/10.1109/HONET.2018.8551326
Proceedings of the Twelfth International Conference
on Tangible, Embedded, and Embodied Interaction
(TEI ’18). Association for Computing Machinery, New
York, NY, USA, 329–336. DOI:
http://dx.doi.org/10.1145/3173225.3173273
[10] Pascal Knierim, Thomas Kosch, Valentin Schwind,
Markus Funk, Francisco Kiss, Stefan Schneegass, and