=Paper= {{Paper |id=Vol-2617/paper8 |storemode=property |title=3D Tactile Obstacle Awareness System for Drones using a Tactile Interface around the Head |pdfUrl=https://ceur-ws.org/Vol-2617/paper8.pdf |volume=Vol-2617 |authors=Oliver Beren Kaul,Michael Rohs |dblpUrl=https://dblp.org/rec/conf/chi/KaulR20 }} ==3D Tactile Obstacle Awareness System for Drones using a Tactile Interface around the Head== https://ceur-ws.org/Vol-2617/paper8.pdf
                                   3D Tactile Obstacle Awareness System
                                   for Drones using a Tactile Interface
                                   around the Head
Oliver Beren Kaul                          Michael Rohs                              Abstract
Leibniz University Hannover                Leibniz University Hannover               We propose a 3D obstacle awareness system for drone
Hannover, 30167, Germany                   Hannover, 30167, Germany                  pilots, implemented as a tactile user interface around the
kaul@hci.uni-hannover.de                   rohs@hci.uni-hannover.de
                                                                                     head. The concept of this system is presented alongside a
                                                                                     variety of use cases and recommendations for future work.

                                                                                     Author Keywords
                                                                                     Drones; tactile obstacle awareness; drone navigation; wear-
                                                                                     ables.

                                                                                     CCS Concepts
                                                                                     •Human-centered computing → Haptic devices; Inter-
                                                                                     action techniques; Ubiquitous and mobile computing sys-
                                                                                     tems and tools;

                                                                                     Introduction and Related Work
                                                                                     Drone pilots face obstacle awareness challenges in case of
                                                                                     bad lighting conditions, distractions, or when flying in any
                                                                                     direction that is not in the camera view. Possible obstacles
                                                                                     include static and dynamic obstacles such as other drones,
                                                                                     humans, animals, or even brick walls within buildings. We
This paper is published under the Creative Commons Attribution 4.0 International     propose a tactile system to indicate obstacles, including
(CC-BY 4.0) license. Authors reserve their rights to disseminate the work on their   their distance from the drone, in the 3D space around the
personal and corporate Web sites with the appropriate attribution.
Interdisciplinary Workshop on Human-Drone Interaction (iHDI 2020)                    user (see Figure 1).
CHI ’20 Extended Abstracts, 26 April 2020, Honolulu, HI, US
© Creative Commons CC-BY 4.0 License.
                                                                                     Earlier work and concepts on human-drone interaction was
                                                                                     neatly summarized in [4] and explained in further detail by
                                                                       Figure 2: HapticHead, a vibrotactile interface around the head [7].



                                                                       Baytas et al. [1]. Our obstacle awareness concept pre-
                                                                       sented in this paper extends the idea of augmenting spa-
                                                                       tial awareness for humans [2] and aims to instead increase
                                                                       spatial awareness of a human controlling a remote drone.
                                                                       Earlier approaches to this challenge were able to show
                                                                       promising results for a 2D navigation task using ultrasound
                                                                       sensors attached to a drone and a vibrotactile belt [13]. We
                                                                       aim to extend Spiss et al.’s obstacle awareness system to
                                                                       3D use cases, which cannot be displayed properly by the
                                                                       tactile belt used in [13].

                                                                       In our previous work, we presented HapticHead [6, 7, 5], a
                                                                       vibrotactile display around the head consisting of a bathing
                                                                       cap with a chin strap and a total of 24 vibrotactile actuators
Figure 1: Live tactile drone obstacle awareness system using
                                                                       (see Fig. 2). We were able to show that our prototype can
HapticHead, a vibrotactile interface around the head [7]. The
user’s drone is currently floating while another drone is close to
                                                                       be used in 3D guidance and localization scenarios for peo-
crashing into it from behind. The user receives a tactile warning of   ple with normal vision in both virtual (VR) and augmented
an obstacle closing in from the top-back-left direction.               reality (AR) scenarios. The system can indicate directions
                                                                     variety of systems and technologies that could serve as an
                                                                     obstacle detection system, such as multiple stereo cameras
                                                                     working together [12, 10], 3D LIDARs [9], or even a system
                                                                     using, e.g., HyperOmni Visions (HOVIs) [11]. These input
                                                                     systems would need to filter and extrapolate static and dy-
                                                                     namic obstacles, including their distance and 3D viewing
                                                                     angle from the drone camera perspective. The detected ob-
                                                                     stacles should further be filtered so that obstacles further
                                                                     away than a threshold distance would be excluded from the
                                                                     results, as these can be deemed harmless at the given mo-
                                                                     ment.

                                                                     Output system: Indicating obstacles around the
                                                                     drone by HapticHead
                                                                     In our prior work, we introduced a 3D guidance algorithm
                                                                     for arbitrary actuator configurations such as HapticHead
                                                                     [7]. This guidance algorithm proved to be quite efficient and
                                                                     fast in guiding study participants to look in the indicated
Figure 3: Blindfolded participant in prior experiment, feeling the
                                                                     direction in 3D, including elevation. The same algorithm
direction and distance to physical objects [7].
                                                                     can be used in obstacle awareness scenarios as well. Just
                                                                     like in [7], the depth to obstacles may also be indicated by
all around the user and guide the user to look at a defined          a vibrotactile pulse-pattern and intensity modulation which
point in space with a median deviation of 2.3° to the ac-            gets faster and stronger, the closer an object is.
tual target. This precise guidance capability may also be
                                                                     The spatial mapping of the vibrotactile feedback is drone
used to make users aware of obstacles in the space around
                                                                     centric: The output occurs relative to the drone that the
them. The previous work further included a scenario in
                                                                     user is controlling and is mapped in a one-to-one fashion
which blindfolded users were able to feel the presence of
                                                                     to the HapticHead. The front of the drone is mapped to the
real physical objects in the 3D space around them and sub-
                                                                     front of the head. Obstacles appearing in front of the drone
sequently were able to find and touch the objects (see [7]
                                                                     are haptically displayed on the forehead. Obstacles that
and Fig. 3).
                                                                     appear to the right of the drone appear on the right side of
Input system: Suitable 360 degree obstacle de-                       HapticHead, and so on. This yields in natural mapping of
                                                                     the drone coordinate system to the head coordinate sys-
tection for drones                                                   tem. To the user it feels as if he or she is flying as a pilot
A suitable 360°obstacle detection system for drones is
                                                                     inside the drone, intuitively feeling obstacles along its way.
needed as an input for our proposed system. There are a
When indicating multiple obstacles at the same time with              4. operating a drone while being distracted (e.g., by
the proposed tactile interface, a user will likely suffer from a         other humans).
loss of localization accuracy. For one, if two obstacles are
close together, the user will only be able to perceive one of
                                                                   In the first three cases, the system would provide tactile
them, as the vibrotactile pulse-pattern would become con-
                                                                   guidance to the closest two or three obstacles, so that the
fusing if two obstacles overlap from the perspective of the
                                                                   user can intuitively navigate his drone out of a dangerous
drone and thus allocate the same actuators on the Hap-
                                                                   situation. In the fourth case, the system would provide tac-
ticHead. Arguably, this limitation is no deal breaker, as the
                                                                   tile warnings in case an obstacle is close which reminds the
user can still feel the distance of the closer of the two (or
                                                                   user to redirect his attention back to the drone.
more) objects.
                                                                   Another use case would be accessibility: visually impaired
Furthermore, if more than two obstacles are indicated at
                                                                   drone operators should have a much easier time avoiding
the same time, even if they are not allocating the same ac-
                                                                   obstacles due to the additional tactile feedback channel.
tuators, a loss of accuracy is still likely. This results from
sensory congestion/overload or funneling illusion effects if
                                                                   Conclusion and future work
too many actuators are active at the same time [3, 8].
                                                                   In conclusion, we propose a tactile obstacle awareness sys-
As a solution to these issues, we suggest to only indicate         tem for drone operators, which may be used in a large vari-
the closest two or three obstacles at the same time and            ety of use cases. Future work may implement the proposed
merge obstacles that are close together, only indicating the       system and test the assumed benefits in a real environ-
closer obstacle.                                                   ment.

Use Cases                                                          REFERENCES
As indicated in the introduction, the proposed system may           [1] Mehmet Aydin Baytas, Damla Çay, Yuchong Zhang,
be used in a variety of use cases related to drone operation            Mohammad Obaid, Asim Evren Yantaç, and Morten
and handling. These include:                                            Fjeld. 2019. The Design of Social Drones. In
                                                                        Proceedings of the 2019 CHI Conference on Human
                                                                        Factors in Computing Systems - CHI ’19, Vol. 4. ACM
   1. Flying into any direction that is not in the camera view          Press, New York, New York, USA, 1–13. DOI:
      (e.g., side-, back-, down-, or upwards);                          http://dx.doi.org/10.1145/3290605.3300480

   2. operating a drone at night or in bad lighting condi-          [2] Alvaro Cassinelli, Carson Reynolds, and Masatoshi
      tions;                                                            Ishikawa. 2007. Augmenting spatial awareness with
                                                                        haptic radar. Proceedings - International Symposium
   3. operating a drone around areas with many static or                on Wearable Computers, ISWC (2007), 61–64. DOI:
      dynamic obstacles such as other drones, humans,                   http://dx.doi.org/10.1109/ISWC.2006.286344
      animals, plants, or walls within buildings;
[3] Michal Karol Dobrzynski, Seifeddine Mejri, Steffen              SIGCHI Conference on Human Factors in Computing
    Wischmann, and Dario Floreano. 2012. Quantifying                Systems - CHI ’20 (CHI ’20). ACM, New York, NY,
    Information Transfer Through a Head-Attached                    USA, 13. DOI:
    Vibrotactile Display: Principles for Design and Control.        http://dx.doi.org/10.1145/3313831.3376335
    IEEE Transactions on Biomedical Engineering 59, 7
                                                                [9] James R. Kellner, John Armston, Markus Birrer, K. C.
    (jul 2012), 2011–2018. DOI:
                                                                    Cushman, Laura Duncanson, Christoph Eck,
    http://dx.doi.org/10.1109/TBME.2012.2196433
                                                                    Christoph Falleger, Benedikt Imbach, Kamil Král,
[4] Markus Funk. 2018. Human-drone interaction: Let’s               Martin KrÅŕček, Jan Trochta, Tomáš Vrška, and Carlo
    get ready for flying user interfaces! Interactions 25, 3        Zgraggen. 2019. New Opportunities for Forest Remote
    (2018), 78–81. DOI:                                             Sensing Through Ultra-High-Density Drone Lidar.
    http://dx.doi.org/10.1145/3194317                               Surveys in Geophysics 40, 4 (2019), 959–977. DOI:
                                                                    http://dx.doi.org/10.1007/s10712-019-09529-9
[5] Oliver Beren Kaul, Kevin Meier, and Michael Rohs.
    2017. Increasing Presence in Virtual Reality with a        [10] Deukhyeon Kim, Jinwook Choi, Hunjae Yoo, Ukil Yang,
    Vibrotactile Grid Around the Head. Springer                     and Kwanghoon Sohn. 2015. Rear obstacle detection
    International Publishing, Cham, 289–298. DOI:                   system with fisheye stereo camera using HCT. Expert
    http://dx.doi.org/10.1007/978-3-319-68059-0_19                  Systems with Applications 42, 17-18 (2015),
                                                                    6295–6305. DOI:
[6] Oliver Beren Kaul and Michael Rohs. 2016.
                                                                    http://dx.doi.org/10.1016/j.eswa.2015.04.035
    HapticHead: 3D Guidance and Target Acquisition
    through a Vibrotactile Grid. In Proceedings of the 2016    [11] Hiroshi Koyasu, Jun Miura, and Yoshiaki Shirai. 2001.
    CHI Conference Extended Abstracts on Human                      Realtime omnidirectional stereo for obstacle detection
    Factors in Computing Systems - CHI EA ’16. ACM                  and tracking in dynamic environments. IEEE
    Press, New York, New York, USA, 2533–2539. DOI:                 International Conference on Intelligent Robots and
    http://dx.doi.org/10.1145/2851581.2892355                       Systems 1 (2001), 31–36. DOI:
                                                                    http://dx.doi.org/10.1109/iros.2001.973332
[7] Oliver Beren Kaul and Michael Rohs. 2017.
    HapticHead: A Spherical Vibrotactile Grid around the       [12] Sergiu Nedevschi, Radu Danescu, Dan Frentiu,
    Head for 3D Guidance in Virtual and Augmented                   Tiberiu Marita, Florin Oniga, Ciprian Pocol, Rolf
    Reality. In Proceedings of the 2017 CHI Conference              Schmidt, and Thorsten Graf. 2004. High accuracy
    on Human Factors in Computing Systems - CHI ’17.                stereo vision system for far distance obstacle
    ACM Press, New York, New York, USA, 3729–3740.                  detection. IEEE Intelligent Vehicles Symposium,
    DOI:http://dx.doi.org/10.1145/3025453.3025684                   Proceedings (2004), 292–297. DOI:
                                                                    http://dx.doi.org/10.1109/ivs.2004.1336397
[8] Oliver Beren Kaul, Michael Rohs, Benjamin Simon,
    Kerem Can Demir, and Kamillo Ferry. 2020.                  [13] Stefan Spiss, Yeongmi Kim, Simon Haller, and
    Vibrotactile Funneling Illusion and Localization                Matthias Harders. 2018. Comparison of Tactile Signals
    Performance on the Head. In Proceedings of the                  for Collision Avoidance on Unmanned Aerial Vehicles.
In Haptic Interaction, Shoichi Hasegawa, Masashi
Konyo, Ki-Uk Kyung, Takuya Nojima, and Hiroyuki
Kajimoto (Eds.). Springer Singapore, Singapore,
393–399.