<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>New Opportunities for Forest Remote
Sensing Through Ultra-High-Density Drone Lidar.
Surveys in Geophysics</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.1007/s10712-019-09529-9</article-id>
      <title-group>
        <article-title>3D Tactile Obstacle Awareness System for Drones using a Tactile Interface around the Head</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Oliver Beren Kaul</string-name>
          <email>kaul@hci.uni-hannover.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Michael Rohs</string-name>
          <email>rohs@hci.uni-hannover.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Leibniz University Hannover</institution>
          ,
          <addr-line>Hannover, 30167</addr-line>
          ,
          <country country="DE">Germany</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2019</year>
      </pub-date>
      <volume>40</volume>
      <issue>4</issue>
      <abstract>
        <p>We propose a 3D obstacle awareness system for drone pilots, implemented as a tactile user interface around the head. The concept of this system is presented alongside a variety of use cases and recommendations for future work.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>This paper is published under the Creative Commons Attribution 4.0 International
(CC-BY 4.0) license. Authors reserve their rights to disseminate the work on their
personal and corporate Web sites with the appropriate attribution.
Interdisciplinary Workshop on Human-Drone Interaction (iHDI 2020)
CHI ’20 Extended Abstracts, 26 April 2020, Honolulu, HI, US
© Creative Commons CC-BY 4.0 License.</p>
    </sec>
    <sec id="sec-2">
      <title>Author Keywords</title>
      <p>Drones; tactile obstacle awareness; drone navigation;
wearables.</p>
    </sec>
    <sec id="sec-3">
      <title>CCS Concepts</title>
      <p>•Human-centered computing ! Haptic devices;
Interaction techniques; Ubiquitous and mobile computing
systems and tools;</p>
    </sec>
    <sec id="sec-4">
      <title>Introduction and Related Work</title>
      <p>Drone pilots face obstacle awareness challenges in case of
bad lighting conditions, distractions, or when flying in any
direction that is not in the camera view. Possible obstacles
include static and dynamic obstacles such as other drones,
humans, animals, or even brick walls within buildings. We
propose a tactile system to indicate obstacles, including
their distance from the drone, in the 3D space around the
user (see Figure 1).</p>
      <p>
        Earlier work and concepts on human-drone interaction was
neatly summarized in [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] and explained in further detail by
Baytas et al. [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Our obstacle awareness concept
presented in this paper extends the idea of augmenting
spatial awareness for humans [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] and aims to instead increase
spatial awareness of a human controlling a remote drone.
Earlier approaches to this challenge were able to show
promising results for a 2D navigation task using ultrasound
sensors attached to a drone and a vibrotactile belt [
        <xref ref-type="bibr" rid="ref12">13</xref>
        ]. We
aim to extend Spiss et al.’s obstacle awareness system to
3D use cases, which cannot be displayed properly by the
tactile belt used in [
        <xref ref-type="bibr" rid="ref12">13</xref>
        ].
      </p>
      <p>
        In our previous work, we presented HapticHead [
        <xref ref-type="bibr" rid="ref5 ref6 ref7">6, 7, 5</xref>
        ], a
vibrotactile display around the head consisting of a bathing
cap with a chin strap and a total of 24 vibrotactile actuators
(see Fig. 2). We were able to show that our prototype can
be used in 3D guidance and localization scenarios for
people with normal vision in both virtual (VR) and augmented
reality (AR) scenarios. The system can indicate directions
all around the user and guide the user to look at a defined
point in space with a median deviation of 2.3° to the
actual target. This precise guidance capability may also be
used to make users aware of obstacles in the space around
them. The previous work further included a scenario in
which blindfolded users were able to feel the presence of
real physical objects in the 3D space around them and
subsequently were able to find and touch the objects (see [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]
and Fig. 3).
      </p>
    </sec>
    <sec id="sec-5">
      <title>Input system: Suitable 360 degree obstacle detection for drones</title>
      <p>
        A suitable 360°obstacle detection system for drones is
needed as an input for our proposed system. There are a
variety of systems and technologies that could serve as an
obstacle detection system, such as multiple stereo cameras
working together [
        <xref ref-type="bibr" rid="ref11 ref9">12, 10</xref>
        ], 3D LIDARs [9], or even a system
using, e.g., HyperOmni Visions (HOVIs) [
        <xref ref-type="bibr" rid="ref10">11</xref>
        ]. These input
systems would need to filter and extrapolate static and
dynamic obstacles, including their distance and 3D viewing
angle from the drone camera perspective. The detected
obstacles should further be filtered so that obstacles further
away than a threshold distance would be excluded from the
results, as these can be deemed harmless at the given
moment.
      </p>
    </sec>
    <sec id="sec-6">
      <title>Output system: Indicating obstacles around the drone by HapticHead</title>
      <p>
        In our prior work, we introduced a 3D guidance algorithm
for arbitrary actuator configurations such as HapticHead
[
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. This guidance algorithm proved to be quite efficient and
fast in guiding study participants to look in the indicated
direction in 3D, including elevation. The same algorithm
can be used in obstacle awareness scenarios as well. Just
like in [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], the depth to obstacles may also be indicated by
a vibrotactile pulse-pattern and intensity modulation which
gets faster and stronger, the closer an object is.
      </p>
      <p>The spatial mapping of the vibrotactile feedback is drone
centric: The output occurs relative to the drone that the
user is controlling and is mapped in a one-to-one fashion
to the HapticHead. The front of the drone is mapped to the
front of the head. Obstacles appearing in front of the drone
are haptically displayed on the forehead. Obstacles that
appear to the right of the drone appear on the right side of
HapticHead, and so on. This yields in natural mapping of
the drone coordinate system to the head coordinate
system. To the user it feels as if he or she is flying as a pilot
inside the drone, intuitively feeling obstacles along its way.
When indicating multiple obstacles at the same time with
the proposed tactile interface, a user will likely suffer from a
loss of localization accuracy. For one, if two obstacles are
close together, the user will only be able to perceive one of
them, as the vibrotactile pulse-pattern would become
confusing if two obstacles overlap from the perspective of the
drone and thus allocate the same actuators on the
HapticHead. Arguably, this limitation is no deal breaker, as the
user can still feel the distance of the closer of the two (or
more) objects.</p>
      <p>
        Furthermore, if more than two obstacles are indicated at
the same time, even if they are not allocating the same
actuators, a loss of accuracy is still likely. This results from
sensory congestion/overload or funneling illusion effects if
too many actuators are active at the same time [
        <xref ref-type="bibr" rid="ref3 ref8">3, 8</xref>
        ].
As a solution to these issues, we suggest to only indicate
the closest two or three obstacles at the same time and
merge obstacles that are close together, only indicating the
closer obstacle.
      </p>
    </sec>
    <sec id="sec-7">
      <title>Use Cases</title>
      <p>As indicated in the introduction, the proposed system may
be used in a variety of use cases related to drone operation
and handling. These include:
1. Flying into any direction that is not in the camera view
(e.g., side-, back-, down-, or upwards);
2. operating a drone at night or in bad lighting
conditions;
3. operating a drone around areas with many static or
dynamic obstacles such as other drones, humans,
animals, plants, or walls within buildings;
4. operating a drone while being distracted (e.g., by
other humans).</p>
      <p>In the first three cases, the system would provide tactile
guidance to the closest two or three obstacles, so that the
user can intuitively navigate his drone out of a dangerous
situation. In the fourth case, the system would provide
tactile warnings in case an obstacle is close which reminds the
user to redirect his attention back to the drone.</p>
      <p>Another use case would be accessibility: visually impaired
drone operators should have a much easier time avoiding
obstacles due to the additional tactile feedback channel.</p>
    </sec>
    <sec id="sec-8">
      <title>Conclusion and future work</title>
      <p>In conclusion, we propose a tactile obstacle awareness
system for drone operators, which may be used in a large
variety of use cases. Future work may implement the proposed
system and test the assumed benefits in a real
environment.
In Haptic Interaction, Shoichi Hasegawa, Masashi
Konyo, Ki-Uk Kyung, Takuya Nojima, and Hiroyuki
Kajimoto (Eds.). Springer Singapore, Singapore,
393–399.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>Mehmet</given-names>
            <surname>Aydin</surname>
          </string-name>
          <string-name>
            <surname>Baytas</surname>
          </string-name>
          , Damla Çay, Yuchong Zhang, Mohammad Obaid, Asim Evren Yantaç, and
          <string-name>
            <given-names>Morten</given-names>
            <surname>Fjeld</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>The Design of Social Drones</article-title>
          .
          <source>In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI '19</source>
          , Vol.
          <volume>4</volume>
          . ACM Press, New York, New York, USA,
          <fpage>1</fpage>
          -
          <lpage>13</lpage>
          . DOI: http://dx.doi.org/10.1145/3290605.3300480
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>Alvaro</given-names>
            <surname>Cassinelli</surname>
          </string-name>
          , Carson Reynolds, and
          <string-name>
            <given-names>Masatoshi</given-names>
            <surname>Ishikawa</surname>
          </string-name>
          .
          <year>2007</year>
          .
          <article-title>Augmenting spatial awareness with haptic radar</article-title>
          .
          <source>Proceedings - International Symposium on Wearable Computers</source>
          ,
          <string-name>
            <surname>ISWC</surname>
          </string-name>
          (
          <year>2007</year>
          ),
          <fpage>61</fpage>
          -
          <lpage>64</lpage>
          . DOI: http://dx.doi.org/10.1109/ISWC.
          <year>2006</year>
          .286344
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Michal</given-names>
            <surname>Karol</surname>
          </string-name>
          <string-name>
            <given-names>Dobrzynski</given-names>
            , Seifeddine Mejri,
            <surname>Steffen Wischmann</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Dario</given-names>
            <surname>Floreano</surname>
          </string-name>
          .
          <year>2012</year>
          .
          <article-title>Quantifying Information Transfer Through a Head-Attached Vibrotactile Display: Principles for Design and Control</article-title>
          .
          <source>IEEE Transactions on Biomedical Engineering</source>
          <volume>59</volume>
          , 7 (jul
          <year>2012</year>
          ),
          <fpage>2011</fpage>
          -
          <lpage>2018</lpage>
          . DOI: http://dx.doi.org/10.1109/TBME.
          <year>2012</year>
          .2196433
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>Markus</given-names>
            <surname>Funk</surname>
          </string-name>
          .
          <year>2018</year>
          .
          <article-title>Human-drone interaction: Let's get ready for flying user interfaces</article-title>
          !
          <source>Interactions</source>
          <volume>25</volume>
          ,
          <issue>3</issue>
          (
          <year>2018</year>
          ),
          <fpage>78</fpage>
          -
          <lpage>81</lpage>
          . DOI: http://dx.doi.org/10.1145/3194317
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Oliver</given-names>
            <surname>Beren</surname>
          </string-name>
          <string-name>
            <surname>Kaul</surname>
          </string-name>
          , Kevin Meier, and
          <string-name>
            <given-names>Michael</given-names>
            <surname>Rohs</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Increasing Presence in Virtual Reality with a Vibrotactile Grid Around the Head</article-title>
          . Springer International Publishing, Cham,
          <fpage>289</fpage>
          -
          <lpage>298</lpage>
          . DOI: http://dx.doi.org/10.1007/978-3-
          <fpage>319</fpage>
          -68059-0_
          <fpage>19</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Oliver</given-names>
            <surname>Beren Kaul</surname>
          </string-name>
          and
          <string-name>
            <given-names>Michael</given-names>
            <surname>Rohs</surname>
          </string-name>
          .
          <year>2016</year>
          .
          <article-title>HapticHead: 3D Guidance and Target Acquisition through a Vibrotactile Grid</article-title>
          .
          <source>In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA '16</source>
          . ACM Press, New York, New York, USA,
          <fpage>2533</fpage>
          -
          <lpage>2539</lpage>
          . DOI: http://dx.doi.org/10.1145/2851581.2892355
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>Oliver</given-names>
            <surname>Beren Kaul</surname>
          </string-name>
          and
          <string-name>
            <given-names>Michael</given-names>
            <surname>Rohs</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality</article-title>
          .
          <source>In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI '17</source>
          . ACM Press, New York, New York, USA,
          <fpage>3729</fpage>
          -
          <lpage>3740</lpage>
          . DOI:http://dx.doi.org/10.1145/3025453.3025684
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>Oliver</given-names>
            <surname>Beren</surname>
          </string-name>
          <string-name>
            <surname>Kaul</surname>
          </string-name>
          , Michael Rohs, Benjamin Simon, Kerem Can Demir, and
          <string-name>
            <given-names>Kamillo</given-names>
            <surname>Ferry</surname>
          </string-name>
          .
          <year>2020</year>
          .
          <article-title>Vibrotactile Funneling Illusion and Localization Performance on the Head</article-title>
          .
          <source>In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '20 (CHI '20)</source>
          . ACM, New York, NY, USA, 13. DOI: http://dx.doi.org/10.1145/3313831.3376335
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Deukhyeon</surname>
            <given-names>Kim</given-names>
          </string-name>
          , Jinwook Choi, Hunjae Yoo,
          <string-name>
            <given-names>Ukil</given-names>
            <surname>Yang</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Kwanghoon</given-names>
            <surname>Sohn</surname>
          </string-name>
          .
          <year>2015</year>
          .
          <article-title>Rear obstacle detection system with fisheye stereo camera using HCT</article-title>
          .
          <source>Expert Systems with Applications</source>
          <volume>42</volume>
          ,
          <fpage>17</fpage>
          -
          <lpage>18</lpage>
          (
          <year>2015</year>
          ),
          <fpage>6295</fpage>
          -
          <lpage>6305</lpage>
          . DOI: http://dx.doi.org/10.1016/j.eswa.
          <year>2015</year>
          .
          <volume>04</volume>
          .035
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Hiroshi</surname>
            <given-names>Koyasu</given-names>
          </string-name>
          , Jun Miura, and
          <string-name>
            <given-names>Yoshiaki</given-names>
            <surname>Shirai</surname>
          </string-name>
          .
          <year>2001</year>
          .
          <article-title>Realtime omnidirectional stereo for obstacle detection and tracking in dynamic environments</article-title>
          .
          <source>IEEE International Conference on Intelligent Robots and Systems</source>
          <volume>1</volume>
          (
          <year>2001</year>
          ),
          <fpage>31</fpage>
          -
          <lpage>36</lpage>
          . DOI: http://dx.doi.org/10.1109/iros.
          <year>2001</year>
          .973332
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Sergiu</surname>
            <given-names>Nedevschi</given-names>
          </string-name>
          , Radu Danescu, Dan Frentiu, Tiberiu Marita, Florin Oniga, Ciprian Pocol, Rolf Schmidt, and
          <string-name>
            <given-names>Thorsten</given-names>
            <surname>Graf</surname>
          </string-name>
          .
          <year>2004</year>
          .
          <article-title>High accuracy stereo vision system for far distance obstacle detection</article-title>
          .
          <source>IEEE Intelligent Vehicles Symposium</source>
          ,
          <string-name>
            <surname>Proceedings</surname>
          </string-name>
          (
          <year>2004</year>
          ),
          <fpage>292</fpage>
          -
          <lpage>297</lpage>
          . DOI: http://dx.doi.org/10.1109/ivs.
          <year>2004</year>
          .1336397
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Stefan</surname>
            <given-names>Spiss</given-names>
          </string-name>
          , Yeongmi Kim, Simon Haller, and
          <string-name>
            <given-names>Matthias</given-names>
            <surname>Harders</surname>
          </string-name>
          .
          <year>2018</year>
          .
          <article-title>Comparison of Tactile Signals for Collision Avoidance on Unmanned Aerial Vehicles</article-title>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>