<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Dronible: Operating drones with Tangible objects</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Jérémie Garcia</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Nicolas Viot</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Dong Bach Vo</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sylvain Pauchet</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>ENAC - Université de Toulouse</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>France</string-name>
        </contrib>
      </contrib-group>
      <abstract>
        <p>The goal of the Dronible workshop is to design and prototype interaction to fly and operate drones using tangible objects. This hands-on workshop will start with a video brainstorming for rapid and physical exploration of potential interaction. The workshop will continue with a digital prototyping phase with small-sized drones, motion sensors and haptic actuators. Finally, a presentation session will allow all participants to show off their prototypes and reflect on future Human-Drone Interaction perspectives.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Human-Drone Interaction</kwd>
        <kwd>Tangible User Interface</kwd>
        <kwd>Haptic Feedback</kwd>
        <kwd>Prototyping</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>•</p>
      <p>Afternoon
o Tutorial on the drones, sensors and actuators available (30 minutes)
o Physical and Digital prototyping session (120 minutes)
o Prototypes presentations and discussions (30 minutes)</p>
    </sec>
    <sec id="sec-2">
      <title>Apparatus</title>
      <p>We provided several small-sized programmable drones (BitCraze and Crazyflies). We also provided
various physical prototyping material and inesensing modules from Bitalino RioT. The modules, which
embed an inertial motion unit allowed the participants to retrieve acceleration or rotation data and other
input data from external sensors such as force sensitive resistors or or potentiometers. For the haptic
feedback part, we provided DFROBOT Bluetooth audio cards (DFR0720) and LRA vibrating motors.
Programming sketcheswere provided in advance via GitHub2 to facilitate the workshop.</p>
      <p>The afternoon prototyping session took place at the ENAC flight hall to ensure participants’ safety
while testing their flying prototypes. The hall also features a recording system to document the results
of the workshop.</p>
    </sec>
    <sec id="sec-3">
      <title>Results</title>
      <p>The workshop started with a brief overview of Human Drone Interaction research before moving to
the video brainstorming phase.</p>
    </sec>
    <sec id="sec-4">
      <title>Video brainstorming</title>
      <p>
        During this first part, we split into two groups to brainstorm on tangible interaction to operate drones
with vibrotactile feedback. We used video brainstorming techniques to record drone interaction
enaction with minimal prototype designing efforts [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Each group came up with at least 10 ideas
involving single or multiple users and various physical devices to operate drones such as:
- Performing musical instruments' motion and having drones creating auditory feedback while
flying
- Various games involving throwing a ball to the player and the drone would follow the ball and
after some time would buzz to indicate who lost
- A fitness assistant that would react to efforts performed to reach high altitude and slowly come
back to the ground during breaks between workouts.
- Control of a fleet of drones by using tangible objects over a map to inform their target position
and the constraints between the drones
2 https://github.com/jeremie-garcia/dronible
      </p>
      <p>After a quick demonstration of the available technologies for prototyping, participants voted for their
favorite ideas from the video brainstorming sessions. The candidate ideas were:
- A drone as a dance partner that would react to performed dance moves and be integrated to the
choreography. The design envisioned that the dancer would drop a ball attached to her wrist to
make the drone take off. Then rotation motions would influence the speed and altitude of the
drones, or the circular motions performed by the drones near the dancer.
- A physical remote controller using pressure and rotation to fly the drone. The two-pressure
sensitive area controlled the lateral direction of the drone and its altitude.
- A fitness assistant reacting to efforts from the user. The user receives vibrotactile feedback to
give rhythm information for the gestures that must be repeated. According to the energy of the
motions sensed in the weights, the drone would increase its altitude. At the end of the workout,
the drone would slowly land to indicate the remaining time before the next exercise.
The three groups managed to build physical prototypes but the digital prototypes using crazyflies
were not very functional due to technical difficulties in operating the drones simultaneously.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Cauchard</surname>
            ,
            <given-names>J. R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Khamis</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Garcia</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kljun</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Brock</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          (
          <year>2021</year>
          ).
          <article-title>Toward a roadmap for human-drone interaction</article-title>
          .
          <source>ACM Interactions</source>
          ,
          <volume>28</volume>
          (
          <issue>2</issue>
          ),
          <fpage>76</fpage>
          -
          <lpage>81</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Mackay</surname>
            ,
            <given-names>W. E.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Fayard</surname>
            ,
            <given-names>A. L.</given-names>
          </string-name>
          (
          <year>1999</year>
          , May).
          <article-title>Video brainstorming and prototyping: techniques for participatory design</article-title>
          .
          <source>In ACM/CHI'99 extended abstracts on Human factors in computing systems</source>
          (pp.
          <fpage>118</fpage>
          -
          <lpage>119</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Tezza</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Andujar</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2019</year>
          ).
          <article-title>The state-of-the-art of human-drone interaction: A survey</article-title>
          .
          <source>IEEE Access</source>
          ,
          <volume>7</volume>
          ,
          <fpage>167438</fpage>
          -
          <lpage>167454</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>