<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Actuated Peripherals as Tangibles in Desktop Interaction</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Thomas Pietrzak</string-name>
          <email>thomas.pietrzak@univ-lille1.fr</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Gilles Bailly</string-name>
          <email>gilles.bailly@upmc.fr</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sylvain Malacria</string-name>
          <email>sylvain.malacria@inria.fr</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>CNRS</institution>
          ,
          <country country="FR">France</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Inria</institution>
          ,
          <country country="FR">France</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>University of Lille</institution>
          ,
          <country country="FR">France</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>TUIs use everyday physical objects to interact with digital information. With decades of usage, computer peripherals became everyday physical objects. We observed that users manipulate them for other purpose than input and output devices. For example users turn their screen to avoid sun reflections, or move their keyboard ans mouse because they need space on their desk. In this work we see computer peripherals as everyday objects, and use them as TUIs. This paper presents two levels of tangible interaction with desktop computers: the first one is a keyboard with actuated keys. The keys can raise from their initial position, which can be used to represent interaction or extend interaction with keyboards. On the second level we actuated a mouse, a keyboard and a screen so that they can move around on the desk. We present scenarios showing how it extends interaction with a desktop computer setup.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        INTRODUCTION
In the early days of tangible interaction, Ullmer and Ishii
described Tangible interaction this way: “TUIs will augment the
real physical world by coupling digital information to
everyday physical objects and environments.”[
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]. The idea is to
break the barrier between the physical and the digital world.
With this paradigm, any object can either represent digital
information or be a proxy for manipulating digital information.
Similarly to input and output devices, these objects are
instrumented with sensors or actuators, which are the links with the
digital world.
      </p>
      <p>The question whether a computer mouse is a TUI is
interesting. On one hand it complies with Ullmer and Ishii’s
definition. Since the introduction of this definition, computer
peripherals became everyday objects. On the other hand,
the computer mouse was specifically designed for interaction
with digital information, and would not exist otherwise.
Now, consider the situation of a user having a talk with a
slideshow. He holds the mouse in the hand and just use the
buttons to move to the next slide, the same way he would do
with a remote control. In this case he is not using the mouse
as it was designed for. Is it sufficient to consider the computer
mouse as a TUI in this specific scenario?
Further, now imagine an actuated computer mouse so that it
can move around on the table. This mouse moves around on
the desk to give the user notifications when he is not watching
the screen. In this situation, the mouse is clearly not used as
the mouse was designed to be used. In this work we explore
actuation and motion as a way of interacting with computer
peripherals in the way they were not designed for. We discuss
scenarios in which computer peripherals are tangible objects
for interaction with digital information.</p>
      <p>RELATED WORK
We describe below evolutions of the desktop interface, the
use of motion as an output modality, and shape changing
interfaces.</p>
      <p>Rethinking desktop interaction
The way we interact with computer peripherals has not
changed much since their invention. Mice, keyboards and
screens have not changed much on an interaction point of
view.</p>
      <p>
        Studies showed that some design choices are questionable.
For example Pietrzak et al. studied the impact of the mode
delimiters for keyboard shortcuts by replicating the CTRL and
SHIFT on the thumb buttons of the mouse [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. They
observed similar performance for keyboard shortcut entry than
with the keyboard. This means it makes sense to revisit
design choices made decades ago.
      </p>
      <p>
        Research explored additional dimensions to extend the
capacities of computer peripherals. Rekimoto et al. added
capacitive sensing to the keys of a keyboard [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]. It enables sensing
whether the user touches a key or not. They propose
scenarios in which they use this information to display feedforward,
and other scenarios which take advantage of this extended
vocabulary to enhance interaction.
      </p>
      <p>
        Beyond rethinking desktop devices, Bi et al. used the desk
itself for interaction [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. They extend the peripherals
capabilities with interaction with the desk, both for multi-touch input
and a projected display. At the opposite, Gervais et al. use
everyday objects as viewports, which share or extend computer
screens real estate [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. These systems explore tangible
properties of the desktop environment to extend interaction.
Motion output
Motion is a property of the interaction with an object. It is
commonly used as input values, but we are interested in
motion of a physical object as an output modality. Motion as
output produces visual and haptic feedback.
      </p>
      <p>
        Löffler et al. designed insect-like desk companions [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
These companions can move around on the desk to give the
user notifications through the visual channel. Authors
focused on their affective effect on the user. Interestingly,
motion based interfaces can take advantage of both the visual
and haptic aspects of movements. Zooids are small robots
which cooperate to achieve a particular task [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. In some
situations they represents points on a graph. In other situations
they move an object on a table.
      </p>
      <p>
        Actuating an object enables dynamic force feedback when
the user touches it. For example Roudaut et al. explored the
idea of actuating a layer over a touchscreen to guide the finger
touching the device [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. It makes it possible to teach users
gestures, such as gesture keyboard symbols. Other studies
use motion to encode information. Either the system controls
the movement [
        <xref ref-type="bibr" rid="ref13 ref5">5, 13</xref>
        ], or only constrains the movements of
the user [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. Similarly to Tactons [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], information is coded
by mapping pieces of information to signal parameters such
as amplitude, size or shape. Below, we explain how we use
motion to extend interaction with computer peripherals.
Shape changing interfaces
Actuating objects also make it possible to change their shape,
and therefore their affordances. Knobslider is an example of
interactive object which is either a button or a slider,
depending on its shape [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. This object was specifically designed to
behave this way. At the opposite, Kim et al. designed an
inflatable mouse [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] which can either give notifications, or be
used as an elastic input device for continuous rate control.
ACTUATED PERIPHERALS
In this project we use both the concepts of motion as output,
and shape changing interfaces to redesign computer
peripherals. We discuss design rationales on the device level, desktop
level, and envision extending the concept to en entire room or
a house.
      </p>
      <p>
        Device level
Motion is an essential aspect of interaction with
peripherals. Pointing devices rely on movement measurements.
Keyboards use binary key positions as input data. In the
Métamorphe project [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], we actuated the keys so that they can
either be up or down (Figure 1). Keys can still be pressed,
whether the key is up or down.
      </p>
      <p>This shape changing keyboard has new properties compared
to regular keyboards. When a key is up, the use can push it
in four directions, or even pinch it (Figure 2). With a touch
sensor all around it, the key could be used as an isometric
pointing device such as a trackpoint.</p>
      <p>Our previous studies showed that raising keys eases eyes-free
interaction with the keyboard. Specifically we observed that
users can easier locate raised keys and surrounding ones.</p>
      <p>
        The possibilities of such a keyboard go beyond text typing
and keyboard shortcuts. Similarly to Relief [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], it is a shape
changing device which can be used to display information.
Desktop level
We observed people when they use a desktop computer, and
identified situations in which they move their peripherals
besides interaction with the computer. For example we
observed people turning their screen to avoid sun reflexions.
Other users turned their screen either to show visual content
to somebody, or to show something in the room in a video
conference with the camera affixed to the screen. It is also
frequent to move the mouse and keyboard to make space on
the desk for something else.
      </p>
      <p>
        In the Living Desktop project[
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], we actuated a mouse,
keyboard and screen (Figure 3):
      </p>
      <p>The mouse can translate in the x; y plane directions.
The keyboard can rotate, and translate in the x; y plane
directions.</p>
      <p>The screen can rotate, and translate in the x axis direction.
With these capabilities, devices can move on their own
without requiring the user to move them. The interesting question
here is the degree of control the user has over his devices.
There is a continuum between full control and full
automation, in which we identify some particular degrees:</p>
      <p>Telekinesis: the user moves the devices with distant
controls.</p>
      <p>Tele-operation: the user suggests movements, the device
decides to which degree it complies.</p>
      <p>Constraint: the user defines the constraints of the devices
movements.</p>
      <p>Insurrection: the user has no influence on the device
movements.</p>
      <p>We implemented a couple of scenarios, which illustrate the
concept.</p>
      <p>Peephole display
Even with a large screen, the interactive screen estate is
limited. We propose to use the screen as a peephole display in a
larger working space. In this scenario, the screen moves on
the x axis and the pixels show the content in this area in space.
The screen is like a moving physical window. In this scenario
the user controls the screen position.</p>
      <p>Video Conference
When video-conferencing with a desktop computer, the
camera is usually affixed to the screen. The problem is when the
user would like to show something he manipulates outside the
camera range. He has to move the screen at the same time he
is manipulating. In this scenario the screen follows the user so
that he can always see the video conference, and show what
he is doing to his collaborators. The user does not control the
screen position in the strict sense of the term. However he can
activate or deactivate this behavior and still control the screen
position manually or with another interface.</p>
      <p>Ergonomic coach
Being well seated is essential for healthy office work. It
reduces fatigue and pain. It is however difficult to pay attention
to our posture all day. In this scenario, devices move away
if we are not seated correctly on the chair. The user has no
control over the devices in this situation.</p>
      <p>
        Going further
Looking at the office environment, there are many other
objects involved. They can be actuated to provide other
interactive scenarios. Probst et al. presented a prototype of chair
they use for input[
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]. These chairs are not actuated, but
equipped with sensors. However, Nissan designed parking
chairs1 which can move around. In their scenario the chairs
move back under the table to tidy the room. But we can
envision other scenarios. Pull-down beds are other examples of
existing moving objects, which are easy to actuate.
In a larger scale, the concept of moving walls makes it
possible to have many rooms in small flats2. Each wall has specific
equipment, suitable for a particular room. If we keep think
bigger, rotating houses is another example of actuated
environment3. The obvious application is to maintain sunlight at
a specific location in the house. But there may be many
interesting interactive scenarios to study with such a building.
CONCLUSION
The early studies about TUIs used to consider everyday
objects for interaction. Nowadays, computer peripherals
became everyday objects. As such, they can also be considered
as TUIs as long as they are not used as the device they are
designed to be. We discussed how actuating computer
peripherals enables new interactions. We presented a prototype
of keyboard with actuated keys which can move up and down.
We also presented a concept of moving computer peripherals,
which enable new interactions. We envision this concept can
apply to many other objects in our environment. The
question we must always keep in mind is the degree of control we
would like to keep over these wandering TUIs.
1https://youtu.be/O1D07dTILH0
2https://vimeo.com/110871691
3https://youtu.be/dIHUwp9x8Fg
      </p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Bailly</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pietrzak</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Deber</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Wigdor</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <article-title>Métamorphe: Augmenting hotkey usage with actuated keys</article-title>
          .
          <source>In Proc. CHI</source>
          '
          <volume>13</volume>
          (Paris, France, May
          <year>2013</year>
          ),
          <fpage>563</fpage>
          -
          <lpage>572</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Bailly</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sahdev</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Malacria</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Pietrzak</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          <article-title>LivingDesktop: Augmenting desktop workstation with actuated devices</article-title>
          .
          <source>In Proc. CHI</source>
          '
          <volume>16</volume>
          (San Jose, USA, May
          <year>2016</year>
          ),
          <fpage>5298</fpage>
          -
          <lpage>5310</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Bi</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Grossman</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Matejka</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Fitzmaurice</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <article-title>Magic desk: bringing multi-touch surfaces into desktop work</article-title>
          .
          <source>In Proc. CHI</source>
          '
          <volume>11</volume>
          (Vancouver, Canada, May
          <year>2011</year>
          ),
          <fpage>2511</fpage>
          -
          <lpage>2520</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Brewster</surname>
            ,
            <given-names>S. A.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Brown</surname>
          </string-name>
          , L. M.
          <article-title>Non-visual information display using tactons</article-title>
          . In E.A. CHI'
          <volume>04</volume>
          (Vienna, Austria, Apr.
          <year>2004</year>
          ),
          <fpage>787</fpage>
          -
          <lpage>788</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Enriquez</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , and MacLean,
          <string-name>
            <surname>K. E.</surname>
          </string-name>
          <article-title>The hapticon editor: A tool in support of haptic communication research</article-title>
          .
          <source>In Proc. HAPTICS</source>
          <year>2003</year>
          (Los Angeles, USA, Mar.
          <year>2003</year>
          ),
          <fpage>356</fpage>
          -
          <lpage>362</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Gervais</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Roo</surname>
            ,
            <given-names>J. S.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Hachet</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>Tangible viewports: Getting out of flatland in desktop environments</article-title>
          .
          <source>In Proc. TEI</source>
          '
          <volume>16</volume>
          (
          <issue>Funchal</issue>
          , Portugal,
          <year>2016</year>
          ),
          <fpage>176</fpage>
          -
          <lpage>184</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Kim</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coutrix</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Roudaut</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <article-title>Knobslider: design of a shape-changing device grounded in users' needs</article-title>
          .
          <source>In Proc. IHM</source>
          <year>2016</year>
          (
          <article-title>Fribourg</article-title>
          , Suisse, Oct.
          <year>2016</year>
          ),
          <fpage>91</fpage>
          -
          <lpage>102</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Kim</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kim</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lee</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nam</surname>
          </string-name>
          , T.-J., and
          <string-name>
            <surname>Lee</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          <article-title>Inflatable mouse: volume-adjustable mouse with air-pressure-sensitive input and haptic feedback</article-title>
          .
          <source>In Proc. CHI</source>
          '
          <volume>08</volume>
          (
          <issue>Florence</issue>
          , Italy, Apr.
          <year>2008</year>
          ),
          <fpage>211</fpage>
          -
          <lpage>224</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <given-names>Le</given-names>
            <surname>Goc</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            ,
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. H.</given-names>
            ,
            <surname>Parsaei</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Fekete</surname>
          </string-name>
          , J.
          <string-name>
            <surname>-D.</surname>
          </string-name>
          ,
          <string-name>
            <surname>Dragicevic</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Follmer</surname>
          </string-name>
          , S. Zooids:
          <article-title>Building blocks for swarm user interfaces</article-title>
          .
          <source>In Proc. UIST</source>
          '
          <volume>16</volume>
          (Tokyo, Japan, Oct.
          <year>2016</year>
          ),
          <fpage>97</fpage>
          -
          <lpage>109</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Leithinger</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Ishii</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <article-title>Relief: a scalable actuated shape display</article-title>
          .
          <source>In Proc. TEI</source>
          '
          <volume>10</volume>
          (Cambridge, USA,
          <year>2010</year>
          ),
          <fpage>221</fpage>
          -
          <lpage>222</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Löffler</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kaul</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Hurtienne</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <article-title>Expected behavior and desired appearance of insect-like desk companions</article-title>
          .
          <source>In Proc. TEI</source>
          '
          <volume>17</volume>
          (
          <issue>Yokohama</issue>
          , Japan,
          <year>2017</year>
          ),
          <fpage>289</fpage>
          -
          <lpage>297</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Pietrzak</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Malacria</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Bailly</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <article-title>CtrlMouse et TouchCtrl : Dupliquer les délimiteurs de mode sur la souris</article-title>
          .
          <source>In Proc. IHM</source>
          <year>2014</year>
          (
          <article-title>Lille</article-title>
          , France, Oct.
          <year>2014</year>
          ),
          <fpage>38</fpage>
          -
          <lpage>47</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Pietrzak</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Martin</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Pecci</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          <article-title>Affichage d'informations par des impulsions haptiques</article-title>
          .
          <source>In Proc. IHM</source>
          <year>2005</year>
          (Toulouse, France, Sept.
          <year>2005</year>
          ),
          <fpage>223</fpage>
          -
          <lpage>226</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Pietrzak</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Martin</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Pecci</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          <article-title>Information display by dragged haptic bumps</article-title>
          .
          <source>In Proc. Enactive /05 (Genova</source>
          , Italy,
          <year>2005</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Probst</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lindlbauer</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Haller</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schwartz</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Schrempf</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <article-title>A chair as ubiquitous input device: exploring semaphoric chair gestures for focused and peripheral interaction</article-title>
          .
          <source>In Proc. CHI</source>
          '
          <volume>14</volume>
          (Toronto, Canada, May
          <year>2014</year>
          ),
          <fpage>4097</fpage>
          -
          <lpage>4106</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Rekimoto</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ishizawa</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schwesig</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Oba</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <article-title>PreSense: interaction techniques for finger sensing input devices</article-title>
          .
          <source>In Proc. UIST</source>
          '
          <volume>03</volume>
          (Vancouver, Canada, Nov.
          <year>2003</year>
          ),
          <fpage>203</fpage>
          -
          <lpage>212</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Roudaut</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rau</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sterz</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Plauth</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , , and Baudisch,
          <string-name>
            <surname>P.</surname>
          </string-name>
          <article-title>Gesture output: eyes-free output using a force feedback touch surface</article-title>
          .
          <source>In Proc. CHI</source>
          '
          <volume>13</volume>
          (Paris, France, May
          <year>2013</year>
          ),
          <fpage>2547</fpage>
          -
          <lpage>2556</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Ullmer</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Ishii</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <article-title>Tangible bits: towards seamless interfaces between people, bits and atoms</article-title>
          .
          <source>In Proc. CHI</source>
          '
          <volume>97</volume>
          (Atlanta, USA,
          <year>1997</year>
          ),
          <fpage>234</fpage>
          -
          <lpage>241</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>