<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Developing Multi-touch Software through Creative Destruction</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Ingmar S. Franke, Dietrich Kammer,</string-name>
          <email>ingmar.franke@tu-dresden.de</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Simone Happ, Juliane Steinhauf,</string-name>
          <email>juliane.steinhauf@t-systems.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Frank Schönefeld, T-Systems Multimedia Solutions GmbH</institution>
          ,
          <addr-line>Riesaer Straße 5, D-01129 Dresden, +49 351 2820 0</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Rainer Groh, Technische Universität Dresden</institution>
          ,
          <addr-line>Fakultät Informatik, Professur Mediengestaltung, Nöthnitzer Straße 46, D-01187 Dresden, +49 351 463 39261</addr-line>
        </aff>
      </contrib-group>
      <abstract>
        <p>Tangible and Gesture-Based Interfaces are made possible by hardware like the Microsoft® Surface Device. Despite the commitment of big business, the killer application is yet to be found. To address this challenge, we investigate the innovation process. Against this background, a workshop was set up and conducted. The clarification of focuses during brainstorming is achieved by introducing a triad of innovation, which revealed potentials for entertaining multi-touch applications. We respond to these potentials with the custom-made solutions realized during our workshop.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        MOTIVATION
Natural user interface, Gesture-based Interfaces, Software
design, Creativity techniques
Multi-touch technology is becoming available to customers
today. The direct interaction paradigm supported by
multitouch devices has been the focus of researchers for a long
period of time [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ][
        <xref ref-type="bibr" rid="ref8">8</xref>
        ][
        <xref ref-type="bibr" rid="ref16">16</xref>
        ], and is being transported into the
mainstream with mature products like iPhone or iPad from
Apple Inc. [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] or Laptops equipped with N-Trig®
technology like the Touchsmart series from Hewlett
Packard [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] or the Dell Latitude™ XT2 [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. The
Microsoft® Surface device demonstrates how the
technology scales to larger setups, which will be available
to consumers in the future. Multi-touch interfaces excel in
user experience, which is more entertaining due to the
direct, unencumbered interaction with digital content.
BACKGROUND
In contrast to the mature and robust hardware, software
products for large-scale multi-touch devices are lagging
behind. To overcome this discrepancy, we consider
commoditisation as described by Simon Wardley [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]. We
seek to adapt his conclusions to multi-touch technology.
Figure 1 shows how novel ideas, generated from
discoveries or inventions, are transformed into common
and ubiquitous appliances that are central to the
development of a service infrastructure. This can be
illustrated by the example of electricity [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]. From its
inception, there have been innovative inventions and
prototypes. This was followed by custom-made solutions to
power specific appliances. Products like electric generators
or the light bulb were followed by a network of different
services offered by power companies, providing the
appropriate infrastructure to make electricity a ubiquitous
phenomenon.
With multi-touch technology, we are currently seeing a lot
of custom-made solutions. In this contribution, we present
approaches to exploit custom-made solutions to generate
knowledge and reusable components in order to evolve into
the stage of products. Especially the entertaining qualities
and ease of use of these products is the key to their success.
In order to generate knowledge concerning the
development of multi-touch software, it was decided to
build on the key instruments of commoditisation. The goal
In order to generate knowledge concerning the
development of multi-touch software, it was decided to
build on the key instruments of commoditisation. The goal
was to exploit creative destruction and adaptation of
existing solutions (cp. Figure 1). As commoditisation
shows, there is an increasing pressure on the developers to
gain advantages in the innovation process.
WORKSHOP SETTING
As a consequence of the previous considerations, a
workshop was set up by a group of researchers and
business experts in product development with the following
terms and conditions. Ten students were picked from
applicants with knowledge in software developing and fine
arts. None of them had any experience with multi-touch
technology so far. The goal was to support creative
destruction, in contrast to the more technical approach of
professionals in the field of software development.
The initial kick-off meeting included a traditional
brainstorming [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. Thoughts were organized with regard
to different focuses in a triad of innovation (cp. Figure 2).
Multi-touch technology presents certain challenges, e. g.
the orientation of text in a bargaining situation. On the
other hand, concrete solutions are made possible by the
properties of multi-touch tables, e. g. collaborative work
with simultaneous interaction on one device. Other ideas
are of a more conceptual nature. This is the case for
interfaces based on tangible objects or 3D modelling
software. Creative destruction was enforced by presenting
the innovative technology to students without going into
technical details and limitations the technology currently
exhibits. As a result, the capabilities were questioned and
sometimes even overestimated, revealing requirements for
future versions of the hardware. For instance, identification
of unique users and mapping of touch interaction to user’s
hands was discussed. Some of these modalities already
exist in research setups [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. After the initial brainstorming,
the students were acquainted with the concrete features and
capabilities of the multi-touch table. Adaptation of these
properties was crucial to implement the ideas from the
brainstorming (cp. Figure 1).
      </p>
      <p>
        Students were asked to choose their focuses and refine the
scenarios to an application and a working prototype. This
was realized by providing a Microsoft® Surface Device
and workstations running Windows Vista and Visual
Studio with the Surface SDK [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. As a result, the
following technical capabilities were addressed:
•
•
•
•
      </p>
      <p>Multi-user setup of table allows collaborative work
Physical objects can be used for interaction (tagging)</p>
    </sec>
    <sec id="sec-2">
      <title>Recognition of 52 simultaneous touch contacts</title>
    </sec>
    <sec id="sec-3">
      <title>Bluetooth, Infrared and Wireless LAN available</title>
      <p>Over the course of 14 days, five teams of two students
worked on one project each. Each day commenced with a
status report in order to continuously evaluate the process
of creative destruction by the workshop advisers and to
allow peer review among the students. The first week was
focused on developing conclusive concepts and getting to
know the technical properties of the Microsoft® Surface
Device. The second week was dedicated to the
implementation of the prototypes. Our initial non-technical
approach enhanced the creative quality of the resulting
applications. In addition, advisers offered consulting to
support the process of creative destruction.</p>
      <p>During the workshop the standard Surface SDK was
adapted to the requirements of each individual approach. At
this point, knowledge about the workflow in software
development for multi-touch technology was generated.
New controls that adapt and extend the functionality
provided by the Surface SDK are introduced.</p>
      <p>In the next section, we present the five applications and
their underlying scenarios that resulted from our workshop
of creative destruction.</p>
      <p>
        CASE STUDIES
This section describes the custom-made solutions designed
and implemented during the workshop (cp. Figure 1).
Different modes of interaction emerged from each
application. A reference to all interaction modes is
provided in Table 1, along with the specific switching
patterns used in each scenario. This is further proof that
universal concepts like WIMP (Windows, Icons, Menus,
Pointing devices) have to be abandoned in order to design
multi-touch interfaces [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. Individual solutions adapted to
their specific requirements are advisable.
      </p>
      <sec id="sec-3-1">
        <title>Navigatable</title>
      </sec>
      <sec id="sec-3-2">
        <title>Tangible</title>
      </sec>
      <sec id="sec-3-3">
        <title>Design Helper</title>
      </sec>
      <sec id="sec-3-4">
        <title>SurfaceReader</title>
      </sec>
      <sec id="sec-3-5">
        <title>MySpace</title>
      </sec>
      <sec id="sec-3-6">
        <title>TagIt</title>
      </sec>
      <sec id="sec-3-7">
        <title>Modes of interaction</title>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Navigation vs. modification</title>
    </sec>
    <sec id="sec-5">
      <title>Presentation vs. edit</title>
    </sec>
    <sec id="sec-6">
      <title>Alignment vs. annotations</title>
    </sec>
    <sec id="sec-7">
      <title>Collaboration vs. private work</title>
    </sec>
    <sec id="sec-8">
      <title>Tagging vs. filter</title>
      <sec id="sec-8-1">
        <title>Switching pattern</title>
      </sec>
    </sec>
    <sec id="sec-9">
      <title>Gesture based in 3D content (using a five fingers gesture)</title>
    </sec>
    <sec id="sec-10">
      <title>Tangible based (put down interaction PUK)</title>
    </sec>
    <sec id="sec-11">
      <title>Menu based (tap on icon)</title>
    </sec>
    <sec id="sec-12">
      <title>Gesture based in</title>
      <p>2D content (pull
forward gesture
to bring up
private
workspace )</p>
    </sec>
    <sec id="sec-13">
      <title>Tangible based (interaction with content or application area)</title>
      <sec id="sec-13-1">
        <title>Navigatable</title>
        <p>The Navigatable application was focused on the challenge
of adapting natural human gestures to the multi-touch
scenario. Human gestures are three-dimensional, similar to
3D computer graphics. In this context, multi-touch
interfaces constitute a dimensional discrepancy. Data from
a two-dimensional interface has to be interpreted and
mapped to a three-dimensional space.
In our approach, users can switch between the navigation
and modification modes by using a five fingers gesture
performed with one hand (Figure 3, top picture). As visual
feedback, the background changes to a darker shade of
grey. Auxiliary objects are used to translate models on each
of the three axes in space. This is done by dragging one of
the coloured manipulation spheres (Figure 3, bottom
picture). Multi-touch is further exploited by dragging two
manipulation spheres simultaneously. Consequently, the
translation axis in space is defined between the two
spheres. Tapping on a manipulation sphere rotates the
selected object or part of the object to the corresponding
section of the model.</p>
        <p>By putting the whole hand on the surface, every
modification is reversed and the model is presented without
auxiliary objects. Models are loaded into Navigatable by
putting actual file cards on the table. They show a preview
picture of the model on one side and a tag is applied on the
flipside for recognition purposes.</p>
        <p>In navigation mode, the familiar pinch gesture with two
fingers is used to zoom in and out. Rotating is performed
with one finger and a swiping motion and panning of the
camera is realized by a three fingers swipe. It is possible to
select an object in the scene by tapping on it. This will
change the camera focus to the selected object and the pivot
point for rotating is set to the object’s centre.</p>
        <p>Navigatable answers to the challenge of using the
twodimensional interaction surface of a multi-touch table as a
proxy for three-dimensional objects. In this innovative
undertaking, a set of tools and gestures were defined, which
can be applied to 3D modelling software in general. The
seemingly complex gesture interaction necessary to
navigate in three-dimensional space is easily learned due to
the entertaining qualities of multi-touch interaction. They
consist of direct feedback and unencumbered contact with
the multi-touch surface, which displays digital content.
Tangible Design Helper
Designing a laptop cover in a collaborative, tangible based
manner is possible with our Tangible Design Helper
application. Existing design templates are available and
custom pictures and content can be added.</p>
        <p>Apart from gestures based on fingers, the program employs
a tangible input device. The Personal User Key (PUK) is
used to interact with the system and constitutes an evident
element to avoid conflicts between users. Only one user is
able to use the PUK at any time, in order to allow
undisturbed interaction. Removing the PUK from the
interaction surface disables all editing features and switches
into presentation mode. Standard gestures to rotate and to
zoom in and out of the laptop cover are available to the
users surrounding the multi-touch table.</p>
        <p>In interaction mode, elements on the cover can be selected
and resized by all users with finger gestures. The PUK is
managed by one user at a time and offers different tools to
change fonts and colors (Figure 4, bottom picture) and
allows inserting of new elements (Figure 4, in the middle).
Tangible Design Helper incorporates simultaneous
interaction of multiple users on the laptop cover and the
PUK as a singular element to avoid conflicts. Natural
understanding of the users at the table is exploited to reach
agreements regarding the management of the PUK and its
functionality.</p>
        <p>For instance, this tool can be used in a store to personalize
a product in a comfortable and spontaneous way.
Multitouch tables are often used in business to consumer
settings. Tangible Design Helper demonstrates sales
conversations with actual participation of the customer.
This custom-made solution is also a showcase for point of
sale situations in which the setup of physical objects on the
interaction surface can influence the system state (cp.
Figure 1).
The SurfaceReader application enables users to read and
annotate electronic documents in a collaborative manner.
Documents are duplicated for each user and aligned to be
read properly. Annotations are made by switching into
annotation mode (Figure 5, bottom picture). To this end, a
menu is located at the edge of each document. A master
key (red circular button in Figure 5, top picture) can be
dragged onto one of the duplicates. The navigation and
annotations are reflected on each of the other documents.
The SurfaceReader realizes a bargaining table with
electronic documents and can be extended to a distributed
setting. Documents can be shared on several tables via
network connection. Cooperative work as well as teaching
and learning are realized and assisted by the concepts
introduced in this solution.</p>
        <p>
          TagIt
The TagIt application supports users in tagging their photo
collections. Tangible objects shaped like stamps are used.
Tagging is facilitated, and users are motivated to actually
organize and classify their photo collection. Collaborative
tagging is possible by employing more than one stamp.
Each stamp is individually associated with a set of tags.
Users can activate their stamp-pad from the side of the
table (on the bottom and right of Figure 6, top picture).
New tags can be defined (Figure 6, bottom picture) and
combined to be associated with a stamp. Besides the
viewing and tagging mode, it is also possible to put a stamp
on the background of the photo application. The filter mode
is activated, and only photos matching the tags of the
particular stamp are displayed. The TagIt application
demonstrates how skills, rules and knowledge of users can
be exploited in natural user interfaces [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ]. The notion of
indexing content by stamping is a cultural tradition that has
been known even before printing was invented by
Gutenberg in the 15th century. The concept has proven to
be stable even in applications like Adobe’s Photoshop®
toolbox. Multi-touch rediscovers the actual manual process.
MySpace
The MySpace application targets collaborative work on
multi-touch tables. Simple whiteboard or brainstorming
activities can be assisted by introducing personalized,
private workspaces and a shared workspace for
collaborative interaction.
        </p>
        <p>
          Personalized tags identify users and serve to provide data
and content. Private areas are activated at the edges of the
table by putting down a user tag (Figure 7, top picture).
Private workspaces allow interaction only from the
corresponding location of the user. This is achieved by
matching the finger orientation provided by the Surface
SDK to the correct workspace [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ]. Pictures, text, and other
objects can be arranged in private workspaces. To make
any object publicly available for collaborative
manipulation, it is dragged into the shared workspace with
three fingers. The ring menu (Figure 7, bottom picture) is
used to store objects and to access manipulation tools. For
instance, text labels can be associated with each other by
means of the connector tool. Moreover, this ring menu is
fully customizable and refactored to be reused in other
applications. If one of the private workspaces is not needed,
it can be minimized by using a four finger pull gesture.
This increases the size of the shared workspace. It also
frees up space for the private workspaces of other users.
Similar to the SurfaceReader application, a distributed
version of the MySpace application is conceivable. Users in
various places can work with several multi-touch tables on
joint activities. To this end, personalized tags could be used
to carry private content and the individual workspace
layout from one table to the other. This supports the
knowledge management of each user over time and
different locations.
        </p>
        <p>
          CONCLUSIONS AND FUTURE WORK
In this contribution, we presented a working approach to
leveraging the potentials of multi-touch technology. The
workshop setting we described can help to put forth
innovative and sustainable concepts. As a rationale for this
setting, the framework of commoditisation with its key
instruments of creative destruction and adaptation was
exploited (cp. Figure 1). In Figure 2, thoughts and ideas are
arranged according to their focuses on conceptual work,
challenges and solutions. Prior to the definition of
custommade solutions, this triad revealed potentials of multi-touch
technology. This is one approach to support creative
destruction. Other techniques, such as TRIZ [
          <xref ref-type="bibr" rid="ref1">1</xref>
          ] are
available. However, due to the complexity, it is not suitable
for the presented workshop setting. Similarly, methods like
Synectics [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ] require more time and training to be effective.
In the future, we seek to further deconstruct existing
technical solutions. In a following workshop, fundamental
capabilities of multi-touch interaction will be investigated.
Further custom-based solutions can be developed and
transformed into stable and mature products.
        </p>
        <p>ACKNOWLEDGMENTS
Thanks are due to the staff of Professur Mediengestaltung
at Technische Universität Dresden for supporting the
practical work of the students. We are indebted to the
inspiring and motivated work of the students who
participated in the workshop: Claudia Zimmer, Kristin
Dietze, Marie Schacht, Tino Winkler, Christian Klauss,
Julian Eberius, Martin Herrmann, Frank Harnisch, Mathias
Müller and Florian Schneider.</p>
        <p>Finally, thanks are due to T-Systems Multimedia Solutions
GmbH for supplying the Microsoft Surface Device and
providing technical assistance during the workshop.</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Altshuller</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <year>1973</year>
          .
          <article-title>The Innovation Algorithm: TRIZ, Systematic Innovation</article-title>
          and
          <string-name>
            <given-names>Technical</given-names>
            <surname>Creativity</surname>
          </string-name>
          , Original publication in Russian. English translation by L.
          <string-name>
            <surname>Shulyak</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Rodman</surname>
          </string-name>
          , Technical Innovation Center, Inc., USA,
          <year>1999</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <given-names>Apple</given-names>
            <surname>Inc</surname>
          </string-name>
          . Apple - iPhone - Mobile phone, iPod, and
          <article-title>Internet device</article-title>
          . Website. Available at: http://www.apple.com/iphone/. Retrieved 2/24/
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3. van Dam,
          <string-name>
            <surname>A.</surname>
          </string-name>
          <year>1997</year>
          .
          <article-title>Post-WIMP user interfaces</article-title>
          .
          <source>Commun. ACM 40</source>
          ,
          <issue>2</issue>
          (Feb.
          <year>1997</year>
          ),
          <fpage>63</fpage>
          -
          <lpage>67</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <given-names>Dell</given-names>
            <surname>Latitude</surname>
          </string-name>
          <article-title>Laptops</article-title>
          . Website. Available Online: http://www.dell.
          <source>com/latitude. Retrieved</source>
          <volume>2</volume>
          /24/
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Echtler</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Huber</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Klinker</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <year>2008</year>
          .
          <article-title>Shadow tracking on multi-touch tables</article-title>
          .
          <source>In Proceedings of the Working Conference on Advanced Visual interfaces (Napoli</source>
          , Italy, May
          <volume>28</volume>
          - 30,
          <year>2008</year>
          ).
          <source>AVI '08. ACM</source>
          , New York, NY,
          <fpage>388</fpage>
          -
          <lpage>391</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Fitzmaurice</surname>
            ,
            <given-names>G. W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ishii</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Buxton</surname>
            ,
            <given-names>W. A.</given-names>
          </string-name>
          <year>1995</year>
          .
          <article-title>Bricks: laying the foundations for graspable user interfaces</article-title>
          .
          <source>In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems</source>
          (Denver, Colorado, United States, May
          <volume>07</volume>
          - 11,
          <year>1995</year>
          ).
          <string-name>
            <given-names>I. R.</given-names>
            <surname>Katz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Mack</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Marks</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. B.</given-names>
            <surname>Rosson</surname>
          </string-name>
          , and J. Nielsen, Eds.
          <source>Conference on Human Factors in Computing Systems</source>
          . ACM Press/Addison-Wesley Publishing Co., New York, NY,
          <fpage>442</fpage>
          -
          <lpage>449</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Gordon</surname>
            ,
            <given-names>W. J. J.</given-names>
          </string-name>
          <year>1961</year>
          .
          <article-title>Synectics: The development of creative capacity</article-title>
          . Harpercollins College Div.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Hauptmann</surname>
            ,
            <given-names>A. G.</given-names>
          </string-name>
          <year>1989</year>
          .
          <article-title>Speech and gestures for graphic image manipulation</article-title>
          .
          <source>SIGCHI Bull. 20</source>
          ,
          <string-name>
            <surname>SI</surname>
          </string-name>
          (Mar.
          <year>1989</year>
          ),
          <fpage>241</fpage>
          -
          <lpage>245</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Hewlett-Packard Development</surname>
            Company,
            <given-names>L.P.</given-names>
          </string-name>
          <string-name>
            <surname>HP Touchsmart</surname>
          </string-name>
          <article-title>PCs</article-title>
          . Website. Available at: http://www.hp.com/unitedstates/campaigns/touchsmart/. Retrieved 2/24/
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10. Microsoft®. Microsoft® Surface. Website. Available at: http://www.microsoft.com/surface/. Last Checked:
          <volume>2</volume>
          /24/
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11. Microsoft® Developer Center .
          <article-title>Microsoft Surface SDK</article-title>
          . Website. Available at: http://msdn.microsoft.com/enus/library/ee804845.aspx.
          <source>Retrieved</source>
          <volume>2</volume>
          /24/
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Osborn</surname>
            ,
            <given-names>A. F. Applied</given-names>
          </string-name>
          <string-name>
            <surname>Imagination</surname>
          </string-name>
          . Scribner, June 1979
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Rasmussen</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <year>1987</year>
          .
          <article-title>Skills, rules, and knowledge; signals, signs, and symbols, and other distinctions in human performance models. In System Design For Human interaction, A</article-title>
          . P. Sage, Ed. IEEE Press, Piscataway, NJ,
          <fpage>291</fpage>
          -
          <lpage>300</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Ren</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          <year>2009</year>
          .
          <article-title>Empirical evaluation for finger input properties in multi-touch interaction</article-title>
          .
          <source>In Proceedings of the 27th international Conference on Human Factors in Computing Systems</source>
          (Boston, MA, USA, April
          <volume>04</volume>
          -
          <issue>09</issue>
          ,
          <year>2009</year>
          ).
          <source>CHI '09. ACM</source>
          , New York, NY,
          <fpage>1063</fpage>
          -
          <lpage>1072</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Wardley</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <article-title>Innovation, the Future and Why Nothing is Ever Simple</article-title>
          . Available at http://www.capgemini.com/technologyblog/2008/12/simon_wardley_and_innovation_l.php.
          <source>Retrieved</source>
          <year>2010</year>
          -
          <volume>02</volume>
          -19.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Wolf</surname>
            ,
            <given-names>C. G.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Morrel-Samuels</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          <year>1987</year>
          .
          <article-title>The use of hand-drawn gestures for text editing</article-title>
          .
          <source>Int. J. Man-Mach. Stud</source>
          .
          <volume>27</volume>
          ,
          <issue>1</issue>
          (Jul.
          <year>1987</year>
          ),
          <fpage>91</fpage>
          -
          <lpage>102</lpage>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>