=Paper= {{Paper |id=Vol-1190/paper1 |storemode=property |title=Describing Movements for Motion Gestures |pdfUrl=https://ceur-ws.org/Vol-1190/paper1.pdf |volume=Vol-1190 |dblpUrl=https://dblp.org/rec/conf/eics/AltakrouriS14 }} ==Describing Movements for Motion Gestures== https://ceur-ws.org/Vol-1190/paper1.pdf
                       Describing Movements for Motion Gestures
                    Bashar Altakrouri                                                                    Andreas Schrader
          Ambient Computing Group, Institute of                                                Ambient Computing Group, Institute of
                         Telematics                                                                           Telematics
         University of Luebeck, Luebeck, Germany                                              University of Luebeck, Luebeck, Germany
              altakrouri@itm.uni-luebeck.de                                                         schrader@itm.uni-luebeck.de


ABSTRACT                                                                                   Principally, gestures describe situations where body move-
Gestural interactions will continue to proliferate, enabling a                             ments are used as a means to communicate to either a ma-
wide range of possibilities to interact with mobile, pervasive,                            chine or a human (revised from Mulder’s definition of hand
and ubiquitous environments. Particularly, motion gestures                                 gestures [24]).
are getting an increasing attention amongst researchers. Like-
                                                                                           Gestures come in different forms such as motion gestures,
wise, a large adoption of motion gestures is noticeable on a
                                                                                           facial expressions, and bodily expressions [24]. Moreover,
commercial level. Motion gestures research strives to utilize
                                                                                           they are often discussed, classified, and defined from vari-
the human body potential for interaction with interactive eco-
                                                                                           ous viewpoints and perspectives. The major part of human
systems. Despite the innovation and development in this field,
                                                                                           gesture classification research is focused on human discourse
we believe that describing motion gestures remains an un-
                                                                                           [31], but also extend to human/device dialog approach [29],
solved challenge for the community to tackle and the effort
                                                                                           input device properties and sensing technology [15], etc. This
in this direction is still limited. In our research, we focus on
                                                                                           diversity has been reflected on the wide range and diverse
describing the human body movements for motion gestures
                                                                                           gesture manipulation parameters, taxonomies, design spaces,
based on movement description languages (particularly, La-
                                                                                           and gesture to command mappings. Hence, the complexity
banotation). In this paper, we argue that without adequate
                                                                                           to tackle many open questions regarding gestural interaction
descriptions of gestural interactions, the engineering of inter-
                                                                                           descriptions and languages is inevitably increased. Paradoxi-
active systems for large-scale dynamic runtime deployment
                                                                                           cally, Scoditti et al. [30] pointed out that whilst sensor-based
of existing and future interaction techniques will be greatly
                                                                                           interaction research often presents highly satisfactory results,
challenged.
                                                                                           they often fail to support designers’ decisions and researchers
                                                                                           analysis. Bailly et al. [6] proposed a set of guidelines for
Author Keywords                                                                            gesture-aware remote controllers based on a series of stud-
Natural User Interfaces (NUI); Gesture Interfaces; Motion                                  ies and five novel interaction techniques, but the scope of
Interfaces; HCI modeling; HCI documentation; Description                                   their guidelines remains limited and is not scalable to other
Languages.                                                                                 application domains or interaction techniques. Moreover, re-
                                                                                           searchers have pointed out that Gestural research still lacks a
                                                                                           well defined and clear design space for multitouch gestures
ACM Classification Keywords                                                                [31] and motion gestures [29]. Furthermore, the bodily pres-
H.5.m. Information Interfaces and Presentation (e.g., HCI):                                ence in HCI remains limited due to the subtlety and complex-
Miscellaneous; H.5.2 Information interfaces and presentation                               ity of of human movement, leaving an open space for further
(e.g., HCI): User Interfaces                                                               investigations [23].
                                                                                           Principally, gestures are described and disseminated in vari-
                                                                                           ous forms including written material, visual clues, animated
INTRODUCTION
                                                                                           clues, and formal description models and languages. In their
Human Computer Interaction (HCI) research has continued to                                 work about formal descriptions for multitouch interactions,
flourish, with an expanding world of interconnected devices                                Hamon et al. [11] analyzed the expressiveness of various user
and technologies driven by rich interaction capabilities. This                             interface description languages (an extension to [26]). Princi-
innovation is fueled with increasing calls for HCI researchers                             pally, modeling includes mainly data description, state repre-
to investigate new interaction possibilities. This has resulted                            sentation, event representation, timing, concurrent behavior,
into an increasing innovation in Gestural studies. Gestures in                             and dynamic instantiation. Despite the existence of various
the HCI field have been closely related to human gesturing,                                approaches to describe touch-based interactions, the litera-
which is extensively studied in different fields such as linguis-                          ture lacks a similar coverage for motion gestures. An exten-
tics, anthropology, cognitive science, and psychology [20].                                sive review on those approaches is out of the scope of this
                                                                                           paper. Herein, we target our effort to describe the movement
                                                                                           aspects of motion-based gestures, which we believe is not a
EGMI 2014, 1st International Workshop on Engineering Gestures for Multimodal In-
terfaces, June 17 2014, Rome, Italy.
                                                                                           well exploited research direction by the HCI community.
Copyright c 2014 for the individual papers by the papers’ authors. Copying permit-
ted only for private and academic purposes. This volume is published and copyrighted
by its editors.
http://ceur-ws.org/Vol-1190/.

                                                                                       1
Gesture description languages are relevant for the correct             ble commands for a particular applications. In Bellotti et al.’s
execution of interactions by end users, the preservation of            framework ”Address” refers to the communication with an
technique by designers, the accumulation of knowledge for              interactive system, ”Attention” indicates whether the system
the community, and the engineering of interactive systems.             is attending to the user, ”Action” defines the interaction goal
Moreover, we argue that languages for describing various               for the system, ”Alignment” refers to monitoring the system
movement aspects of gestures are very important resources              response, and finally ”Accident” refers to errors avoidance
of context information about the gestures, which can be uti-           and recovery.
lized by interactive systems for various reasons. For instance,
filtering and selecting adequate gestural interactions could be        The richness of human body movements makes human move-
                                                                       ment an overwhelming subject for designing and engineering
based on the user’s physical context. Recently, we have pro-
                                                                       interactions. The hand and its movements, for instance, pro-
posed a shift towards completely dynamic on-the-fly ensem-
                                                                       vide an open list of interaction possibilities. In his work, Mul-
bles of interaction techniques at runtime [4]. The Interac-
                                                                       der [24] listed just a subset of hand movements that reflects
tion Ensembles approach is defined as ”Multiple interaction
                                                                       interaction possibilities, which included: Accusation (index
modalities (i.e. interaction plugins) are tailored at runtime to
adapt the available interaction resources and possibilities to         pointing); moving objects; touching objects; manipulating
the user’s physical abilities, needs, and context” [4]. Engi-          objects; waving and saluting; pointing to real and abstract
neering an interactive system of this kind imposes new dis-            objects; and positioning objects. Moreover, he described and
semination (especially interaction description and modeling),          categorized hand movements into goal directed manipulation,
deployment, and adaptation requirements and challenges to              empty-handed gestures, and haptic exploration. This clas-
                                                                       sification reveals the potential of one individual part of hu-
consider.
                                                                       man body. The goal directed manipulation category includes
In this paper, we discuss the use of movement description              movement for changing position (e.g., lift and move), chang-
languages for describing motion gestures and we present our            ing orientation (e.g., revolve, twist), changing shape (e.g.,
approach of choice to tackle this problem.                             squeeze and pinch), contact with the object (e.g., snatch and
                                                                       clutch), joining objects (e.g., tie and sew), and indirect ma-
BACKGROUND AND RELATED WORK
                                                                       nipulation (e.g., set and strop). The empty-handed gestures
                                                                       category included examples such as twiddle and wave. Fi-
Research on utilizing movements for interaction is spread
                                                                       nally, the haptic exploration category included touch, stroke,
over a wide research landscape. For instance, computer vi-
                                                                       strum, thrum, and twang. In the same work, he also indicated
sion studies different approaches to visually analyze and rec-
                                                                       that there are other types of categorization base on communi-
ognize human motion on multiple levels (i.e. body parts,
                                                                       cation aspects for example. Yet, this potential grows greatly
whole body, and high level human activities) [22]. Other
                                                                       when considering the rich nature of natural interaction tech-
research projects involve affective computing to study ex-
                                                                       niques, as in whole body interactions and motion-based inter-
pressive movements as in the EMOTE model [9] and Eye-
                                                                       actions for instance.
sWeb [8], movements visual analysis [1], and representation
of movements [5]. The literature is also rich with examples            The notion of movement qualities is another well studied and
on utilizing movements for interactions. Rekimoto [28] pre-            applied topic in different fields, especially in dance and chore-
sented one of the earliest work on mapping motion (e.g., tilt-         ography. Despite the importance of movement for interac-
ing) to navigate menus, interact with scroll bars, pan, zoom,          tion, the HCI field does not yet explore this notion on the
and to perform manipulate actions on 3D objects. The re-               same scale. In fact, some argue that the primary founda-
search effort on tilting was then followed, especially in the          tions of movement qualities are very poorly discussed in the
mobile interaction area by Harrison et al. [12] and Bartlett           HCI literature [2], despite some recent research contributions
[7]. Meanwhile, Hinckley et al.’s [16] idea of using tilting           as James et al. (interactions technique based on dance per-
for controlling the mobile screen orientation is one of the            formance) [18], Moen (applying Laban effort dance theory
most widely adopted techniques implemented in many mo-                 for designing of movement-based interaction) [23], Alaoui
bile phones currently sold on the market.                              et al. (movement qualities as interaction modalities) [2], and
                                                                       Hashim et al. (Laban’s movement analysis for graceful inter-
In their work on movement-based interactions, Loke et
                                                                       action) [13]. The discussed work in this paper contributes to
al. [22] presented an interesting analysis on the design of
                                                                       this area of research.
movement-based interactions from four different frameworks
and perspectives: Suchman’s framework for covering the                 To our best knowledge, universal design guidelines for
communicative resources for interacting humans and ma-                 motion-based interactions are not easily found in the liter-
chines; Benford et al.’s framework (based on Expected,                 ature. Nonetheless, efforts to investigate and outline such
Sensed and Desired movements) for designing sensing-based              guidelines are recently reported for specific application do-
Interactions; Bellotti et al.’s framework (Address, Attention,         mains. For instance, Gerling et al. [10] proposed seven guide-
Action, Alignment, Accident) for sensor-based systems; and             lines for whole body interactions created based on gaming
Labanotation as one of the most popular systems of analyzing           scenarios and focused on elderly population.
and recording movement. In Benford et al.’s framework ”Ex-
pected” movements are the natural movements that users do,             Principally, one of the foundations of the work presented in
”Sensed” movements those which can be sensed by an inter-              this paper is to relay on human body movements as the central
active system, ”Desired” movements are those which assem-              focal point in designing, sharing and executing motion ges-


                                                                   2
tures. This position puts human body movement at the core
of our approach to describe gestures and our implementation
of what we call movement profiles.


DESCRIBING MOVEMENTS FOR MOTION GESTURES
Loke et al. [21] have presented an analysis of people’s move-
ments when playing two computer games, which utilize play-
ers’ free body movements as input sensed by a basic com-
puter vision. Their analysis included various ways to describe
movement, ranging from the mechanics of the moving body
in space and time, the expressive qualities of movement, the             Figure 1. Designer drawing 1: Documenting an arm swipe interaction
                                                                         by drawing
paths of movement, the rhythm and timing, and the moving
body involved in acts of perception as part of human action
and activity. Kahol et al. [19] proposed an intuitive method
to understand the creation and performance of gestures by
modeling gestures as a sequence of activity events in body
segments and joints. Once captured, the sequences can be
annotated by several different choreographers, based on their
own interpretations and styles.
HCI researchers tend to preserve and describe the movement
aspects of newly developed gestures using direct personal
transmissions, written textual records, still visual records
(e.g., images, sketches, drawings), and animated visual
records (e.g., videos). Nevertheless, the aforementioned                 Figure 2. Designer drawing 2: Documenting an arm swipe interaction
methods suffer from different drawbacks, which negatively                by drawing
affect the description quality, e.g., textual records are often
too ambiguous, inaccurate, or too complex to comprehend;
                                                                         hence challenging the design and engineering of interactive
still visual records fail to convey timing and movement dy-
                                                                         systems that utilize gestural interaction techniques.
namics; and animated visual records are affected greatly by
the capturing quality.                                                   Formal description models and languages are also used to de-
                                                                         scribe or disseminate the developed interaction. In their work,
Previously in [3], we have argued that describing movement               Hamon et al. [11] analyzed the expressiveness of various mul-
as an interaction element for ubiquitous and pervasive envi-
                                                                         titouch user interface description languages. They argued that
ronments is a more challenging task because of the hetero-
                                                                         modeling should include data description, state representa-
geneity of users’ needs and abilities, heterogeneity of envi-
                                                                         tion, event representation, timing, concurrent behavior, and
ronment context, and media renderers availability. We have               dynamic instantiation. Nonetheless, modeling and describing
also argued that the current documentation practices are not             the movement aspects of motion-based gestures, the focus of
fully suitable for motion gestures because of the lack of stan-          this paper, is not well investigated.
dardized and agreed upon description methods for motion
gestures. Current practices are too static and fixed to a par-           Proper description of movements in motion gestures should
ticular media type, which may easily limit the target users              therefore ensure a standardized machine readable and
of the interaction technique; current methods such as direct             parsable language; generation of documentation learning and
personal transmissions fail to scale with a massive user popu-           presentation material (e.g., visual records, and audio records)
lation; and current practices fail to clearly reveal the required        based on the context of the user and his environment; and
physical abilities to perform the interactions.                          methods for observing users’ interactions in order to provide
                                                                         suitable feedback and adaptation to depict clearly the required
To demonstrate one of the many issues regarding current doc-             interaction movements and physical abilities [3].
umentation practices, Figure 1 and Figure 2 show two differ-
ent drawings of the same interaction technique. The tech-                Labanotation is adopted for our approach due to its flexible
nique presented in the drawings is a simple arm swiping ges-             expressive power and holistic power to capture movements in
ture. This gesture requires the user to position the left arm            terms of movement structural description, analysis of patterns
to the front parallel position to the ground (as a starting posi-        (shapes), and qualities of movement (efforts). Labanotation is
tion), and move it to the left side to do a left swipe (for inter-       a system of analyzing and recording movement, originally de-
action). The two drawings depict the interaction differently             vised by Rudolf Laban in the 1920’s. It is then further devel-
using different drawing styles, angles, and ways to depict se-           oped by Hutchinson and others at the Dance Notation Bureau
quencing. Both drawings can be easily differently interpreted            [17]. Labanotation is used in fields traditionally associated
by users as well as peer-designers. This causes great varia-             with the physical body, such as dance choreography, physical
tions in interaction understanding and execution. Moreover,              therapy, drama, early childhood development, and athletics.
this style of interaction description is not machine readable,           Additionally, Labanotation fosters great flexibility that em-

                                                                     3
              Figure 3. Labanotation visual notations (staff)

                                                                       Figure 4. Using Labanotation to document 3Gear pinch interaction tech-
powers designers to describe all or any part of movements as           nique (right hand)
required. In this paper, we particularly aim at the structural
aspects of the movement.
                                                                       There have been a few previous research attempts to provide
In its current form, Labanotation is a visual notation system          XML presentation of Labanotation such as MovementXML
where symbols for body movements are written on a vertical             [14] and LabanXML [25] in the area of dance representation.
”body” staff. Each element in the notation has a dedicated             Nonetheless, the efforts were neither aimed at describing ges-
Labanotation symbol, which is used to present and document             tural interactions nor have they been widely adopted.
various movement qualities. Figure 3 illustrates the Laban-
otation staff. The staff is used as the layout for all involved        This scheme allows translating the notation to a machine
movements. Each column, from inside out, presents a differ-            readable representation of the motion gesture description.
ent body part. Column (1) presents the support (i.e., the dis-         Clearly, the representation illustrated in Figure 4 is not tar-
tribution of body weight on the ground). Columns (2) to (4)            geted at end users due to its speciality. The representation (in
present leg, body, and arm movements respectively. Column              its visual and XML code) provides an exact description of the
(5) and additional columns can be defined by the designers as          movement that can be only correctly interpreted by interac-
required. The most right column is defined for head move-              tion designers and developers, as well as interactive systems.
ments. The designer is still able to change this order as re-          Nonetheless, user friendly readable descriptions for end users
quired by redefining any columns except (1) and (2). The staff         are possible to be generated automatically based on the XML
is split into different sections. The symbols before the double        code by interactive systems (a detailed discussion in this di-
lines, indicated by (6), present the start position. Moreover,         rection is out of the scope of this paper).
the movements components appear after the position lines in            Generally, increasing the description details will result into
terms of measures (horizontal lines as in (8)) and beats (hor-         a fine preservation and execution of movements details.
izontal short lines as in (7)). The measures and beats define          Nonetheless, this inevitably causes a large movement profile
the timing of the movements. The right side and the left sides         that results into an increasing complexity of reading and in-
of the staff correspond to the two sides of the body involved.         terpretation. On contrary, reduced details result into a sim-
In Figure 4, a simple 3Gear1 pinching gesture for the right            ple movement description that is easy to read and interpret.
hand is modeled in Labanotation and its corresponding XML              Nonetheless, this leads to losing the details of movements.
representation is presented in Listing 5. The Figure 4 is read
as follows: (1) The right arm starts at a 90-degree angle to the       LABANOTATION XML SCHEME
rest of the body pointing forward. (2) The palm of the hand            Our approach based on Labanotation aims at a robust and
points to the left and should remain so during the interaction.        standardized description of movements in motion gestures,
(3) The right hand is naturally curved. (4) The right hand is          whereby the transmission and preservation of motion gestures
curved and the fingers tips touch each other. The position of          become possible. Nonetheless, the modeling of Labanotation
the fingers should be held for short time. (5) The hand returns        is challenging due to the extensibility of the notation, size,
to the natural curve quickly with the fingers naturally spread.        and variations of symbols.
The visual notation aims at a human readable approach for              In the scope of this work, a subset of Labanotation is con-
describing and reading movements, but is not adequately                sidered. Nonetheless, the extensibility of this scheme is
machine readable. Therefore, we have designed a compli-                still possible. The current scheme mainly targets the fol-
ant XML scheme that is both machine and human readable.                lowing structural elements: direction symbols, pins and con-
                                                                       tact hooks, space measurement signs, turn symbols, vibra-
1
    http://www.threegear.com, accessed on 03.04.2014                   tion symbols, body hold sign, back-to-normal sign, release-


                                                                   4
                                                                            in automated interactive systems, especially processes such
                                                                            as context acquisition, reasoning, interaction filtering, etc.
                                                                            are greatly hindered. Good record-keeping of motion ges-
                                                                            tures should guarantee to preserve and transfer the tech-
                                                                            nique to users and other peer designers without endanger-
                                                                            ing the originality and vital aspects of the technique.
                                                                          • The tension between formal and empirical movement
                                                                            descriptions: Formal interface description languages sup-
                                                                            port interaction at the development as well as the oper-
                                                                            ation phase, while conventional empirical or semiformal
                                                                            techniques lack to provide adequate and sufficient insights
                                                                            about the interaction (e.g., comparing two design options
                                                                            with respect to the reliability of the human-system coop-
                                                                            eration) [26]. Those techniques are more susceptible to
                                                                            losing parts of the movements, overly complicated descrip-
                                                                            tions, losing timing information, etc. Nonetheless, a wide
                                                                            adoption of formalized languages amongst motion interac-
                                                                            tion designers is challenged by the potential complexity of
Figure 5. Movement profile: 3Gear right pinch interaction technique         language learning and movements description.
(excerpt)
                                                                          • Meeting future challenges: New interactive systems are
contact sign, path signs, relationship bows, room-direction                 targeted to achieve ad-hoc composition of multiple inter-
pins, joint signs, area signs, limb signs, surface signs, a uni-            action techniques; de-couple the close binding between de-
versal object pre-sign, dynamic signs, and accent signs.                    vices, interaction techniques, and applications; and address
                                                                            user physical needs and preferences [4] [27]. This shift im-
Figure 6 (left) illustrates an overview over the movement pro-              poses new requirements and challenges the current prac-
file XML scheme. The original Labanotation naming is pre-                   tices for describing motion gestures. To meet those chal-
served to insure compatibility and readability of the scheme.               lenges, gestures should by transparent to reflect their inter-
As shown in the figure, the staff is defined in terms of tim-               nal functionality and physical requirements for intelligent
ing information (measures and timing) and the body parts in-                interactive systems.
volved (by defining the columns), and movement components
are defined in the movements element. The movements ele-                  • Limited research effort: We argue that this area of re-
ment contains a collection of elements to define the individ-               search requires a lot of attention for the community in-
ual movements, path, the movement directions, relationships,                cluding: a better understanding of gestures and their re-
and phrasing (connecting individual movements together).                    quirements; guidelines for describing gestures; new au-
                                                                            thoring and design tools for motion gestures; and better
Figure 6 (right) illustrates a close overview on the movement               understanding of the users’ learning habits and practices
element. In this element, a single individual movement is                   for learning motion gesture.
fully described. The information modeled includes place-
ment in the score (defined by the column element), timing                 CONCLUSION
information (beats, measures, and execution duration), the                In this paper, we have argued that adequate movements de-
body part(s) involved (defined by the preSign), and move-                 scription of motion gestures is very relevant to the correct
ment quality such as direction, space, turn, and vibration. The           execution of interactions by end users, the preservation of
number and detailed level of movements modeled depend on                  technique by designers, the accumulation of knowledge for
the designer. The design should model just enough informa-                the community, and most importantly for the process of de-
tion for ideal execution of the movement.                                 signing and engineering interactive systems. Moreover, lan-
                                                                          guages for describing the movement aspects of gestures are
DISCUSSION
                                                                          very important resource of context information about the ges-
Describing movements for motion gestures is a challenging                 tures, which can be utilized by interactive systems for interac-
process and imposes a number of open issues (only some are                tion filtering, adaptation, and dynamic on-the-fly deployment
discussed in this paper):                                                 at runtime. Herein, Labanotation as a flexible and extensible
• Support of dynamic interactive systems: The lack of ad-                 movement documentation system is adopted for describing
  equate interaction documentation and dissemination leads                the movements aspects of gestural interactions.
  inevitably to challenge the design and engineering of in-
  teractive systems. Documentation can be used to extract                 FUTURE WORK
  information about the type of movements involved in the                 We continue our work on an authoring tool called Interaction
  interaction, involved body parts, adequate interaction exe-             Editor [3], which aims to ease the workflow for describing
  cution, etc. The absence of such information will necessar-             the movement aspects of gestural interactions for gesture de-
  ily lead to burden the deployment of interaction techniques             velopers and designers. Moreover, one of our active areas

                                                                      5
                                Figure 6. Movement profile scheme - movement element (high-level overview)


of research continues to investigate the real practices for de-          5. Badler, N. I., and Smoliar, S. W. Digital representations
scribing gestural interactions applied by the HCI community.                of human movement. ACM Comput. Surv. 11, 1 (Mar.
                                                                            1979), 19–38.
ACKNOWLEDGEMENT
This work was partially supported by the Graduate School                 6. Bailly, G., Vo, D.-B., Lecolinet, E., and Guiard, Y.
for Computing in Medicine and Life Sciences funded by                       Gesture-aware remote controls: guidelines and
Germany’s Excellence Initiative [DFG GSC 235/1] and by                      interaction technique. In Proceedings of the 13th
vffr (Verein zur Förderung der Rehabilitations- Forschung                  international conference on multimodal interfaces,
in Hamburg, Mecklenburg-Vorpommern und Schleswig-                           ICMI ’11, ACM (New York, NY, USA, 2011), 263–270.
Holstein e.V.) We also thank Michal Janiszewski for drawing              7. Bartlett, J. F., and Bartlett, J. F. Rock ’n’ scroll is here to
the design sketches in Figure 1 and Figure 2.                               stay. Computer Graphics and Applications (2000),
                                                                            40–45.
REFERENCES
 1. Aggarwal, J. K., and Cai, Q. Human motion analysis: A                8. Camurri, A., Ricchetti, M., and Trocca, R.
    review. Computer Vision and Image Understanding 73                      EyesWeb-toward gesture and affect recognition in
    (1999), 428–440.                                                        dance/music interactive systems. In Multimedia
                                                                            Computing and Systems, 1999. IEEE International
 2. Alaoui, S. F., Caramiaux, B., Serrano, M., and
                                                                            Conference on, vol. 1 (Florence, Italy, Jul 1999),
    Bevilacqua, F. Movement qualities as interaction
                                                                            643–648 vol.1.
    modality. In Proceedings of the Designing Interactive
    Systems Conference, DIS ’12, ACM (Newcastle, UK,                     9. Chi, D., Costa, M., Zhao, L., and Badler, N. The emote
    2012), 761–769.                                                         model for effort and shape. In Proceedings of the 27th
 3. Altakrouri, B., Gröschner, J., and Schrader, A.                        Annual Conference on Computer Graphics and
    Documenting natural interactions. In CHI ’13 Extended                   Interactive Techniques, SIGGRAPH ’00, ACM
    Abstracts on Human Factors in Computing Systems, CHI                    Press/Addison-Wesley Publishing Co. (New York, NY,
    EA ’13, ACM (New York, NY, USA, 2013), 1173–1178.                       USA, 2000), 173–182.
 4. Altakrouri, B., and Schrader, A. Towards dynamic                    10. Gerling, K., Livingston, I., Nacke, L., and Mandryk, R.
    natural interaction ensembles. In Fourth International                  Full-body motion-based game interaction for older
    Workshop on Physicality (Physicality 2012) co-located                   adults. In Proceedings of the SIGCHI Conference on
    with British HCI 2012 conference, A. D. Devina                          Human Factors in Computing Systems, CHI ’12, ACM
    Ramduny-Ellis and S. Gill, Eds. (Birmingham, UK, 09                     (New York, NY, USA, 2012), 1873–1882.
    2012).

                                                                    6
11. Hamon, A., Palanque, P., Silva, J. L., Deleris, Y., and              Proceedings of the 2nd Australasian Conference on
    Barboni, E. Formal description of multi-touch                        Interactive Entertainment, IE 2005, Creativity &
    interactions. In Proceedings of the 5th ACM SIGCHI                   Cognition Studios Press (Sydney, Australia, 2005),
    Symposium on Engineering Interactive Computing                       113–120.
    Systems, EICS ’13, ACM (New York, NY, USA, 2013),                22. Loke, L., Larssen, A. T., Robertson, T., and Edwards, J.
    207–216.                                                             Understanding movement for interaction design:
12. Harrison, B. L., Fishkin, K. P., Gujar, A., Mochon, C.,              frameworks and approaches. Personal and Ubiquitous
    and Want, R. Squeeze me, hold me, tilt me! an                        Computing 11, 8 (2006), 691–701.
    exploration of manipulative user interfaces. In                  23. Moen, J. From hand-held to body-worn: Embodied
    Proceedings of the SIGCHI Conference on Human                        experiences of the design and use of a wearable
    Factors in Computing Systems, CHI ’98, ACM                           movement-based interaction concept. In Proceedings of
    Press/Addison-Wesley Publishing Co. (New York, NY,                   the 1st International Conference on Tangible and
    USA, 1998), 17–24.                                                   Embedded Interaction, TEI ’07, ACM (New York, NY,
13. Hashim, W. N. W., Noor, N. L. M., and Adnan, W. A. W.                USA, 2007), 251–258.
    The design of aesthetic interaction: Towards a graceful          24. Mulder, A. Hand gestures for hci. Hand Centered
    interaction framework. In Proceedings of the 2Nd                     Studies of Human Movement Project (1996), 1–21.
    International Conference on Interaction Sciences:
    Information Technology, Culture and Human, ICIS ’09,             25. Nakamura, M., and Hachimura, K. An XML
    ACM (New York, NY, USA, 2009), 69–75.                                representation of Labanotation, LabanXML, and its
                                                                         implementation on the Notation Editor LabanEditor2.
14. Hatol, J. Movementxml: A representation of semantics
                                                                         Review of the National Center for Digitization (Online
    of human movement based on labanotation. Master’s
                                                                         Journal) 9 (2006), 47–51.
    thesis, SIMON FRASER UNIVERSITY, Burnaby, BC,
    Canada, 2006.                                                    26. Navarre, D., Palanque, P., Ladry, J.-F., and Barboni, E.
15. Hinckley, K. Input technologies and techniques. In The               ICOs: A model-based user interface description
    human-computer interaction handbook, J. A. Jacko and                 technique dedicated to interactive systems addressing
    A. Sears, Eds. L. Erlbaum Associates Inc., Hillsdale, NJ,            usability, reliability and scalability. ACM Trans.
    USA, 2003, ch. Input technologies and techniques,                    Comput.-Hum. Interact. 16, 4 (Nov. 2009), 18:1–18:56.
    151–168.                                                         27. Pruvost, G., Heinroth, T., Bellik, Y., and Minker, W.
16. Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E.              User Interaction Adaptation within Ambient
    Sensing techniques for mobile interaction. In                        Environments, next generation intelligent environments:
    Proceedings of the 13th Annual ACM Symposium on                      ambient adaptive systems ed. Springer, Boston (USA),
    User Interface Software and Technology, UIST ’00,                    2011, ch. 5, 153–194.
    ACM (New York, NY, USA, 2000), 91–100.                           28. Rekimoto, J. Tilting operations for small screen
17. Hutchinson, A. Labanotation The System of Analyzing                  interfaces. In Proceedings of the 9th Annual ACM
    and Recording Movement, 4th ed. Routledge, NewYork                   Symposium on User Interface Software and Technology,
    and London, 2005.                                                    UIST ’96, ACM (New York, NY, USA, 1996), 167–168.
18. James, J., Ingalls, T., Qian, G., Olsen, L., Whiteley, D.,       29. Ruiz, J., Li, Y., and Lank, E. User-defined motion
    Wong, S., and Rikakis, T. Movement-based interactive                 gestures for mobile interaction. In Proceedings of the
    dance performance. In Proceedings of the 14th Annual                 2011 annual conference on Human factors in computing
    ACM International Conference on Multimedia,                          systems, CHI ’11, ACM (New York, NY, USA, 2011),
    MULTIMEDIA ’06, ACM (New York, NY, USA, 2006),                       197–206.
    470–480.                                                         30. Scoditti, A., Blanch, R., and Coutaz, J. A novel
19. Kahol, K., Tripathi, P., and Panchanathan, S.                        taxonomy for gestural interaction techniques based on
    Documenting motion sequences with a personalized                     accelerometers. In the 15th International Conference on
    annotation system. IEEE MultiMedia 13, 1 (2006),                     Intelligent User Interfaces (IUI ’11), ACM (New York,
    37–45.                                                               NY, USA, 2011), 63–72.
20. Karam, M., and m. c. schraefel. A taxonomy of gestures           31. Wobbrock, J. O., Morris, M. R., and Wilson, A. D.
    in human computer interactions. Technical report,                    User-defined gestures for surface computing. In
    University of Southampton, Southampton, United                       Proceedings of the 27th international conference on
    Kingdom, 2005.                                                       Human factors in computing systems, CHI ’09, ACM
                                                                         (New York, NY, USA, 2009), 1083–1092.
21. Loke, L., Larssen, A. T., and Robertson, T. Labanotation
    for design of movement-based interaction. In




                                                                 7