=Paper= {{Paper |id=Vol-1746/paper-10 |storemode=property |title=A Touchless Human-machine Interface for the Control of an Elevator |pdfUrl=https://ceur-ws.org/Vol-1746/paper-10.pdf |volume=Vol-1746 |authors=Luca Montanaro,Paolo Sernani,Davide Calvaresi,Aldo Franco Dragoni |dblpUrl=https://dblp.org/rec/conf/rtacsit/MontanaroSCD16 }} ==A Touchless Human-machine Interface for the Control of an Elevator== https://ceur-ws.org/Vol-1746/paper-10.pdf
    A touchless human-machine interface for the control of
                       an elevator

        Luca Montanaro                          Paolo Sernani                        Davide Calvaresi
            Vega S.R.L.                Università Politecnica delle Marche      Scuola Superiore Sant’Anna
           Fermo, Italy                            Ancona, Italy                          Pisa, Italy
     luca.montanaro@vegalift.it               p.sernani@univpm.it                    d.calvaresi@sssup.it

                                             Aldo Franco Dragoni
                                       Università Politecnica delle Marche
                                                   Ancona, Italy
                                             a.f.dragoni@univpm.it




                                                           devices [AJS13]. In addition to the assistive sup-
                                                           port, a touchless interaction is useful in environments
                     Abstract                              which require absolute sterility, such as operating
                                                           rooms [OGS+ 14]. Touchless interfaces are exploited
    We present a touchless interface based on ges-         even in computer games, for pure entertainment as
    ture recognition for the control of an eleva-          well as for serious purposes [KLJ04, HPKJ12].
    tor. Interacting with the interface, a user                The research described in this paper shares some
    can select and confirm the desired floor with-         features with the aforementioned domains: we present
    out touching any physical device. The inter-           a touchless interface based on gesture recognition to
    face has to guarantee a usability comparable           control an elevator. The user can select and confirm
    to the habitual button panels: the users are           the desired floor without touching any physical device.
    supposed to use the elevator without any spe-          As in the Ambient Assisted Living, we want to provide
    cific training, and its functionality has to be        a natural interaction with the available devices. Even
    the same of traditional elevators. In addition         if applications which require a sterile environment are
    to describing three possible implementations           not usual as in surgery, our approach could contribute
    of the touchless interface, the paper provides         to the hygiene of the elevator and its occupants, espe-
    two contributions: a comparison of two differ-         cially in places like hospitals. To this point, one could
    ent technologies used as the interface inputs,         object that the button panels typical of elevators are
    and the results from 10 preliminary user tests         intuitive and usable as they are. However, a touchless
    performed to measure the perceived usability.          interface to control an elevator fits even for entertain-
    The results are promising: two implementa-             ment purposes. An example could be a scenic eleva-
    tions out of three got an average score around         tor of a skyscraper, where the touchless control could
    84 on the “Usability Metrics for User Experi-          be integrated with augmented reality to let the user
    ence”.                                                 browse some information about the view.
                                                               In this paper, we provide two main contributions,
                                                           based on a set of preliminary user tests performed with
1    Introduction                                          a simulated system:
Touchless interfaces are useful in different application      • we compare two different technologies for the in-
domains. An example is the Ambient Assisted Liv-                put device to recognize the user’s gesture, one
ing, where voice recognition and gesture recognition            based on computer vision and one on electrical
are key technologies for assistive environments [Dra13]:        near-fields;
beyond the tendency to be used for activity recogni-
tion [CCS+ 16], touchless interfaces enable the users         • we propose and evaluate three different implemen-
to achieve a natural interaction with the available             tations of our touchless interface.
   The rest of this paper is organized as follows. Sec-     an active life [GLNM12], and supporting rehabilita-
tion 2 presents some related works, analyzing a sample      tion [BMC+ 09]. Our application could include enter-
of studies where touchless interfaces are at the service    tainment, too. The touchless interaction might amaze
of different application domains. Section 3 describes       technology enthusiast users. Moreover, in future, it
our system, presenting the touchless interface to con-      might be integrated with augmented reality, to let the
trol an elevator. Section 4 details a preliminary evalua-   user browse some information such as the history of
tion of our system: after an assessment of two different    the building or some point of interests in scenic eleva-
input devices, we performed user tests to understand        tors.
the perceived usability of three different implementa-
tions of the touchless interface. To conclude, Section 5    3     A touchless interface for elevator
discusses the results and introduces future works.                control
                                                            The implemented system is a touchless interface to
2   Related works                                           manage the control display inside an elevator. The
The touchless interface presented in this paper is based    only input of the interface is the movement of the
on gesture recognition, i.e. the process by which ges-      user’s hand to select the desired floor. Thus, the con-
tures made by a user are the input data to devices          trol of the elevator is based on gesture recognition. In
and applications [Gee04, Tur14]. The importance of          such an environment, the interface needs to be com-
gesture recognition lies in building efficient human-       pliant with the following requirements:
machine interaction [MA07]. For such reason, we                 • users with no distinction of age, education level,
based our research on the perceived usability of our              habits, and experiences need to be able to control
system during the touchless interaction.                          the elevator without a specific and deep training;
   As stated in the introduction, touchless interfaces
and gesture recognition are common to multiple do-              • the selection of the floor has to be based only on
mains. Assistance, Ambient Assisted Living, and the               the user’s hand movements, without any physical
Ambient Intelligence are relevant application domains             interface such as a button. Even buttons to turn
for gesture cognition. For example, in [OCBM04] the               on the recognition are excluded since the entire
authors presented a solution to perform the interac-              interaction has to be touchless;
tion with a computer system with head tracking and
                                                                • as in ordinary elevators, users can select more
eye blinking, to replace the use of the mouse. In [BP11]
                                                                  floors, and the number of false positive should be
the authors describe a gesture recognition application
                                                                  null.
with the purpose to help an assisted person with activi-
ties of daily living such as interacting with appliances,   In other words, the touchless interface has to be ex-
switching lights on and off, and answering the door.        tremely intuitive, at least as much as the ordinary but-
In fact, using our interface the user executes hand ges-    tons used to control an elevator.
tures to interact with the surrounding ambient, i.e.
the elevator: thus, our interface can be considered an      3.1    System interface
Ambient Intelligence application.                           The interface is based on the tracking of the move-
   In surgery, gesture recognition and touchless inter-     ments of a user’s hand on the xy plane parallel to the
faces can be a powerful tool for patient data visualiza-    display placed inside the elevator. We present three
tion in operating rooms [DP16, RRA+ 12, WSE+ 08],           different interaction modes with the elevator controls:
to preserve the complete sterility of such environments.
In most applications, an elevator does not need abso-           • the first is based on two linear widgets, tracking
lute sterility. However, being public surfaces, button            separately the movements along the y-axis (for
panels in elevators can be the source of bacteria col-            the floor selection) and the x-axis (for the floor
onization [KSR14, RWBG05]. Therefore, a touchless                 confirmation);
interface is useful in preserving the hygiene, especially
                                                                • the second is based on a circular movement of the
in public places.
                                                                  hand (for the selection) and on a waiting time (for
   Gesture recognition is widely used in computer                 the confirmation);
games: the video game market already includes ap-
plications which react to users’ gestures. However,             • the third replicates the interaction with the but-
gesture recognition is used even in serious games, us-            ton panels on habitual elevators, where the user
ing entertainment to engage the users for the game                selects a button on the xy plane (for the floor se-
serious purposes. Such purposes are manifold: ex-                 lection) and confirms his selection with his finger’s
amples are enhancing tourism [BKW08], promoting                   movement along the z-axis.
In all cases, the interaction starts when the distance      bounce” mechanism: the time that the user spends
of the user’s hand from the display is under a fixed        with his hand on a selected floor increases a threshold
threshold: thus, the floor selection and confirmation       which indicates the time to be spent pointing another
are activated by the evaluation of the position along       floor to change the selection. Such threshold further
the z-axis. A normalization is necessary to track the       increases when the user is confirming his choice by
position of the user’s hand since the input devices usu-    moving his hand on the x-axis. Thus, while the user
ally return values in millimeters. By means of empiri-      is confirming a floor, the interface becomes less sen-
cal tests, the position of the user’s hand in the device    sible to the change of the floor, reducing the risk of
sensitive area is transformed into a point on the visu-     inadvertent choices.
alization area of the interface. For example, along the
x-axis, zero means extreme left and one means extreme       3.1.2    Circular widget
right. The circular widget also requires a conversion       With the circular widget on the elevator display shown
in polar coordinates: the center of the elevator display    in Figure 2, the user selects the desired floor with a
is the center of the reference system.                      clockwise circular movement on the xy plane parallel
    As depicted in Figure 1, 2, and 3, some common          to the display. The user confirms his choice by keeping
features can be found in each widget: the interface         the selection for a fixed minimum time.
always presents to the users the current position of           To avoid inadvertent changes of the selected floor,
the elevator, a bar to show the queue of the floor calls,   the “debounce” mechanism for the circular widget acts
and the floor the user is currently selecting.              by enlarging the movement needed to change the selec-
                                                            tion, proportionally to the time the user spends in the
3.1.1   Linear widgets                                      current selection. Thus, the floor selection is fluid, and
                                                            during the confirmation larger movements are required
With the interaction mode based on linear widgets,          to intentionally change the floor.
the user selects the floor moving his hand along the y-
axis (parallel to the elevator height) and confirms his
selection moving his hand on the x-axis (parallel to the
elevator base). The interface shown in Figure 1 gives a
constant feedback to the user, through a display inside
the elevator. While the user moves his hand along the
y-axis, a vertical status bar at the center of the screen
shows the floor that could be currently selected; when
the user’s hand reaches the desired floor, he can con-
firm the floor by moving his hand to the right, filling
the label “confirm”.
   When the user confirms the selected floor, it might
happen that he accidentally moves his hand along the
y-axis. That gesture would result in an undesired be-       Figure 2: The interface based on the circular widget.
havior of the interface, hence a frustrating human-
machine interaction and, in extreme cases, in a wrong       3.1.3    Button widget
action of the elevator, i.e. the selection of the wrong
floor. To avoid such issues, we implemented a “de-          The buttons widget presented in Figure 3 replicates
                                                            the button panel usually available on elevators: the
                                                            user has to move his finger near the display on the xy
                                                            plane to select a floor, and move the finger towards the
                                                            display over a certain threshold to confirm the selec-
                                                            tion. As in the previous case, the interaction is based
                                                            only on the user gestures and does not require a direct
                                                            touch by the user.
                                                               The buttons widget does not have a “debounce”
                                                            mechanism implemented, since the area involved in the
                                                            interaction is clearly shown on the elevator display.

                                                            3.2     Software Structure
                                                            The entire software system manages the recognition
Figure 1: The interface based on two linear widgets.        of user gestures, the simulation of the elevator move-
                                                            interface is built on top of the “Qt” library1 .


                                                                                        wait




                                                                      select                           confirm
Figure 3: The interface based on the button widget.

ments (to execute the user tests), the updating of the
visual interface, and the audio feedback given to the              Figure 4: Application states and transitions.
user during the interactions. The system is imple-
mented as a multithreaded application, based on state
                                                            3.3     Hardware Components
transitions. Such structure is the same for each of the
presented widgets.                                          We implemented the system with two different tech-
    Three are the possible states, i.e. “wait”, “select”,   nologies: the “Leap Motion Controller”2 and the Mi-
and “confirm”; Figure 4 depicts the state transitions.      crochip “MGC3130” based on the Microchip “GestIC”
    In the “wait” state, an activation thread is contin-    technology3 . In both cases the interface and the ges-
uously running and listening for an event to trigger        tures required to control the elevator are the same: the
the floor selection. As described above, the floor se-      difference relies on the used gesture recognition tech-
lection starts when the user’s hand is under a thresh-      niques. With the “Leap Motion”, the recognition is
old distance from the elevator display, along the z-        computer vision-based, with algorithms applied to a
axis. If such event occurs, the activation thread cre-      stereo grayscale image. With the “GestIC”, the move-
ates two new threads, to monitor the position of the        ments are detected with an electrical near-field gen-
user’s hand for the floor selection and confirmation,       erated by the device (the “MGC3130”), and thus no
according to the widget in use. Hence, the applica-         images are required.
tion goes to the “select” state. If the user cancels the
operation (by moving his hand out of the threshold          4      Evaluation
distance before the confirmation) the application goes
                                                            To evaluate the system, we ran preliminary tests to
back to the “wait” state and the selection and confir-
                                                            compare the “Leap Motion” and the Microchip “Ges-
mation threads are killed. Otherwise, when the user
                                                            tIC” for the purpose of the gesture-based control of an
confirms his selection, the application goes to the “con-
                                                            elevator. Then, we ran usability tests with ten users,
firm” state: the application adds the selected floor to
                                                            to evaluate the perceived usability during the interac-
the service queue, plays a feedback animation on the
                                                            tion with the system interface. Such usability tests are
display and a confirmation sound. Then, the selection
                                                            based on the “Usability Metric for User Experience”
and confirmation threads are killed, and the applica-
                                                            (UMUX) [Fin10], a Likert scale used for the subjec-
tion goes back to the “wait” state.
                                                            tive assessment of an application’s perceived usability,
    The elevator movements are simulated by means           based on the following four items:
of a dedicated thread, which implements the elevator
algorithm (the traditional SCAN algorithm known in              1. this system’s capabilities meet my requirements;
disk scheduling [SGG12]). The interface is updated by
a separated thread which shares global variables with           2. using this system is a frustrating experience;
the activation, selection, and confirmation threads, as
                                                                3. this system is easy to use;
well as with the simulation thread. Similarly, a ded-
icated thread manages the application audio, to give            4. I have to spend too much time correcting things
feedback to the user: the activation, selection, and               with this system.
confirmation threads put the references to the audio
                                                                1 https://www.qt.io/
files on a queue, consumed by the audio management              2 https://www.leapmotion.com/
thread, according to the producer-consumer pattern.            3 http://www.microchip.com/design-centers/

    The system is implemented in C++ and the visual         touch-input-sensing/gestic-technology/overview
The user can assign a value from 1 (strongly disagree)      Hence, we decided to use only the “Leap Motion” for
to 7 (strongly agree) to each item. The positive items      the usability tests described in Subsection 4.2.
(1 and 3) are scored as [user0 s value−1], while the neg-
ative items (2 and 4) are scored as [7 − user0 s value].    4.2   Interface usability
The UMUX score of a test is a value between 0 and
100, given by the sum of the four items’ scores divided     Table 1 presents the results of the UMUX question-
by 24 and multiplied by 100.                                naire filled by the users. The results are promising
   We ran the tests in lab settings, and the users had to   and show a high acceptance of the gesture-based in-
perform simple tasks, i.e. the selection and the confir-    teraction: the best user experience is provided by the
mation of some floors with the implemented widgets,         interfaces based on the linear widgets and the but-
presented in a random order. Five out of ten users          tons widget, with an average score of 84.27 and 83.40
who performed the tests are male. Five users are be-        respectively. However, as highlighted in Figure 5, the
tween 20 and 30 years old; one is between 30 and 40         linear widgets get more homogeneous results (standard
years old, two between 40 and 50, and two between 50        deviation 12.62): one person had a perceived usability
and 60. Four users have a secondary school education        between 61 and 70, four people between 71 and 80, one
level; two users hold a bachelor degree or equivalent;      person between 81 and 90 and four people between 91
four users have a master degree or equivalent.              and 100. The buttons widget gets more distributed
   The main threat to the validity of the tests is that     scores: in two cases the interface scored lower than 60.
the interface is simulated, and thus the users per-            During the tests with all the widgets, some users
formed the task in front of a PC, instead of inside         felt disoriented at the beginning of the interaction:
a real elevator. Nevertheless, the evaluation is focused    they did not immediately understand how to control
on the perceived usability during the interaction with      the interface and find the correct distance to interact
the system interface: the input device and the inter-       with it. Such issues were more relevant with the inter-
action area would be the same even in a real environ-       face based on the circular widget which got an average
ment.                                                       score of 70.30 (standard deviation 24.81). As shown
                                                            in Figure 5, the perceived usability scored less than 50
4.1   Technology comparison                                 in two tests with the circular widget. The solution to
                                                            mitigate such issues could be a short demo running on
The different devices used to take the user’s gestures      the interface display with audio instructions.
as the system input require different resources to per-
form the gesture recognition: the “Leap Motion” is
more resource-demanding than the “MGC3130”. In              Table 1: The results on the perceived usability,
fact, the “Leap Motion” generates a grayscale stereo        based on the Usability Metric for User Experience
image which has to be processed with computer vision        (UMUX) [Fin10].
algorithms while the “GestIC” technology returns the                              average score std dev
coordinates of the hand on the sensible area thanks            Linear widgets         84.27         12.62
to the perturbation of the electrical near field gener-        Circular widget        70.30         24.81
ated by the device. Despite such feature makes the             Buttons widget         83.40         19.15
“MGC3130” more adequate to an embedded context,
in our opinion the “Leap Motion” allows a better user
experience for our application, since:                      5     Conclusions
  • the “GestIC” allows a maximum range of inter-           We presented a touchless interface to control an ele-
    action of 15 cm, while the “Leap Motion” has a          vator by means of hand gestures. We described three
    range of interaction around 60 cm, allowing larger      different modes of interaction based on three imple-
    gestures;                                               mentations of the system interface, i.e. the linear wid-
                                                            gets, the circular widget, and the buttons widget. The
  • the sensitive area of the “GestIC” is limited to a 7    linear widgets use the vertical movements of the user’s
    inches diagonal. Therefore, the gesture-based in-       hand for the floor selection and the horizontal move-
    teraction can be overlapped to displays of 7 inches     ments for the confirmation. The circular widget uses
    at most. To mitigate such issue, we executed tests      a circular movement for the selection and a wait for
    trying to separate the interaction area from the        the confirmation. The buttons widget replicates the
    display, using a 7 inches area beside a larger mon-     conventional button panel of an elevator. The inter-
    itor. However, such expedient interferes with the       action starts when the user puts his hand nearer than
    natural usability and the direct feedback available     a fixed distance from the elevator display. Then, he
    by moving the hand exactly in front of the display.     selects and confirms the desired floor by moving his
                                      Number of users per UMUX score range
   7
                                                                                      Linear widgets
                                                                                      Circular widget
   6                                                                                   Button widget


   5
                                                                                                                5

   4
                                                                                4                       4

   3
                                                                                              3

   2
                                                                  2                                         2

   1
                                  1                  1        1        111          11    1       1

   0




                                                                                                          00
                        0


                                  0


                                            0


                                                     0


                                                               0


                                                                          0


                                                                                  0


                                                                                             0
              10


                      -2


                                -3


                                          -4


                                                   -5


                                                             -6


                                                                        -7


                                                                                -8


                                                                                           -9


                                                                                                        -1
           1-


                    11


                              21


                                        31


                                                 41


                                                           51


                                                                      61


                                                                              71


                                                                                         81


                                                                                                      91
 Figure 5: The distribution of the UMUX scores: the linear widgets got the best results, with 9 tests over 71.
hand, according to the widget displayed on the eleva-      floor.
tor monitor, receiving a constant feedback.                   The user tests also highlighted:
   We performed preliminary tests with two different
input devices:                                               • the need of support tools to let the users imme-
                                                               diately understand the flow of interaction. For
  • the “Leap Motion Controller”, to recognize ges-            example, a short demo running on a corner of the
    tures with computer vision;                                elevator display could help users in the first im-
                                                               pact with the interface;
  • the Microchip “MGC3130”, to perform the recog-
    nition with the electrical near-field technology Mi-     • the need of enough room to interact with the in-
    crochip “GestIC”.                                          terface, to avoid unintended choices or even the
                                                               impossibility to make the necessary gestures.
Despite the “MGC3130” is suitable for an embedded
context (such as the control of an elevator) we decided       The development of support tools to make the in-
to use the “Leap Motion” for the usability tests. In       terface easier to understand is ongoing work. However,
fact, the “Leap Motion” has a larger sensitive area        user tests inside a real elevator are the only viable so-
which results in larger movements allowed to the users,    lution to fully validate the idea of a touchless control
especially in front of displays greater than 7 inches.     of an elevator.
   We ran the usability tests with ten users using a
simulated interface. The perceived usability is encour-
                                                           References
aging: the average UMUX score of the linear widgets
is 84.27 with a standard deviation of 12.62. Thus, the     [AJS13]       Dimitra Anastasiou, Cui Jian, and
users accepted the touchless interaction without feel-                   Christoph Stahl.       A german-chinese
ing disoriented: after an initial explanation, they were                 speech-gesture behavioural corpus of de-
all able to select and confirm a floor. The buttons                      vice control in a smart home. In Proceed-
widget had similar results. On the contrary, the circu-                  ings of the 6th International Conference
lar widget had the most problematic usability, scoring                   on PErvasive Technologies Related to As-
70.30 on the UMUX scale: the users experienced dif-                      sistive Environments, PETRA ’13, pages
ficulties in immediately understanding how to select a                   62:1–62:6. ACM, 2013.
[BKW08]    R. Ballagas, A. Kuntze, and S.P. Walz.        [HPKJ12] G.F. He, J.W. Park, S.K. Kang, and S.T.
           Gaming tourism: Lessons from evaluating                Jung. Development of gesture recognition-
           rexplorer, a pervasive game for tourists.              based serious games. In Proceedings of
           In Pervasive Computing: 6th Interna-                   2012 IEEE-EMBS International Confer-
           tional Conference, Pervasive 2008 Sydney,              ence on Biomedical and Health Informat-
           Australia, May 19-22, 2008 Proceedings,                ics, pages 922–925, 2012.
           pages 244–261, Berlin, Heidelberg, 2008.
           Springer Berlin Heidelberg.                   [KLJ04]    H. Kang, C.W. Lee, and K. Jung.
                                                                    Recognition-based gesture spotting in
[BMC+ 09] J.W. Burke, M.D.J. McNeill, D.K.                          video games. Pattern Recognition Letters,
          Charles, P.J. Morrow, J.H. Crosbie, and                   25(15):1701 – 1714, 2004.
          S.M. McDonough. Optimising engage-
          ment for stroke rehabilitation using se-       [KSR14]    C.E. Kandel, A.E. Simor, and D.A. Re-
          rious games.    The Visual Computer,                      delmeier. Elevator buttons as unrecog-
          25(12):1085–1099, 2009.                                   nized sources of bacterial colonization in
                                                                    hospitals. Open Medicine, 8(3):e81–e86,
[BP11]     M. Bhuiyan and R. Picking. A gesture                     2014.
           controlled user interface for inclusive de-
           sign and evaluative study of its usability.   [MA07]     S. Mitra and T. Acharya. Gesture recog-
           Journal of software engineering and appli-               nition: A survey. IEEE Transactions
           cations, 4:513–521, 2011.                                on Systems, Man, and Cybernetics, Part
                                                                    C (Applications and Reviews), 37(3):311–
[CCS+ 16] D. Calvaresi, D. Cesarini, P. Sernani,                    324, 2007.
          M. Marinoni, A.F. Dragoni, and A. Sturm.
          Exploring the ambient assisted living do-      [OCBM04] R. O’Grady, C. J. Cohen, G. Beach, and
          main: a systematic review. Journal of Am-               G. Moody. Navigaze: enabling access to
          bient Intelligence and Humanized Comput-                digital media for the profoundly disabled.
          ing, pages 1–19, 2016.                                  In Information Theory, 2004. ISIT 2004.
                                                                  Proceedings. International Symposium on,
[DP16]     L.T. De Paolis. A touchless gestural plat-             pages 211–216, 2004.
           form for the interaction with the patients
           data. In E. Kyriacou, S. Christofides,        [OGS+ 14] K. O’Hara, G. Gonzalez, A. Sellen,
           and C.S. Pattichis, editors, XIV Mediter-               G. Penney, A. Varnavas, H. Mentis,
           ranean Conference on Medical and Bio-                   A. Criminisi, R. Corish, M. Rouncefield,
           logical Engineering and Computing 2016:                 N. Dastur, and T. Carrell. Touchless in-
           MEDICON 2016, pages 874–878. Springer                   teraction in surgery. Communications of
           International Publishing, 2016.                         the ACM, 57(1):70–77, 2014.

[Dra13]    A.F. Dragoni. Virtual carer: A first proto-   [RRA+ 12] G.C. Ruppert, L.O. Reis, P.H. Amorim,
           type. In Telehealth Networks for Hospital               T.F. de Moraes, and J.V. da Silva. Touch-
           Services: New Methodologies, pages 290–                 less gesture user interface for interactive
           299. IGI Global, 2013.                                  image visualization in urological surgery.
                                                                   World Journal of Urology, 30(5):687–691,
[Fin10]    K. Finstad. The usability metric for user               2012.
           experience. Interacting with Computers,
           22(5):323 – 327, 2010.                        [RWBG05] K.A. Reynolds, P.M. Watt, S.A. Boone,
                                                                  and C.P. Gerba. Occurrence of bacte-
[Gee04]    D. Geer. Will gesture recognition technol-             ria and biochemical markers on public
           ogy point the way? Computer, 37(10):20–                surfaces. International Journal of En-
           23, 2004.                                              vironmental Health Research, 15(3):225–
[GLNM12] Kathrin Gerling, Ian Livingston, Lennart                 234, 2005.
         Nacke, and Regan Mandryk. Full-body             [SGG12]    A. Silberschatz, P.B. Galvin, and
         motion-based game interaction for older                    G Gagne. Operating system concepts, 9th
         adults. In Proceedings of the SIGCHI Con-                  Ed. John Wiley & Sons, 2012.
         ference on Human Factors in Computing
         Systems, CHI ’12, pages 1873–1882. ACM,         [Tur14]    M. Turk.      Gesture recognition.   In
         2012.                                                      K. Ikeuchi, editor, Computer Vision: A
           Reference Guide, pages 346–349. Springer
           US, 2014.

[WSE+ 08] J.P. Wachs, H.I. Stern, Y. Edan,
          M. Gillam, J. Handler, C. Feied, and
          M. Smith. A gesture-based tool for sterile
          browsing of radiology images. Journal
          of the American Medical Informatics
          Association, 15(3):321–323, 2008.