=Paper= {{Paper |id=Vol-1746/paper-01 |storemode=property |title=Exploiting a Touchless Interaction to Drive a Wireless Mobile Robot Powered by a Real-time Operating System |pdfUrl=https://ceur-ws.org/Vol-1746/paper-01.pdf |volume=Vol-1746 |authors=Davide Calvaresi,Andrea Vincentini,Antonio Di Guardo,Daniel Cesarini,Paolo Sernani,Aldo Franco Dragoni |dblpUrl=https://dblp.org/rec/conf/rtacsit/CalvaresiVGCSD16 }} ==Exploiting a Touchless Interaction to Drive a Wireless Mobile Robot Powered by a Real-time Operating System== https://ceur-ws.org/Vol-1746/paper-01.pdf
     Exploiting a touchless interaction to drive a wireless
    mobile robot powered by a real-time operating system

            Davide Calvaresi                   Andrea Vincentini                  Antonio Di Guardo
         Scuola Superiore Sant’Anna          Scuola Superiore Sant’Anna         Scuola Superiore Sant’Anna
           d.calvaresi@sssup.it              a.vincentini@sssup.it               a.diguardo@sssup.it

        Daniel Cesarini                        Paolo Sernani                       Aldo Franco Dragoni
    Scuola Superiore Sant’Anna        Università Politecnica delle Marche     Università Politecnica delle Marche
      d.cesarini@sssup.it                 p.sernani@univpm.it                      a.f.dragon@univpm.it



                                                                Although there are different kinds of systems with
                                                             different peculiar goals, even within the same domain,
                     Abstract                                they all face a common challenge: interaction with
                                                             the users [Suc87] guaranteeing the compliance with
    Nowadays, touch-based user interfaces are                their needs [CCS+ 16]. Indeed, all systems involv-
    widely used in consumer electronics. Recent              ing human interaction provide a way to support a
    trends confirm the high potential of touch-              two-sided communication: human to machine (inter-
    less interface technologies to manage also               faces to manage inputs, command, etc.) and ma-
    Human-Machine Interaction, in scenarios such             chine to human (interfaces to perceive the systems’
    as healthcare, surveillance, and outdoor activ-          feedback) [Min02, WG95]. To have robust, depend-
    ities. Moving from pointers, keyboards or joy-           able and usable interfaces, they have to be accessi-
    sticks to touch-screens represented a signifi-           ble, as much intuitive as possible, and to respond in
    cant challenge. However, a touchless approach            a time coherent with human requirements (if the in-
    needs to ensure intuitiveness and robustness.            teraction is too fast or too slow the user gets easily
    This paper describes a framework enabling the            lost) [Fis01, Dix09].
    wireless control of a mobile robot through a
                                                                Over the years, to produce more powerful inter-
    contactless controller. The data provided by a
                                                             faces, programmers tried to couple syntax and seman-
    complex sensor, composed of two stereo cam-
                                                             tics, trying to associate the dynamics of a command to
    eras and three IR sensors, are processed with
                                                             the actual output (gesture or movement) [Bro98]. In-
    custom algorithms that recognize the move-
                                                             deed, around the 40’s, multi-purpose systems only em-
    ments of users’ hands. The result is promptly
                                                             ployed panels with buttons (evolved into the modern
    translated into commands for the robot run-
                                                             keyboard), while systems dedicated to more specific
    ning a real-time operating system. Usability
                                                             contexts required properly designed interfaces. For ex-
    tests confirm the compelling employment of
                                                             ample, in the case of moving mobile objects, both di-
    contactless controllers for mobile robots and
                                                             rectly or indirectly, the joystick (emulating the older
    drones both in open and closed environments.
                                                             cloche employed in the aviation) became in the 80s the
                                                             default choice for electronic controllers [Wol88, Gra81].
1   Introduction                                             Even tough it was widely declared a big failure when
                                                             employed as a pointer [CEB78], it kept evolving (main-
In the recent decades, human beings are living in
                                                             taining its original basic features) becoming a standard
a technological revolution. The frenetic advent of
                                                             interface for systems moving/driving objects.
technological devices is changing peoples daily life.
Widespread consumer devices, new generation sensors,            The advent of the new touchscreen devices radi-
single-board computers, and other components, foster         cally changed the human experiences when approach-
the design of many heterogeneous and different user-         ing technological devices. However, to face specific
centered systems.                                            tasks like moving objects (e.g., gaming, etc.), even
tough the touchscreen revolutionized the way of think-           The applications in the virtual reality domain are
ing about “how to perform an action”, a simulated            increasing remarkably, both in presence [EFAA15] or
joystick still represents the more intuitive means to        in absence [PBAR15] of haptic feedback. Such solu-
translate an intention to an action. In line with the        tions mainly aim at moving and virtualizing real ob-
current technological trends, this paper aims at mov-        jects or their models, and entire environments while
ing a further step on behalf of the old joystick, bringing   enabling interactions with them. The Leap Motion
it in the touchless era [MBJS97].                            Controller is massively employed in those systems, like
    The presented solution is a system, based on mul-        the presented one, providing a visual rather than a
tiple devices, which detects hands movements, trans-         haptic feedback [BGG+ 14, FSLS+ 16].
lating them into joystick-like commands, and finally             Briefly analyzing the Leap Motion Controller, it is
communicating them to a mobile robot, that behaves           a sensor composed of two stereo CMOS image sensors
accordingly to the received instruction. The rest of the     and three infrared LEDs. Such a sensor, coupled with
paper is structured as follows: Section 2 presents an        a set of libraries, besides exporting the image of the
overview of the framework, its structure and its main        area over the sensor, exposes APIs to get the kinemat-
components and functionalities. Section 3 addresses          ics of the hand and all its joints, thus enabling the
the system’s requirements, functionalities and chal-         hands’ acquisition as in Figure 1(a). The Command
lenges. Section 4 presents the conducted tests sum-          Generator Module in Figure 1(b), running on a PC,
marizing their results. Finally, Section 5 concludes         operates over the obtained data to extract relevant
the paper, presenting the lesson lernt and the future        information, and decide which commands to send to
works.                                                       the mobile robot through the Communication Client
                                                             in Figure 1(c). The communication system, running
2     System overview                                        a custom Communication Protocol over an XBee con-
                                                             nection [Fal10], Figure 1(d), transfers the commands
This section introduces the framework, presenting an         detected by the User Interface Module to the Commu-
overview of its structure, its main components and           nication Server, Figure 1(e). The Decoding Module,
their functionalities.                                       Figure 1(f), elaborates the received information which
   The framework is mainly composed of three mod-            is then transferred to a buffer shared by the two Motor
ules:                                                        systems, Figure 1(g), in charge of finally performing
                                                             the robot’s motion.
    • the User Interface Module (UIM) , Figure 1(x);
                                                             2.1   System’s timing requirements
    • the Communication System (CS), Figure 1(y);
                                                             Robotic actuation and motion are widely known as
    • the Mobile Robot (MR), Figure 1(z).                    safety-critical systems. Thereby, the guarantee that
                                                             any task in the system respects its deadline, or at
   The interactions between the components of the            least that the system remains operative during pos-
framework are represented by the blue arrows in Fig-         sible deadline misses, is mandatory. To operate cor-
ure 1.                                                       rectly, the system has to perform a series of periodic
   Currently, the technological market offers a broad        tasks. Those can be characterized as follows:
range of solutions for gathering and processing hands’
position in space/time. Vokorokos, et al. [VMC16] an-          • ∆tdet ≤ 50 ms is the interval of time between the
alyzed the efficiency of the three most relevant devices         instant at which the user performs a command
on the market: the Leap Motion Controller [Leab],                and its detection.
the Microsoft Kinect [Zha12] and the Myo Arm-                  • ∆tel ≤ 20 ms is the interval of time between the
band [Nuw13]. Depending on scenarios and settings,               command detection and the message transmission
those devices showed up different peculiarities. For             to the robot.
example, the Microsoft Kinect enables to track the
entire body motion within a detection range of 0.5m            • ∆tcom is the time required for the communication
and 4.5m [FP11], the Myo Armband is a wearable                   to take place. It is unpredictable, but it can be
bracelet reading the muscular’ electric impulses, that           minimized accepting that some data can be lost.
guarantees a high level of comfort, however limiting
                                                               • ∆tact ≤ 50 ms is the time between the reception of
the amount of information that can be acquired. Com-
                                                                 the message by the robot and the actual motion.
pared with the previous ones, the Leap Motion Con-
troller represents a better trade-off between flexibility,   To obtain a smoother speed profile of the robot, a filter
time-to-prototype, and precision [MDZ14], and thus it        operation can be performed with an additional delay
has been chosen for the presented project.                   ∆tsteady ≤ 150 ms.
                                                             (thread hand).
  (X) User Interaction Module
                                                          A Rate Monotonic scheduler is employed to
                                                       schedule the threads composing the Robot Module,
                                                       which is powered by the Embedded Real-Time OS
                   (a)    Data acquisition             ARTE [BBP+ 16]. The threads periods are:

                                                           • 10 ms: thread handling the serial communication;

                                                           • 40 ms: thread decoding new messages and man-
                                                             aging the two servo motors.
                              Command
                   (b)        Generation
                                                       3     System design
                          Communication                The User Interaction Module, the Mobile Robot Mod-
                   (c)       Client
                                                       ule and the Communication System can be charac-
                                                       terized by different requirements, functionalities and
                                                       challenges, that are presented in this section.
  (Y) Communication System
                                                       3.1    User Interaction Module
                Dependable                             The User Interaction Module (UIM) is intentionally re-
       (d)     Communication
                                                       leased without a GUI, since a more natural interaction
                 Protocol
                                                       is obtained by simply having the user actions reflected
                                                       by the movements of the mobile robot. The UIM is
                                                       implemented in the C++ language, and is platform
 (Z) Mobile Robot                                      independent. The only requirement is the support for
                                                       the library of the Leap Motion Controller [Leaa]. Since
                                                       the User Interaction Module is touchless, the user is
                                                       supposed to be able to perform at least basic hands’
                          Communication                movements. The interaction area is about one cubic
           (e)               Server                    meter centered above the sensor. The hands are con-
                                                       tinuously tracked within this area, and when specific
                               Decoding                paths, positions or movements recalling the semantic
             (f)                Module                 of a joystick are identified, the related commands are
                                                       communicated to the robot.
                   Motor_Sx                Motor_Dx       The Data acquisition and Command generation
           (g)      System                  System     recognition enable three functional units:
                                                          The Controller detects the hands’ position, move-
                                                       ments, speed, and rotation radius expressed accord-
                                                       ing to the Cartesian axis placed as shown in Figure 2.
        Figure 1: Frameworks’ components               Custom messages’ structures defined by the Commu-
                                                       nication Protocol encapsulates such parameters.
2.2   System’s timing implementation                      Such a Communication Protocol, detailed in Sec-
                                                       tion 3.2, manage a circular buffer containing the afore-
Meeting the timing requirements expressed in Sec-      mentioned messages, discriminating which have to be
tion 2.1, the Robot Module and the User Interac-       sent or deleted because out of date.
tion Module are implemented using real-time poli-         Two modalities have been implemented to drive the
cies [But11]. The threads composing the User Inter-    robot:
action Module are Posix compliant [Har] and have pe-
riods of:                                                  • Normal mode - the robot is driven like a normal
                                                             3-wheel car. The commands are: stop, go forward
  • 15 ms: thread handling the serial communication
                                                             or backward, turn right or left.
    (thread tick);
  • 50 ms: thread detecting the hands’ motion and          • Rotatory mode - the robot rotates around itself,
    the commands creating the messages to be sent            and the only commands are: rotate or stop.
                                                           about a cubic meter, it is linearly clustered. Such a
                                                           choice ensures fewer variations of the hand’s position,
                                                           thus generating fewer messages. The size of the parti-
                                                           tions of the area are chosen accordingly to the feedback
                                                           provided by the motors: passing from one area to an
                                                           adjacent one results in a change of speed by a factor
                                                           of 5. Such solutions provide a higher reactivity and a
                                                           smoother and more usable mobile robot.

                                                           3.2     Communication Protocol
                                                           The users’ experience and the actual system usabil-
                                                           ity gain significant benefits by employing a wireless
            Figure 2: Leap Motion’s axis                   communication (by exchanging messages) between the
                                                           UIM and the Robot. The proposed protocol aims at
Normal Mode                                                being robust.
                                                              Indeed, the TCP [KR07] inspired its design (in
The stop, “rest” state of the joystick, is established     terms of ACKnowledgement mechanism and of tim-
with a minimum margin of tolerance of 1 cm, around         ing constraints). This protocol aims at guaranteeing
(0, 20, 0) cm on the area above the Leap Motion Con-       that if more than one message is queued in the buffer
troller [WBRF13]. If the hand is perceived in such an      (more than one command is detected in a relatively
area (no movements are required) a single message is       short period, or possible communication delays hap-
sent requiring the robot to hold the position.             pened) only the most recent is delivered to the robot.
   The go forward or backward is determined by the         These behaviours are modeled by the state machine in
hand’s position on the z axes: −z move forward and         Figure 4.
+z move backward. The distance of the hand from the
origin of the z axes defines proportionally the speed of   3.2.1    Protocol operating principles
the robot.
   The turn left or right is similar to go forward or      One of the requirements for the custom designed
backward but it is referred to the x axes: −x turns left   protocol were the ability to handle undesired re-
and +x turns right. The distance of the hand from the      transmissions in order to avoid delays and to guaran-
origin of the x axes defines proportionally the rotation   tee the highest possible responsiveness of the system.
radius to be followed by the robot.                        Thus, during its development we have taken some con-
                                                           cepts from the TCP protocol. In fact, the implemented
Rotatory Mode                                              mechanism to reduce the communication overload and
                                                           the number of unneeded computations performed by
By rotating the hand upside down and then back again       the robot is shown in Figure 3(b).
activates the rotating mode. The command “rotate”             In particular, the developed protocol allows to use
defines the rotation speed by moving the user hand         16 different message formats. The formats employed
along the ± x axis.                                        in this system’s version are two: one communicating
                                                           the ACK, Figure 6(a), and one communicating speed
3.1.1   Enhancement of usability and fluidity
                                                           and rotation angle, Figure 6(b). The messages’ struc-
The information acquired from sensors like the Leap        tures are detailed in Section 3.2.2 and represented in
Motion might introduce uncertainty or noise. Such a        Figure 6.
noise can generate incorrect or redundant information.        If a message is available, it is sent from the User In-
Thus, undesired behavior might take place, generat-        teraction Module to the Robot. The Controller keeps
ing feedback different from the expected. To avoid         generating new messages (related to the identified
this possibility and ensuring a consistent reading, two    commands) while it is waiting for an ACK from the
techniques have been designed and implemented: the         Robot.
first is sampling in frequency, and the second is sam-        If the Controller receives an ACK with the expected
pling the interaction area. The sampling frequency is      Id (equal to the sent one), it means that the sent com-
about 20Hz, and to be sure that the identified com-        mand has been received by the robot, so the buffer
mand is correct, the user’s hands have to be detected      is scanned looking for the next message to be sent,
in the same position for at least 3 consecutive frames     Figure 3(a). On the Robot side, the Communication
provided by the Leap Motion sensor. Recalling that         Server receives the messages sharing the information
the interaction area above the Leap Motion sensor is       with the Decoder Module through a shared buffer while
sending an ACK message (containing the same Id)                       3.2.2       Messages structure
back to the User Interaction Module.
                                                                      The structure of the messages has been inspired by the
                                                                      message structure of the MIDI protocol [Mid] (MSB to
                                                                      LSB). As aforementioned, the system uses two formats
          Circular Buffer                 Message Sent
                                                                      for the messaging, Figure 6.
                          Msg                  Msg
                        [Id = 1]             [Id = 1]                    Similarly to the MIDI protocol, the messages are
                                                                      composed of StatusByte (Figure 6(x)) and DataByte
                          Msg                  Msg              (a)
                        [Id = 2]             [Id = 2]                 (Figure 6(y)). The StatusByte (8bits) contains:
                                                                        • Header identifier [1 bit]
                                      t
                                                                          The value identifying header is 1;
                                                                        • Message Id [3 bits]
                                                                          Used for feedback mechanism;
          Circular Buffer                 Message Sent

          Msg      Msg      Msg                Msg
                                                                        • Type of message[4 bits]
        [Id = 3] [Id = 2] [Id = 1]           [Id = 3]
                                                                          If the message is an ACK the value is 0b0000, if
                   Msg      Msg                Msg              (b)       the message contains data the value is 0b0001.
                 [Id = 7] [Id = 6]           [Id = 7]



                                                                                       1 bit - Header identifier




                                                                                                                                        Header (x)
                                      t
                                                                                       3 bit - Id




                                                                         1 Byte
                           Message sent and Id matching
                           Missing message or Id not matching                                                       0000 - ACK
                                                                                       4 bit - Type of message
                                                                                                                    0001 - Data


Figure 3: Communication protocol: (a) Normal sce-                                      1 bit - Data identifier                    (b)
nario, (b) error or delay handling scenario.                                           1 bit - Sign for direction
                                                                         1 Byte




   If the Id contained in the ACK received by the Com-                                 6 bit - Speed absolute value




                                                                                                                                        Payload (y)
munication client does not match the expected one, or
if no ACK is received before a predetermined period
of time (named timer), the last message (the most re-                                  1 bit - Data identifier                    (c)
cent) inserted in the controller buffer is sent removing                               1 bit - Validity
                                                                                       1 bit - Sign for direction
all the older messages, Figure 3(b). Moreover, an in-
                                                                         1 Byte




ternal timer is set once a message is sent. If such a
timer expires before the UIM receives a proof of re-                                   5 bit - Rotation angle absolute value

ception (ACK), the message is sent again.


                                                                      Figure 6: Messages architecture and formats: (x) Sta-
                                                                      tusByte, (y) Payload [Speed (b) and Rotation radius
                                                                      (c)]

                                                                         If the Type of message has value 0b0000, the Robot
                                                                      knows that the message is complete (there are no more
                                                                      bytes/information to be read). An ACK message is
                                                                      generated by the Robot and it contains only the Sta-
Figure 4: State machines of UIM’s communication be-                   tusByte. If its value is 0b0001, the Robot knows that
haviours - Simulink model                                             the payload contains two more bytes of information.
                                                                      When a message is sent from the UIM to the Robot,
   Promoting the robustness analysis through formal                   3 bytes are sent: one StatusByte and two DataByte as
verification, two virtual state machines are used to                  payload.
model the Communication Client (UIM), Figure 4,                          The Speed format, Figure 6(b), is composed of the
and the Communication Server (MR), Figure 5.                          following three parts:
              Figure 5: Communication Server’s (Robot Module) state machine - SySML model

  • Data identifier [1 bit]                                • the Decoding Module elaborates information
    The value identifying data is 0;                         about speed and rotation radius encoding the ac-
                                                             tual commands for the Motor Modules;
  • Direction [1 bits]
    Go forward is identified by 0 and go backward by       • the two Motor systems (one for each motor) man-
    1;                                                       age the servo dynamics.

  • Speed value [6 bits]                                 3.3.1   Decoding Module
    The absolute value of the speed to be approached;
                                                         The Decoding Module elaborate the commands de-
  The Rotation radius format, Figure 6(c), is com-       pending on the required behaviour:
posed of the following three parts:                        • curving;
  • Data identifier [1 bit]                                • straight motion;
    The value identifying a data is always 0;
                                                           • motion around itself.
  • Validity [1 bits]
    The straight motion (rotation = ∞) is identified     Normal mode: Curving
    by 0;                                                Rotating the robot around the axes centered on itself
  • Direction [1 bits]                                   requires different speeds on the wheels (Figure 7(a)).
    Turn left is identified by 0 and turn right by 1;    Defining speed (v) and rotation radius (r), the wheels’
                                                         speed is calculated as shown in equation 1:
  • Rotation radius [5 bits]
    The absolute value of the rotation angle to be ap-                     Vright = v − δ(v, r)
    proached;                                                                                                 (1)
                                                                            Vlef t = v + δ(v, r)
3.3   Robot Module                                          Figure 7(b) identifies the differential speed enabling
The Robot operates on horizontal or reasonably sloping   the turning action (named δ(v, r)) which is obtained
surfaces. The possible Robot’s movements, described      as shown in equation 2:
in Section 3.1, are powered by two continuous speed                                     v
step motors, controlled by the Arduino board running                          δ(v, r) = C              (2)
                                                                                        r
ARTE and finally equipped with a Li-po battery.          where C is the distance between the wheel and the
   The system powering the Robot is composed of four     center of the robot wheelbase.
elements:
                                                         Normal mode: Straight motion
  • the Communication Server receives the messages
    as presented in Section 3.2;                         The speed of two motors is simply set equal to v.
                     v                              (a)




                                                             velocity
                                                                                                            Δt



                                                                                          Δt


                                                                               Δt

                                          r                                               (a)
                                                                                                      (b)




                                                    (b)
                                                                                                                 (c)

                                                             t=0                    t=3         t=6               t
                                                            v = 10                  v=8         v=0
                                                                          Δt

          v+dx           v         v-dx
                 C           C                              Figure 8: The speed profile produced by the controller
                                     r
                                                            in the cases of three different speed requests arrived
                                                            before the robot reach such values

                                                            4           Testing and Results
Figure 7: Speeds diagram in the case of Curving (Nor-       The presented system is composed of several elements.
mal mode) and positive rotation radius.                     Considering that design and implementation errors
                                                            can happen in several of those elements proper test-
Rotatory mode: Motion around itself                         ing is needed. As manual testing is a cumbersome
                                                            procedure, test automation is highly needed. Thus,
The velocities of two motors are opposite and their         we searched for existing test automation frameworks
absolute value is equal to v (the sign depends on the       and decided to adopt the CUnit [Ham04] and the gcov
direction of rotation).                                     systems [Rid04]. Furthermore, Usability Testing is a
                                                            key aspect when dealing with HMI systems. In the
3.3.2   Motor system                                        rest of this Section we present what has been adopted
                                                            in this work.
When the Motor System receives a command, it con-
verts the speed or rotation radius into engines’ instruc-
                                                            4.1         CUnit Test e gcov
tions. Thus, developing a linear profile for the wheels’
velocities, it needs particular care to prevent sliding     Performance analysis and dynamics verification are
and overturns.                                              two strategic tests. A useful framework to automati-
   Figure 8 exemplifies how a speed profile can be re-      cally test functions code is CUnit. Such a framework
alized when a new desired speed is requested before         has been employed to check the coherence of the ex-
the previous one is reached:                                pected outputs of Command detection and messages
                                                            generation, reception and transmission which are im-
  • at t = 0 a speed request arrives, Figure 8(a). The      plemented as state machines. Code Coverage Analy-
    system provides to the motors a linear speed pro-       sis [CCA] is crucial to identify and purge the dead or
    file to satisfy the request in a fixed ∆T ;             obsolete code from the system. Moreover, the coverage
                                                            analysis counts the number of times each program’s
  • at t = 3 (with t < ∆T ) a slower speed request          statement is executed. Finally, the profiling tool Gcov
    arrives, Figure 8(b). The system calculates a new       is used to increase the code optimization.
    profile starting from the current robot’s speed;
                                                            4.2         UIM usability test
  • finally, the stop command arrives at t = 6, Fig-
    ure 8(c). Since no more new requests arrive in a        The User Interaction Module aims at proposing a set
    period where t < 6 + ∆T , the robot is stopped          of manageable and effective commands.
    within the expected ∆T , holding that speed (in            To test their functionality several approaches are
    this case 0) till a new command arrives.                viable, and the most effective are the usability
                                                            test [DR99] and the usability inspection [Nie94]. The
first approach is a user-based method that involves the      handled (zoom, movements, taking pictures, etc.) by
end users in the testing phase, while the second is an       the user with one hand while driving the robot with
expert-based method that during the testing phase in-        the other. This new feature involves updates in the
volves only experts and context aware users. Both            data acquisition, communication protocol, and the ac-
of them aim at identifying possible issues related to        tuation module. While the data acquisition and the
the usability. They mainly differ in the categories of       actuation modules have to be restructured, the com-
the identified problems. Indeed, although the usability      munication protocol, thanks to its implementation, re-
test identifies fewer issue’s occurrences, these are more    quires only to be extended with the definition of the
general and frequent issues [JMWU91].                        new types of messages. An alternative might be in-
    The first usability test conducted concerns the types    cluding voice commands to control the camera and
of commands: different semantics (joystick, steering         empower the interaction with the robot [CSM+ 16]. Fi-
wheel, etc.) were taken into account. This first exper-      nally, the introduction of a GUI is under evaluation to
iment, named A/B test, consisting in executing the ba-       enrich the user experience when operating on the mo-
sic commands as move forward or backward and turn            bile camera.
left or right with different speed and rotation radius,
was useful to gather testers’ opinion about efficacy,        References
accuracy, and personal perception. Indeed, 27 testers
                                                             [BBP+ 16] Pasquale Buonocunto, Alessandro Biondi,
out of 30 expressed their preference for the joystick-like
                                                                       Marco Pagani, Mauro Marinoni, and Gior-
interface. The second test, named Hallway testing, in-
                                                                       gio Buttazzo. Arte: arduino real-time ex-
volves different groups of testers randomly selected to
                                                                       tension for programming multitasking ap-
test the whole system. The test consists of driving the
                                                                       plications. In Proceedings of the 31st An-
Robot between a series of obstacles. After a few com-
                                                                       nual ACM Symposium on Applied Com-
mands, the testers gained confidence with the system
                                                                       puting, pages 1724–1731. ACM, 2016.
and easily accomplished the test. Imagining to use
a Cartesian system to represent the lessons learned,         [BGG+ 14] D Bassily, C Georgoulas, J Guettler,
putting on the vertical axes the number of attempts                    T Linner, and T Bock. Intuitive and adap-
and on the vertical axes the coefficient of experience                 tive robotic arm manipulation using the
matured during tests, the output is a learning curve                   leap motion controller. In ISR/Robotik
which can be approximated by an exponential func-                      2014; 41st International Symposium on
tion. Finally, even though the current system does                     Robotics; Proceedings of, pages 1–7. VDE,
not provide a GUI (the system’s feedback is directly                   2014.
provided by the motion of the robot) all the testers ap-
preciated the system’s response, classifying the system      [Bro98]     C Marlin Brown. Human-computer in-
as “user-friendly”.                                                      terface design guidelines. Intellect Books,
                                                                         1998.

5    Conclusion                                              [But11]     Giorgio Buttazzo. Hard real-time com-
                                                                         puting systems: predictable scheduling al-
The presented project aimed at bringing a tradition-                     gorithms and applications, volume 24.
ally physical user interface for human-machine inter-                    Springer Science & Business Media, 2011.
action like the joystick in the touchless era. In the pro-
posed work, a touchless joystick has been realized and       [CCA]       CCA. Code Coverage Analysis. http://
employed to drive a wireless mobile robot customized                     www.bullseye.com/coverage.html.
for that purpose. All the system’s components have           [CCS+ 16]   Davide Calvaresi, Daniel Cesarini, Paolo
been tested and satisfied the conducted formal verifi-                   Sernani, Mauro Marinoni, Aldo Franco
cations. Thanks to the adoption of real-time policies,                   Dragoni, and Arnon Sturm. Exploring the
the timing constraints have been respected throughout                    ambient assisted living domain: a system-
all tests. Moreover, according to the testers’ feedback,                 atic review. Journal of Ambient Intelli-
the whole system was easy-to-use, responsive and ef-                     gence and Humanized Computing, pages
fective. The current system’s version perceives both                     1–19, 2016.
the user hands, but it uses only one of them to get
the commands to guide the robot. The next step is            [CEB78]     Stuart K Card, William K English, and
introducing the acquisition and elaboration of the sec-                  Betty J Burr. Evaluation of mouse, rate-
ond hand in charge of handling sensors equipping the                     controlled isometric joystick, step keys,
robot. An initial idea we are working on is to equip the                 and text keys for text selection on a crt.
robot with a mobile camera. Such a camera could be                       Ergonomics, 21(8):601–613, 1978.
[CSM+ 16] Davide Calvaresi, Paolo Sernani, Mauro          [JMWU91] Robin Jeffries, James R Miller, Cathleen
          Marinoni, Andrea Claudi, Alessio Balsini,                Wharton, and Kathy Uyeda. User inter-
          Aldo F. Dragoni, and Giorgio Buttazzo.                   face evaluation in the real world: a com-
          A framework based on real-time os and                    parison of four techniques. In Proceedings
          multi-agents for intelligent autonomous                  of the SIGCHI conference on Human fac-
          robot competitions. In 2016 11th IEEE                    tors in computing systems, pages 119–124.
          Symposium on Industrial Embedded Sys-                    ACM, 1991.
          tems (SIES), pages 1–10, 2016.
                                                          [KR07]     James F Kurose and Keith W Ross. Com-
[Dix09]    Alan Dix. Human-computer interaction.                     puter networking: a top-down approach.
           Springer, 2009.                                           Addison Wesley, 2007.

[DR99]     Joseph S Dumas and Janice Redish. A            [Leaa]     LeapMotion. Leap Motion documenta-
           practical guide to usability testing. Intel-              tion. https://developer.leapmotion.
           lect Books, 1999.                                         com/documentation/cpp/index.html.

[EFAA15] Ruffaldi Emanuele, Brizzi Filippo, Fil-          [Leab]     LeapMotion. Leap Motion. http://www.
         ippeschi Alessandro, and Carlo Alerto                       leapmotion.com.
         Avizzano. Co-located haptic interaction          [MBJS97]   Mark R Mine, Frederick P Brooks Jr,
         for virtual usg exploration. In 2015 37th                   and Carlo H Sequin. Moving objects
         Annual International Conference of the                      in space: exploiting proprioception in
         IEEE Engineering in Medicine and Bi-                        virtual-environment interaction. In Pro-
         ology Society (EMBC), pages 1548–1551.                      ceedings of the 24th annual conference on
         IEEE, 2015.                                                 Computer graphics and interactive tech-
[Fal10]    Robert Faludi. Building wireless sensor                   niques, pages 19–26. ACM Press/Addison-
           networks: with ZigBee, XBee, arduino,                     Wesley Publishing Co., 1997.
           and processing. O’Reilly Media, Inc., 2010.    [MDZ14]    Giulio Marin, Fabio Dominio, and Pietro
                                                                     Zanuttigh. Hand gesture recognition with
[Fis01]    Gerhard Fischer.     User modeling in
                                                                     leap motion and kinect devices. In 2014
           human–computer interaction. User mod-
                                                                     IEEE International Conference on Im-
           eling and user-adapted interaction, 11(1-
                                                                     age Processing (ICIP), pages 1565–1569.
           2):65–86, 2001.
                                                                     IEEE, 2014.
[FP11]     Valentino Frati and Domenico Prat-             [Mid]      Midi Association. Midi Association - The
           tichizzo. Using kinect for hand tracking                  official midi specification. http://www.
           and rendering in wearable haptics. In                     midi.org/specifications.
           World Haptics Conference (WHC), 2011
           IEEE, pages 317–321. IEEE, 2011.               [Min02]    David A Mindell. Between human and
                                                                     machine: feedback, control, and comput-
[FSLS+ 16] Ramon A Suỳrez Fernỳndez, Jose Luis                     ing before cybernetics. JHU Press, 2002.
           Sanchez-Lopez, Carlos Sampedro, Hriday
           Bavle, Martin Molina, and Pascual Cam-         [Nie94]    Jakob Nielsen. Usability inspection meth-
           poy. Natural user interfaces for human-                   ods. In Conference companion on Human
           drone multi-modal interaction. In Un-                     factors in computing systems, pages 413–
           manned Aircraft Systems (ICUAS), 2016                     414. ACM, 1994.
           International Conference on, pages 1013–
                                                          [Nuw13]    Rachel Nuwer. Armband adds a twitch
           1022. IEEE, 2016.
                                                                     to gesture control.  New Scientist,
[Gra81]    J Martin Graetz. The origin of spacewar.                  217(2906):21, 2013.
           Creative Computing, 18, 1981.
                                                          [PBAR15] Lorenzo Peppoloni,       Filippo Brizzi,
[Ham04]    Paul Hamill. Unit Test Frameworks: Tools                Carlo Alberto Avizzano, and Emanuele
           for High-Quality Software Development.                  Ruffaldi. Immersive ros-integrated frame-
           O’Reilly Media, Inc., 2004.                             work for robot teleoperation.      In 3D
                                                                   User Interfaces (3DUI), 2015 IEEE
[Har]      Michael González Harbour. Programming                  Symposium on, pages 177–178. IEEE,
           real-time systems with c/c++ and posix.                 2015.
[Rid04]   Marty Ridgeway. Using code coverage
          tools in the linux kernel. 2004.

[Suc87]   Lucy A Suchman. Plans and situated
          actions: The problem of human-machine
          communication.    Cambridge university
          press, 1987.
[VMC16]   Liberios Vokorokos, Juraj Mihal’ov, and
          Eva Chovancová. Motion sensors: Ges-
          ticulation efficiency across multiple plat-
          forms. In Intelligent Engineering Systems
          (INES), 2016 IEEE 20th Jubilee Inter-
          national Conference on, pages 293–298.
          IEEE, 2016.

[WBRF13] Frank Weichert, Daniel Bachmann,
         Bartholomäus Rudak, and Denis Fisseler.
         Analysis of the accuracy and robustness
         of the leap motion controller. Sensors,
         13(5):6380–6393, 2013.

[WG95]    Matthias M Wloka and Eliot Greenfield.
          The virtual tricorder: a uniform interface
          for virtual reality. In Proceedings of the
          8th annual ACM symposium on User in-
          terface and software technology, pages 39–
          40. ACM, 1995.
[Wol88]   William Wolf. German guided missiles
          henschel hs 293 and ruhrstahl sd1400 x
          fritz x, 1988.
[Zha12]   Zhengyou Zhang. Microsoft kinect sensor
          and its effect. IEEE multimedia, 19(2):4–
          10, 2012.