=Paper= {{Paper |id=Vol-1380/paper3 |storemode=property |title=A Framework for Rapid Prototyping of Multimodal Interaction Concepts |pdfUrl=https://ceur-ws.org/Vol-1380/paper3.pdf |volume=Vol-1380 |dblpUrl=https://dblp.org/rec/conf/eics/SeigerNKNS15 }} ==A Framework for Rapid Prototyping of Multimodal Interaction Concepts== https://ceur-ws.org/Vol-1380/paper3.pdf
              A Framework for Rapid Prototyping of Multimodal
                           Interaction Concepts
               Ronny Seiger                                           Florian Niebling                            Mandy Korzetz
          Technische Universität                                   Technische Universität                    Technische Universität
                  Dresden                                                  Dresden                                    Dresden
            Dresden, Germany                                          Dresden, Germany                           Dresden, Germany
        ronny.seiger@tu-dresden.de                             florian.niebling@tu-dresden.de               mandy.korzetz@tu-dresden.de

                                           Tobias Nicolai                                     Thomas Schlegel
                                       Technische Universität                             Technische Universität
                                               Dresden                                            Dresden
                                         Dresden, Germany                                    Dresden, Germany
                                     tobias.nicolai@mailbox.tu-                             thomas.schlegel@tu-
                                              dresden.de                                        dresden.de


ABSTRACT                                                                               to the high degree of embeddedness of pervasive systems,
Ubiquitous systems provide users with various possibilities of                         ubiquitous systems are characterized by a high level of mobil-
interacting with applications and components using different                           ity, often consisting of a large number of heterogeneous and
modalities and devices. To offer the most appropriate mode                             possibly resource-limited devices, which are loosely-coupled
of interaction in a given context, various types of sensors are                        into dynamic network infrastructures. The emergence of
combined to create interactive applications. Thus, the need                            smart spaces (smart homes, smart factories, smart offices)
for integrated development and evaluation of suitable inter-                           shows the increasing importance and spreading of ubiquitous
action concepts for ubiquitous systems increases. Creation                             systems throughout different areas of everyday life.
of prototypes for interactions is a complex and time consum-
                                                                                       As user interaction with devices that disappear into the back-
ing part of iterative software engineering processes, currently
                                                                                       ground often cannot be realized using traditional metaphors,
not well supported by tools as prototypes are considered to
                                                                                       new ways of interaction have to be explored during the
be short-living software artifacts. In this paper, we introduce
                                                                                       software engineering process for creating ubiquitous system
the framework Connect that enables rapid prototyping of in-
                                                                                       components. The development and improvement of multi-
teraction concepts with a focus on software engineering as-
                                                                                       modal interaction concepts (i. e., interactions using one or
pects. The framework allows the definition and modification
                                                                                       more input modalities [21]) is thereby not limited to initial
of event-to-action mappings for arbitrary interaction devices
                                                                                       prototyping, but equally important during implementation,
and applications. By applying Connect, model-based proto-
                                                                                       testing and evaluation stages.
types of multimodal interaction concepts involving multiple
devices can be created, evaluated and refined during the en-                           Existing tools for interaction design and rapid prototyping
tire engineering process.                                                              of ubiquitous user interaction can be successfully employed
                                                                                       during the initial prototyping phases. In later development
ACM Classification Keywords                                                            phases as well as in iterative engineering processes such as
H.5.2 User Interfaces: User-centered design                                            user-centered design (UCD), their applicability is often re-
                                                                                       stricted due to their limited ability to automatically propagate
Author Keywords                                                                        changes in prototypes to subsequent stages of development.
interaction framework; interaction concepts; multimodal                                The mostly informal descriptions and implementations of in-
interaction; rapid prototyping; software engineering                                   teraction concepts and interactive applications limit their ex-
                                                                                       tensibility with respect to new interaction devices and modal-
INTRODUCTION                                                                           ities. The lack of models and formalism also prevents pro-
Ubiquitous Systems as coined by Weiser [22] describe user-                             totypes for interactions from being used and matured in later
centered systems at the intersection of mobile and pervasive                           stages of the development process, which is why prototypes
computing combined with ambient intelligence. In addition                              are usually considered to be short-living software artifacts.
                                                                                       In this paper, we propose a model-driven framework for pro-
                                                                                       totyping of interaction concepts that can be applied through-
Workshop on Large-scale and model-based Interactive Systems: Approaches and            out the different phases of iterative software engineering pro-
Challenges, June 23 2015, Duisburg, Germany.
Copyright © 2015 for the individual papers by the papers’ authors. Copying permitted   cesses. The focus of the introduced Connect framework for
only for private and academic purposes. This volume is published and copyrighted by    interaction design is placed on software engineering aspects
its editors.
and models. It enables the rapid development of prototypes         Interface workbench [13] provides developers with a compre-
and enhancement at runtime during expert evaluation and user       hensive toolset for configuring components and interactions.
studies. Extensibility concerning new types of interaction de-     Our aim is to provide an easy to use tool also enabling non-
vices as well as interactive components is easily achieved by      programmers to rapidly create interaction prototypes.
inheritance in the object-oriented framework architecture. As
a result of a high-level, model-based design of interaction        In [10] Hoste et al. describe the Mudra framework for fusing
concepts, modifications to the interactions–even of very early     events from multiple interaction devices in order to leverage
prototypes–can be reused and advanced from beginning to            multimodal interactions. Mudra provides means for process-
end of the development cycle. The framework supports indi-         ing low-level sensor events and for inferring high-level se-
                                                                   mantic interaction events based on a formal declarative lan-
vidualizations concerning different groups of users as well as
                                                                   guage. The framework is primarily focused on combining
distinct scenarios by customizing interaction concepts using
                                                                   multiple interaction devices to enable advanced forms of in-
specialization and configuration. We introduce a tool based
                                                                   teraction input. It can be used complementary to the Connect
on the Connect framework that facilitates the creation and
                                                                   framework as part of defining and integrating low-level and
customization of interaction concepts at runtime even by non-
programmers. The framework is demonstrated developing an           high-level sensor components. However, the application of
interaction prototype within a Smart Home–a ubiquitous en-         the formalism and semantic models that Mudra is based on
vironment consisting of various devices for multimodal inter-      increases the effort and complexity for rapidly prototyping
actions with physical and virtual entities.                        interaction concepts and introduces a number of additional
                                                                   components to the lightweight Connect framework.
RELATED WORK                                                       In addition to the work of Dey et al. [6], one of the first
Prototypes are useful tools for assessing design decisions and     conceptual frameworks for the rapid prototyping of context-
concepts in early but also in advanced stages of the software      aware applications predominant in ubiquitous systems, the
engineering process. Especially in iterative design processes      iStuff toolkit [3] and its mobile extension by Ballagas et al. [2]
for creating usable systems, future users have to be involved      represent further related research our Connect framework is
continuously to provide feedback and to improve concepts           based on. These toolkits offer a system for connecting in-
and software artifacts [9]. According to Weiser, one of the        teractive physical devices and interaction sensors with digi-
essential research and development methods for interactive         tal applications and software components. The iStuff toolkit
ubiquitous systems is the creation of prototypes [23]. The         suite supports multiple users, devices and applications as
rapid prototyping technique aims at creating prototypes in a       well as interaction modalities. However, due to the limited
time and resource efficient way to mature artifacts in agile       software models applied within these tools, the set of sup-
software engineering processes [15]. The focus of our work         ported interaction devices is rather static. A model-based ap-
is on providing a framework for the rapid development and          proach for dynamically creating multimodal user interfaces
evaluation of multimodal interactions for ubiquitous systems.      composed of several components is described by Feuerstack
Especially within the UCD process, prototypes are needed to        and Pizzolato [8] as part of their MINT framework. Mappings
evaluate design ideas and improve the usability of interac-        between user interface elements, user actions and actions to
tive systems [17]. Complex interaction scenarios involving         be executed can be formally defined with help of this frame-
multimodal interactions require the use of technically mature      work and used for dynamically generating interactive appli-
prototypes to improve the usability of the system or applica-      cations. Both the iStuff and MINT framework are intended
tion [14].                                                         to be used in the design and development process of user in-
                                                                   terfaces whereas our focus lies on the prototyping and evalu-
The basis for our prototyping framework is the OpenInter-
                                                                   ation of interaction concepts (i. e., event-to-action mappings)
face platform developed by Lawson et al. [13]. The plat-
                                                                   in different stages of the UCD process. However, in order to
form allows the integration of arbitrary heterogeneous com-
                                                                   prototype and develop interactive applications including in-
ponents as interaction devices and applications. OpenInter-
                                                                   teraction concepts and user interfaces, the iStuff and Connect
face provides a lightweight framework and runtime environ-
                                                                   framework can be used complementary.
ment leveraging the prototyping of multimodal interaction
concepts using high-level data fusion and pipeline concepts        The ProtUbique framework by Keller et al. facilitates the
to connect interaction devices with interactive applications.      rapid prototyping of interactive ubiquitous systems [11]. It
In contrast to other frameworks for prototyping of interac-        supports an extensible set of interaction channels transmitting
tions [7, 4], OpenInterface is based on a platform and tech-       low-level interaction events from heterogeneous devices and
nology independent description of a component’s functional-        sensors. These interaction events are unified by the frame-
ity and interface. Our framework adapts these concepts with        work and accessible programmatically in order to prototype
a stronger focus on the underlying models and component            interactive applications. As ProtUbique offers interfaces to
descriptions in order to facilitate the extension, runtime adap-   access its interaction channels, it is possible to directly com-
tation and reuse of components and interactions during the         bine both the ProtUbique and Connect framework. Interac-
iterative stages of UCD. Other existing interaction and proto-     tion channels are integrated into Connect in the form of sen-
typing frameworks (e. g., CrossWeaver [20] and STARS [16])         sors supplying interaction events. Connect can then be used
are realized in a more informal way for specific scenarios and     to map these events to actions that will be executed by actua-
therefore lack extensibility of software components, interac-      tors or applications.
tion devices and modalities as well as reusability. The Open-
With the emergence of ubiquitous systems, users play a cen-
tral role in the software development process. Prototypes
are well suited for involving users in the design process and
for improving concepts and software artifacts based on user
feedback. Various frameworks for the prototyping of inter-
active applications including user interfaces and interaction
concepts exist. These frameworks are often focused on the
use of prototyping techniques in early development stages
and limited in the set of supported software components and
interaction modalities. However, agile and iterative software
engineering processes are required for developing interactive               Figure 1. Class diagram of the Connect framework
ubiquitous systems. Therefore, we propose a model-driven
framework for the rapid prototyping of multimodal interac-
tion concepts. By applying models for the definition of inter-     interaction devices and arbitrary event sources can be inte-
active components and their interrelations, extensibility and      grated into the framework. In order to integrate new types
reusability of interaction concepts and interactive prototypes     and corresponding instances of sensor components into the
in multiple design stages is facilitated. In that way, the us-     framework, an adapter for receiving the sensors’ events has to
ability and user experience of applications and systems for        be implemented. On receiving an event from the sensor, the
ubiquitous environments can be increased.                          state of the corresponding event port is updated, i. e., the port
                                                                   is activated or deactivated. An event port can be connected
INTERACTION FRAMEWORK                                              to one or more action ports of one or more actuator compo-
                                                                   nents. The event port’s activation leads to the activation of the
Structure                                                          connected action ports. As arbitrary event sources are sup-
We designed the Connect framework from a software engi-            ported, interactive devices independent of modality and num-
neering point of view using abstract models and their build-       ber can be combined to be used for multimodal interactions,
ing blocks as a starting point. The framework adheres to a         i. e., using one or more input modalities. Currently, only bi-
basic class structure consisting of multiple types of compo-       nary states for events (active/inactive) without additional data
nents, which are interconnected with each other. Fig. 1 shows      payload are supported by the Connect framework.
the class diagram using UML notation. A Component is a             An example of a locally integrated sensor component is the
software entity having a defined interface and known behav-        computer’s keyboard. The event ports correspond to the set
ior [13]. In analogy to control systems linking physical sen-      of the individual keys. A smartphone device sending touch
sors with actuators, we distinguish between SensorCompo-           events via a dedicated app to an instance of the Connect run-
nents representing entities that are able to produce interaction   time is an example of a remotely integrated sensor compo-
events and ActuatorComponents able to consume interaction          nent. The set of touch events provided by the smartphone and
events and trigger subsequent actions. ComplexComponents           supported by the app represent the event ports.
combine these capabilities. In addition, specializations of
complex components are used to enable the logical and tem-
poral combination of sensor and actuator components. Ports         Actuator Components
describe the components’ interfaces in order to define interac-    Actuator components represent devices and applications that
tions and connections between multiple components. Event-          are able to actively perform and execute actions. Analogous
Ports define the types of events a sensor component is able        to a sensor component, the ActuatorComponent is an inter-
to produce and ActionPorts represent the types of actions an       face actuators have to implement in order to connect the ac-
actuator component is able to perform. The activation of an        tuator to the Connect framework. An ActionPort is a wrapper
event port leads to the activation of the action ports the event   for an action or method the actuator component is able to ex-
port is connected to. A central Manager class handles the in-      ecute. New types of actuator components can be integrated
stantiation of components and maintains a list of all active       into the framework as implementations of the interface Ac-
component instances, which are accessible from within the          tuatorComponent. Adapters for calling the actuator compo-
scope of the framework.                                            nents’ particular operations from inside the framework have
                                                                   to be implemented for every type of actuator. An action port
                                                                   can be connected to one or more event ports of one or more
Sensor Components
                                                                   sensor components. Upon receiving an activation from an
Sensor components represent devices and applications that
                                                                   event port connected to an action port, the actuator compo-
are able to detect interactions and produce corresponding in-
                                                                   nent activates the action port and executes the corresponding
teraction events. The SensorComponent is an interface sen-         method. Arbitrary local and remote devices and applications
sors have to implement, e. g., by an adapter connecting the        can be integrated into the framework as actuator components.
sensor device via its API to the Connect framework. An             Currently, we support the activation of methods without the
EventPort is a wrapper for every type of event the sensor com-     processing of input or output parameters.
ponent is able to trigger. The sensor component maintains a
list of all its events and creates corresponding event ports. By   An example of an actuator component is a service robot
implementing the sensor component interface, new types of          whose movement functionality is provided in the form of
directed movement actions (e. g., forward, backward, left,
right). For each direction there is a corresponding action port.
Complex Components
Complex components represent devices and applications that
combine sensor and actuator functionalities. These entities
can contain multiple event and action ports, i. e., they are able
to produce events for actuator components and receive events
from sensor components. The ComplexComponent class is               Figure 2. Extensions of the SensorComponent to support BCI input
                                                                    modes
viewed as an abstract class that has to implement both the
SensorComponent and the ActuatorComponent interfaces in
order to be integrated into the Connect framework.                  of an actuator’s callable methods and their parameters, which
An example of a complex component is a smartphone sending           can be parsed in order to automatically create an actuator
touch interaction events and providing executable operations        component implementing the ActuatorComponent interface
(e. g., for taking pictures or switching on the its light).         and the corresponding action ports.

Logical and Temporal Components                                     PROTOTYPING MULTIMODAL INTERACTION CONCEPTS
Logical Components are viewed as specializations of com-            Exemplary Sensors
plex components. They are used for creating logical connec-         Brain Computer Interface
tions (AND, OR, NOT, XOR, etc.) between multiple event              In order to show the framework’s capability of supporting
ports of one or more sensor components. The logical compo-          multimodal interactions and its applicability within the Smart
nent’s action ports are used as input ports for events from the     Home scenario, we extended the core sensor component by
sensor components and its event ports are used as output ports      the Emotiv EPOC1 EEG brain computer interface (BCI) act-
producing events for the activation of subsequent actuator          ing as a source of interaction events [19]. The BCI used in our
components. By cascading these logical components, com-             setting provides interaction modes enabling the recognition of
plex logical circuits for sensor events triggering actions of       thoughts (Cognitive), facial expressions (Expressive), emo-
actuator components can be created. In addition, we integrate       tions (Affective) and head movement (Gyroscope). Each of
flip-flop and trigger components for saving of and switching        these modes is introduced as a subclass of the abstract Brain-
between states. That way, it is possible to define more com-        ComputerInterface class, which implements the SensorCom-
plex interaction concepts involving multiple interaction de-        ponent interface (see Fig. 2). Event ports are created for ev-
vices in advanced stages of the engineering process and also        ery possible type of sensor event produced by the BCI in each
to introduce modes of interaction (i. e., state-dependent be-       mode (e. g., for blink, wink left, look right, smile, and laugh
havior of the interactive prototypes).                              in the Expressive mode). Upon instantiation of an object of
Besides logical components, Temporal Components for de-             one of the interaction mode classes, a listener for event ports
scribing temporal dependencies between sensor events are            corresponding to the sensor events is initialized.
supported as extensions of complex components. That way             Tablet
we are able to define the activation of higher level events,        The second exemplary sensor component from the Smart
e. g., after a defined number of repeating sensor events or after   Home domain that we integrated into our test setting is an
the appearance of an event within a defined time frame. The         Android-based tablet device. A dedicated app sends inter-
functions and algorithms–including additional attributes–that       action events regarding the pressing of specific buttons and
are executed when the logical or temporal component is acti-        events detected by the tablet’s gyroscope sensor to an instance
vated have to be provided for each new type of complex com-         of the Connect framework. In order to support this event
ponent. As new types of components are introduced into the          source, we introduce the abstract Tablet class implementing
framework’s underlying class model by inheritance, only the         the SensorComponent interface. From that class, the Button
base classes’ methods have to be overwritten to use instances       and Gyroscope modes are derived as subclasses (see Fig. 3).
of these new components.                                            Event ports representing the particular buttons and gyroscope
Dynamic Components                                                  movement directions (i. .e., forward, backward, left, right) en-
Thus far we are able to extend the set of sensor, actuator and      able the detection of the corresponding interaction events and
complex components by introducing implementations and               connection to other components.
specializations of the appropriate classes into the model at
                                                                    Exemplary Actuators
design time. In order to add new types of components at run-
time, we extend the framework by the concept of Dynamic             Service Robot
Components. These components are created by Connect’s               A TurtleBot 22 service robot plays the role of an actuator in
runtime based on a formal model of a component’s function-          the context of our Smart Home scenario [19]. We abstracted
ality and ports. Currently, we support the use of a WSDL            its movement functionality into two operational modes ex-
(Web Services Description Language [5]) document describ-           tending the abstract ServiceRobot class: Manual Movement
ing the available operations of a service-based actuator com-       1
                                                                        https://emotiv.com/epoc.php
                                                                    2
ponent. The WSDL format provides a suitable formalization               http://www.turtlebot.com/
Figure 3. Extensions of the SensorComponent to support tablet input       Figure 5. Complex network of input and output components forming an
modes                                                                     interactive prototype



                                                                          interactions and listeners for new events. The security com-
                                                                          ponent’s second operation resets all ports to the inactive state
                                                                          and re-enables the event listeners to continue with the inter-
                                                                          action. Both operations can be connected to the event ports
                                                                          of an arbitrary–preferably reliable–sensor component.

                                                                          Prototyping Tool
                                                                          We implemented a Java application based on the Connect
                                                                          framework. The tool allows the graphical instantiation of
                                                                          known types of sensor, actuator and complex components.
                                                                          The lists of available ports, individual attributes as well as the
Figure 4. Extension of the ActuatorComponent to support a service robot   component’s graphical representation are coded into the class
actuator                                                                  structure and component’s data model. Instantiated compo-
                                                                          nents can be configured using the tool. In addition, it is pos-
                                                                          sible to generate service-based actuator components from a
and Automatic Movement (see Fig. 4). The manual movement                  WSDL file. Connections between component ports are cre-
mode supports the fine-grained control of the robot platform              ated and modified graphically at runtime using drag and drop
by direct movement commands (i. e., forwards, backwards, to               gestures (cf. Pipeline metaphor [13]). That way, circuits for
the left, to the right). Using the automatic movement mode,               interactions consisting of sensors, actuators, logical compo-
the robot can be send to specific locations in a room or build-           nents and temporal components can bes designed. For certain
ing. In automatic mode, driving, path planning and obstacle               types of sensor events there are sliders that are used for setting
avoidance are handled by the robot itself. The action ports for           activation thresholds. As many interaction devices provide
these two actuator component modes correspond to the avail-               sensor data in the form of numerical values–not just Boolean
able movement directions (manual mode) and to the specific                values for the active/inactive states–the definition of activa-
target locations (automatic mode).                                        tion thresholds increases the accuracy of event detection/acti-
Service-based Light Switch                                                vation and supports individual user configurations.
The capability of dynamically adding new components at run-               Fig. 5 shows a screenshot of the configuration tool’s user in-
time is an important feature of the Connect framework. As it              terface containing three instances of sensor components (BCI
supports the automated generation of an actuator component                modes), logical components and an actuator component (ser-
based on a WSDL document, we implemented a web service                    vice robot). The tool’s user interface provides visual feedback
for the remote control of a light switch providing a switch on            regarding currently active sensor events, connections, and ac-
and a switch off operation. Upon parsing of the WSDL file                 tions as well as numerical values for sensor input.
and creation of action ports for both operations, the Connect
runtime acts as a client sending requests to the web service.
                                                                          Prototype Configurations and User Profiles
Security Component                                                        The composition of components as well as their intercon-
In order to prevent incorrect behavior and actions caused                 nections, attributes and port thresholds can be persisted in
by imprecise interaction devices and unintended user inter-               individual prototype configurations and user profiles based
actions at runtime, a Security Component is introduced into               on the class model presented in the previous section. These
the framework as an implementation of the actuator compo-                 settings are saved in and loaded from XML-based files. In
nent. This component provides an operation for deactivating               this way, individual interaction concepts can be created for
all event and action ports and thereby disabling the current              specific prototypes, component configurations, scenarios and
users according to their capabilities. These model-based con-                           By creating user profiles for specific users, user groups and
figurations can then be used as templates for creating new                              scenarios, the corresponding prototypes can be used for user-
interaction concepts or for refinement at a later stage of the                          centered evaluation methods (e. g., user studies and expert
development process [18]. Listing 1 shows an extract of a                               evaluation) during various stages of the software development
prototype configuration describing an event port of a sensor                            process. In addition, user profiles facilitate the creation of in-
component connected to an action port.                                                  teractive applications according to a user’s individual cogni-
                                                                                        tive capabilities and preferences.
              Listing 1. Extract from a prototype configuration
                                                                                        The extensibility of the framework’s underlying models for
                                                                               components and interactions allows for the integration of
                                                                             new interactive devices and applications at later engineer-
  IO . T a b l e t B u t t o n C o m p o n e n t 
  T a b l e t B u t t o n C o m p o n e n t                                 ing stages. That way, interaction concepts can be developed
    

starting from simple event-to-action mappings and evolved

to more complex models and scenarios for interactive ubiq- Core . E v e n t P o r t uitous systems. Changes within these models can be propa- Up f a l s e gated to the affected software artifacts. By adding a mech- anism for model versioning, traceability for the evolution of these interaction models in iterative stages of the design pro- T u r t l e B o t cess is achieved and templates of interaction concepts can be

MoveForward created for reuse in new projects. DISCUSSION The introduced Connect framework for the creation and iter- ative advancement of interaction concepts is built on model- ... based components and configurations. Our chosen design al- lows for convenient extension through the inclusion of addi- tional arbitrary interaction components (i. e., sensors and ac- Prototyping and Evaluation tuators) via concepts of object-orientation. Basic design pat- With the help of the Connect framework and the configuration terns reduce the effort for developers to extend the set avail- tool, interaction concepts describing mappings between inter- able interaction devices and applications. At runtime, interac- action devices and active controllable components can be cre- tion components can be rapidly combined into pipelines and ated. Once integrated into the framework, multiple instances circuits to form complex interaction concepts. Modifications of multimodal sensor, actuator and complex components are of interaction concepts as well as persisting and loading of ready to be used and loosely-coupled at runtime for a partic- interaction and component configurations are supported by ular setting. Connections between sensors and actuators and the framework. Compared to related research, this approach their respective ports are modifiable at runtime (create, up- allows for a high level of reusability and refinement of inter- date, delete). Compared to hard-wired event-to-action map- active prototype applications and configuration in succeeding pings, model-based prototypes of interaction concepts can be development stages. With the help of the graphical configura- created and modified quickly with the help of Connect in or- tion tool it is possible to rapidly create working prototypes for der to test and evaluate their usability and suitability for con- multimodal interactive applications and use these prototypes crete use cases as part of the software engineering process. for test and evaluation purposes. Connect can be part of various user-centered prototyping Due to the use of known metaphors, e. g., pipelines and cir- and evaluation methods and stages reusing models that de- cuits, the creation and configuration of complex interaction scribe components and their interrelations. As the prototyp- concepts can be achieved even by non-programmers such as ing tool follows known metaphors from the WIMP paradigm interaction designers or end users, supporting different phases and integrates easy to understand graphical metaphors of iterative software engineering processes like UCD. By pro- (e. g., pipelines and circuits), it is also possible for designers, viding model-based abstractions for component design and non-programmers and end-users to understand and define in- data flow, interaction design may be advanced from sim- teraction concepts for a given scenario. That way, future users ple prototypes during requirement analysis to complex mul- can be involved in early stages of the design process lead- timodal interaction concepts containing numerous different ing to more intuitive interactions and usable applications for sensors and actuators in subsequent phases of development. ubiquitous systems. This iterative improvement of existing concepts proved to be especially helpful during usability testing using expert evalu- Due to the model-driven approach for describing components ation and end user studies. and their interrelations applied in Connect, it is possible to persist and reload component configurations and their con- On the one hand, the introduction of logical and tempo- nections. Prototypes of interaction concepts can be repeat- ral components into more complex interaction concepts al- edly tested, reused and refined in order to increase the usabil- lows for the use of multiple interactive input devices. On ity of the interactive application that is under development. the other hand, these components enable interaction modes and the definition of interaction sequences to prevent unin- nents. The framework follows a model-driven software en- tended interaction events. These mechanisms become neces- gineering approach enabling the extension and integration of sary when using imprecise and “always on” interaction de- new types of components into interactive prototypes as well vices (e. g., brain-computer interfaces and eye trackers) to as the reuse of component and prototype configurations dur- prevent Midas touch. With respect to the BCI, a double-blink ing development. Related frameworks and concepts generally within a certain timeframe could, for example, be used to trig- lack extensibility, flexibility and reusability due to the limited ger an action instead of a simple “natural” blink. As shown use of models and other formalisms. Therefore, interaction in the prototyping tool, thresholds defining the activation of frameworks often support only a static set of interaction de- sensor ports can be defined for sensor events containing Inte- vices and applications that can be used for the development ger or Double values. This mechanism also helps interaction of short-living interaction prototypes at early design stages. designers with the integration of imprecise devices. Lastly, a With Connect, the set of software components can be easily security component can be added to the interaction concept extended to support new types of devices and applications, and test environment to stop all ongoing interactions in case which can be combined to create multimodal interactions. of any malfunctions. This is especially helpful when interact- Connect’s runtime and user-friendly prototyping tool facili- ing with real world physical devices in ubiquitous systems. tate the use of multiple input and output devices as entities involved in interactions as well as dynamic reconfiguration The Connect framework is a prototyping tool for engineer- of interactions at runtime. The model-based descriptions of ing interactive applications. It can be employed during the components and interactions leverage the reuse and iterative development process for designing and testing of interaction refinement of components, concepts and prototypes for the concepts. In combination with other frameworks for the pro- user-centered software engineering process of ubiquitous sys- totyping of user interfaces (e. g., MINT [8], iStuff [2]), in- tems. teractions (e. g., OpenInterface [13]) and interaction devices (e. g., ProtUbique [11]) as well as for the fusion of high-level Regarding future work, we will extend the component models sensor events (e. g., Mudra [10]), Connect represents an ad- in order to be able to process non-Boolean input and output dition to the engineering toolchain for interactive ubiquitous values and states for event and action ports. The use of a systems. Due to its extensibility and model-driven develop- formal definition language for describing the interfaces of a ment approach, interaction and configuration models created sensor component will add the ability to also introduce new with Connect could be used as input for subsequent tools and types of sensor components to Connect at runtime. By com- phases in the software engineering process. bining the prototyping framework ProtUbique [11] for defin- ing interaction sources and high-level interaction events with The current development state of the Connect framework still service-based communication for sending interaction events contains some shortcomings that will be improved in con- to Connect, we will create a flexible toolchain for developing secutive versions. Until now, user defined data types beyond prototypes of interactive multimodal applications. In addi- simple Boolean values at event and action ports are not sup- tion, dynamic component platforms (e. g., OSGi [1]) can be ported. To be able to accommodate analogue sensors, at least employed to introduce additional runtime flexibility concern- some form of floating point data and complex data types will ing the support of new types of software components. We will have to be included into the extensible data model. In ad- also look into the distributed communication among several dition, aggregation of components attached to distributed in- instances of the Connect runtime. An instance of Connect stances of the framework is not yet possible. With the avail- running on one computer could be used as a sensor com- ability of active components that contain significant process- ponent within another Connect instance, which allows the ing power themselves such as smartphones, integration of preprocessing and derivation of higher order events on local preprocessed sensor values can be simplified by combining computers in order to save resources and simplify the mod- multiple networked Connect systems. eling of complex interactions. To evaluate the framework’s applicability, we will use Connect for the prototyping of in- CONCLUSION & FUTURE WORK teractive software components as part of the engineering pro- Engineering software for interactive ubiquitous systems re- cess for Smart Home applications [19, 12]. quires flexible and iterative development processes. The con- tinuous involvement of future users throughout the entire REFERENCES process is a key aspect of the user-centered design and de- 1. OSGi Alliance. 2003. Osgi service platform, release 3. velopment methodology for ubiquitous systems. Prototypes IOS Press, Inc. are a well suited tool for the development, test and evalua- tion of theoretical concepts in almost all stages of the soft- 2. Rafael Ballagas, Faraz Memon, Rene Reiners, and Jan ware engineering process. We developed an extensible and Borchers. 2007. iStuff mobile: rapidly prototyping new easy to use framework that supports rapid prototyping, evo- mobile phone interfaces for ubiquitous computing. In lution and evaluation of interaction concepts for ubiquitous Proceedings of the SIGCHI conference on Human systems. The Connect framework is based on a modular factors in computing systems. ACM, 1107–1116. object-oriented software model, which views interaction de- vices as sensor components and interactive applications as 3. Rafael Ballagas, Meredith Ringel, Maureen Stone, and actuator components. Interaction concepts can be defined, Jan Borchers. 2003. iStuff: a physical user interface modified and tested at runtime by connecting these compo- toolkit for ubiquitous computing environments. In Proceedings of the SIGCHI conference on Human 16. Carsten Magerkurth, Richard Stenzel, Norbert Streitz, factors in computing systems. ACM, 537–544. and Erich Neuhold. 2003. A multimodal interaction framework for pervasive game applications. In 4. Jullien Bouchet and Laurence Nigay. 2004. ICARE: a Workshop at Artificial Intelligence in Mobile System component-based approach for the design and (AIMS), Fraunhofer IPSI. development of multimodal interfaces. In CHI’04 extended abstracts on Human factors in computing 17. Martin Maguire. 2001. Methods to support systems. ACM, 1325–1328. human-centred design. International journal of 5. Erik Christensen, Francisco Curbera, Greg Meredith, human-computer studies 55, 4 (2001), 587–634. Sanjiva Weerawarana, and others. 2001. Web services 18. Nicolai Marquardt, Robert Diaz-Marino, Sebastian description language (WSDL) 1.1. (2001). Boring, and Saul Greenberg. 2011. The proximity 6. Anind K Dey, Gregory D Abowd, and Daniel Salber. toolkit: prototyping proxemic interactions in ubiquitous 2001. A conceptual framework and a toolkit for computing ecologies. In Proceedings of the 24th annual supporting the rapid prototyping of context-aware ACM symposium on User interface software and applications. Human-computer interaction 16, 2 (2001), technology. ACM, 315–326. 97–166. 19. Ronny Seiger, Tobias Nicolai, and Thomas Schlegel. 7. Pierre Dragicevic and Jean-Daniel Fekete. 2001. Input 2014. A Framework for Controlling Robots via device selection and interaction configuration with Brain-Computer Interfaces. In Mensch & Computer ICON. In People and Computers XVInteraction without 2014–Workshopband: 14. Fachübergreifende Konferenz Frontiers. Springer, 543–558. für Interaktive und Kooperative Medien–Interaktiv unterwegs-Freiräume gestalten. Walter de Gruyter 8. Sebastian Feuerstack and Ednaldo Pizzolato. 2011. GmbH & Co KG, 3. Building multimodal interfaces out of executable, model-based interactors and mappings. In 20. Anoop K Sinha and James A Landay. 2003. Capturing Human-Computer Interaction. Design and Development user tests in a multimodal, multidevice informal Approaches. Springer, 221–228. prototyping tool. In Proceedings of the 5th international conference on Multimodal interfaces. ACM, 117–124. 9. John D Gould. 2000. How to design usable systems. Readings in Human Computer Interaction: Towards the 21. Wolfgang Wahlster. 2006. SmartKom: foundations of Year (2000), 93–121. multimodal dialogue systems. Vol. 12. Springer. 10. Lode Hoste, Bruno Dumas, and Beat Signer. 2011. 22. Mark Weiser. 1991. The computer for the 21st century. Mudra: a unified multimodal interaction framework. In Scientific american 265, 3 (1991), 94–104. Proceedings of the 13th international conference on 23. Mark Weiser. 1993. Some computer science issues in multimodal interfaces. ACM, 97–104. ubiquitous computing. Commun. ACM 36, 7 (1993), 11. Christine Keller, Romina Kühn, Anton Engelbrecht, 75–84. Mandy Korzetz, and Thomas Schlegel. 2013. A Prototyping and Evaluation Framework for Interactive Ubiquitous Systems. In Distributed, Ambient, and Pervasive Interactions. Springer, 215–224. 12. Suzanne Kieffer, J-YL Lawson, and Benoit Macq. 2009. User-centered design and fast prototyping of an ambient assisted living system for elderly people. In Information Technology: New Generations, 2009. ITNG’09. Sixth International Conference on. IEEE, 1220–1225. 13. Jean-Yves Lionel Lawson, Ahmad-Amr Al-Akkad, Jean Vanderdonckt, and Benoit Macq. 2009. An open source workbench for prototyping multimodal interactions based on off-the-shelf heterogeneous components. In Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems. 245–254. 14. Linchuan Liu and Peter Khooshabeh. 2003. Paper or interactive?: a study of prototyping techniques for ubiquitous computing environments. In CHI’03 extended abstracts on Human factors in computing systems. ACM, 1030–1031. 15. Luqi. 1989. Software evolution through rapid prototyping. Computer 22, 5 (1989), 13–25.