=Paper= {{Paper |id=None |storemode=property |title=The End-user vs. Adaptive User Interfaces |pdfUrl=https://ceur-ws.org/Vol-828/SUI_2011_paper8.pdf |volume=Vol-828 }} ==The End-user vs. Adaptive User Interfaces== https://ceur-ws.org/Vol-828/SUI_2011_paper8.pdf
                             The end-user vs. adaptive user interfaces
                                   Veit Schwartze, Frank Trollmann, Sahin Albayrak
                                                       DAI – Labor
                                                   Ernst Reuter Platz 7
                                                  Berlin, 10781Germany
                                            +49 30/314 - 74064, 74048, 74001
                             {Veit.Schwartze, Frank.Trollmann, Sahin.Albayrak}@dai-labor.de


ABSTRACT                                                                   For instance there is a large set of heterogenic displays for
In smart environments, applications can support users in                   graphical user interfaces, which differ in their aspect ratio,
their daily life by being ubiquitously available through                   resolution and input possibilities. In addition, each user has
various interaction devices. Applications deployed in such                 different abilities or disabilities as well as a personal taste.
an environment, have to be able to adapt to different                      Such preferences cannot be predicted or categorized in a
context of use scenarios in order to remain usable for the                 reliable way at design time. The ability of the user to
user. For this purpose the designer of such an application                 distribute user interface elements to different devices also
defines adaptations from her point of view.                                raises the problem of multi-application scenarios.
Because of situations, which are unforeseeable at design                   This raises the need for the user to understand and control
time, the user sometimes needs to adjust the designers’                    adaptations of the application at runtime in order to
decisions. For instance, the capabilities and personal                     personalize it to her liking. Following, we want to describe
preferences of the user cannot be completely foreseen by                   the requirements and functions of a supportive user
the designer. The user needs a way to understand and                       interface, to enable the user to evaluate and control user
change adaptations defined by the designer and to define                   interface adaptations.
new adaptations. This requires the definition of a set of                  The next section describes the problem in more detail by an
context of uses and adaptations applied to the user interface              example application. This is followed by the requirements
in this situation. For this reason supportive user interfaces              that have to be achieved by a supportive user interface. The
should enable the user to control and evaluate the state of                section work in progress then gives an overview about the
the adaptive application and to understand “What happens                   layout- and adaptation model, which are needed to generate
and why?”1 In this paper, we describe the requirements and                 the position, size and style for each user interface element
function of a supportive user interface to evaluate and                    and to change these layout dimensions to a specific
control an adaptive application, deployed in a smart                       situation. The conclusion summarizes the paper and
environment.                                                               describes the next steps.

Keywords
                                                                           PROPLEM DESCRIPTION
Context aware applications, end-user support, adaptation-
                                                                           In this section we illustrate the problem space by an
and situation definition
                                                                           example of a cooking assistant. Afterwards we derive
                                                                           problems that have to be solved within the scope of
INTRODUCTION                                                               adaptive user interfaces.
Applications, which are deployed into smart environments,                  The cooking assistant is an application that enables the user
often aim to support the users in their every-day life. Such               to search for recipes and supports her while cooking them.
applications must be able to adapt to different context of                 During the cooking process the cooking assistant is able to
use scenarios to remain useable in every situation. The                    control the devices in the kitchen. We deployed the cooking
large set of possible properties of devices leads to an                    assistant into a real kitchen environment like depicted in
infinite number of possible situations which cannot be                     Figure 1 top-left. The main screen, shown in Figure 1, top-
considered at design time completely.                                      right, guides the user through the cooking steps and
                                                                           provides help if needed. The bottom half of Figure 1
                                                                           illustrates several spots corresponding to the different
                                                                           working positions and user tasks in the kitchen.



1
    Direct manipulation vs. interface agents, Shneiderman, B. & Maes, P.
    Interactions, ACM, 1997, 4, 42-61
Figure 1: The kitchen with the cooking assistant running on a touch screen (top-left), the main screen of the cooking
assistant (top-right), and the location spots defined by the context model (bottom).

In [4], we define different automatic adaptations, to adapt      Incomprehensible adaptations can lead to confusions for the
the user interface to specific situations, defined by working    user. The user has little knowledge about the state of the
steps, to support the user while operating in the kitchen.       system and its internal representation of the environment,
Two examples are:                                                user and platform characteristics. Therefore, it is hard for
    •    Distance-based adaptation: While cleaning dishes        her to comprehend why a specific adaptations has been
         the user wants to learn more about the next step. A     applied. It is not only important to know why something
         video helps to understand what has to be done.          happens but rather how to influence the behavior of the user
         Depending on the users distance to the screen, the      interface generation. At design time unknown environment
         layout algorithm increases the size of video            conditions and user characteristics leads to the wish to
         element to improve the legibility. In this case the     adjust adaptations at runtime e.g. button size to the
         distance of the user to the interaction device is       preference, capabilities or rule of the actual user. For
         used to calculate the enlargement factor for this       example a user with a color blindness or degeneration of the
         element.                                                macula2 may wish to adjust the contrast and the font size to
                                                                 improve the visibility and readability of the user interface.
    •    Spot-based adaptation: While using the cooking          In a similar case, left-handed users may wish to adjust the
         assistant, the user is preparing ingredients,           position of interaction elements (e.g. buttons) so their hands
         following the cooking advices and controlling the       don’t hide important information during interaction.
         kitchen appliances on a working surface. Because
                                                                 Additionally, supportive user interfaces can allow the user
         it is difficult to look at the screen from this
                                                                 to define individual distributions, which leads to free space
         position, shown in Figure 1 bottom, the important
                                                                 or multi-application scenarios. These problems must be
         information (Step description and the list of
                                                                 solved. The next section defines the requirements of an
         required ingredients) are highlighted.
                                                                 approach to enable the user to adjust, interfere or define
The described adaptions can improve the interaction with         new adaptations.
the application but the user is not able to influence the
adaptations or to interfere, which can lead to frustration and
the denial of the application. For instance, if the user is
concentrated on the ingredients list or the textual step
description and the size of these elements is scaled down.
This problem space can be divided into the evaluation and
control of the system state and behavior.
                                                                 2
                                                                     That means the loss of vision in the center of the visual
                                                                     field (the macula) because of damage to the retina.
REQUIREMENTS                                                     WORK IN PROGRESS
The requirements of a supportive application are derived         In our implementation the components that realize
from the need to evaluate the state of the system and to         adaptations of user interfaces, which can be adjusted at
control the behavior of the adaptation algorithm. They are       runtime, are the layout and the adaptation model, both
divided into:                                                    based on a model@runtime [6] approach to use the same
    •    An approach, to define the layout of an application     model at design and run time.
         and the adaptations to different context of use         Additionally, we have done the first steps to expand the
         scenarios and                                           approach of a meta-user interface described in [3] to
    •    The support of the end-user to change these             provide a simple way to adapt the layout generation
         adaptations to their preferences.                       algorithm to the needs of the user.
As aforementioned, heterogeneous interaction devices,
sensors and appliances makes the development of user             Layout model
interfaces for smart environments a challenging and time-        The layout model defines the structure of the user interface
consuming task. To reduce the complexity of the problem          and spatial relationships between user interface elements. It
user interface developers can utilize models and modeling        consists of the user interface structure and a set of
languages. User interfaces generated from models at design       statements. The user interface structure is determined by a
time often fail to provide the required flexibility because      tree-like hierarchy of Containers and UI-Elements.
decisions made at design time are no longer available at         Containers can contain a set of nested containers and nested
runtime. To handle this issue, the use of user interface         elements. User interface elements are the visible parts of the
models at runtime has been suggested [6].                        user interface structure and can present information to the
The approach shifts the focus from design to run time and        user. The statements describe the size, style and spatial
raises the need to support the end-user by the development       relationships between the user interface elements.
and personalization of applications. A meta-user interface       The approach differs from previous approaches in two
offers an abstract view to the state of the system and           general aspects. First of all, we interpret the design models,
provides an interface to influence its behavior. In [1] the      such as the task tree, the dialog model, the abstract user
system provides access to the task and the platform model,       interface model and the concrete user interface model. We
at which the platform model shows the interaction devices        derive the initial structure of the user interface and suggest
currently available in the home. Like the described              statements influencing the spatial relationships and size of
approach, the supportive user interface should visualize the     user interface elements from this information. Therefore we
user, environment, and platform information of the running       propose an interactive, tool-supported process that reduces
system in a simple way. Also the situations and                  the amount of information that needs to be specified for the
corresponding adaptations (system and user initiated)            layout. The tool enables designers to comfortably define
should be transparent to the user. This means, the               design model interpretations by specifying statements and
adaptation rules representation must describe in detail why      subsequently applying them to all screens of the user
and how the user interface changes and enable the user to        interface. The layout model editor is described in [7] in
interfere. To make the execution of user interface               more detail.
adaptations more comprehensible for the user, feedback
should be provided like the animation of user interface          Furthermore, different to other layout generation
changes.                                                         approaches like [2], we create a constraint system at
                                                                 runtime. A sub tree of the user interface structure marks the
Additionally, the user needs a way to delete or adjust layout    user interface elements that are currently part of the
adaptations rules and thus change the situation precondition     application’s visible user interface and a set of statements
and the adaptation. A preview of the changes avoids wrong        regarding these nodes is evaluated and creates a constraint
decisions. The definition of new adaptation rules requires       system solved by a Cassowary constraint solver. The result
the selection of context variables, their accuracy and range     of a successful layout calculation is a set of elements, each
of values which accurately describe the situation.               consisting of the location (an absolute x, y coordinate) and
Following, the user defines the executed adaption. First she     a width and height value.
has to select the layout dimension (size, orientation,
containment) she wishes to influence, following she selects
a specific statement and the changes realized by the layout      Adaptation model
generation algorithm. Furthermore, some statements need          The adaptation model describes possible situations and the
parameters e.g. a statement, defines the size of a button,       corresponding adaptations of the layout model of the
which depends on the width of the finger.                        application. For this purpose, the adaptation model consists
The state of the realization is described in the next section.   of adaptation definitions. Each adaptation definition
                                                                 consists of a tuple of a situation, describing when the rule
                                                                 should be applied and an adaptation rule, describing how
the layout model is adapted. The adaptation rules may cause      user to define a statement which influences the size of these
changes to the user interface structure and may also add,        elements. A screenshot is shown in Figure 3.
modify or delete statements.




                                                                      Figure 3: Supportive user interface screenshot


                                                                 The supportive user interface application adds a statement
                                                                 to the layout model and triggers the recalculation
                                                                 mechanism to update the user interface of the application.


                                                                 CONCLUSION
                                                                 In this paper, we have defined the requirements of a SUI to
 Figure 2: Example graph of layout model adaptations             control and evaluate the state of the adaptive application
                                                                 and have shown first steps of implementation.
In the center of Figure 2 an example of an adaptation graph      In the future, we plan to increase the ratio of automatic
is shown. Each node () defines a state of the layout model      statements derived from the user interface models for the
() and each edge () a set of adaptation rules to               layout generation process. Additionally, we take the domain
transform the layout model to a state, applicable for a          model objects influenced by the user interface elements into
specific situation (). A situation is determined by a certain   account. The resulting set of statements reduces the amount
state of the user, device and environment.                       of designer defined statements. At run time, the situation
                                                                 recognition and the adaptation algorithm must be evaluated,
Additionally, we have done first steps to define a supportive
                                                                 especially the handling of imperfect (e.g. inaccuracy,
user interface.
                                                                 incompleteness, conflicting) context information and the
                                                                 user interface adaptation over the time.
Supportive user interface                                        Last but not least, we have to implement the SUI concepts
The supportive user interface should provide a way, to           and prove the acceptance of our approach by user studies.
understand the context information representation within         Additionally, because the user doesn’t want to define all
the system and allow the manipulation of the user interface      adaptions manually, we want to explore the possibilities of
generation and adaptation algorithm.                             machine learning algorithms to reduce and simplify the
                                                                 definition of adaptations.
To match the requirements defined above, a supportive user
interface should hide the complexity of the interaction
space (various sensors gathering information about the           REFERENCES
environment, heterogenic interaction devices and user            1.Joelle Coutaz. Meta-user interfaces for ambient spaces:
characteristics) from the user. Also the complexity of             Can model-driven engineering help? In Margaret H.
situation definition and recognition must be encapsulated.         Burnett, Gregor Engels, Brad A. Myers and Gregg
Accordingly, the situation description, the adaptation             Rothermel, editors, End-User Software Engineering,
definition must be as simple as possible but as complex as         number 07081 in Dagstuhl Seminar Proceedings.
necessary. The user must be able to define powerful                Internationales Begegnungs und Forschungszentrum für
adaptations but shouldn’t be overstrained. A way to do this        Informatik (IBFI), Schloss Dagstuhl, Germany, 2007.
is to derive semantic information from the user interface
models to visualize the effected elements on the screen. To      2.Christof Lutteroth, Robert Strandh, and Gerald Weber.
preview the user interface changes, the supportive user            Domain specific high-level constraints for user interface
interface application simulates the layout model changes           layout. Constraints, 13(3):307 - 342, 2008.
and visualizes the result of the calculation to the user.        3.Dirk Roscher, Marco Blumendorf, and Sahin Albayrak.
In [5] we use the information derived from the concrete            Using Meta user interfaces to control multimodal
user interface model (e.g. all button elements) and allow the      interaction in smart environments. In Gerrit Meixner;
                                                                   Daniel Görlich; K. Breiner; H. Huÿmann; A. Pleuÿ; S.
 Sauer; J. Van den Bergh, editor, Proceedings of the        runtime. In Proceedings of the Working Conference on
 IUI'09 Workshop on Model Driven Development of             Advanced Visual Interfaces, pages 321 - 325, 2010.
 Advanced User Interfaces, volume 439 of CEUR              6.Gordon Blair, Nelly Bencomo, and Robert B. France.
 Workshop Proceedings, ISSN 1613-0073. CEUR                  Models@ run.time. Computer, 42(10):22 27, Oct. 2009.
 Workshop Proceedings (Online), 2009.
                                                           7.Sebastian Feuerstack, Marco Blumendorf, Veit
4.Veit Schwartze, Sebastian Feuerstack, and Sahin            Schwartze, and Sahin Albayrak. Model-based layout
  Albayrak. Behavior sensitive user interfaces for smart     generation. In Paolo Bottoni and Stefano Levialdi,
  environments. In HCII 2009 - User Modeling, 2009.          editors, Proceedings of the working conference on
5.Veit Schwartze, Marco Blumendorf and Sahin Albayrak.       Advanced visual interfaces. ACM, 2008.
  Adjustable context adaptations for user interfaces at