=Paper=
{{Paper
|id=Vol-159/paper-5
|storemode=property
|title=Towards Model-driven Engineering of Plastic User Interfaces
|pdfUrl=https://ceur-ws.org/Vol-159/paper5.pdf
|volume=Vol-159
|dblpUrl=https://dblp.org/rec/conf/uml/SottetCF05
}}
==Towards Model-driven Engineering of Plastic User Interfaces==
Towards Model Driven Engineering
of Plastic User Interfaces
Jean-Sébastien Sottet Gaëlle Calvary Jean-Marie Favre
CLIPS-IMAG, CLIPS-IMAG, LSR-IMAG
University of Grenoble, France University of Grenoble, France University of Grenoble, France
jean-sebastien.sottet@imag.fr gaelle.calvary@imag.fr jean-marie.favre@imag.fr
ABSTRACT set of interaction resources. As a result, UIs must be mold
Developing advanced User Interfaces (UI) is very challenging, in dynamically in order to adapt gracefully to the new context of use.
particular when many variants of a same UI are to be produced for While product line approaches deal with the production of
different platforms. The development of plastic user interfaces is variants of UIs (e.g., a UI specifically crafted for PC or PDA or
even more demanding. In Human Computer Interaction, plasticity phone), plasticity is even more challenging coping with the
denotes the capacity of a UI to withstand variations of the context change of the context of use. The UI can fully migrate from one
of use while preserving usability. A context of use is a triplet < platform to another one (e.g., from a PC to a PDA when the
user, platform, environment >. Plasticity raises many issues for battery of the PC gets low); the UI may be redistributed among a
both the design and run time of UIs. This paper shows how Model set of platforms (e.g., a remote controller will migrate to a PDA)
Driven Engineering concepts can be used in this area. or the UI may stay on the current platform but be remold in order
to accommodate to variations in the context of use (e.g.,
1. INTRODUCTION luminosity level). Adaptation should be done in an opportunistic
The need of models in Human Computer Interaction (HCI) has
manner for preserving usability.
been recognized for long. Nevertheless full automatic generation
of user interfaces (UI) rapidly shown its limits [1]. That does not While code-centric approaches might be suited for the
mean that model-based techniques are not valuable, but that they development of simple and single UIs, they simply fail for
have to be further explored with other objectives. First, developing plastic UIs. Many facets of the interactive system, in
traditionally limited to machine or human processed forward particular its plasticity domain (i.e., the set of contexts of use it is
engineering, they have to be investigated in reverse and cross able to cover) must be modeled explicitly. This is what Model
engineering. Secondly, models should live at run time instead of Driven Engineering is suited for.
being limited to the design time. Figure 1 shows an overview of a MDE framework based on the
Model Driven Engineering (MDE) [2] advocates the systematic European CAMELEON project [5] addressing the development of
use of "productive" models, that is, models that can be processed Plastic UIs. A (simplified) version of this framework will be
by the machine. Full engineering processes are described by shortly described in this paper. Each package on the first line
explicit networks of models connected by explicit mappings and represents a metamodel (M2 level of the meta pyramid [4]). As
transformations [3]. This implies the systematic description of the reader can see, many facets have to be made explicit (e.g., the
explicit metamodels, that is models of the modeling languages User, the Platform, …). The models representing a specific
used to describe each model. Actually, most of the time the focus interactive system fit in the M1 level. In a given column, all
is set on transformation and traceability rather than on runtime models conform to [4] the same metamodel. For clarity, mappings
productive models. and transformations are not shown in the figure.
The core idea of this paper is to investigate Model Driven The paper is threefold. Section 2 focuses on the right part of
Engineering for the development and execution of plastic UIs. In Figure 1, that is the development of "rigid" (i.e., not plastic) UIs.
HCI, plasticity denotes the capacity of a UI to withstand Specific requirements for plasticity are described in Section 3 (left
variations of context of use while preserving usability. A context part of Figure 1). Finally Section 4 concludes the paper.
of use refers to the triplet < user, platform, environment >. With
ubiquitous computing, UIs are no longer confined in a unique
desktop, but they can be distributed or migrated among a dynamic
Hypothesis Design
Context of Use Property Domain
User Platform Environment Property Concept Task Workspace Interactor Program
M2 M2-Usr M2-Plf M2-Env M2-Ppt M2-Cpt M2-Tsk M2-Wks
M1-Usr M1-Plf M1-Env M1-Ppt M1-Cpt M1-Tsk M1-Wks
Model refinement
M1 M1-Usr' M1-Plf' M1-Env' M1-Ppt' M1-Cpt' M1-Tsk' M1-Wks' M1-Int'
M1-Usr'' M1-Plf'' M1-Env'' M1-Ppt'' M1-Cpt'' M1-Tsk'' M1-Wks'' M1-Int'' M1-Prg''
adhoc UI
programming
Model Driven Engineering of Rigid User Interfaces (section 2)
Model Driven Engineering of Plastic User Interfaces (section 3)
Figure 1. A Model Driven Engineering
Model Driven EngineeringFramework for the Development
of Plastic User Interface (section 3) of UIs
explicit metamodel (otherwise the model cannot be interpreted
2. MDE OF RIGID USER INTERFACES and transformed by a machine). The development of classical
Before addressing the specific requirements for plasticity, let's see
rigid UIs implies five metamodels [6]. A simplified backbone
how Model Driven Engineering can be used for the development
composed of four metamodels is given in Figure 2. The mappings
of classical rigid UIs by opposition to plastic UIs. This implies the
between metamodels will be explained in the next section.
use of (1) a development process (or better said a family of
development processes), (2) a set of metamodels and (3) a set of 〈 Task. A task describes how a user's objective can be reached
mappings and transformations. A more detailed discussion on this by following a given procedure. From a MDE point of view,
approach can be found in [6]. task models are trees made of binary and unary operators.
2.1 Development Process 〈 Concept. This model describes the domain of discourse.
Typically such models can be described using UML class
Current industrial practices are still mostly code-centric (right
diagrams. These are usually referred as Domain Model in
bottom of Figure 1). Conversely, Model Driven Engineering
Software Engineering and Product Line communities.
processes are based on successive refinements of models, with the
integration of new information at each step [5]. Teresa [11] and 〈 Workspace. Also referred as Abstract User Interface [5], this
UsiXML [10] exemplify forward engineering processes. They model describes the network of workspaces in which tasks
consist in reifying models step by step from an entry point [16] (in are performed.
most cases, the task model) until reaching the final running 〈 Interactor. Also referred as Concrete User Interface in [5],
program. Typical development processes start from domain this model can still be described in terms of abstract
models such as user tasks and concepts models. Then, based on Interactors. Various levels of abstraction can be considered
this knowledge and design choices, the UI is designed in terms of here.
workspaces, interactors and finally program elements dependent
on specific platforms and libraries. In practice, processes are 〈 Program. The last "models" are direct abstractions of
iterative rather than straightforward and the number of steps may programs and/or other implementation techniques. Such
vary. Moreover as suggested by Figure 1, very often it is models are Platform Specific Models (PSM) in the MDE
necessary to adapt an initial model to the various constraints jargon. They just represent the actual implementation of the
introduced at each level. For instance an initial task model (M1- UI.
Tsk) might be tuned into another one (M1-Tsk') to take into Obviously, all models expressed with these metamodels should be
account constraints related to concrete library toolkits (M1-Tsk''). connected together because they just represent different points of
view on the same UI. Mappings and transformations are briefly
2.2 (Meta)Models for Rigid UI Engineering discussed in the next section.
Metamodels are keys to Model Driven Engineering. To be
productive, each model must be accompanied with a precise and
Figure 2. Mappings Between 4 Metamodels for Rigid User Interface Engineering (simplified view)
2.3 Mapping and Transformations (o . R i g h t T a s k ) into two sequence operators (from
With metamodels, mappings and transformations are the o.motherTask to o.leftTask and o.leftTask to o.rightTask).
cornerstones of MDE. Without them, all the models would be
isolated. Conversely, the idea is to incrementally transform
abstract models at high level of abstraction (Platform Independent
Models, PIM in MDE jargon) into Platform Specific Models. This module M2TaskToM2Workspace {
from M1Task : M2Task
is obviously an over simplification. Monolithic code generators to M1Workspace : M2Workspace
are not suited for advanced user interfaces. MDE approaches are
based on libraries of small composable and extensible -- One workspace for each task
rule TaskToSpace {
transformations. The designer selects and if necessary tunes the from t : M2Task!Task
appropriate transformations. If no transformation is available, a to w : M2Workspace!Space (
new one can be written thanks to transformation languages. It is name <- t.name
then added to a library for further use. In this way, expertise can )
}
be captured and packaged into transformation libraries.
Quite often transformation engines are associated to specific -- OrOperator
rule OrOperatorToSequence{
modeling environments based on a given set of metamodels for UI from o : M2Task!BinaryOperator (
development. This is the case for instance for TransformiXML o.name = "or"
[12] in the UsiXML environment. While this kind of approaches )
is worth, it is specific to UI development. It does not cover the to leftSequence : M2Workspace!Sequence (
origin<- [Task2Space.e]o.motherTask,
whole software engineering process. destination<-[Task2Space.e]o.rightTask )
The core idea of our approach is to use generic Model Driven ...
Engineering techniques and extensive libraries of metamodels.
This approach is being investigated in the Zooomm project [2]. 3. TOWARDS MDE OF PLASTIC UI
While emerging standards for expressing MDE transformations As shown on the left of Figure 1, additional models are necessary
are under active development (e.g. QVT), we investigate [6] the for the engineering of plastic UIs.
appropriateness of the ATL generic MDE transformation
language [12] for plasticity. The following piece of code is an
3.1 Additional (Meta)Models for Plastic UI
example of transformation written in ATL. It describes very For plasticity, we need additional metamodels for capturing both
simple rules transforming Tasks into Workspaces: the context of use and the usability of the interactive systems.
• The rule TaskToSpace creates one workspace w per user task • User. This model represents the end-user of the interactive
t. The workspace takes the name of the task; system. It may capture general information (e.g., age,
gender), the skill level of the end-user (e.g. Rasmussen [18])
• The rule OrOperatorToSequence applies to the task model. It in both computer science and in the applicative domain.
transforms all OR operators o between two user tasks
• Environment. This model represents the physical (e.g., the
noise level), social conditions where the interaction takes
place. From an engineering perspective, the environment can The approach presented in this paper is quite different. Instead on
be modeled as a graph of contexts and situations [17]. focusing on the user interface only, general Model Driven
Additionally the model could contain social rules such as Engineering techniques have been used. The key idea is to merge
"switch off the volume in a train", etc. experiences from both the MDE and the UI development
communities. Instead of developing specific model based tools
• Platform. This model exhibits the hardware and software such as transformation languages for the development of UI,
platform sustaining the interaction. It is fundamental when reusing emerging MDE technologies is promising. First versions
considering redistributable UIs. The simple metamodel might not be fully suited to UI development specificities, but if
below describes the hardware core elements of such a system this is the case, this would lead to new requirements for MDE.
[9]. Platform modeling is an important issue in Model Driven The framework presented here is used both at design time and
Engineering, and expertise from this domain could be reused. run-time. This last point is quite innovative with respect to
rational MDE applications that consider platform migration as a
quite heavy development process.
5. ACKNOWLEDGMENTS
This work has been supported by the SIMILAR European
Network of Excellence.
6. REFERENCES
[1] Myers B., Hudson S.E., Pausch R. "Past, Present, and Future of User
Interface Software Tools", Transactions on Computer-Human
Interaction (TOCHI), Vol 7, Issue 1, 2000
[2] Planet MDE, "A Web Portal for the Model Driven Engineering
Community" http://planetmde.org
[3] Favre J.M., "Foundations of Model (Driven) (Reverse) Engineering",
Dagsthul Seminar on Language Engineering for Model Driven
Development, DROPS, http://drops.dagstuhl.de/portals/04101, 2004
[4] Favre J.M., "Foundations of the Meta-pyramids: Languages and
Metamodels", DROPS, http://drops.dagstuhl.de/portals/04101, 2004
[5] Calvary G., Coutaz J. Thevenin, D. Limbourg, Q., Bouillon, L.,
Vanderdonckt J. "A Unifying Reference Framework for Multi-
Figure 3 Simplified hardware platform metamodel for UI Target User Interfaces, Interacting With Computers, 2003
• Usability. While some elements could be modeled in a single [6] Sottet J.S., Calvary G., Favre J.M., "Ingénierie de l’Interaction
Homme-Machine Dirigée par les Modèles", IDM05, Paris, 2005.
package, existing frameworks (e.g., Scapin & Bastien [7],
IFIP [8]) show that many elements could be attached on [7] Scapin D., Bastien, C.H., "Ergonomic Criterias for Evaluating the
Ergonomic Quality Interactive Systems." Behaviour and Information
relations between models. For instance, the IFIP Technologies, Vol 16, 1997
observability property may be seen as a link between [8] Abowd G., Coutaz J., Nigay L., "Structuring the Space of Interactive
concepts and concrete UI elements. If the concept is not System Properties", Proceeding of the IFIP, 1992.
mapped on an UI element then it is not observable. [9] Demeure A., Calvary G., Sottet J.S., Vanderdonkt, J., "A Reference
Model for Distributed User Interfaces", TAMODIA’2005.
3.2 Runtime Adaptation [10] Limbourg Q., Vanderdonckt J., Michotte, B., Bouillon, L. Florins,
While models are mostly used at design time in MDE and M. Trevisan D., "UsiXML: A User Interface Description Language
product-line engineering, plasticity may rely on an extensive use for Context-Sensitive User Interfaces", AVI 2004, Gallipoli, 2004.
of models at run-time. Moreover, all abstraction levels and [11] Mori G., Paternò F., Santoro C. "Design and Development of
traceability links are required during execution to compute the Multidevice User Interfaces through Multiple Logical Descriptions"
adaptation. For instance, moving a selection task from a PC to a IEEE Transactions on Software Engineering, August 2004.
PDA is easier when reasoning at the task level rather than at the [12] Limbourg Q., Vanderdonckt J., Michotte B., Bouillon L., Lopez-
Jaquero, V., "UsiXML: a Language Supporting Multi-Path
implementation level. The corresponding interactors can be Development of User Interfaces", 9th IFIP Working Conference on
simply found by following the mapping links [9]. Then for a Engineering for Human-Computer Interaction EHCI 2004
migration, a new set of program elements should be computed on [13] Nunes N., Cunha J.F, "Toward Flexible Automatic Generation of
the target platform, and this through traceability links. Sometime User-Interface via UML and XMI", 5th Workshop Iberoamericano
adaptation requires knowledge about the heuristics that were used de Ingenieria de Requisitos y Ambientes Software, IDEAS 2002
former design choices. For instance, some concepts could have [14] Zooomm, "Zooomm, The International ZOO Of MetaModels,
been suppressed for targeting small devices. These concepts Schemas and Grammar for Software Engineering”,
should be recovered when migrating to a more powerful device. http://zooomm.org
[15] Mori, G., Paternò, F., Santoro, C. “Tool Support for Designing
4. CONCLUSION Nomadic Applications” Proceedings ACM IUI’03, Miami, pp.141
The use of models is not new in User Interface development. First ACM press
attempts have been directed towards full automatic generation of [16] Limbourg, Q. "Multi-path Development of User Interfaces", PhD of
University of Louvain La Neuve, Belgium, 2004
UI code. Experience has shown that the resulting UIs were usually
[17] J. Crowley, J. Coutaz, G. Rey, P. Reignier Perceptual Components e
poor in terms of usability. Moreover, the use of quite monolithic for Context-Aware Computing, UbiComp 2002:, Göteburg, Sweden
code generators make it impossible to customize the interface Sept./Oct. 2002
when needed and specific heuristics based on application domains [18] Rasmussen, J. (1983). Skills, rules, and knowledge; signals, signs,
could not be integrated. Finally, existing environments were not and symbols; and other distinctions in human performance models.
designed with interoperability in mind. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13,
257-266