=Paper=
{{Paper
|id=Vol-214/paper-8
|storemode=property
|title=Multi-paradigm modelling and synthesis of user interfaces
|pdfUrl=https://ceur-ws.org/Vol-214/paper8.pdf
|volume=Vol-214
|dblpUrl=https://dblp.org/rec/conf/models/VangheluweD06
}}
==Multi-paradigm modelling and synthesis of user interfaces==
Multi-Paradigm Modelling and Synthesis of User Interfaces
Denis Dubé and Hans Vangheluwe
School of Computer Science
McGill University
Montréal, Québec, Canada
denkkar@gmail.com, hv@cs.mcgill.ca
ABSTRACT a digital watch example how CAMPaM principles can be
In this article, model-based design and synthesis of reac- consistently applied to the design and synthesis of User In-
tive user interfaces is presented as a particular application terfaces.
of Computer-Automated Multi-Paradigm Modelling (CAM-
PaM). Multi-paradigm modelling acknowledges the need to
model at different levels of abstraction, using appropriate 2. MODELLING LANGUAGES
formalisms. It also gives transformations first-class model The two main aspects of any model are its syntax (how it
status. In the CAMPaM UI development process, a class is represented) and its semantics (what it means). The syn-
of user interfaces is modelled. This includes models of the tax of modelling languages is traditionally partitioned into
abstract syntax of the user interface, of the concrete visual concrete syntax and abstract syntax. In textual languages
syntax of the user interface (including layout) and of the for example, the concrete syntax is made up of sequences
semantics of the application (its reactive behaviour). From of characters taken from an alphabet. These characters are
these models, an interactive modelling environment is syn- typically grouped into words or tokens. Certain sequences
thesized. This environment allows the modeller to experi- of words known as sentences are considered to the language.
ment (analyze, simulate) with different instances in the mod- The (possibly infinite) set of all valid sentences is said to
elled class of user interfaces. Once a single element of the set make up the language. Costagliola et. al. [2] present
of possible user interfaces is chosen, the final UI application a framework of visual language classes in which the anal-
is synthesized. This process will be demonstrated by means ogy between textual and visual characters, words, and sen-
of a digital watch application. Code is synthesized for ex- tences becomes apparent. Visual languages are those lan-
ecution within a web browser using an AJAX client-server guages whose concrete syntax is visual (graphical, geomet-
architecture. rical, topological, . . . ) as opposed to textual. For practical
reasons, models are often stripped of irrelevant concrete syn-
tax information during syntax checking. This results in an
1. INTRODUCTION abstract representation which captures the essence of the
Recently, model-based approaches to complex (software) sys- model. This is called the abstract syntax. Obviously, a
tems development have gained popularity. Putting mod- single abstract syntax may be represented using multiple
els (rather than code) central in the development process concrete syntaxes. In programming language compilers, ab-
does indeed offer many advantages. It raises the level of ab- stract syntax of models (due to the nature of programs) is
straction; it enables formal analysis (through model check- typically represented in Abstract Syntax Trees (ASTs). In
ing for example), simulation (for performance analysis), as the context of general modelling, where models are often
well as automated, consistent code synthesis for multiple graph-like, this representation can be generalized to Abstract
target platforms. Modelling complex systems is a difficult Syntax Graphs (ASGs). Once the syntactic correctness of a
task, as these systems often have components and aspects model has been established, its meaning must be specified.
which cannot be described in a single formalism (such as This meaning must be unique and precise. Meaning can be
Class Diagrams, Statecharts, or Petri Nets). User interfaces expressed by specifying a semantic mapping function which
are a very pertinent example of such complex systems, in maps every model in a language onto an element in a seman-
particular as there are many facets to their structure and tic domain. Often meaning is given in an operational way
behaviour. Multi-Paradigm Modelling [9] captures the no- by specifying how to execute models in a language. In our
tions that (1) models may have components described in approach, we consider an application with a reactive, visual
different formalisms, and span different level of abstraction User Interface to be a single element of a language. The
and that (2) model transformations are used to map models application has concrete visual syntax, abstract syntax (the
onto domains and formalisms where certain questions can essential structures) and semantics (the behaviour). The
be easily answered. The sequel demonstrates by means of Cameleon framework [1] makes a similar distinction focused
on the user interface domain between final UI, concrete UI,
abstract UI, task and domain. Figure 1 shows concrete and
abstract syntax as well as the relation between them explic-
itly. Semantics is distributed over all of these as reactive
behaviour may need to be specified for both the concrete
syntax entities and for the abstract syntax entities. The fig-
the role name watch here) will subsequently take appropri-
ate action. On the concrete syntax side, a button needs to
0..*
Bu t t on
+type: string
+isActive: bool
B utton B ehaviour
T: create
InitialA: self = eventhandler.get_event_param s()
T: press
G : self.abstract.set(self.type, ’isActive’, True)
A: self.w atch.event(self.type + ’press’)
ButtonO ff ButtonO n
T: release
G : self.abstract.set(self.type, ’isActive’, False)
A: self.w atch.event(self.type + ’release’)
Figure 3: Button Structure and Behaviour
be visualized. This is done by means of a Button2D visual
object which in this case can turn gray or green to indi-
cate the state of the abstract Button. The actual changing
of colour is done by calling upon the methods setGray and
Figure 1: The Process setGreen of Button2D. In our concrete prototype implemen-
tation, we use Scalable Vector Graphics (SVG) to render vi-
sual objects. As such, Button2D specializes SVGObject. As
ure shows three levels. The top level will not be used in our such, the setGray and setGreen are actually implemented
digital watch example. It uses Domain-Specific Formalisms by means of SVG instructions. Figure 4 shows structure and
(DSF) to optimally model structure and behaviour of the behaviour of Button2D.
different parts of the application. These models are sub-
sequently transformed into models in more general-purpose S VGO b j e c t
formalisms such as Class Diagrams (CD) and Statecharts +id: string Bu t t on2 D
(SC). The models thus obtained specify not a single appli- -SVGData: dictionary
+set(attribute,value) +isActive: bool
cation but rather a class of applications. For example, in a +create(id,SVGData) +setGray()
class diagram, a multiplicity * may be used to indicate the +delete(id) +setGreen()
number of allowed buttons in a digital watch. The models B utton B ehaviour
may be used to synthesize a visual and interactive analysis T: create
and simulation environment. Using such an environment, InitialA: self = eventhandler.get_event_param s()
T: press
the modeller may come up with a refinement of the model, G : self.concrete.set(self.type, ’isActive’, True)
A: self.setG ray()
thus specifying a smaller language. The above multiplicity ButtonO ff ButtonO n
may for example be refined to 4 indicating that the final T: release
application must have exactly four buttons. This refined G : self.concrete.set(self.type, ’isActive’, False)
model can then be used to synthesize application code. A: self.setG reen()
Figure 4: Button2D Structure and Behaviour
3. DIGITAL WATCH EXAMPLE
In this section, we use the example of a digital watch ap-
plication to illustrate the various steps in the process. On
Figure 5 depicts how abstract and concrete models are linked
the abstract syntax side, the essential parts of structure and
using a ButtonA2CS mediator which keeps both views con-
behaviour of a digital watch are depicted in Figure 2. Due
sistent. Consistency most be guaranteed in both directions
to space restrictions, the figure does not explicitly show be-
as (1) interactively, the concrete syntax may be manipu-
haviour for all elements of the model. Only for the Button
lated which must be propagated to the abstract level and (2)
class, Figure 3 shows the associated behaviour in the form of
changes may occur at the abstract level (such as time up-
a Statechart. The simple statechart has two states: ButtonOn
date) which need to be reflected/visualized at the concrete
and ButtonOff. Transitions between these states are trig-
gered by an event or trigger T:, but only if the guard G: is
True. When taken, a transition has an action A: as side-
Button2D
effect. Guards and actions can refer to attributes of the Button
ButtonA2CS
class (and at instance level object) whose behaviour is de- +isActive: bool
+type: string
+setGray()
scribed. Note that as this model is on the Abstract Syntax +setGreen()
+isActive: bool
side, it does not contain nor refer to any concrete (visual) To Absract
To Concrete
information. Rather, a press or release event received
(from the concrete syntax) is forwarded to the Watch ob-
ject of which the button is part. The Watch (known under Figure 5: Abstract and Concrete Syntax in Synch
1 1
Watch
0..* 1..* +1
Button 1
<> Display
1
AbstractSyntax +type: string +stringFields: list
+isActive: bool +setDisplayField(index,text)
+abstractObjectDictionary +1
1
+set(id,attribute,value)
+create(id)
+delete(id)
0..* 0..* 0..* 0..*
+connect(edgeID,sourceID,targetID)
+disconnect(edgeID) Indiglo Chrono Time Alarm
To Concrete +isLightOn: boolean +min: int +year: int +hour: int
1
+setLightOn() +sec: int +month: int +min: int
+setLightOff() +tengthOfSec: int +day: int +isActive: boolean
+start() +hour: int +isRinging: boolean
+stop() +min: int +ringAlarm(hour,min)
1
+reset() +sec: int +addMin()
+addTengthOfSec() +blinkedItemIndex: int +addHour()
+blinkedItemFlag: boolean +toggleAlarm()
+addSec() 0..* 1
1
+addMin()
1
+addHour()
+addDay()
+addMonth()
+addYear()
-getDaysInMonth()
+setEditBlink(index)
+toggleEditBlink(index)
Figure 2: Digital Watches Abstract Syntax
level. Some of the classes in the abstract syntax have no con- is possible thanks to the information available in both the
crete representation. Other classes are related to concrete Class Diagrams and (Rhapsody) Statecharts. Note that our
visual representations (vector graphics drawings). Some as- current code generator generates Python code which ex-
sociations in the abstract syntax are related to connection plains the syntax of guards and actions in the Statecharts.
splines connecting the visual representations of the classes.
Another alternative is to relate abstract syntax associations As the models upto now still leave a lot of freedom (in mul-
to geometric or topological relations such as insideness or tiplicities at the abstract side and in visual layout on the
relative positioning on the concrete side. the concrete side), the modelling and simulation environ-
ment allows the modeller to experiment with various model
Once the above models have been built, an appropriate refinements. A modeller might for example decide to cre-
meta-modelling and model transformation tool such as AToM3 ate a watch which only shows time, and has no chrono nor
[3, 4] can be used to synthesize an interactive modelling and alarm. Referring back to our discussion about modelling lan-
simulation environment as shown in Figure 6. Synthesis guages, the various alternative models are all sentences in
the language defined by the design model. As such, the syn-
thesized modelling and simulation environment is a highly
domain-specific visual modelling environment (DSVME). It
is interesting to note that the modeller refines the design by
manipulating concrete visual (instance) objects.
Eventually, after refinements are complete, an actual appli-
cation can be synthesized as shown in Figure 7. In our
prototype we synthesize a real-time, reactive application
whose user-interaction part runs in an SVG-rendering ca-
pable browser such as firefox. We use an “Asynchronous
JavaScript + CSS + DOM + XMLHttpRequest” (AJAX)
[6] framework supporting both push and pull interaction as
shown in Figure 8 As XMLHttpRequest can be initiated from
the browser side, we need a means of “pushing” information
from the abstract side to the browser. This is particularly
necessary for the autonomous, timed digital watch applica-
tion where time gets updated every second on the abstract
side (as specified by the Statechart of the Time class in Fig-
ure 2). This “push” is achieved by polling the server (every
50ms) in the Javascript JS_Eval_Poll inside the browser.
Figure 6: Simulating language elements in AToM3
sendEvent
1 1 1 1 T
<> o
JS_Eval_Poll HTTPServer RequestMediator Queue SVGObject
SVGRender
+queuedSVGRequests: list +id: string A
+pollForever() +do_GET() +sendEventTo(id,eventName)
+set(id,attribute,value) -SVGData: dictionary b
+AJAX(url) +do_POST() 1 1 +set(id,attribute,value)
+create(id,SVGDataString) +set(attribute,value) s
1 +create(id,SVGDataString)
+delete(id) +create(id,SVGData) r
+delete(id)
queryUpdates1 +getAndClearQueue() +delete(id) a
1 1 c
1 1
t
Figure 8: AJAX Framework
[3] J. de Lara and H. Vangheluwe. AToM3 : A tool for
multi-formalism and meta-modelling. In FASE ’02:
Proceedings of the 5th International Conference on
Fundamental Approaches to Software Engineering,
pages 174–188, London, UK, 2002. Springer-Verlag.
[4] J. de Lara and H. Vangheluwe. Defining visual
notations and their manipulation through
meta-modelling and graph transformation. J. Vis.
Lang. Comput., 15(3-4):309–330, 2004.
[5] J. Falb, R. Popp, T. Rock, H. Jelinek, E. Arnautovic,
and H. Kaindl. Using communicative acts in high-level
specifications of user interfaces for their automated
synthesis. In ASE ’05: Proceedings of the 20th
IEEE/ACM international Conference on Automated
software engineering, pages 429–430, New York, NY,
USA, 2005. ACM Press.
[6] J. J. Garrett. Ajax: A new approach to web
applications. http://www.adaptivepath.com/
publications/essays/archives/000385.php, 2005.
Figure 7: Synthesized Application in Browser [7] D. Harel, S. Efroni, and I. R. Cohen. Reactive
animation. In FMCO, pages 136–153, 2002.
4. RELATED WORK [8] G. Mori, F. Paterno, and C. Santoro. Design and
An example of the use of various formalisms for the speci- development of multidevice user interfaces through
fication of contex-sensitive interactive applications is given multiple logical descriptions. IEEE Trans. Softw.
in [12]. Behaviour and structure of the UI are modeled and Eng., 30(8):507–520, 2004.
then XForms and XHTML is generated for the final appli- [9] P. J. Mosterman and H. Vangheluwe. Computer
cation. Myers [10] describes the UI challenges and the diffi- Automated Multi-Paradigm Modeling: An
culty of UI and behaviour seperation. This problem gets ex- Introduction. Simulation: Transactions of the Society
acerbated when in addition the application logic is reactive for Modeling and Simulation International,
due to complexity of the callback structure. [5] gives a brief 80(9):433–450, September 2004. Special Issue: Grand
high-level overview of a UI synthesizer whereas [8] discusses Challenges for Modeling and Simulation.
abstract UI to concrete UI synthesis on multiple platforms.
[11] goes deeper into the issues of having different models at [10] B. A. Myers. User interface software tools. ACM
the presentation, dialog, and application levels. The closest Trans. Comput.-Hum. Interact., 2(1):64–103, 1995.
to our approach is reactive animation [7] which links appli-
cation behaviour to rendering and reactivity to interactive [11] K. Stirewalt and S. Rugaber. Automating ui
input. It uses Flash rather than SVG. generation by model composition. In ASE ’98:
Proceedings of the 13th IEEE international conference
on Automated software engineering, page 177,
5. REFERENCES Washington, DC, USA, 1998. IEEE Computer Society.
[1] G. Calvary, J. Coutaz, D. Thevenin, Q. Limbourg,
L. Bouillon, and J. Vanderdonckt. A unifying [12] J. Van den Bergh and K. Coninx. Cup 2.0: High-level
reference framework for multi-target user interfaces. modeling of context-sensitive interactive applications.
Interacting with Computers, 15(3):289–308, 2003. In Proceedings of ACM/IEEE 9th International
Conference on Model Driven Engineering Languages
[2] G. Costagliola, A. D. Lucia, S. Orefice, and G. Polese.
and Systems, LNCS, Genoa, Italy, October 2006.
A classification framework to support the design of
Springer. Accepted.
visual languages. J. Vis. Lang. Comput.,
13(6):573–600, 2002.