<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Self-Explanatory User Interfaces by Model-Driven Engineering</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Alfonso Garc´ıa Frey</string-name>
          <email>Alfonso.Garcia-Frey@imag.fr</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Gae¨lle Calvary</string-name>
          <email>Gaelle.Calvary@imag.fr</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sophie Dupuy-Chessa</string-name>
          <email>Sophie.Dupuy@imag.fr</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Author Keywords Self-Explanatory User Interfaces, UI quality</institution>
          ,
          <addr-line>help, design rationale, model-driven engineering, model transformation</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>University of Grenoble</institution>
          ,
          <addr-line>CNRS, LIG 385, avenue de la Bibliothe`que, 38400, Saint-Martin d'He`res</addr-line>
          ,
          <country country="FR">France</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2009</year>
      </pub-date>
      <abstract>
        <p>Modern User Interfaces (UI) must deal with the increasing complexity of applications as well as new features such as the capacity of UIs to be dynamically adapted to the context of use. The complexity does not necessarily imply a better quality. Thus, it becomes necessary to make users understand the UIs. This paper describes an on-going research about Self-Explanatory User Interfaces (SE-UI) by ModelDriven Engineering (MDE). Self-explanation makes reference to the capacity of a UI to provide the end-user with information about its rationale (which is the purpose of the UI), its design rationale (why is the UI structured into this set of workspaces?, what's the purpose of this button?), its current state (why is the menu disabled?) as well as the evolution of the state (how can I enable this feature?). Explanations are provided by embedded models. We explore modeldriven engineering to understand why and how this approach can lead us to overcome shortcomings of UI quality successfully.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        INTRODUCTION
Motivation
On the one hand, most software is too hard to use.“Modern
applications such as Microsft Word have many automatic
features and hidden dependencies that are frequently helpful
but can be mysterious to both novice and expert users” [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ].
Users may require assistance while interacting with a User
Interface (UI). Ideally, the UI must guide the user in
accomplishing a task the application was designed for. The user
can request help about functionality, features, or any
information about the process of the task that is being performed.
The UI must be able to provide the correct answer, giving
the necessary information to the user in an appropiate
format. This can take place at any time in the whole interaction
process between both the user and the UI. However, modern
applications cover only a few questions the user may have, or
provide a general help instead of a clear and concise answer
to a given question. Furthermore, help is created ad-hoc, this
is, it has been previously generated and it’s not able to cover
new questions at run-time because they were not considered
by the designers. UI design problems are not covered at all
because the designers are not aware of them.
      </p>
      <p>
        Moreover, the UI must deal with users having different levels
of expertise. Even many long-time users never master
common procedures [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] and in other cases, users must work hard
to figure out each feature or screen [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ].
      </p>
      <p>
        The problem is greater for Plastic UIs [
        <xref ref-type="bibr" rid="ref19 ref5">5, 19</xref>
        ]. Plastic UIs
demand dynamic adaptation also for help systems because
from now on, developers can’t afford to consider all the
different contexts of use one by one coding all possible ad-hoc
solutions by hand. This complicates the prediction of the
result and the final quality, making difficult the design choices.
As a result, dynamic solutions are required also for help
systems. These help systems must now be aware of the context
of use (user, platform and environment), the task, the
structure and presentation of the UI.
      </p>
      <p>
        MDE and MB-UIDE approaches
On the other hand, Model-Driven Engineering (MDE) exists
since long time ago and its recently applied to the
engineering of UIs. It consists in describing different features of UIs
(e.g., task, domain, context of use) in models from which a
final UI is produced [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ] according to a forward engineering
process. MDE of UI is assumed to be superior to the
previous Model-Based User Interface Development Environment
versions since it makes the UI design knowledge explicit,
and external for instance as model-to-model transformations
and model-to-code compilation rules [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. However, neither
Model-Based User Interface Development Environment
automatic generated UIs nor final UIs produced by MDE have
enough quality, forcing designers to manually tweak the
generated UI code [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Design knowledge can not be always
explicitly represented into the models, but it has a potential to
help final users. Some models as for instance the task model
have this potential explicitly represented, and they can
contribute also to guide and help the user.
This research will study how Self-Explanatory User
Interfaces (SE-UIs) can be built using the MDE. A SE-UI is a
UI with the capacity of understanding its own rationale and
consequently having the abilities of answer questions about
it. We aim to provide a method for creating SE-UIs
analyzing the relations between the different levels of abstraction
in our MDE-compliant approach for developing UIs as well
as the different models presented into the UsiXML
specification and their relations. Complementary views of the UI
are also considered into this research.
      </p>
      <p>The rest of the paper presents the related work and our
contribution to the field.</p>
      <p>RELATED WORK
The two major areas involved in our Self-Explanation
approach are MDE and UI quality. The related works of the
next two sections allow us to set up the bases of our
contribution.</p>
      <p>
        MDE
The Cameleon Reference Framework [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] presented a
MDEcompliant approach for developing UIs consisting of four
different levels of abstraction: Task Model, Abstract User
Interface, Concrete User Interface and Final User Interface.
These levels correspond, in terms of MDE, to
ComputingIndependent Model (CIM), Platform-Independent Model
(PIM), Platform-Specific Model (PSM) and the code
level respectively. In the Model-Driven Development (MDD)
many transformation engines for UI development have been
created. Several researches have addressed the mapping
problem for supporting MDD of UIs: Teresa [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], ATL [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ],
oAW [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] and UsiXML [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ] among others. A comparative
analysis can be found in [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. Semantic Networks have been
also covered for UIs [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. The Meta-UI concept was
firstly proposed in [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] and deeply explored later in many other
works. In one of them [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ], the concept of Mega-UI is
studied introducing Extra-UIs, allowing a new degree of control
by the use of views over the (meta-)models. We will focus
on it later as these views are relevant for the explanation of
the UI and consequently for the end-user’s comprehension.
UIs Quality
Help systems have been extensively studied. One of the most
relevant works is the Crystal application framework [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ].
Inspired by the Whyline research [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ], “Crystal” provides an
architecture and interaction techniques that allow
programmers to create applications that let the user ask a wide
variety of questions about why things did and did not happen,
and how to use the related features of the application
without using natural language [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]. Even if this approach does
not cover the capacity of adaptation to different contexts of
use, it represents an important improvement in quality for
the end-user in terms of achieved value. Quality can be
improved regarding not only the achieved value, but also from
the perspectives of software features and interaction
experiences [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. The integration of Usability Evaluation Methods
(UEM) [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ] into a MDA process has been proved to be
feasible in [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. In particular, the evaluation at the PIM or PSM
should be done in an interactive way until these models have
the required level of usability. Different UEMs (e.g.,
heuristic evaluation, usability test, etc) can be applied iteratively
until the concerned models have the required level of
usability. A set of ergonomic criteria for the evaluation of
HumanComputer Interaction (HCI) can be found in [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
This research improves quality of help systems allowing a
new range of questions. Adaptation to the context of use
is now considered since SE-UIs understand their own
rationale.
      </p>
      <p>CONTRIBUTION
End-User’s point of view
The goal of this work is to study how SE-UI can be built by
MDE. One of the ways to explore SE-UI involves the task
model and its rationale. A task model describes the user’s
task in terms of objectives and procedures. Procedures
recursively decompose tasks into subtasks until one or more
elementary tasks are reached, i.e., tasks which would be
decomposable into physical actions only (“press the button”).
A task model is well-defined then by the following terms:</p>
    </sec>
    <sec id="sec-2">
      <title>Nodes Containing abstract tasks</title>
    </sec>
    <sec id="sec-3">
      <title>Leaves Special nodes containing elementary tasks</title>
      <p>Branches Expressing logical and temporal relations
between tasks, subtasks and elementary tasks
The explicit information contained into the branches can
help and guide the end-user answering questions related to
different aspects of the UI. For instance, regarding the
rationale of the UI questions like which is the purpose of the
UI? can be successfully answered; also, questions as why
is the UI structured into this set of workspaces? or what is
the purpose of this button? can be explained understanding
the relations of the design rationale. The current state of the
UI and consequently the state of the application, can trigger
a different kind of questions to the end-user as for instance
why is the menu disabled?, as well as questions related to
the overall progress of a task or questions about the
evolution of the current state of the application as for example
how can I enable this feature? Answers for all of them can
be obtained exploring tasks and subtasks (nodes),
elementary tasks (leaves) and relations between them (branches) in
the task model.</p>
      <p>
        This work will study also how different views of the model
centered in extra-UIs, can help the end-user to understand
the UI. A extra-UI [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] is a UI which represents and gives
the control of a UI through a model. It is in a sense the UI of
the configuration of a UI. These views can improve the
enduser’s comprehension as they are relevant for the explanation
of the UI. Extra-UIs provide a new degree of control over the
(meta-)models of the UI; both designer and end-user can see
and understand how tasks are decomposed and how tasks
are represented in a specific UI. In other words, how the UI
is interfacing the interaction between the application and the
own user. Designers can express this interaction in the form
of relations between tasks and elements of the final UI with
the method explained in the next section.
      </p>
      <p>Designer’s point of view
This work will explore a method to provide designers with
a technique to add Self-Explanation to final UIs, specifying
how end-user’s tasks are directly related to the final UI level.
The method consists in four steps:
1. Specify the final UI of the model-compliant application
that it will be extended with SE functionality.</p>
    </sec>
    <sec id="sec-4">
      <title>2. Define the task model of the application.</title>
      <p>3. Specify the relations between both the task model and the
final UI.
4. A new final SE-UI will be generated from these relations,
adding SE functionality in real-time.</p>
      <p>To support this method, we will supply designers with an
editor in which tasks models and final UIs can be created.
Both of them will coexist at the same time into the same
workspace inside this editor. Once the task model and the
final UI are represented, the designer will draw direct
connections between elements of the task model and elements
of the final UI, linking for instance, widgets with subtasks,
as we can see in figure 1. Here, the task called Specify
identity is visually connected to a group of widgets, containing
two labels and two input fields. Then, the elementary task
Specify first name which is also a subtask, is connected to a
new subgroup of two widgets, one label and one input field.
The purpose of the method is to allow designers to
specify direct relations between tasks and different elements of
the final UI. The main advantage for designers is that from
now on, there is no need of a deeply comprehension of
all the model-to-model and model-to-code transformations
between all the four levels of MDE. A visual
representation gives direct information about these relations because
connections are explicitly represented in a visual render, in
which the final UI and the task model levels share the same
workspace.</p>
      <p>
        To allow end-user questions this study will consider a help
button (figure 2) as a first approach. Other approaches can be
considered as well. By clicking this help button, the
application enters in a help mode where the end-user can ask about
different elements of the UI just by clicking on them.
Answers will be generated in real-time in different ways. The
following section illustrates an example of this procedure.
Answering questions
This work will study also how different questions can be
answered. The first approach will associate a description to
each element (tasks, relations, widgets, etc.) of figure 1.
Other approaches like semantic networks [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] can be considered
in the future. If the end-user asks himself, for instance, Why
is the OK button disabled?, by clicking on this button using
the special help mode, the system can say that the task is not
completed. In figure 2 the message is dynamically derived
from the relations of figure 1. For an edit box, the
application can say You must fill in + Description of the task, where
your personal information is the description. A more
specific information can be generated exploring the task model.
For instance, we can travel all the subtasks of the
uncompleted task. In the example before, we can answer also that the
user needs to fill in the first name and the last name, because
these subtasks are both uncompleted.
      </p>
      <p>CONCLUSION
This research takes a significant step forward in the
development of high quality UIs. It explores MDE of UIs to provide
Self-Explanation at run-time, analysing the four levels of the
MDE-compliant approach for developing UIs and the
different models presented into the UsiXML specification and
their relations. Complementary views of the UI are explored
in order to exploit these models, explaining the UI itself and
giving to the user a new dimension of control by these views.
This opens the work on End-User programming.
ACKNOWLEDGMENTS
This work is funded by the european ITEA UsiXML project.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <given-names>S.</given-names>
            <surname>Abraho</surname>
          </string-name>
          , E. Iborra, and
          <string-name>
            <given-names>J.</given-names>
            <surname>Vanderdonckt</surname>
          </string-name>
          .
          <article-title>Maturing Usability, chapter Usability Evaluation of User Interfaces Generated with a Model-Driven Architecture Tool</article-title>
          , pages
          <fpage>3</fpage>
          -
          <lpage>32</lpage>
          . Human-Computer Interaction Series. Springer-Verlag,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <given-names>N.</given-names>
            <surname>Aquino</surname>
          </string-name>
          .
          <article-title>Adding flexibility in the model-driven engineering of user interfaces</article-title>
          .
          <source>In EICS '09: Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems</source>
          , pages
          <fpage>329</fpage>
          -
          <lpage>332</lpage>
          , New York, NY, USA,
          <year>2009</year>
          . ACM.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Bastien</surname>
          </string-name>
          and
          <string-name>
            <given-names>D. L.</given-names>
            <surname>Scapin</surname>
          </string-name>
          .
          <article-title>Ergonomic criteria for the evaluation of human-computer interfaces</article-title>
          .
          <source>0 RT-0156</source>
          , INRIA, 06
          <year>1993</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <given-names>G.</given-names>
            <surname>Calvary</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Coutaz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Thevenin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q.</given-names>
            <surname>Limbourg</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Bouillon</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J.</given-names>
            <surname>Vanderdonckt</surname>
          </string-name>
          .
          <article-title>A unifying reference framework for multi-target user interfaces</article-title>
          .
          <source>Interacting With</source>
          Computers Vol.
          <volume>15</volume>
          /3, pages
          <fpage>289</fpage>
          -
          <lpage>308</lpage>
          ,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <given-names>B.</given-names>
            <surname>Collignon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Vanderdonckt</surname>
          </string-name>
          , and
          <string-name>
            <given-names>G.</given-names>
            <surname>Calvary</surname>
          </string-name>
          .
          <article-title>Model-driven engineering of multi-target plastic user interfaces</article-title>
          .
          <source>In Proc. of 4th International Conference on Autonomic and Autonomous Systems ICAS</source>
          <year>2008</year>
          , pages
          <fpage>7</fpage>
          -
          <lpage>14</lpage>
          ,
          <year>2008</year>
          . D.
          <string-name>
            <surname>Greenwood</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Grottke</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <string-name>
            <surname>Lutfiyya</surname>
          </string-name>
          , M. Popescu (eds.), IEEE Computer Society Press, Los Alamitos, Gosier,
          <fpage>16</fpage>
          -
          <lpage>21</lpage>
          March
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <given-names>M.</given-names>
            <surname>Corporation</surname>
          </string-name>
          .
          <article-title>Microsoft inductive user interface guidelines</article-title>
          ,
          <year>2001</year>
          . http://msdn.microsoft.com/en-us/library/ms997506.aspx.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <given-names>J.</given-names>
            <surname>Coutaz</surname>
          </string-name>
          .
          <article-title>Meta-user interfaces for ambient spaces</article-title>
          .
          <source>In Tamodia'06</source>
          ,
          <year>2006</year>
          . 8 pages.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <given-names>A.</given-names>
            <surname>Demeure</surname>
          </string-name>
          , G. Calvary,
          <string-name>
            <given-names>J.</given-names>
            <surname>Coutaz</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J.</given-names>
            <surname>Vanderdonckt</surname>
          </string-name>
          .
          <article-title>Towards run time plasticity control based on a semantic network</article-title>
          .
          <source>In Fifth International Workshop on Task Models and Diagrams for UI design (TAMODIA'06)</source>
          , pages
          <fpage>324</fpage>
          -
          <lpage>338</lpage>
          ,
          <year>2006</year>
          . Hasselt, Belgium,
          <source>October 23-24</source>
          ,
          <year>2006</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9. J. Gonza´lez Calleros,
          <string-name>
            <given-names>A.</given-names>
            <surname>Stanciulescu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Vanderdonckt</surname>
          </string-name>
          ,
          <string-name>
            <surname>D. J.P.</surname>
          </string-name>
          , and
          <string-name>
            <given-names>M.</given-names>
            <surname>Winckler</surname>
          </string-name>
          .
          <article-title>A comparative analysis of tranformation engines for user interface development</article-title>
          .
          <source>In Proc. of the 4th International Workshop on Model-Driven Web Engineering (MDWE</source>
          <year>2008</year>
          ), pages
          <fpage>16</fpage>
          -
          <lpage>30</lpage>
          , Tolouse, France,
          <year>2008</year>
          . CEUR Workshop Proceedings.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <given-names>F.</given-names>
            <surname>Jouault</surname>
          </string-name>
          and
          <string-name>
            <surname>I. Kurtev.</surname>
          </string-name>
          <article-title>Transforming models with atl</article-title>
          .
          <source>In Satellite Events at the MoDELS 2005 Conference</source>
          , volume
          <volume>3844</volume>
          of Lecture Notes in Computer Science, pages
          <fpage>128</fpage>
          -
          <lpage>138</lpage>
          , Berlin,
          <year>2006</year>
          . Springer Verlag.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <given-names>A. J.</given-names>
            <surname>Ko</surname>
          </string-name>
          and
          <string-name>
            <given-names>B. A.</given-names>
            <surname>Myers</surname>
          </string-name>
          .
          <article-title>Designing the whyline: a debugging interface for asking questions about program behavior</article-title>
          .
          <source>In CHI '04: Proceedings of the SIGCHI conference on Human factors in computing systems</source>
          , pages
          <fpage>151</fpage>
          -
          <lpage>158</lpage>
          , New York, NY, USA,
          <year>2004</year>
          . ACM.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12. E.
          <string-name>
            <surname>Lai-Chong Law</surname>
            ,
            <given-names>E. T.</given-names>
          </string-name>
          <string-name>
            <surname>Hvannberg</surname>
            , and
            <given-names>G. Cockton. Maturing</given-names>
          </string-name>
          <string-name>
            <surname>Usability</surname>
          </string-name>
          .
          <article-title>Quality in Software, Interaction and Value</article-title>
          . Human-Computer Interaction Series. Springer-Verlag,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <given-names>E. L.</given-names>
            <surname>Law</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. T.</given-names>
            <surname>Hvannberg</surname>
          </string-name>
          , G. Cockton,
          <string-name>
            <given-names>P.</given-names>
            <surname>Palanque</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Scapin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Springett</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Stary</surname>
          </string-name>
          , and
          <string-name>
            <given-names>J.</given-names>
            <surname>Vanderdonckt</surname>
          </string-name>
          .
          <article-title>Towards the maturation of IT usability evaluation (MAUSE)</article-title>
          .
          <source>In Human-Computer Interaction - INTERACT</source>
          <year>2005</year>
          , pages
          <fpage>1134</fpage>
          -
          <lpage>1137</lpage>
          .
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <given-names>G.</given-names>
            <surname>Mori</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Paterno</surname>
          </string-name>
          , and
          <string-name>
            <given-names>C.</given-names>
            <surname>Santoro</surname>
          </string-name>
          .
          <article-title>Design and development of multidevice user interfaces through multiple logical descriptions</article-title>
          .
          <source>IEEE Trans. Softw</source>
          . Eng.,
          <volume>30</volume>
          (
          <issue>8</issue>
          ):
          <fpage>507</fpage>
          -
          <lpage>520</lpage>
          ,
          <year>2004</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <given-names>B. A.</given-names>
            <surname>Myers</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. A.</given-names>
            <surname>Weitzman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. J.</given-names>
            <surname>Ko</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D. H.</given-names>
            <surname>Chau</surname>
          </string-name>
          .
          <article-title>Answering why and why not questions in user interfaces</article-title>
          .
          <source>In CHI '06: Proceedings of the SIGCHI conference on Human Factors in computing systems</source>
          , pages
          <fpage>397</fpage>
          -
          <lpage>406</lpage>
          , New York, NY, USA,
          <year>2006</year>
          . ACM.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>J.-S. Sottet</surname>
          </string-name>
          , G. Calvary,
          <string-name>
            <surname>J.-M. Favre</surname>
            , and
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Coutaz</surname>
          </string-name>
          .
          <article-title>Megamodeling and Metamodel-Driven Engineering for Plastic User Interfaces: Mega-</article-title>
          UI.
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <given-names>J.</given-names>
            <surname>Vanderdonckt</surname>
          </string-name>
          .
          <article-title>A MDA-Compliant environment for developing user interfaces of information systems</article-title>
          .
          <source>In Advanced Information Systems Engineering</source>
          , pages
          <fpage>16</fpage>
          -
          <lpage>31</lpage>
          .
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <given-names>J.</given-names>
            <surname>Vanderdonckt</surname>
          </string-name>
          .
          <article-title>Model-driven engineering of user interfaces: Promises, successes, failures, and challenges</article-title>
          .
          <source>In Proc. of 5th Annual Romanian Conf. on Human-Computer Interaction ROCHI'</source>
          <year>2008</year>
          , (
          <issue>Iasi</issue>
          ,
          <fpage>18</fpage>
          -
          <issue>19</issue>
          <year>September 2008</year>
          ), pp.
          <fpage>1</fpage>
          -
          <lpage>10</lpage>
          .
          <string-name>
            <surname>Matrix</surname>
            <given-names>ROM</given-names>
          </string-name>
          , Bucarest,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19.
          <string-name>
            <surname>J. Vanderdonckt</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Coutaz</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <article-title>Calvary, and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Stanciulescu</surname>
          </string-name>
          .
          <article-title>Multimodality for Plastic User Interfaces: Models, Methods,</article-title>
          and Principles, chapter
          <volume>4</volume>
          , pages
          <fpage>61</fpage>
          -
          <lpage>84</lpage>
          .
          <year>2008</year>
          . D. Tzovaras (ed.),
          <source>Lecture Notes in Electrical Engineering</source>
          , Springer-Verlag, Berlin,
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>