Self-Explanatory User Interfaces by Model-Driven Engineering Alfonso Garcı́a Frey, Gaëlle Calvary and Sophie Dupuy-Chessa University of Grenoble, CNRS, LIG 385, avenue de la Bibliothèque, 38400, Saint-Martin d’Hères, France {Alfonso.Garcia-Frey, Gaelle.Calvary, Sophie.Dupuy}@imag.fr ABSTRACT the necessary information to the user in an appropiate for- Modern User Interfaces (UI) must deal with the increasing mat. This can take place at any time in the whole interaction complexity of applications as well as new features such as process between both the user and the UI. However, modern the capacity of UIs to be dynamically adapted to the con- applications cover only a few questions the user may have, or text of use. The complexity does not necessarily imply a provide a general help instead of a clear and concise answer better quality. Thus, it becomes necessary to make users un- to a given question. Furthermore, help is created ad-hoc, this derstand the UIs. This paper describes an on-going research is, it has been previously generated and it’s not able to cover about Self-Explanatory User Interfaces (SE-UI) by Model- new questions at run-time because they were not considered Driven Engineering (MDE). Self-explanation makes refer- by the designers. UI design problems are not covered at all ence to the capacity of a UI to provide the end-user with because the designers are not aware of them. information about its rationale (which is the purpose of the UI), its design rationale (why is the UI structured into this Moreover, the UI must deal with users having different levels set of workspaces?, what’s the purpose of this button?), its of expertise. Even many long-time users never master com- current state (why is the menu disabled?) as well as the evo- mon procedures [6] and in other cases, users must work hard lution of the state (how can I enable this feature?). Explana- to figure out each feature or screen [6]. tions are provided by embedded models. We explore model- driven engineering to understand why and how this approach The problem is greater for Plastic UIs [5, 19]. Plastic UIs can lead us to overcome shortcomings of UI quality success- demand dynamic adaptation also for help systems because fully. from now on, developers can’t afford to consider all the dif- ferent contexts of use one by one coding all possible ad-hoc Author Keywords solutions by hand. This complicates the prediction of the re- Self-Explanatory User Interfaces, UI quality, help, design ra- sult and the final quality, making difficult the design choices. tionale, model-driven engineering, model transformation. As a result, dynamic solutions are required also for help sys- tems. These help systems must now be aware of the context ACM Classification Keywords of use (user, platform and environment), the task, the struc- H.5.2 User Interfaces: Theory and method. ture and presentation of the UI. INTRODUCTION Motivation MDE and MB-UIDE approaches On the one hand, most software is too hard to use.“Modern On the other hand, Model-Driven Engineering (MDE) exists applications such as Microsft Word have many automatic since long time ago and its recently applied to the engineer- features and hidden dependencies that are frequently helpful ing of UIs. It consists in describing different features of UIs but can be mysterious to both novice and expert users” [15]. (e.g., task, domain, context of use) in models from which a Users may require assistance while interacting with a User final UI is produced [18] according to a forward engineering Interface (UI). Ideally, the UI must guide the user in accom- process. MDE of UI is assumed to be superior to the previ- plishing a task the application was designed for. The user ous Model-Based User Interface Development Environment can request help about functionality, features, or any infor- versions since it makes the UI design knowledge explicit, mation about the process of the task that is being performed. and external for instance as model-to-model transformations The UI must be able to provide the correct answer, giving and model-to-code compilation rules [2]. However, neither Model-Based User Interface Development Environment au- tomatic generated UIs nor final UIs produced by MDE have Pre-proceedings of the 5th International Workshop on Model Driven Development of enough quality, forcing designers to manually tweak the gen- Permission Advanced UsertoInterfaces make digital (MDDAUI or hard copies 2010): of all Bridging or part between UserofExperience this workand for personal UI or classroom Engineering, organized use at theis28th granted ACMwithout feeonprovided Conference that copies Human Factors in are erated UI code [2]. Design knowledge can not be always ex- not made Systems Computing or distributed for profit (CHI 2010), or Georgia, Atlanta, commercial USA,advantage and that copies April 10, 2010. plicitly represented into the models, but it has a potential to bear this notice and the full citation on the first page. To copy otherwise, or help final users. Some models as for instance the task model republish,©to Copyright postforontheservers 2010 or topapers individual redistribute to lists, by the papers' requires authors. priorpermitted Copying specific permission for private andand/or academica fee. purposes. Re-publication of material from this volume have this potential explicitly represented, and they can con- CHI 2009, requires April 4by- the permission 9, 2009, Boston, copyright Massachusetts, owners. This volume is USA. published by its editors. tribute also to guide and help the user. Copyright 2009 ACM 978-1-60558-246-7/09/04...$5.00. 1 1 This research will study how Self-Explanatory User Inter- the required level of usability. Different UEMs (e.g., heuris- faces (SE-UIs) can be built using the MDE. A SE-UI is a tic evaluation, usability test, etc) can be applied iteratively UI with the capacity of understanding its own rationale and until the concerned models have the required level of usabil- consequently having the abilities of answer questions about ity. A set of ergonomic criteria for the evaluation of Human- it. We aim to provide a method for creating SE-UIs analyz- Computer Interaction (HCI) can be found in [3]. ing the relations between the different levels of abstraction in our MDE-compliant approach for developing UIs as well This research improves quality of help systems allowing a as the different models presented into the UsiXML specifi- new range of questions. Adaptation to the context of use cation and their relations. Complementary views of the UI is now considered since SE-UIs understand their own ratio- are also considered into this research. nale. The rest of the paper presents the related work and our con- CONTRIBUTION tribution to the field. End-User’s point of view The goal of this work is to study how SE-UI can be built by RELATED WORK MDE. One of the ways to explore SE-UI involves the task The two major areas involved in our Self-Explanation ap- model and its rationale. A task model describes the user’s proach are MDE and UI quality. The related works of the task in terms of objectives and procedures. Procedures re- next two sections allow us to set up the bases of our contri- cursively decompose tasks into subtasks until one or more bution. elementary tasks are reached, i.e., tasks which would be de- composable into physical actions only (“press the button”). MDE A task model is well-defined then by the following terms: The Cameleon Reference Framework [4] presented a MDE- compliant approach for developing UIs consisting of four Nodes Containing abstract tasks different levels of abstraction: Task Model, Abstract User Interface, Concrete User Interface and Final User Interface. Leaves Special nodes containing elementary tasks These levels correspond, in terms of MDE, to Computing- Branches Expressing logical and temporal relations be- Independent Model (CIM), Platform-Independent Model tween tasks, subtasks and elementary tasks (PIM), Platform-Specific Model (PSM) and the code lev- el respectively. In the Model-Driven Development (MDD) The explicit information contained into the branches can many transformation engines for UI development have been help and guide the end-user answering questions related to created. Several researches have addressed the mapping different aspects of the UI. For instance, regarding the ra- problem for supporting MDD of UIs: Teresa [14], ATL [10], tionale of the UI questions like which is the purpose of the oAW [10] and UsiXML [17] among others. A comparative UI? can be successfully answered; also, questions as why analysis can be found in [9]. Semantic Networks have been is the UI structured into this set of workspaces? or what is also covered for UIs [8]. The Meta-UI concept was first- the purpose of this button? can be explained understanding ly proposed in [7] and deeply explored later in many other the relations of the design rationale. The current state of the works. In one of them [16], the concept of Mega-UI is stud- UI and consequently the state of the application, can trigger ied introducing Extra-UIs, allowing a new degree of control a different kind of questions to the end-user as for instance by the use of views over the (meta-)models. We will focus why is the menu disabled?, as well as questions related to on it later as these views are relevant for the explanation of the overall progress of a task or questions about the evolu- the UI and consequently for the end-user’s comprehension. tion of the current state of the application as for example how can I enable this feature? Answers for all of them can UIs Quality be obtained exploring tasks and subtasks (nodes), elemen- Help systems have been extensively studied. One of the most tary tasks (leaves) and relations between them (branches) in relevant works is the Crystal application framework [15]. In- the task model. spired by the Whyline research [11], “Crystal” provides an architecture and interaction techniques that allow program- This work will study also how different views of the model mers to create applications that let the user ask a wide va- centered in extra-UIs, can help the end-user to understand riety of questions about why things did and did not happen, the UI. A extra-UI [16] is a UI which represents and gives and how to use the related features of the application with- the control of a UI through a model. It is in a sense the UI of out using natural language [15]. Even if this approach does the configuration of a UI. These views can improve the end- not cover the capacity of adaptation to different contexts of user’s comprehension as they are relevant for the explanation use, it represents an important improvement in quality for of the UI. Extra-UIs provide a new degree of control over the the end-user in terms of achieved value. Quality can be im- (meta-)models of the UI; both designer and end-user can see proved regarding not only the achieved value, but also from and understand how tasks are decomposed and how tasks the perspectives of software features and interaction experi- are represented in a specific UI. In other words, how the UI ences [12]. The integration of Usability Evaluation Methods is interfacing the interaction between the application and the (UEM) [13] into a MDA process has been proved to be fea- own user. Designers can express this interaction in the form sible in [1]. In particular, the evaluation at the PIM or PSM of relations between tasks and elements of the final UI with should be done in an interactive way until these models have the method explained in the next section. 2 2 Figure 2. Help message derived from connections in Figure 1. The purpose of the method is to allow designers to speci- fy direct relations between tasks and different elements of the final UI. The main advantage for designers is that from now on, there is no need of a deeply comprehension of all the model-to-model and model-to-code transformations between all the four levels of MDE. A visual representa- tion gives direct information about these relations because connections are explicitly represented in a visual render, in which the final UI and the task model levels share the same workspace. To allow end-user questions this study will consider a help button (figure 2) as a first approach. Other approaches can be considered as well. By clicking this help button, the applica- Figure 1. Association between UI and a task model. tion enters in a help mode where the end-user can ask about different elements of the UI just by clicking on them. An- Designer’s point of view swers will be generated in real-time in different ways. The following section illustrates an example of this procedure. This work will explore a method to provide designers with a technique to add Self-Explanation to final UIs, specifying how end-user’s tasks are directly related to the final UI level. Answering questions The method consists in four steps: This work will study also how different questions can be answered. The first approach will associate a description to 1. Specify the final UI of the model-compliant application each element (tasks, relations, widgets, etc.) of figure 1. Oth- that it will be extended with SE functionality. er approaches like semantic networks [8] can be considered in the future. If the end-user asks himself, for instance, Why 2. Define the task model of the application. is the OK button disabled?, by clicking on this button using the special help mode, the system can say that the task is not 3. Specify the relations between both the task model and the completed. In figure 2 the message is dynamically derived final UI. from the relations of figure 1. For an edit box, the applica- 4. A new final SE-UI will be generated from these relations, tion can say You must fill in + Description of the task, where adding SE functionality in real-time. your personal information is the description. A more spe- cific information can be generated exploring the task model. To support this method, we will supply designers with an For instance, we can travel all the subtasks of the uncomplet- editor in which tasks models and final UIs can be created. ed task. In the example before, we can answer also that the Both of them will coexist at the same time into the same user needs to fill in the first name and the last name, because workspace inside this editor. Once the task model and the these subtasks are both uncompleted. final UI are represented, the designer will draw direct con- nections between elements of the task model and elements CONCLUSION of the final UI, linking for instance, widgets with subtasks, This research takes a significant step forward in the develop- as we can see in figure 1. Here, the task called Specify iden- ment of high quality UIs. It explores MDE of UIs to provide tity is visually connected to a group of widgets, containing Self-Explanation at run-time, analysing the four levels of the two labels and two input fields. Then, the elementary task MDE-compliant approach for developing UIs and the dif- Specify first name which is also a subtask, is connected to a ferent models presented into the UsiXML specification and new subgroup of two widgets, one label and one input field. their relations. Complementary views of the UI are explored 3 3 in order to exploit these models, explaining the UI itself and 9. J. González Calleros, A. Stanciulescu, J. Vanderdonckt, giving to the user a new dimension of control by these views. D. J.P., and M. Winckler. A comparative analysis of This opens the work on End-User programming. tranformation engines for user interface development. In Proc. of the 4th International Workshop on Model-Driven Web Engineering (MDWE 2008), pages 16–30, Tolouse, France, 2008. CEUR Workshop ACKNOWLEDGMENTS Proceedings. This work is funded by the european ITEA UsiXML project. 10. F. Jouault and I. Kurtev. Transforming models with atl. In Satellite Events at the MoDELS 2005 Conference, volume 3844 of Lecture Notes in Computer Science, REFERENCES pages 128–138, Berlin, 2006. Springer Verlag. 1. S. Abraho, E. Iborra, and J. Vanderdonckt. Maturing Usability, chapter Usability Evaluation of User 11. A. J. Ko and B. A. Myers. Designing the whyline: a Interfaces Generated with a Model-Driven Architecture debugging interface for asking questions about program Tool, pages 3–32. Human-Computer Interaction Series. behavior. In CHI ’04: Proceedings of the SIGCHI Springer-Verlag, 2008. conference on Human factors in computing systems, pages 151–158, New York, NY, USA, 2004. ACM. 2. N. Aquino. Adding flexibility in the model-driven 12. E. Lai-Chong Law, E. T. Hvannberg, and G. Cockton. engineering of user interfaces. In EICS ’09: Proceedings Maturing Usability. Quality in Software, Interaction and of the 1st ACM SIGCHI symposium on Engineering Value. Human-Computer Interaction Series. interactive computing systems, pages 329–332, New Springer-Verlag, 2008. York, NY, USA, 2009. ACM. 13. E. L. Law, E. T. Hvannberg, G. Cockton, P. Palanque, D. Scapin, M. Springett, C. Stary, and J. Vanderdonckt. Towards the maturation of IT usability evaluation 3. J. C. Bastien and D. L. Scapin. Ergonomic criteria for (MAUSE). In Human-Computer Interaction - the evaluation of human-computer interfaces. 0 INTERACT 2005, pages 1134–1137. 2005. RT-0156, INRIA, 06 1993. 14. G. Mori, F. Paterno, and C. Santoro. Design and development of multidevice user interfaces through 4. G. Calvary, J. Coutaz, D. Thevenin, Q. Limbourg, multiple logical descriptions. IEEE Trans. Softw. Eng., L. Bouillon, and J. Vanderdonckt. A unifying reference 30(8):507–520, 2004. framework for multi-target user interfaces. Interacting 15. B. A. Myers, D. A. Weitzman, A. J. Ko, and D. H. With Computers Vol. 15/3, pages 289–308, 2003. Chau. Answering why and why not questions in user interfaces. In CHI ’06: Proceedings of the SIGCHI conference on Human Factors in computing systems, 5. B. Collignon, J. Vanderdonckt, and G. Calvary. pages 397–406, New York, NY, USA, 2006. ACM. Model-driven engineering of multi-target plastic user interfaces. In Proc. of 4th International Conference on 16. J.-S. Sottet, G. Calvary, J.-M. Favre, and J. Coutaz. Autonomic and Autonomous Systems ICAS 2008, pages Megamodeling and Metamodel-Driven Engineering for 7–14, 2008. D. Greenwood, M. Grottke, H. Lutfiyya, M. Plastic User Interfaces: Mega-UI. 2007. Popescu (eds.), IEEE Computer Society Press, Los Alamitos, Gosier, 16-21 March 2008. 17. J. Vanderdonckt. A MDA-Compliant environment for developing user interfaces of information systems. In Advanced Information Systems Engineering, pages 6. M. Corporation. Microsoft inductive user interface 16–31. 2005. guidelines, 2001. 18. J. Vanderdonckt. Model-driven engineering of user http://msdn.microsoft.com/en-us/library/ms997506.aspx. interfaces: Promises, successes, failures, and challenges. In Proc. of 5th Annual Romanian Conf. on Human–Computer Interaction ROCHI’2008, (Iasi, 7. J. Coutaz. Meta-user interfaces for ambient spaces. In 18–19 September 2008), pp. 1—10. Matrix ROM, Tamodia’06, 2006. 8 pages. Bucarest, 2008. 19. J. Vanderdonckt, J. Coutaz, G. Calvary, and 8. A. Demeure, G. Calvary, J. Coutaz, and J. Vanderdonckt. A. Stanciulescu. Multimodality for Plastic User Towards run time plasticity control based on a semantic Interfaces: Models, Methods, and Principles, chapter 4, network. In Fifth International Workshop on Task pages 61–84. 2008. D. Tzovaras (ed.), Lecture Notes in Models and Diagrams for UI design (TAMODIA’06), Electrical Engineering, Springer-Verlag, Berlin, 2007. pages 324–338, 2006. Hasselt, Belgium, October 23-24, 2006. 4 4