Preserving Designer Input on Concrete User Interfaces Using Constraints While Maintaining Adaptive Behavior Pierre A. Akiki, Arosha K. Bandara, and Yijun Yu Computing Department, The Open University Walton Hall, Milton Keynes, United Kingdom {pierre.akiki, a.k.bandara, y.yu}@open.ac.uk Figure 1. Adding Constraints on the CUI as Part of the Process of Developing Adaptive Model-Driven UIs (Step 2) and Maintaining These Constraints When the Adaptation Engine Applies the Adaptive Behavior (Step 4) ABSTRACT INTRODUCTION User interface (UI) adaptation is applied when a single UI User interface (UI) adaptation is applied when a single UI design might not be adequate for maintaining usability in design might not be adequate for maintaining usability in multiple contexts-of-use that can vary according to the user, multiple contexts-of-use that can vary according to the user, platform, and environment. Fully-automated UI generation platform, and environment. UI adaptation is either labeled techniques have been criticized for not matching the as adaptable meaning that manual adaptation is required or ingenuity of human designers and manual UI adaptation has adaptive indicating that an automatic adaptation is done. By also been criticized for being time consuming especially observing the literature we can see that there are a variety when it is necessary to adapt the UI for a large number of of UI adaptation techniques that adopt manual adaptation contexts. This paper presents a work-in-progress approach (adaptable UI) such as “two interface design” [14] and that uses constraints for preserving designer input on “crowdsourced adaptation” [17] or automated adaptation concrete user interfaces upon applying adaptive behavior. (adaptive UI) such as “Supple” [13], and “Personal The constraints can be assigned by the UI designer using Universal Controller” [18]. our integrated development environment Cedar Studio. Some researchers have criticized fully-mechanized UI Author Keywords construction in favor of applying the intelligence of human Adaptive user interfaces; Designer input; Constraints; designers for achieving higher usability [21]. Adaptive UI Concrete user interfaces; Model-driven engineering behavior is also regarded by some as being unpredictable ACM Classification Keywords and possibly disorienting for users [11]. Other researchers D.2.2 Design Tools and Techniques - User interfaces; promote the use of adaptive behavior [5]. The automation [Information Interfaces and Presentation]: H.5.2 User provided by adaptive behavior provides advantages in terms Interfaces – User-centered design of saving development time thereby reducing the cost of adapting user interfaces to multiple contexts-of-use. General Terms Design; Human Factors The importance of obtaining a predictable outcome is emphasized due to its impact on the success of UI development techniques [16]. Some fully-automated Permission to make digital or hard copies of all or part of this work for approaches only allow designer input on a high level of personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies abstraction thereby decreasing the control and predictability bear this notice and the full citation on the first page. To copy otherwise, of the outcome. Other approaches support lower level input or republish, to post on servers or to redistribute to lists, requires prior such as control over the concrete widgets, nevertheless specific permission and/or a fee. upon applying adaptive behavior the input made by the human designer will be overridden. In this paper, we present a work-in-progress technique that  Adaptable UIs allow interested stakeholders to manually allows designers to assign UI constraints that are preserved adapt the desired characteristics after applying automated adaptive behavior. The constraints  Adaptive UIs automatically react to a change in the embody the characteristics of the UI that require human context-of-use by changing one or more of their ingenuity and are not met by fully-automated techniques. characteristics using a predefined set of adaptation rules The model-driven approach to user interface development  Truly Adaptive UIs can automatically react to a change in has been promoted by many research works such as the the context-of-use but are also capable of reacting to well-established CAMELEON reference framework [6]. contexts-of-use that were previously unknown CAMELEON represents user interfaces on multiple levels of abstraction: (1) Task Models can be represented as Adaptable UIs fully support manual designer input, which ConcurTaskTrees [20] and Domain Models as UML class provides an advantage in terms of applying the knowledge diagrams, (2) Abstract User Interface (AUI), represents the of a human designer but has a downside in terms of high UI independent of any modality (e.g., graphical, voice, development time. Both Adaptive and Truly Adaptive UIs etc.), (3) Concrete User Interface (CUI), represents the UI provide a higher level of automation through the ability of as concrete widgets (e.g., buttons, labels, etc.), and (4) adapting the UI using generic rules but even though the Final User Interface (FUI), is the running UI rendered in a rules are meant to produce an optimal UI based on the presentation technology. The model-driven approach to UI context-of-use, in some cases the input of the human development can serve as a basis for devising adaptive UIs designer can be essential (e.g. widget size, position, etc.). due to the possibility of applying different types of Raneburger et. al. presented an approach to automated adaptations on the various levels of abstraction [2]. Out of generation of WIMP style UIs. They attempt to enhance the the levels of abstraction presented by CAMELEON, the quality of the generated UIs by using a graphical tree editor CUI will be given particular attention in this paper since it to add hints to the transformations (e.g., the alignment of a embodies the designer’s ingenuity. Designer input on the widget) [22]. One problem is that UI designers might only CUI is particularly promoted by indicating that it would be work on the CUI level and the specification of the model better if the designer can manipulate a concrete object transformations would be left to the developers. Also, the rather than its abstraction [9]. By following such authors state that a graphical “what you see is what you get” recommendations, we can say that the designer should be (WYSIWYG) editor similar to the one presented by the allowed to create a CUI rather than completely generating it Gummy [15] system would improve on their approach. from an abstract model. Yet even though some approaches might offer designers with the ability to create CUIs, upon Supple is primarily capable of automatically generating UIs applying the adaptive UI behavior the designer’s choices that are adapted to each user’s motor abilities by treating UI are bound to change according to the adaptive UI behavior generation as an optimization problem [13]. Yet, although particular to a given context-of-use. Nevertheless, in certain the authors mention that Supple is not intended to replace cases designers would like to keep some UI characteristics human designers, the system only relies on a high level intact. We think this could be achieved by providing non- model to generate its final UI thereby preventing designer technical UI designers with a simple technique for assigning input from being made on the CUI level. constraints on the CUI. These constraints could be taken into DynaMo-AID [7] is presented as part of the Dygimes UI consideration and preserved at a later stage when the UI is creation framework. It incorporates a design process for the being automatically adapted to a particular context-of-use. development of context-aware UIs that are adaptable at The steps illustrated in Figure 1 show where our proposed runtime. Like Supple this system focuses on a high level UI technique fits in the process of developing adaptive model- representation (task models), which is used for automatic driven UIs. We can see that the constraints are added by the generation of the CUI. designer in Step 2 after adjusting the CUI design. Later, in MASP [10] provides designers with a graphical design tool Step 4 when the adaptation engine applies the adaptive to support the creation of layout models, which are later behavior it preserves the designer’s constraints. interpreted at runtime for supporting adaptive UI behavior. The remainder of this paper is structured as follows. The Although the tool supports designer input, no mechanism is next section briefly describes the related work. Then, an offered for maintaining this input after the adaptive example is given to highlight the importance of preserving behavior is applied. designer input on the CUI. Later, our approach to applying Smart templates are proposed for improving automatic CUI constraints is described. Finally, the conclusions and generation of ubiquitous remote control UIs on mobile future work are given. devices [19]. Although these templates improve the ability of preserving designer input, specifying the various RELATED WORK template variations could be time consuming and would be By observing the literature we can categorize UI adaptation classified under adaptable rather than adaptive behavior. approaches under the following categories: AN EXAMPLE OF USER INTERFACE CONSTRAINTS and 2 without providing any rationale, critics are going to We developed a mechanism called Role-Based User deem adaptive UIs as being unpredictable. Empowering Interface Simplification (RBUIS) [2] for simplifying UIs by human designers could strike a balance between automation minimizing their feature-set and optimizing their layout and human intelligence to increase adaptive UI predictability. based on the context-of-use (user, platform, environment). We define a minimal feature-set as the set with the least CONCRETE USER INTERFACE CONSTRAINTS features required by a user to perform a job. An optimal In many cases UIs are designed by non-technical designers. layout is the one that maximizes satisfaction of the Also, in another work we have highlighted the possibility of constraints imposed by a set of aspects such as computer engaging end-users in the UI adaptation process [3]. skills, culture, etc. An optimal layout is obtained by adapting Therefore, we think that the constraints we are proposing the properties of concrete widgets (e.g., type, grouping, should be kept simple in order to be implementable by the size, location, etc.). In RBUIS, the feature-set is minimized non-technical stakeholders. We devised a basic meta- by applying roles to task models and the layout is optimized model, illustrated in Figure 2, to reflect such constraints. by executing adaptive behavior workflows on the CUI. The workflows can embody visual and code-based constructs. RBUIS is based on the CEDAR architecture [1] and uses interpreted runtime models for the adaptation. Nevertheless, the designer can still create an initial fully-featured CUI. The feasibility of adapting a least constrained UI design was shown in a previous research [12]. RBUIS follows a similar approach by adapting an initial UI that is without constraints in terms of the feature-set and least constrained in terms of the layout (e.g., least constrained screen size). Adaptive UI behavior such as removing and adding widgets could leave gaps and deformations in the layout, which are not esthetically desirable and could increase the navigation Figure 2. Simple CUI Constraints Meta-Model time according to Fitts’s Law. A mechanism is needed for Since each CUIElement has Properties, Constraints can be maintaining plasticity, denoting the UI’s ability to adapt to attached to these properties in order to reflect designer the context-of-use while preserving its usability [8]. Hence, related choices regarding their values. A Constraint simply we can consider layouting as one example of UI constraints has a comparison operator (e.g., “>”, “<”, “=”, etc.) and a that could be influenced by choices made by a human value for comparison. In order to have a practical approach designer rather than merely automated choices. The that promotes easier constraint assignment, a constraint’s example illustrated in Figure 3 is that of a sales invoice UI, value should not necessarily be exact. It can be absolute or usually common in enterprise applications such as relative, quantitative or qualitative. For example, a constraint enterprise resource planning systems. Let us consider that on the width of a widget could be “> 100” or it could be “= we would like to apply RBUIS to this UI in order to Large”. It is possible to define ranges for such values or minimize its feature-set for a role that does not require all leave the decision to the adaptation engine to be made the initial features. The examples shown in Figure 4 and according to a given context and UI. Let us consider group Figure 5 are two possible layouting alternatives that could boxes “a” and “b” presented in both Figure 4 and Figure 5. be produced after eliminating the undesirable features from If the designer specified that the width of group box “a” the UI. The differences between the two versions are the should be “Medium” whereas that of group box “b” should layouting choices related to group boxes “a” and “b” on one be “Large” then the version in Figure 4 would be chosen hand, and data grid “c” and text box “d”. In Version 1, shown and vice-versa. The same could work for data grid “c” and in Figure 4, the width of group box “b” is increased in order text box “d”. The designer also has the ability to allocate to prevent scrolling but this is at the expense of the width of each Constraint to a Priority Class in order to indicate group box “a”, whereas in Version 2 shown in Figure 5 the which constraint would get eliminated in case a conflict opposite is done. Also, in Version 1 the width of text box occurs between two or more constraints. If conflicts still “d” is increased at the expense of the height of data grid “c” exist even with the priority classes, the system will then whereas in Version 2 an opposite choice is made. In both have to eliminate one at random. A Constraint can be one cases there are no absolute right and wrong choices. Such of two types: Strict or Lenient. For example, a lenient choices depend on what the human designer thinks is more equality constraint indicates that the original value can be appropriate. Is giving more room for data entry in the fields changed to close values whereas if it were strict it would of group box “a” and the text box in group box “d” more mean that the value should be exactly the same but it can important than showing additional items on the screen in still be dropped in case of a conflict. The coming section the radio button groups of group box “b” and data grid “c”? explains how we distinguish explicit and implicit constraints When an algorithm makes the choice between Versions 1 and our proposition for applying them in practice. Figure 3. Initial Sales Invoice User Interface Figure 4. Adapted Sales Invoice User Interface Version 1 Figure 5. Adapted Sales Invoice User Interface Version 2 Figure 6. Assigning Concrete User Interface Constraints in Cedar Studio APPLYING CONCRETE UI CONSTRAINTS 6. #the height and width of the canvas holding the widgets Cedar Studio is our integrated development environment 7. canvasWidth, canvasHeight = Reals('canvasWidth canvasHeight') (IDE) for supporting the development of adaptive UIs based 8. canvasWidth = 300; canvasHeight = 200 on a model-driven approach [4]. We consider that designer constraints can be explicitly or implicitly specified. Explicit 9. solve ( constraints are specified by the designer on the CUI #the two possibilities (noteWidth == canvasWidth and pictureHeight == properties whereas implicit constraints can be deduced from initialPictureHeight) or the design made on the canvas itself such as widget (noteWidth == initialNoteWidth and pictureHeight == ordering and positioning relative to other widgets. canvasHeight), #constraint based on the designer's input noteWidth == max(canvasWidth, initialNoteWidth)) Explicit Constraints We extended the CUI designer of Cedar Studio to support The problem shown in Listing 1 is expressed in Python and the addition of explicit designer constraints. Let us is relevant to the example demonstrated in Figure 6. It considers a basic example that requires such constraints and defines two variables “noteWidth” and “pictureHeight” to propose a technique for applying it in practice. Consider hold the calculated values of the widget properties. It takes that the “Phone Numbers” grid (Figure 6 – a) should be as input the initial property values (“initialNoteWidth” and eliminated for a given context-of-use. The layouting engine “initialPictureHeight”) and the height and width of the will be faced with two choices, either filling the space by canvas (“canvasHeight” and “canvasWidth”) that are the increasing the width of the “Note” (Figure 6 – b) or by possible values that these properties can take. The two increasing the height of the “Picture” (Figure 6 – c). If the possibilities at hand are either resizing the width of the designer adds a constraint as shown in Figure 6 – d to “Note” widget to fit the canvas width and keeping the indicate that the “Note” should have a “Large” width, the height of the “Picture” widget intact or vice-versa. Since the system should be able to incorporate this choice in a designer specified a constraint stating that the “Note” width constraint problem that can be passed to a constraint solver. should be “Large”, the problem was supplied with a constraint “noteWidth == max (canvasWidth, initialNoteWidth)” Listing 1. Constraint Problem Written in Python on Z3Py in order to choose the largest possible value. Running the 1. #variables to hold the final calculated width of the widgets problem on the Z3Py [24] constraint solver yields the 2. noteWidth, pictureHeight = Reals('noteWidth pictureHeight') following result: “[noteWidth = 300, pictureWidth = 200]”. 3. #initial width of the note and picture widgets The yielded values could be applied to the relevant CUI 4. initialNoteWidth,initialPictureHeight = Reals('initialNoteWidth element properties to obtain an adapted user interface that initialPictureHeight') preserves designer input. 5. initialNoteWidth = 250; initialPictureHeight = 200 Figure 7. Implicit Concrete User Interface Constraints – A Relative Positioning Example (a) Initial User Interface Design, (b) Minimized Feature-Set UI that Hides Widgets, (c) Refitted Layout UI Design The part of our algorithm that pushes the widgets upwards Implicit Constraints An implicit layouting constraint that we worked on as part is shown in Algorithm 1. We implemented the implicit of the layouting algorithm supporting RBUIS is related to constraints as a layouting algorithm due to its simplicity in the relative widget positioning and ordering specified by the comparison to having to generate a constraint problem such designer. Upon eliminating parts of the UI in Figure 7 – a to as the one shown in Listing 1. For example, the minimize its feature-set for a particular context-of-use as implementation excerpt shown in Algorithm 1 splits the shown in Figure 7 – b, this algorithm would be responsible CUI controls into ordered lines and moves each widget for refitting the UI by removing the gaps. The example in beneath the one above it from one of the previous lines. Figure 7 – c shows how the widgets are pushed upwards Expressing this algorithm as a separate constraint problem beneath the closest widget. Deducing implicit constraints for different contexts would have been more difficult than from the design made on the canvas saves the designer the writing one generic solution. effort of adding these constraints separately. CONCLUSIONS AND FUTURE WORK Algorithm 1. UI Refitting Written in C# (Excerpt) This paper presented a work-in-progress technique that 1. public bool RefitTop(List Controls, int StartingTop = 5) allows designers to supply CUI constraints that would be 2. { maintained after applying automated adaptations. We 3. List> lines = this.GetControlLines(Controls); categorized these constraints as explicit and implicit. 4. if (lines.Count == 0) { return true; } 5. Explicit constraints are specified by the designer on the 6. foreach (ControlInfo control in lines[0]) CUI properties whereas implicit constraints can be deduced 7. { control.Top = StartingTop; } 8. from the design made on the canvas such as widget 9. for (int counter = 1; counter < lines.Count; counter++) ordering and positioning. Both types of constraints can be 10. { specified using our IDE Cedar Studio. We proposed the 11. foreach (ControlInfo control in lines[counter]) 12. { generation of constraint problems that could be solved by 13. int reverseLineCounter = counter -1; constraint solvers to satisfy explicit constraints. On the 14. var ctrsAbove = new List> (); other hand we implemented implicit constraints relevant to 15. 16. while (ctrsAbove.Count() == 0 && reverseLineCounter >= 0) widget positioning and ordering as a layouting algorithm. 17. { 18. ctrsAbove = from l in lines[reverseLineCounter] More work is still required to make the proposed technique 19. where (l.Left > control.Left - l.Width && applicable in practice. A primary point would be devising 20. l.Left < control.Left + l.Width) an algorithm that would convert explicit designer constraints 21. orderby l.Height descending select l; 22. reverseLineCounter- - ; into a constraints problem such as the one shown in Listing 23. } 1. This algorithm should then be utilized by the adaptation 24. engine in combination with the algorithm for refitting the 25. if (ctrsAbove.Count() > 0) { 26. ControlInfo ctrAbove = ctrsAbove.First(); UI based on implicit constraints in order to maintain the 27. control.Top = ctrAbove.Bottom + widgetMargin; designer’s input upon adapting the user interface. When this 28. } 29. else { control.Top = StartingTop; } part is accomplished, then we can comprehensively test 30. } both explicit and implicit constraints in a real-life scenario 31. } by measuring the extent to which the usability is preserved 32. return true; 33. } and the efficiency of the technique. Our solution is intended for allowing designers to add any 8. Coutaz, J. User Interface Plasticity: Model Driven type of constraints that can be applied on the properties of Engineering to the Limit! Proceedings of the 2nd ACM the concrete UI widgets. The incorporation of this solution SIGCHI Symposium on Engineering Interactive in a generic IDE like Cedar Studio allows extensions to be Computing Systems, ACM (2010), 1–8. made in the future. One possible extension would be 9. Demeure, A., Meskens, J., Luyten, K., and Coninx, K. supplying UI designers with the ability to automatically Design by Example of Graphical User Interfaces check the initial design (implicit constraints) based on adapting to available screen size. In V. Lopez-Jaquero, general ergonomic rules [23] or to add these rules as J.P. Molina, F. Montero and J. Vanderdonckt, eds., explicit constraints. Another possibility is to use such Computer-Aided Design of User Interfaces VI. Springer- ergonomic rules for prioritizing constraints in order to allow Verlag (2009), 277–282. the system to make an informed decision when it faces two conflicting constraints that were assigned the same priority 10. Feuerstack, S., Blumendorf, M., Schwartze, V., and by the human designer. Albayrak, S. Model-based Layout Generation. Proceedings of the Working Conference on Advanced ACKNOWLEDGMENT Visual Interfaces, ACM (2008), 217–224. This work is partially funded by ERC Advanced Grant 11. Findlater, L. and McGrenere, J. A Comparison of Static, 291652. Adaptive, and Adaptable Menus. Proceedings of the SIGCHI Conference on Human Factors in Computing REFERENCES Systems, ACM (2004), 89–96. 1. Akiki, P.A., Bandara, A.K., and Yu, Y. Using 12. Florins, M. and Vanderdonckt, J. Graceful Degradation Interpreted Runtime Models for Devising Adaptive User of User Interfaces as a Design Method for Multiplatform Interfaces of Enterprise Applications. Proceedings of the Systems. Proceedings of the 9th International 14th International Conference on Enterprise Conference on Intelligent User Interfaces, ACM (2004), Information Systems, SciTePress (2012), 72–77. 140–147. 2. Akiki, P.A., Bandara, A.K., and Yu, Y. RBUIS: 13. Gajos, K.Z., Weld, D.S., and Wobbrock, J.O. Simplifying Enterprise Application User Interfaces Automatically Generating Personalized User Interfaces through Engineering Role-Based Adaptive Behavior. with Supple. Artificial Intelligence 174, 12-13 Elsevier Proceedings of the fifth ACM SIGCHI Symposium on (2010), 910–950. Engineering Interactive Computing Systems, ACM (2013), Forthcoming. 14. McGrenere, J., Baecker, R.M., and Booth, K.S. An Evaluation of a Multiple Interface Design Solution for 3. Akiki, P.A., Bandara, A.K., and Yu, Y. Crowdsourcing Bloated Software. Proceedings of the SIGCHI User Interface Adaptations for Minimizing the Bloat in Conference on Human Factors in Computing Systems, Enterprise Applications. Proceedings of the fifth ACM ACM (2002), 164–170. SIGCHI Symposium on Engineering Interactive Computing Systems, ACM (2013), Forthcoming. 15. Meskens, J., Vermeulen, J., Luyten, K., and Coninx, K. Gummy for Multi-Platform User Interface Designs: 4. Akiki, P.A., Bandara, A.K., and Yu, Y. Cedar Studio: Shape me, Multiply me, Fix me, Use me. Proceedings of An IDE Supporting Adaptive Model-Driven User the Working Conference on Advanced Visual Interfaces, Interfaces for Enterprise Applications. Proceedings of ACM (2008), 233–240. the fifth ACM SIGCHI Symposium on Engineering Interactive Computing Systems, ACM (2013), 16. Myers, B., Hudson, S.E., and Pausch, R. Past, Present, Forthcoming. and Future of User Interface Software Tools. ACM Transactions on Computer-Human Interaction 7, 1 5. Benyon, D. Adaptive systems: A solution to usability ACM (2000), 3–28. problems. User Modeling and User-Adapted Interaction 3, 1 Springer (1993), 65–87. 17. Nebeling, M. and Norrie, M.C. Tools and Architectural Support for Crowdsourced Adaptation of Web 6. Calvary, G., Coutaz, J., Thevenin, D., Limbourg, Q., Interfaces. Proceedings of the 11th International Bouillon, L., and Vanderdonckt, J. A Unifying Conference on Web Engineering, Springer-Verlag Reference Framework for Multi-Target User Interfaces. (2011), 243–257. Interacting with Computers 15 Elsevier (2003) 289–308. 18. Nichols, J., Myers, B.A., Higgins, M., et al. Generating 7. Clerckx, T., Luyten, K., and Coninx, K. DynaMo-AID: Remote Control Interfaces for Complex Appliances. a Design Process and a Runtime Architecture for Proceedings of the 15th annual ACM Symposium on Dynamic Model-Based User Interface Development. User Interface Software and Technology, ACM (2002), Proceedings of the 2004 International Conference on 161–170. Engineering Human Computer Interaction and Interactive Systems, Springer-Verlag (2004), 11–13. 19. Nichols, J., Myers, B.A., and Litwack, K. Improving Automatic Interface Generation with Smart Templates. Proceedings of the 9th International Conference on 22. Raneburger, D., Popp, R., and Vanderdonckt, J. An Intelligent User Interfaces, ACM (2004), 286–288. Automated Layout Approach for Model-Driven WIMP- 20. Paternò, F., Mancini, C., and Meniconi, S. UI Generation. Proceedings of the 4th ACM SIGCHI ConcurTaskTrees: A Diagrammatic Notation for Symposium on Engineering Interactive Computing Specifying Task Models. Proceedings of the IFIP TC13 Systems, ACM (2012), 91–100. Interantional Conference on Human-Computer 23. Vanderdonckt, J. and Bodart, F. The “Corpus Interaction, Chapman & Hall, Ltd. (1997), 362–369. Ergonomicus”: A Comprehensive and Unique Source 21. Pleuss, A., Botterweck, G., and Dhungana, D. for Human-Machine Interface. Proceedings of the 1st Integrating Automated Product Derivation and International Conference on Applied Ergonomics, USA Individual User Interface Design. Proceedings of the Publishing (1996), 162–169. Fourth International Workshop on Variability 24. Microsoft Z3Py. http://rise4fun.com/z3py. Modelling of Software-Intensive Systems, Universitat Duisburg-Essen (2010), 69–76.