Towards automatically interfacing application services integrated into an automated, model-based user interface generation process Kai Breiner Oliver Maschino Daniel Görlich, Gerrit Meixner Fraunhofer Institute for Experimental TU Kaiserslautern, Software German Research Center for Artificial Software Engineering (IESE), Engineering Research Group, Intelligence (DFKI), 67663 Kaiserslautern, Germany 67663 Kaiserslautern, Germany 67663 Kaiserslautern, Germany breiner@cs.uni-kl.de maschino@cs.uni-kl.de {Daniel.Goerlich, Gerrit.Meixner}@dfki.de ABSTRACT remote controls have already been proven in the context of The impact of user interface quality has grown in software system intelligent households [6]. Faced with a similar initial situation, engineering. This importance will grow further with upcoming universal controls also emerge in the domain of production new paradigms such as Ambient Intelligence or Ubiquitous industry [8]. Because intelligent production environments appear Computing. These paradigms confront the production industry to be highly dynamic, it was not an option to provide a simple, with a new diversity of usage situations. In previous work, we static, universal control device. Analyzing a generic environment have shown the adaptation of a task-oriented, model-based will always lead to a formal description (model), which in the Useware engineering process to future paradigms by extending following will be the starting point of our UI generation approach existing models and shifting the development/generation of the as described below. The contribution of our paper will be the user interface (UI) from development time to run-time. While seamless integration of given functional interfaces into the separating the UI design from the application engineering process, completely automated generation process, resulting in a fully the problem of the generated UI interfacing with the functional and usable context-sensitive UI. We have developed corresponding service functions emerged. We propose a solution this process as an extension to the Useware development process, by integrating the respective linkage information into the function and we will summarize it in the following section. model, which will be introduced in this paper. 2. TOWARDS A MODEL-BASED 1. INTRODUCTION GENERATION PROCESS Modern industry production environments are characterized by a Evidently, a modern user interface’s level of acceptance is heterogeneous set of technical devices, which consequently determined by its ease of use. Furthermore, this also applies to provide an also heterogeneous set of interaction devices as well as entire software products, because for the user, the UI is the concepts [8]. Also, using modern communication technology, product. [17] In order to improve this property, the Useware these devices can be interconnected and therefore share development process – developed by the Center for Human- information about the current state of the whole environment. Machine-Interaction and successfully applied in numerous joint Thus, if this particular information is available at any time and in ventures and industrial projects [7] – has to be adapted any place, this could also be a disadvantage, namely, if it is not systematically. presented properly in terms of format and structure [4]. The starting point of this process is always the systematic analysis Confronted with this diversity of interaction concepts and of the user and the respective environment, which is the sole information, it will be more difficult for users to fulfill their tasks guarantee for the efficient use of the final user interface to be or react in proper time in case of an emergency [8]. For this developed. Subsequently, the result of this phase will be reason, it is important to also consider information about the formalized during the structural design and the design phase, usage situation (e.g. users’ roles, user position, environmental resulting in a room-based use model (useML, see Figure 1), as conditions, etc.). Additionally, the use of (independent) described in [2]. On the basis of this model, it was the UI information and interaction structures generates new human- programmers’ task to implement the final UI. Automating this machine-interaction concepts and Useware engineering methods step between design and implementation is the research focus of [5]. the GaBi project, which aims at achieving this goal by defining a The goal of our approach is to support the users in performing model-based code generation process. One major challenge we their tasks as adequately as possible. Basically, the idea to solve explored in our previous work was to automatically interface the the problem of “explosive diversity” was to combine all the user application services while generating the user interface [1]. We interfaces of nearby devices into a single holistic interaction propose bridging this gap by including all the necessary device with a homogenous interaction concept, which results in information in an extended model, as described below. improved usability as well as in an optimization of the users’ workflows. The benefits and the sufficiency of such universal CEUR ceur-ws.org Workshop ISSN 1613-0073 Proceedings points for manually inserting interfacing code. To meet this challenge, we analyzed the interface definitions and extended the useML model [13] with all the relevant information. 3.1 Challenges For many reasons, the necessary information is not trivial to provide, because of the communication diversity already mentioned. Only because the application domain of our scenario is restrictive in terms of possible communication ways as well as other factors such as user roles can the risk of the already introduced explosive diversity be handled. First, common task and domain models, which are usually employed in model-based user interface development (MBUID) processes, assume that all tasks can be canonically mapped to an obvious domain data operation [1]. Yet, in many real environments, service interfaces as well as their manipulating operations are not equivalent to the users’ tasks [1]. Therefore, each task can be mapped to one (or a set of) operation(s), which has to be done manually because in most the cases, the underlying semantics are not machine-interpretable. communication useML device list protocol Figure 1. Integrated room-based use model, containing contextual information about the entire environment as well as all interactional information about the tasks to be performed by users. [2] expert knowledge 3. INTERFACING SERVICES In our scenario, all the devices located in an intelligent production environment to be monitored and controlled already possess a certain predefined set of well-defined service interfaces. The type and communication channel of each service interface strongly software developer usability engineer depend on the brand or manufacturer of the device. It is also user interface development process thinkable – as implemented in our demonstration environment – that communication with several devices is encapsulated in or final delegated by other communication devices (e.g., Bluetooth user interface DataEagle, PLC, etc.), which is an attempt to homogenize most of the service interfaces to be used. The UI device through which the Figure 2. Manual user interface development process on user should be able to control all these devices has to access all the basis of useML. these service interfaces. Since the intention was to provide a complete UI generation Second, the basic idea behind MBUID is to separate domain process without manual intervention, the generator needs to know knowledge from design knowledge. Thus, the idea is that domain how the respective service interfaces are accessed and how the experts should be able to create a use model containing all the information needs to be structured when a certain user task is users’ tasks, without having any knowledge about the executed. This means that all the information needed to construct implementation of the final user interface. The generator includes the entire UI must be included in the source model – the room- expert knowledge regarding the user interface design and based use model. Like this model, most of the current models transforms the model into an efficient user interface. Including provide a detailed description of the human tasks that could be information about the communication with an application’s performed, but make no statement about the service functions to interface into the model would imply that the domain expert also be executed. needs to have knowledge about some details of the user interface implementation. Hence, the current workflow is visualized in Existing technologies for describing graphical user interfaces Figure 2, which shows, besides the three major input documents, include, among others, the XML-based XML User Interface the roles of both the software developer (technical Language [14] or SwixML [15]. These languages describe a user implementation) and the usability engineer (expert knowledge in interface and a specific generator creates the final user interface design aspects). Among other risks (e.g., human errors), it is also from their description. This is a semiautomatic process and does possible that when this process is applied twice using the same not fit our requirements, because with these languages, it is only documents, it can result in varying output (final user interface) in possible to describe the graphical user interface and there is no several ways. This depends on several context factors during the information about accessing service interfaces. The result of such development process, such as the expert knowledge involved in approaches is a (compilable) UI source code, containing variation the process. 3.2 Idea – Extending useML • Length – the number of the data blocks used for one The solution we propose to meet these challenges is to completely data set automate this process. Consequently, the documents describing • Identifier – of the data set the functionality and the interfaces of the devices to be controlled need to be formalized in a machine-interpretable way. • DataFormat – basis of the interpretation of the content Additionally, the obstacle of automatically integrating the Also, there are optional properties of the data model: necessary code fragments in order to establish communication • Unit – human-understandable identifier of the data with the desired devices needs to be conquered. To achieve this content goal, we introduce an extension to the sophisticated useML description. In the following, we will call this model the function • ConversionFactor – if the content needs to be post- model. Each device compound (see Figure 1) possesses its own processed function model, because is theoretically possible that each • RangeMin/RangeMax – boundary conditions of the compound needs to be addressed in a unique way. In detail, it value consists of two sections: • SignificantDigits – the number of the significant digits • Connection – including information about establishing the connection to the device to be controlled • StatusMessage – special device-dependent status message encoded in one bitvector • Data – a basic structure of the content of the communication This data model in combination with the communication model forms the integrated function model, which allows for deducting One important aspect of this extension is the manner of from elementary use objects how certain bits of information need communication, which is described in the node connection. This to be transferred in order to execute the desired application logic. model was elicited from sample projects and implemented with the use of the uniform resource identifier (URI) standard [16]. Hence, the general communication information can be stored in a generic way. Therefore, it contains the following information: • Scheme – define the kind of communication and the needed additional parameters • Host – the host to be addressed according to the scheme • Data-Reference – reference to the data structure that the information needs to be encoded with • Priority – if multiple schemes are available, in order to choose the most adequate one Optionally, it is possible in our case to add parameters to describe the communication in more detail, which is specific in our sample environment: Figure 3. Accessing application services using the Universal • Device-Number – an unique number of the device to be Control Device. addressed • Device-Type – the concrete type of the device (e.g. XY- 4. FEASABILITY STUDY – Pump) DEMONSTRATOR Now that a UI generator knows how to communicate with the To show the feasibility of our approach and that of the function production environment, it is necessary to encode the transmitted model, we implemented a basic generator that accepts the content as well as its semantic. In order to develop the appropriate adjusted useML specification as input. In general, all elementary model, we analyzed sample device descriptions recorded in use objects are mapped to an object that will be displayed on the spreadsheet files that contain the composition of Bluetooth screen – the interpretation of task constellation towards widget frames. In regular projects, Useware developers used these composition is still ongoing work. The device compound structure documents to choose and adjust the widgets of the UI, but also to of the room-based use model is canonically mapped to a code the action events and the communication with the generated simple navigation structure on our sample UI. Thus, a environment using the widespread Profibus protocol. This led to a user is able to select a device of the compound and perform the data model attached to each elementary use object. In accordance tasks as specified in the useML description. Generating a with the message-based communication in the demonstration functional UI was the major purpose of this rudimentary environment, the data model provides information for processing generator. How the generator integrates the communication will incoming frames and for constructing outgoing frames according be elaborated in the following, where the universal control UI will to the current user input. A data node in our model contains: be considered a product of the generation process. • Position – the starting position within the In general, due to the structure of the extended useML, each communication frame elementary use object possesses (if necessary) its own function node, which is a link to a certain data entry in the function model of its device compound. Thus, if the user interacts with the user [2] Görlich, D. and Breiner, K. 2007. Useware modelling for interface (triggering a particular elementary use object), we are ambient intelligent production environments. Workshop: able to identify the corresponding function, extract the user data Model-Driven Development of Advanced User Interfaces, from the user interface, and compose the communication frame MoDELS 2007, Nashville. with the help of the function mode and vice versa. [3] Görlich, D. and Breiner, K. 2007. Intelligent Task-oriented For the purpose of demonstration, we installed the user interface User Interfaces in Production Environments. Workshop: generator on a Paceblade Slimbook P110 TabletPC [10]. This Model-Driven User-Centric Design & Engineering, 10th device possesses a 12.1″ touch screen, which can be used without IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, keyboard or mouse. and Evaluation of Human-Machine-Systems, Seoul. We analyzed the communication of our sample environment – the [4] Bödcher, A., Mukasa, K. and Zühlke, D. 2005. Capturing SmartFactoryKL – and filled in the new models, which are now Common and Variable Design Aspects for Ubiquitous part of the room-based use model. Figure 3 shows the universal Computing with MB-UID. In: Proceedings of the Workshop remote device (foreground) in action while controlling devices of on Model Driven Development of Advanced User Interfaces. the SmartFactoryKL (background). Montego Bay, Jamaica. [5] Zuehlke, D. 2004. Useware-Engineering für technische 5. CONCLUSION AND FUTURE WORK Systeme. Springer, Berlin. The result of our study is that it is feasible to provide all information on a certain industrial production environment in [6] Adam S., Breiner K., Mukasa, K. and Trapp, M. 2008. An order to enable an automatic user interface generator to create a Apartment-based Metaphor for Intuitive Interaction with functional user interface. Ambient Assisted Living Applications. 22nd European Conference on Human-Computer Interaction HCI2008, But there are certainly some basic limitations. We extended a Liverpool. given user interaction model with communication information, [7] Zühlke, D. and Thiels, N. 2008. Useware engineering: a because the application domain was clearly restricted. Thus, we methodology for the development of user-friendly interfaces, did not have to face the problem of combinational explosion in: Library Hi Tech, Vol. 26, No. 1. (restricted set of types of devices), which always occurs when there are infinite options of combinations between devices, [8] Hofmann, T. and Holzkämper, P. 2008. NEW HMI – communication channels, etc. On the contrary, in our Möglichkeiten und Grenzen abstrakt-geographischer environment, there is a clearly defined and standardized Visualisierung in Bereich der Anlagensteuerung. In: Brau, communication protocol and a predefined set of device types. H., Diefenbach, S., Hassenzahl, M., Koller, F., Peissner, M. and Röse, K. (Editors), Usability Professionals 2008, pp. The advantage of our approach is that we facilitated the 204-208, Fraunhofer IRB Verlag, Sep 2008. development of a completely automated user interface generator on the basis of only a user interaction model, an environmental [9] Grund, M. 2006. Kommunikationstechnologien in der description, and the description of the manner of communication. modernen Prozessleittechnik – Mit praktischer Furthermore, this allows for developing universal control devices Demonstration der dezentralen Parametrierung von that are able to adapt to changing peripheral constellations and Industriegeräten via Bluetooth. University of Kaiserslautern. always provide an adequate user interface. [10] http://www.paceblade.com, last visited 23.09.08. Since we now have an automated development process, it is our [11] Bödcher, A. 2007. Methodische Nutzungskontext-Analyse als vision to improve the usability of the generated user interface by Grundlage eines strukturierten USEWARE-Engineering- including a pattern repository that the generator can make use of. Prozesses. Fortschrittberichte pak, Volume 14. University of This will lead to user interfaces providing holistic interaction Kaiserslautern. (look’n’feel) for the control of various devices. Beside other [12] Maschino, O. 2008. A Strategy for Automated Generation of effects, we expect to significantly reduce the number of human Graphical User Interfaces based on the Useware Markup errors that result from switching between different interaction- Language in the Domain of Intelligent Production concepts. Also, the reaction time in case of critical situations Environments. Diploma Thesis, University of Kaiserslautern. might be reduced, which is vital in production environments. [13] Görlich, D., Thiels, N. and Meixner, G. 2008. Personalized Use Models in Ambient Intelligence Environments. Proc. of 6. ACKNOWLEDGMENTS the 17th IFAC World Congress, Seoul. Our work as well as the GaBi project is funded in part by the [14] https://developer.mozilla.org/en/XUL, last visited 23.09.08. German Research Foundation (DFG). [15] http://www.swixml.org, last visited 23.09.08. 7. REFERENCES [16] Berners-Lee, T., Fielding, R., and Masinter, L. 1998 Uniform [1] Adam, S., Breiner K., Mukasa K. and Trapp, M. 2007. Resource Identifiers (URI): Generic Syntax. RFC. RFC Challenges to the Model Driven Generation of User Editor. Interfaces at Runtime for Ambient Intelligent Systems. [17] Trapp, M. 2008. Generating User Interfaces for Ambient Workshop: Model Driven Software Engineering for Ambient Intelligence Systems. PhD-Thesis, Software Engineering Intelligence Applications, European Conference on Ambient Research Group, University of Kaiserslautern. Intelligence, Darmstadt.