=Paper=
{{Paper
|id=None
|storemode=property
|title=Usability Evaluation of Method Handbook
|pdfUrl=https://ceur-ws.org/Vol-1028/paper-02.pdf
|volume=Vol-1028
|dblpUrl=https://dblp.org/rec/conf/bir/Heinermann13
}}
==Usability Evaluation of Method Handbook==
Usability Evaluation of Method Handbook Laura Heinermann1, Dirk Stamer1, Kurt Sandkuhl1,2 [1] Rostock University, Institute of Computer Science Albert-Einstein-Str. 22, 18059 Rostock, Germany [Laura.Heinermann, Dirk.Stamer, Kurt.Sandkuhl]@uni-rostock.de [2] School of Engineering at Jönköping University, P.O. Box 1026, 55111 Jönköping, Sweden Kurt.Sandkuhl@jth.hj.se Abstract. In enterprise modelling and information systems development, methods contribute to systematic work processes and to improving the quality of modelling results. Information Demand Analysis (IDA) is a method, which recently was developed for the purpose of optimizing the information flow in the field of information logistics. In order to contribute to improvement of the IDA method, the focus of this paper is to evaluate the usability of the IDA method handbook. For this purpose, an approach for usability evaluation of the handbook is proposed and applied. The main contributions are (1) an approach how to apply the concept of usability when evaluating a method handbook, (2) experiences from using this approach in a real world case, and (3) recommendation for improving the IDA method handbook with respect to usability. Keywords: information demand modelling, method usability, information logistics, practice of modelling 1 Introduction In enterprise modelling and information systems development, methods contribute to systematic work processes and to improving the quality of modelling results [1, 2] by providing procedural guidelines and notations for representing modelling results [3]. Information Demand Analysis (IDA) is a method, which was developed for the purpose of optimizing the information flow in the field of information logistics (see section 2). In an effort to provide this method, the creators of IDA released a handbook – the “IDA-User Guide” – that intends to provide guidelines and instructions for conducting information demand analysis. Although the IDA method has already been applied in several organisations, the usability of its handbook in fulfilling the intended purposes is still questioned. In utilising the handbook as applied to a real world situation, we examine the usability of the handbook from the view of a user and intend to contribute to its improvement. In the following we use “IDA-User Guide” equivalent to the term “Handbook”. The research on usability evaluation of user-oriented products belongs to the most productive areas of research in the field of information systems. This is most certainly inevitable since the growth of technology cannot be separated from the dynamic of the human factor, which demands the quality of use. Critical research is being conducted to examine the soundness of the usability concept as well as the suitability of the evaluation methods applied. However, these works deal more with the principles of usability for software products and less with the usability of an analysis method in information logistics, let alone in the area of information demand. The focus of this work is the application of usability to a method handbook. Other approaches in this area, like the SEQUAL approach [4], aim to evaluate the quality of methods on the overall. Furthermore, we performed a literature analysis regarding IDA. Few results were found in the search of particular literature on the evaluation of IDA method. This situation is understandable because the method for information demand analysis is still in its preliminary stage. The only script that is found on this related issue is an undergraduate thesis written three years ago by a group of students at Växjö University [5]. The emphasis of their work focussed on the requirements and not on the usability of the method. Their object of research was the previous version of the method handbook. These factors motivated us to begin an initial work on the usability of the handbook for information demand analysis. The main contributions of this paper are (1) an approach how to apply the concept of usability when evaluating a method handbook, (2) experiences from using this approach in a real world case, and (3) recommendation for improving the IDA method handbook with respect to usability. The remaining part of the paper is structured as follows: section 1 introduces the background of this paper from the area of information demand analysis and information logistics; section 3 discusses related work with respect to usability and the approach used for usability evaluation of the handbook; section 4 contains the evaluation of the IDA method handbook; and section 5 draws conclusions. 2 Information Demand Analysis Accurate and readily available information is essential in decision-making situations, problem solving and knowledge-intensive work. Recent studies show that information overload is perceived as a problem in industrial enterprises [6]. An example of a problem in relation to information overload is, in relation to different roles, to find the right information needed for a work task [7]. It is expected that an improved information supply would contribute significantly to saving time and most likely to improving productivity. The research field information logistics addresses the before mentioned challenge in information supply. The main objective of information logistics is improved information provision and information flow. This is based on demands with respect to the content, the time of delivery, the location, the presentation and the quality of information. The scope can be a single person, a target group, a machine/facility or any kind of networked organisation. The research field of information logistics explores, develops, and implements concepts, methods, technologies, and solutions for the aforementioned purpose. A core subject of demand-oriented information supply is how to capture the needs and preferences of a user in order to get a fairly complete picture of the demand in question. This requires an understanding of what information demand is and a method for capturing and analysing information demand. The term information demand (section 2.1) and the method for information demand analysis (section 2.2) will be briefly discussed in this section. 2.1 Information Demand The understanding and definition of the term ‘information demand’ used in this paper is based on an investigation performed from 2005-2007 [8]. The main objective of this investigation was to identify the connection between information use and different work-related aspects, such as work processes, resources, and organisational structures, in order to achieve a better understanding of the term information demand. The investigation comprised 27 interviews with individuals from three different organisations, a sample which represented all levels of the investigated organisations, i.e. from top-level management via middle management down to production- and administrative personnel. Among the results of the investigation was a definition of information demand as well as a conceptualisation of this term. Information demand will be used throughout this paper with the following meaning: “Information demand is the constantly changing need for relevant, current, accurate, reliable, and integrated information to support (business) activities, when ever and where ever it is needed.”[8] Information demand has a strong relation to the context in which such a demand exists [8]. The organisational role having the demand and for what task the information is demanded as well as the setting in which such tasks are performed are important aspects for understanding information demand. Thus, the concept of information demand context has been defined both conceptually and as the core of the method with respect to modelling, evaluating and analysing of information demand: “An Information Demand Context is the formalised representation of information about the setting in which information demands exist and comprises the organisational role of the party having the demand, work tasks related, and any resources and informal information exchange channels available, to that role.”[8] Based on the above results, a method for analysing information demands and capturing information demand context has been developed as part of the research project infoFLOW-2 during 2010 – 2012 [8]. This method is documented in a method handbook, which describes the work procedures to be performed, the notation for documenting information demand contexts, and the concepts and aspects to be taken into account during analysis. The different phases of this method will be introduced in the next section. 2.2 Method for Information Demand Analysis In Figure 1 an overview of a framework for achieving such an understanding is presented, which will be described in this section. Since context is considered central to information demand analysis, method support for modelling such contexts is at the core of the framework as highlighted in the framework below. However, in order to be able to perform any meaningful context modelling, a clear scope is needed. Consequently, the information demand analysis starts with scoping activities. Figure 1: An overview of the process of analysing information demand Scoping is the process of defining the area of analysis and is done with the purpose of selecting parts of an organisation to be subjected to analysis. This phase also includes the identification of the roles (individuals) relevant for the continued information demand analysis. Scoping also sets the scene for identification and understanding of the organisation’s problems, goals, intentions, and expectations to motivate them to engage in the information demand analysis. Information Demand Context Modelling is mainly performed through participative activities such as joint modelling seminars where the participants themselves are involved in the actual manufacturing of different models. This process is usually supported and facilitated by a method expert who could be an internal or external person. As illustrated in Figure 1 the conceptual focus is in this phase of information demand within a defined scope. The key to context modelling is to identify the interrelationship between roles, tasks, resources and information. No regard is given to the sequence of activities, resource availability, etc. Information Demand Context Analysis and Evaluation: Once the necessary knowledge about the information demand contexts is obtained, it can be used for a number of different purposes. One purpose is evaluation where different aspects of information demand can be evaluated in relation to roles, tasks, resources and information. It is also suitable to address the results from the modelling session with respect to motivation and purposes expressed during the scoping activities. Focusing on information demand contexts provides only an initial view of information demand without any consideration given to such aspects as individual competence, organisational expectations and requirements in terms of goals, processes etc. Depending on the intentions behind the analysis further activities might be required. The method provides a number of method components supporting such activities. Since the main focus of the method presented here is on information demand, it utilises existing procedures and notations for such additional aspects rather than defining new ones. Consequently, if the method user wishes to investigate such additional aspects of information demand, he or she can do this by using subsets of the other methods, notations and languages. 3 Usability Evaluation Approach In this section the approach taken to arrive at a suitable definition of the term usability in the context of the IDA User Guide is described. In addition, the proposed criteria for usability evaluation are outlined. Therefore, in section 3.1, Usability in the context of evaluating a handbook is defined. In section 3.2 the given definition is evolved into Evaluation Criteria. 3.1 Approach to defining Usability The daily use of the term usability is often found reducing the meaning of the word simply to be the ease that one has in using a product. However, the dimensions of ease itself are so diverse that it is quite difficult to measure. Usability practitioners in the area of Human Computer Interaction have dealt with this situation since the early 1970s. During the course of usability research in the field of software engineering back in 1994, usability practitioners had complained that work on usability was too little, too late. Nigel Bevan believed that the reason for this complaint lay in the narrow view toward the term usability. Usability was often considered only in the term of ease of use while the term of ability to fulfil its intended purpose was neglected [9, 10]. Owing to the experiences of fellow usability researchers from the field of information systems, we are well aware of these important aspects of usability and are, therefore, taking these aforementioned aspects into consideration in our initial work on evaluating the usability of the IDA User Guide. Realizing the different contexts between that of software engineering and that of information demand, our attempt to determine a suitable definition of usability for this work is directed in approaching the interpretation of usability based on the context of use. This approach is believed inevitable due to the absence of previous research on defining usability in the practical use of method to the best of our knowledge. Here we observe a definition of usability from a usability practitioner and anticipate how usability is interpreted by the International Organization for Standardization. Jakob Nielsen defined usability by associating it with the following five attributes: • Learnability: The system should be easy to learn so that the user can rapidly start getting some work done with the system. • Efficiency: The system should be efficient to use, so that once the user has learned the system, a high level of productivity is possible. • Memorability: The system should be easy to remember, so that the casual user is able to return to the system after some period of not having used it, without having to learn everything all over again. • Errors: The system should have a low error rate…catastrophic errors must not occur. • Satisfaction: The system should be pleasant to use, so that users are subjectively satisfied when using it [11]. The International Organization for Standardization issued ISO 9216 from 2001 to 2003 as the series of standards addressing software quality and replaced it by ISO/IEC 25010 in 2011. To understand the implicit value of usability beyond the applied categorization, we will consider both releases of the standard for our observation. ISO 9216 constituted a software quality model which divides software quality into six general categories of characteristics: functionality, reliability, usability, effectiveness, maintainability and portability. Simultaneously the definition of usability is given as “the capability of the software product to be understood, learned and liked by the user, when used under specified conditions” [12]. In the final draft ISO/IEC FDIS 25010 a quality-in-use model constituted five characteristics whereas three of them, which tend to be relevant for our purposes are presented here with their definitions as follows: • Effectiveness: accuracy and completeness with which users achieve specified goals. • Efficiency: resources expended in relation to the accuracy and completeness with which users achieve goals. • Satisfaction: degree to which user needs are satisfied when a product or system is used in a specified context of use. • Freedom from risk • Context coverage. Effectiveness and Efficiency are not further divided into sub characteristics as Satisfaction is. However, we believe among the given sub characteristics of Satisfaction, Usefulness is the only one applicable to our context. It is interpreted as “degree to which a user is satisfied with their perceived achievement of pragmatic goals, including the results of use and the consequences of use” [13]. The concept of Usability was still recognized in the development of the standard and the characteristic Usability is categorized as one of the characteristics of the second quality model - product quality model. It is stated as “degree to which a product or system can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” with the sub characteristics: appropriateness recognisability, learnability, operability, user error protection and user interface aesthetics [13]. In both releases, Usability is to be associated with the perspective of learnability, specified user, specified goal and specified conditions or specified context of use. In addition, the presented quality of use model indicates the human factor approach, in the sense that the system is able to fulfil the needs of the user. Both aspects seem to be maintained as the backbone of the definition of usability in the past two decades. These aspects of usability are now to be assessed with respect to their compliance with the nature of the handbook and the context of information demand analysis to which the handbook refers. 1. Learnability: the system should be easy to learn so that the user can rapidly start getting some work done with the system. As stated in the outline of the handbook, section Background, “a method of information demand analysis therefore has to include all relevant aspects of enterprises, such as work tasks, organisational structures, processes, information sources and information receivers. This handbook describes exactly such a method in terms of content, structure, and use.”[8] Consequently a holistic understanding of the content, structure, and use of the method is very important. Moreover, learning a concept, which is intangible unlike a visualized program, requires a very clear explanation and understandable description from the provider of the concept. Taking these matters into consideration, the comprehensibility of the handbook is an essential aspect for the usability of the handbook. Therefore, the term learnability is going to be replaced with the term comprehensibility and its interpretation is as follows: the handbook should be easy to understand so that the user can quickly perceive the overview of the method and finish the preparation for the analysis in a timely manner. 2. Specified user. In the section Purpose of the Handbook, there are three different types of readers the handbook is intended for. This differentiation is nonetheless not a specifying classification based on some competency requirements in the area of information demand analysis. It is rather to serve as information about the particular intentions the handbook is able to facilitate. Since the IDA method itself is “user-intensive”, the aspect of user should be adopted in the concept of usability of the handbook. However, the term “specified” is not proper in this case due to the abovementioned view and will be dropped therefore. 3. Specified goal. Accommodating different types of user requires the handbook to be able to accommodate their different goals, which can be found within the stated purpose of the handbook. The specified goal refers to that of the user in using the handbook according to its stated purpose. Based on this view, this aspect of specified goal is relevant for the concept of usability of this handbook. 4. Specified context of use. The Handbook described in the section The Purpose of a Method for Information Demand Analysis the context in which information demand analysis needs to be done. Hereby the context of use for the handbook is already specified and a specified context of use is not relevant for the usability of the handbook. 5. Effectiveness in the sense of accuracy and completeness with which users achieve specified goals. The specific purpose of this handbook is to facilitate the information demand analysis. Consequently, the focus of the handbook is placed on the pragmatic aspects of information demand. Owing to this purpose, accuracy and completeness of the explanations, directions and references in the handbook is very important. Although it is somewhat subjective, it is absolutely relevant for the evaluation of quality of use. 6. Efficiency as in resources expended in relation to the accuracy and completeness with which users achieve goals. The amount of resources required for the implementation of the handbook depends on various factors. Amongst them are the user profile, the perceived problem, and the context of the use. Even if the resources used can be recorded and the procedures repeated, there are still more variable parameters than fixed ones in the implementation of the handbook. A further specialization in the area of implementation would probably be necessary before integrating this aspect into the usability of the handbook. Hence, we find this aspect irrelevant for the consideration of current usability of the handbook. 7. Usefulness with respect to the degree to which a user is satisfied with their perceived achievement of pragmatic goals, including the results of use and the consequences of use. The practical steps found in the fourth section of the handbook are categorized into prerequisites, activities and expected results. The expectation and pragmatic goals of user during each of these steps can hereby be evaluated regarding to the usefulness of the prerequisites, activities and expected results of each phase. Therefore usefulness is a relevant attribute for the concept of usability. Under the consideration of the abovementioned analysis, we define the term usability of the handbook for Information Demand Analysis as follows: The quality of the handbook to be used with comprehensibility to achieve the specified goal with accuracy, completeness and usefulness for the user. 3.2 Evaluation Criteria By using the building elements of the before given definition we are going to propose criteria for evaluating the usability of the handbook. Attributes Among the said elements there are four attributes, which are directly related to the quality of use of the handbook. These attributes are as follows: • Comprehensibility: the handbook should be easy to understand so that the user can quickly perceive the overview of the method and finish the preparation for the analysis in a timely manner. • Accuracy: the essential point is communicated in the right section and each section presents its contents according to the correct priorities. • Completeness: all the essential features are available. The pertained explanatory elements are likewise obtainable. • Usefulness: the conceptual as well as practical descriptions available in the handbook do not only meet the user’s fragmental expectations, but also accommodate the user’s pragmatic goals. Selected Criteria In evaluating these attributes, certain criteria or parameters for the analysis are needed. There are certainly many facets pertaining to those attributes, but for this work on this particular handbook for information demand method, we limit our list of criteria. Based on our apprehension of the handbook, we assign the abovementioned attributes into the following aspects: • Wording. It is one significant factor of comprehensibility in an expositional text such as a handbook. • Contextual and visual illustration of the concept. Similar to wording, the contextual and visual illustration is also an influential factor of comprehensibility. • The stated heading of each section and the main idea that emerges in the respective section are coherent to each other. This can serve as an aspect to evaluate the accuracy of the handbook. • No parts of ideas, important notes or description pertaining to the existing ones are missing. This is an aspect of completeness. The results of use after each phase meet the user’s expectation and her or his goal. This is an aspect of usefulness. We proposed here several attributes and selected criteria to evaluate the IDA User Guide. Due to the explorative advance we did not develop a measurement scale. This would be a further step towards applying usability concepts to a handbook. 4 Case Study and Evaluation of IDA User Guide Most research projects conducted in university are financed by external funders, who make their decision to grant the fund or not based on the competitiveness of the submitted proposal for the respective project. The process of writing this proposal involves a great deal of information to be processed and procedures to be conducted by various areas of responsibility within the university. The information demand within this information-intensive process of proposal writing is to be identified with the method described in the handbook for information demand analysis. Our customer is the head of the proposal writing team. Interviews are conducted with the head of the proposal writing team, the financial administrator of the institution and the content manager. Furthermore, the role of method user is defined as the method-applying person who does the interviews as well. Problems that arose during the implementation of the handbook are recorded along with the evaluation of the usability criteria. The following sections are structured according to the process of analysing information demand. Within the sections stated activities (proposed by IDA user guide), performed activities and problems, which occurred, are listed. 4.1 Scoping Phase Stated activities • Understanding and identifying the perceived problem. • Defining the problematic area and the actors within it. Activities performed • Orientation interview with the head of the team about the process of proposal writing to get a basic understanding about the process. • Prepared a mission statement with input from the interview. • Review of the mission statement by the head of the team. Problems • Difficulties in describing the mission statement, especially the description of expected results and effects of the assignment by a misunderstanding. 4.2 Information Demand Context Modelling Phase Stated Activities • Short semi-structured interviews with each of the selected individuals in order to get an understanding of their expectations, goals and problems. • Modelling seminars to collect the needed information to perform further work with the aim to get the participants to identify the different tasks performed within the scope and how this relates to information which is needed or used, grouped by role. Typically, this is done by writing down different tasks and information objects on pieces of paper and attaching them to large paper or plastic sheets and then connecting them to each other. The reasoning and motivation behind this approach is that it allows for easy restructuring and changes as the session progresses. An additional benefit of such a non- technical approach is that is easier to get the participants to contribute to the emerging model than it would be if they all were seated around a table watching the model emerge on the facilitator’s computer screen. Activities performed • The first inquiry session was conducted with the head of the team. This lasted 15 minutes longer than the estimated duration of 60 minutes. • For the session with the financial administrator, a sample of the model adopting the illustrated sample case was prepared with the same modelling tool that will be used for the interview. This sample was then shown to the informant at the beginning of the session. The reason for this was to provide the informant a better picture of what the result of the session was supposed to be. This sample was in German as to accommodate the preferable language of the informant for the session. The duration of this session was exactly as estimated. • For the session with the content manager, the sample of the model was again shown to the informant at the beginning of the session. In spite of the use of the German sample, the language for the session was English. This session also finished within the estimated time. Problems • During the first session there were problems to classify the statements and responsibilities given by the informants. • Regarding the respect to all three sessions: The first stated purpose was not fulfilled. The method user does not understand the character and personality of the individuals. Method user had difficulties identifying the information within the information situation. 4.3 Analysis and Documentation of Information Demand Phase Stated Activities • Transcription of the initial models into well-defined notation. • During the transcribing of the models correlations between information flow in the current case and information flow in the previous cases can sometimes be identified. If an identified situation within the models is considered similar to already identified and solved situations from other cases it is reasonable to assume that the solutions applicable in that situation are also applicable in the current one. If no corresponding pattern can be found in the pattern collection despite the fact that it has been identified in a number of different cases, it is reasonable to assume that a candidate for a new and valid pattern has been identified. Activities performed • The models out of the first interview and modelling sessions were still lacking in connectors to most of the relationships. These three models were then analysed using the information from the documented interview. • Constructing one new overall model out of all three, bilingually. • To fill the gaps and get a confirmation for this developed model, a second meeting with each informant was then scheduled, which leads to their approval to the model. Furthermore, a similar information demand pattern addressing the process of proposal writing was found in the collection of patterns. Based on this pattern, the approved context graph was then digitalized. Problems • During transcription some flows were unclear because of the different treatments to the type of information. This should be described more clearly in the handbook. 5 Conclusions and Future Work As shown in section 3 the concept of usability was applied, when evaluating a method handbook. The starting points were definitions of usability given by practitioners and the International Organization for Standardization. We extracted and analysed the relevant terms from the definitions. As a main contribution an adapted definition of usability for the Information Demand Analysis User Guide is proposed as the quality of the handbook to be used with comprehensibility to achieve the specified goal with accuracy, completeness and usefulness for the user. Furthermore we proposed evaluation criteria to measure the given attributes in the definition. Due to the explorative advance we did not develop a measurement scale. This will be a further step towards applying usability concepts to a handbook. In section 4 we applied this approach in a real world case. This quality of use was evaluated by implementing the handbook in an information-intensive process within the process of proposal writing in the university. This leads to helpful insights and practical guidance in the internal process. Through its implementation, the information situation within the process of proposal writing became clearer, and suggestion for solving the problem related to information demand could be given. By using the definition of usability approached in this work, a continuation of the evaluation process in the future is possible. Although the method user handbook follows in most aspects the usability requirements, but as another contribution some recommendations for improving the IDA method handbook with respect to usability could be given: • Important notes on the procedures of asking questions during the interview. • Additional explanation of the contextual use of identifying and differentiating. This work attempted to apply one possible concept – usability – to improve the quality of a method handbook. As a further approach knowledge from cognitive psychology could be integrated as well. The results of this usability evaluation may contribute as well to the acceptance of the Information Demand Analysis method in the networked organization and to the improvement of its quality. References 1. Brinkkemper, S.: Method engineering: engineering of information systems development methods and tools. Information and Software Technology. 38, 275–280 (1996). 2. Mirbel, I., Ralyté, J.: Situational method engineering: combining assembly-based and roadmap-driven approaches. Requirements Engineering 11. 58–78 (2005). 3. Goldkuhl, G., Lind, M., Seigerroth, U.: Method integration: the need for a learning perspective. IEE Proc., Softw. 145, 113–118 (1998). 4. Krogstie, J.: Model-Based Development and Evolution of Information Systems. Springer Verlag (2012). 5. Nyberg, C., Wass, S.: Does IDA meet the requirements? (2009). 6. Öhgren, A., Sandkuhl, K.: Information Overload in Industrial Enterprises - Results of an Empirical Investigation. Presented at the Proceedings ECIME 2008, London (2008). 7. Lundqvist, M., Sandkuhl, K., Seigerroth, U.: Modelling Information Demand in an Enterprise Context: Method, Notation, and Lessons Learned. International Journal of Information System Modelling and Design (IJSMD). 2, 75–94 (2011). 8. Lundqvist, M.: Information Demand and Use: Improving Information Flow within Small-scale Business Contexts, (2007). 9. Shackel, B., Richardson, S.J.: Human Factors for Informatics Usability. Cambridge University Press (1991). 10. Bevan, N.: Measuring usability as quality of use. Software Quality Journal. 4, 115–150 (1995). 11. Nielsen, J.: Usability Engineering. Academic Press (1993). 12. Abran, A., Khelifi, A., Suryn, W.: Usability Meanings and Interpretations in ISO Standards. Software Quality Journal. 11, 325–338 (2003). 13. ISO/IEC FDIS 25010: Systems and Software Engineering - Systems and Software Quality Requirements and Evaluation - Systems and Software Quality Models. (2010).