Design of Knowledge Analytics Tools for Workplace Learning? Maria A Schett1 , Stefan Thalmann2 , and Ronald K Maier2 1 University of Innsbruck, Faculty of Mathematics, Computer Science & Physics, Department of Computer Science 2 University of Innsbruck, School of Management, Department of Information Systems, Production & Logistics Management, Information Systems I mail@maria-a-schett.net {stefan.thalmann, ronald.maier}@uibk.ac.at Abstract. The amount of documented organizational knowledge steadily increases as well as the amount of knowledge available from external sources. At the same time the need for innovation at the workplace also increases and poses the challenge to support employee’s workplace learning. Knowledge analytics seems to be a promising approach to help employees to sift through piles of documents and select knowledge suitable for their learning at the workplace. However, little is known about the requirements for knowledge analytics in general and in the context of workplace learning in particular. Therefore, we developed a knowledge analytics tool and applied it in a case study. We performed seven artifact-driven expert interviews within this case study to elicit the requirements for a knowledge analytics tool. Based on our investigation we developed three candidate design patterns: (1) provenance and traceability, (2) human factor and stakeholder rating, and (3) visualization of the proposed solution. Our design patterns can be used to inform the design of knowledge analytics tools, particularly in the context of workplace learning. Keywords: knowledge analytics, workplace learning, design patterns 1 Introduction The increasing amount of organizational knowledge and the increasing need for workplace learning poses a difficult challenge to organizations: How can an educational program manager provide suitable knowledge to the employees? How can a content developer select those contents in a large digital library, which are best suited for adaptation for workplace learning? In this paper we present the findings of a case study in order to answer these questions. The goal of this paper is to develop candidate design patterns for a knowledge analytics tool used ? The research leading to the presented results was partially funded by the Euro- pean Commission under the 7th Framework Programme (FP7), Integrating Project LEARNING LAYERS (Project no. 318209). for workplace learning. Our case study is performed in the context of the EU FP7 Learning Layers3 (LL) project. We implemented a knowledge analytics tool to recommend contents for adaptive preparation in the context of workplace learning and discussed the recommendation proposed by the tool with seven experts using artifact-driven, semi-structured telephone interviews. Based on these interviews, we developed three candidate design patterns, which capture context, problem, and solution, intended to help the design of functionality for knowledge analytics tools. 2 Background Informal learning is seen as the most important way to acquire and develop skills and competencies within the workplace [1]. Workplace learning is nested in everyday problem solving situations, where people learn through mistakes and interactions with colleagues as well as by learning from others’ personal experi- ences [3]. Workplace learning is increasingly promoted because of the changes in work organizations and the appearance of new types of management [6]. The important interplay of both the informal and the social characteristics is further emerging in research on learning in the workplace [8]. Mobile devices enable access to documented knowledge (from inside and outside the organization) plus interactions with colleagues to foster workplace learning [19]. Workplace learning boosted by proceeding scalable learning solutions could take place with such devices useable from a variety of locations [23]. The quality of scalability particu- larly depends on user acceptance. Hence, the solution needs to be assimilated into the daily learning practices of a critical mass of users [18]. Compared to more traditional learning settings, the unstructured, creative, and expertise-driven informal learning cannot be designed with standardized management approaches and cannot be easily supported by IT [13]. Hence, much more contents prepared for more diverse learner needs are needed, which requires new ways of IT support. In this regard, knowledge analytics seems promising to cope with the increasing complexity. Analytics is used in different settings, e.g., business analytics [2], learning analytics [10], or academic analytics [7] and we aim to take the analogy of analytics to documented organizational knowledge: knowledge analytics. The term analytics has many facets: It features data-driven decision making [22], by using mathematical techniques to analyze data [24] to drive fact-based planning, decisions, execution, management, measurement and learning [9]. The aim of analytics is to develop actionable insights, which give the potential for practical action [4]. Analytics can be used to model the past, recommend in the present, and predict, optimize, and simulate the future [5]. Van Barneveld et al. [22] identify three trends for combining analytics with a keyword: (1) topic of interest, e.g., learning analytics if we are interested in learning, (2) intent of the activity, e.g., predictive analytics, and (3) object of analysis, e.g., analytics based on Google (i.e., Google analytics). 3 See learning-layers.eu. Following this categorization, we define knowledge analytics from perspec- tive (3), object of analysis as analytics, based on knowledge, as opposed to data. First, we define knowledge after Zack [25] as organized accumulation of data enriched by context, and describe knowledge succinctly as “context and content”. Then, we define knowledge analytics as analytics which use knowledge as input to create value as output. In this paper we describe our knowledge analytics approach applied to a case study within the context of the LL project. Thereby we next apply our definition: content/data and context/metadata is used as input for analytics to create value in form of a proposed solution. Content/Data. In the LL project case 58 knowledge elements were created. They are stored in text documents, presentations, spreadsheets, videos, and wiki pages and have topics clustered around theories of learning and knowledge. The goal was to use these knowledge elements to support workplace learning taking the preferences of the project members into account. Context/Metadata. These knowledge elements were rated, or more specifically, factors which reflect benefits and efforts with respect to the knowledge elements were rated. Ratings of factors reflecting benefits were collected via an online survey of LL project members, the raters. Ratings of factors reflecting efforts were collected from a technical and a domain authority, and from employees, who are responsible for adapting the knowledge elements. Analytics. Our analytics approach is the Knowledge Element Preparation model (KEP model) proposed by Thalmann [20]. The KEP model poses a linear, 0/1 combinatorial optimization problem. We employed a general purpose solver to implement the model in the KEP tool. We expressed the KEP model in a subset of the modeling language AMPL, namely GMPL, a declarative language with algebraic notation, which is close to the mathematical description of the KEP model, and utilized the free Gnu Linear Programming Kit. With the KEP tool, we computed a (KEP) proposed solution of knowledge elements best suited for preparation with respect to adaptation criteria based on the collected ratings of factors reflecting benefits and factors reflecting efforts. We selected five adaptation criteria from [21] to make the knowledge elements more accessible to learners in situations of workplace learning: device requirements, didactical approach, language, presentation preferences, and previous knowledge. Value. The KEP proposed solution creates value in the form of decision support for a content developer or an educational program representative. 3 Procedure After we had implemented the KEP model and applied the KEP tool to get the solution of knowledge elements to be prepared for workplace learning in the LL context, we conducted artifact-driven interviews, where the vehicles of ID Length Gender Role in Project Work Exp. Country of origin Ex01 40:35 M Senior Researcher > 10 years Germany Ex02 23:40 F Senior Researcher > 10 years United Kingdom Ex03 34:45 M Senior Researcher > 10 years Luxembourg Ex04 32:03 M Workpackage Leader > 10 years United Kingdom Ex05 22:27 F Senior Researcher 5-10 years Spain Ex06 26:58 F Workpackage Leader > 10 years Finland Ex07 37:26 M Researcher 5-10 years Germany Table 1. Demographics of Interview Partners. our interviews were our artifacts: the KEP model, the KEP tool, and the KEP proposed solution. The goal of the interviews was to identify design patterns. We interviewed seven experts from the LL project. Their demographics can be found in Table 1. The interviews were structured with an interview guideline and by open-ended questions. The interview guideline with the questions can be found in [12]. The interview was motivated by the evaluation of the KEP model and the KEP tool. It was structured in three blocks: (i) evaluation of the factors of the KEP model (based on [16]) and the reasoning behind rating them, (ii) investigation of the KEP proposed solution, and (iii) outlook with requirements on the graphical user interface (GUI). As we performed the interviews remotely, we used visual aids through screen sharing: slides presenting the KEP approach and a spreadsheet showing the KEP proposed solution. We presented this spreadsheet and collected the experts’ opinions. The interviews were transcribed verbatim and double-checked and took between 22 and 41 minutes. We then analyzed the transcripts by a qualitative content analysis after Mayring [14] using deductive codes derived from the research design. The results of the interviews are given in Section 4. From the results we generated three candidate design patterns. Design patterns communicate high level and good solutions to recurring problems and can be seen as artifacts of design science research [17]. On an abstract level, design patterns follow the structure: “for problem P under circumstance C solution S has been known to work” [15]. Design patterns can be valuable for practitioners, as they describe practical and applicable solutions, and for researchers to synthesize and capture knowledge as well as to provide further research directions. In our work we developed candidate design patterns, which are discovered from experience and knowledge, and they are titled candidate, as they need to be validated [17]. We identified the three relevant candidate design patterns by discussing the reflections and suggestions given by the interviewees iteratively in three sessions within the group of co- authors (compare to the participatory pattern workshop methodology [15]). The patterns follow the structure of design patterns of Mor et al. [15] and the LL project [11]. 4 Results We present the results from the artifact-driven interviews structured into three parts: the KEP proposed solution, the rating procedure, and requirements on the graphical user interface. KEP proposed solution. Four experts (Ex01, Ex02, Ex05, and Ex07) saw the proposed solution as a good way to provide a summary, an overview, and a starting point. However, three experts were not satisfied. Ex03 noted “everything fits to almost everything [..] there is no consistency”. Another expert stated that the solution proposed by the KEP tool misses “the core debate from [his] work package perspective, which is a bit of a shame (Ex04)”. Also Ex06 “noticed that there was nothing that was created [by her] work package”. To clarify: all accessible knowledge elements were included. However, the knowledge elements were perceived as missing because they were not included in the KEP proposed solution or not recognized. Overall, the interviewees were not fully satisfied with the recommendations of the KEP tool as they did not understand the logic behind. The analytics model was too complex and thus it was not clear how their individual input was considered. Rating procedure. The experts employed a broad range of approaches to provide input for the KEP tool. Experts provided input based on their personal experience and they expected that their input was somehow reflected in the collective rating procedure. Particularly the collective way of gathering input data for the knowledge analytics approach was considered beneficial. One expert was “happy doing the collaborative rating [and finds] it is important [..] for the project to collect this kind of data (Ex05)”. However, two experts also identified challenges of the collective procedure: “you have got people like [A] defending [Topic A], [him] defending [Topic B], [C] defending [Topic C] and [D] defending [Topic D], and that’s [..] to mention a few (Ex04)”. Also Ex06 thinks that the tool could work in a context where “there is not so much of this, let’s say, social rules on the play”. Hence, the interviewees highlighted the need for a collection of ratings for knowledge analytics from different stakeholders, but also emphasized some challenges. Graphical user interface. Two experts suggested to tag the knowledge elements with keywords or graphic icons. The interviewees also suggested to “have the link between the knowledge element and the corresponding result (Ex05)”. Two experts asked for additional information concerning key measures in the recom- mendation: “something like a benefit-effort ratio [..] some key measure (Ex07)”, and “of course [what would] be interesting here is the benefits [because] what’s the link now between the benefits and knowledge element and the adaptation criterion? (Ex03)”. Another expert suggested “more aggregate views on the results [and to] slice-and-dice results in a way (Ex03)”. He said, that the data “feels a bit raw” and suggested some “quality aggregators”, some kind of “red- green-orange traffic light thing”, which would “give you an idea of whether the outcome of it is clear cut”. Hence, our interviewees highlighted the importance of useful and meaningful representations of the results of a knowledge analytics tool. Particularly, they considered graphical associations, and aggregated numbers and figures as very important. 5 Candidate Design Patterns The first candidate design pattern is generalized from the expert statements on the KEP proposed solution (cf. Section 4). Candidate design pattern (1): “Provenance And Traceability” Context. The knowledge analytics tool computes a recommendation on the basis of various input (i.e., the ratings). Thereby, the complexity of the proposed solution is very high. Problem. The users do not accept the proposed solution, because they do not understand why this recommendation was proposed. They would intuitively select other recommendations or they do not understand how their own and others’ ratings were considered by the knowledge analytics tool. Solution. The proposed solution and the major reasoning for the proposed solution is presented in a suitable way. By showing sub-factors and further details on demand, the analytics approach becomes more transparent. Now explanations for the proposed solution can be presented to the end users, which assumingly will increase their acceptance of the proposed solution. The provision of provenance and traceability is important for knowledge analytics, because knowledge is a less straightforward object for analytics than data. As a consequence the tool’s suggestions are more difficult to understand for users of knowledge analytics tools. Therefore, following [25] we have to provide both in knowledge analytics approaches: content and context, where context is built through provenance and traceability. The second candidate design pattern is based on the experts’ reflections on the rating procedure (cf. Section 4). Candidate design pattern (2): “Human Factor and Stakeholder Rating” Context. Several users provide their ratings, which reflect their individual knowledge and understanding. Therefore, the ratings stem from several different points of views and backgrounds. Problem. The ratings are crucial for applying the knowledge analytics tool, concretely for computing the proposed solution. Thus, the ratings should not reflect only one individual perspective, rather it should reflect all relevant ratings. Solution. The knowledge analytics tool can distinguish user groups and their perspectives on the ratings. It supports a collective approach to rating, the aggregation of ratings and it supports the building of consensus. If no consensus can be reached, then the tool offers the splitting of ratings into different user groups. The stakeholder involvement and collective rating are considered as crucial for knowledge analytics approaches. As one expert puts it: everyone who has provided a rating has “their own insight knowledge (Ex06)”. The integration of this knowledge and the building of shared mental models should be fostered [23]. The final candidate design pattern is identified from the statements with respect to the graphical user interface (cf. Section 4). Candidate design pattern (3): “Visualization of a Proposed Solution” Context. The proposed solution is presented to the users in a spreadsheet which lists the recommended preparation tasks with the selected knowledge elements. Problem. The visualization of the proposed solution in the spreadsheet was very data-oriented and simple, rather than designed according to the needs of the users. To them, the proposed solution seems raw and clunky. Solution. The graphical user interface is directed towards the user. It enables the user to customize the interface to different views on the proposed solution. Moreover, the user interface enables to explore the proposed solution in detail. Our interviewees demanded functionality that enables them to explore the proposed solution according to their own interests, preferences, and needs. 6 Conclusion Our presented work is one step towards the goal of supporting educational content developers with selecting knowledge elements from a large digital library. To achieve this goal we applied our knowledge analytics approach for workplace learning in the case of the EU FP7 Learning Layers. We leveraged the KEP model and its prototype implementation the KEP tool to compute recommendations, which we used to frame seven artifact-driven expert interviews. We analyzed the interviews qualitatively and based on our findings we developed three candidate design patterns for knowledge analytics: (1) provenance and traceability, (2) hu- man factors and stakeholder rating, and (3) visualization of the proposed solution. The next step is to ground the patterns with theories that explain the effects that the solutions are intended to create and to implement the functionality from our candidate design patterns in the KEP tool in order to validate the patterns. References 1. Boud, D., Middleton, H.: Learning from others at work: communities of practice and informal learning. Journal of Workplace Learning 15(5), 194–202 (2003) 2. Chen, H., Chiang, R.H., Storey, V.C.: Business intelligence and analytics: From big data to big impact. MIS quarterly 36(4), 1165–1188 (2012) 3. Collin, K.: Connecting work and learning: design engineers’ learning at work. Journal of Workplace Learning 18(7/8), 403–413 (2006) 4. Cooper, A.: What is analytics? Definition and essential characteristics. CETIS Analytics Series 1(5), 1–10 (2012) 5. Davenport, T., Harris, J., Morison, R.: Analytics at Work: Smarter Decisions, Better Results. Harvard Business Press (2010) 6. Garrick, J.: Informal learning in the workplace: Unmasking human resource devel- opment. Psychology Press (1998) 7. Goldstein, P.J., Katz, R.N.: Academic analytics: The uses of management infor- mation and technology in higher education. Tech. rep., EDUCAUSE Center for Analysis and Research (2005) 8. Hart, J.: Social learning handbook. Centre for Learning & Performance Technologies (2011) 9. Kiron, D., Shockley, R., Kruschwitz, N., Finch, G., Haydock, M.: Analytics: The widening divide. MIT Sloan Management Review 53(2), 1 (2012) 10. Proceedings of the 1st International Conference on Learning Analytics and Knowl- edge. ACM (2011) 11. Learning Layers D2.3: Tools for networked scaffolding in layers ecosystem. available at learning-layers.eu/deliverables/ (2015) 12. Learning Layers D3.2: Layers tools for creating and maturing instructional material. available at learning-layers.eu/deliverables/ (2014) 13. Maier, R., Thalmann, S.: Using personas for designing knowledge and learning services: results of an ethnographically informed study. International Journal of Technology Enhanced Learning 2(1–2), 58–74 (2010) 14. Mayring, P.: Qualitative Content Analysis: Theoretical Foundation, Basic Proce- dures And Software Solution. Klagenfurt (2014) 15. Mor, Y., Mellar, H., Warburton, S., Winters, N.: Practical design patterns for teaching and learning with technology, chap. Introduction: Using Design Patterns to Develop and Share Effective Practice, pp. 1–11. Springer (2014) 16. Parboteeah, P., Jackson, T.W.: Expert evaluation study of an autopoietic model of knowledge. Journal of Knowledge Management 15(4), 688–699 (2011) 17. Petter, S., Khazanchi, D., Murphy, J.D.: A design science based evaluation frame- work for patterns. ACM SIGMIS Database 41(3), 9–26 (2010) 18. Pirkkalainen, H., Thalmann, S., Pawlowski, J., Bick, M., Holtkamp, P., Ha, K.H.: Internationalization processes for open educational resources. In: Workshop on Competencies for the Globalization of Information Systems in Knowledge-Intensive Settings (2010) 19. Schäper, S., Thalmann, S.: Addressing challenges for informal learning in networks of organizations. In: 23rd European Conference on Information Systems (2015) 20. Thalmann, S.: Decision support framework for selecting techniques to prepare knowledge elements for adaptive use. Ph.D. thesis, University of Innsbruck (2012) 21. Thalmann, S.: Adaptation criteria for the personalized delivery of learning mate- rials: A multi-stage empirical investigation. Australasian Journal of Educational Technology 30(1), 45–60 (2014) 22. Van Barneveld, A., Arnold, K.E., Campbell, J.P.: Analytics in higher education: Establishing a common language. EDUCAUSE learning initiative 1, 1–11 (2012) 23. Wang, M., Shen, R.: Message design for mobile learning: Learning theories, human cognition and design principles. British Journal of Educational Technology 43(4), 561–575 (2012) 24. Watson, H.J.: Business analytics insight: hype or here to stay. Business Intelligence Journal 16(1), 4–8 (2011) 25. Zack, M.H.: Managing codified knowledge. MIT Sloan Management Review 40(4), 45 (1999)