Learning scorecard: monitor and foster student learning through gamification Elsa Cardoso1, Diogo Santos2, Daniela Costa2, Filipe Caçador2, António Antunes2, Rita Ramos2 University Institute of Lisbon (ISCTE-IUL) and INESC-ID, Lisbon, Portugal elsa.cardoso@iscte.pt University Institute of Lisbon (ISCTE-IUL), Lisbon, Portugal {diogo_leo_santos, dscaa1, filipe_cacador, antonio_lorvao, rita_parada}@iscte.pt Abstract. This paper presents the Learning Scorecard (LS), a platform that ena- bles students to monitor their learning progress in a Higher Education course dur- ing the semester, generating the data that will also support the ongoing supervi- sion of the class performance by the course coordinator. The LS uses gamifica- tion techniques to increase student engagement with the course. Business Intelli- gence best practices are also applied to provide an analytical environment for student and faculty to monitor course performance. This paper describes the ini- tial design of the LS, based on a Balanced Scorecard approach, and the prototype version of the platform, currently in use by graduate and undergraduate students in the fall semester of 2016-2017. Keywords: Balanced Scorecard. Business Intelligence. Student learning. Gami- fication. 1 Introduction A recurrent problem in Higher Education is the lack of information about the pro- gress of student learning in “real” time. Various statistics are calculated by Planning and Institutional Research offices offering a post analysis of academic success for each semester (e.g., evaluated, approved, and retention rates). Current pedagogic guidelines also encourage course coordinators to clearly define a set of tasks that students should perform autonomously additional to the course classes (e.g., exercises to be solved, basic and complementary bibliography that should be read). However, there is still little institutional support provided to students and faculty regarding the monitoring of the student learning experience and ongoing autonomous work completion throughout the semester. On the one hand, students do not know if they are correctly performing the proposed autonomous work, that is supposedly “a route to success in the course”. On the other hand, a faculty has no information regarding the real commitment of students to the learning experience, apart from his/her experience-based perception of the class behavior. 39 The Learning Scorecard (LS) is a platform that enables students to monitor their learning progress in a course during the semester, generating the data that will also support the ongoing supervision of the class performance by the course coordinator. The LS was initially developed by a group of students in the context of a course on Decision Support Systems (DSS) of the master program in Computer Science Engi- neering in the 2015-2016 spring semester, at ISCTE – University Institute of Lisbon, a public University in Lisbon, Portugal. The LS is a tool that helps students with the planning and monitoring of their learning experience in a course, using gamification and business intelligence techniques. Gamification was used to foster student interac- tion and positive competition. The LS has been initially designed to support the learning of the Data Warehouse course, which is a core subject of the Computer Science Engi- neering and Informatics and Management programs. This is a very demanding course in terms of study hours and practical assignments; hence, time management is critical for student success. Due to its characteristics, this course is a good case study to test the LS functionalities. This paper describes the initial design of the LS, based on a Balanced Scorecard approach, and the prototype version of the platform, currently in use by students in the fall semester of 2016-2017. The LS is presently the research subject of two master dis- sertations in Computer Science Engineering, and new improved versions of the plat- form are scheduled to be released in the next two semesters. 2 Business intelligence in Higher Education Business intelligence (BI) and analytics techniques are used for data-driven decision making. BI encompasses a “broad category of applications and technologies for gath- ering, storing, analyzing, sharing and providing access to data to help enterprise users make better business decisions,” [1]. The ultimate goal of a BI is to measure (i.e., the data related component), in order to manage, in order to enable a continuous improve- ment of an organization or a specific process. Hence, BI is deeply linked with perfor- mance management, requiring a positive and pro-active type of management and lead- ership. An analytical mindset includes the use of data, different types of analysis (e.g., meth- ods, approaches), and a systematic reasoning to make decisions [2]. BI and analytics are widely used in the business context, as well as in Higher Education [3,4]. Learning analytics, is a relatively recent research area [5], focusing on the “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs,” [6]. The Balanced Scorecard (BSC) is a performance management system used to sup- port strategic decisions. Originally developed by [7], the BSC has been successfully applied in many industries, including Higher Education (HE). Most BSC applications in HE are implemented at the institutional level, providing a performance management framework linked to the goals and strategic plans of the HE institutions. There are many examples in the literature reporting the institutional use of the BSC in academia, pre- dominantly in the United States [8,9,10,11] and United Kingdom [12,13], but also in 40 many different countries [14,15,16]. The use of the BSC approach to strategically man- age academic programs and to support the learning process is less common. Recently, [17] discusses the design of a BSC to support student success, in particular for account- ing students, enabling student engagement with the educational process, as well as with the accounting profession. Other relevant examples are: [18] discussing the benefits and potential components of a BSC for an accounting program (US and Canada); [19] describing a case study, in which the BSC was applied to the Master of Business, En- trepreneurship and Technology at the University of Waterloo (Canada); and [20] de- scribing the design of strategy maps for program performance measurement in HE. The Learning Scorecard, presented in this paper, designed according to the best practices of BI and BSC systems development, aims to measure and manage the performance and quality of the learning process. Since the LS goal is first and foremost to support stu- dents in their learning experience, it is also a valid application of learning analytics. 3 Gamification in Higher Education Gamification is defined as the use of game design elements in non-game contexts [21]. Game elements are artifacts regularly used in game design, such as points, levels, quests or challenges, avatars, social graphs, leaderboard, badges, and rewards. Gamifi- cation, albeit being a recent trend, has been applied in several non-game contexts, such as productivity, finance, health, sustainability, and also in education [21,22]. When us- ing gamification the designer should keep in mind the following aspects: (1) games are to experienced voluntarily; (2) games should involve learning or problem solving; and (3) games should have some structure (i.e., game rules) but the gamer should have the freedom to explore and have fun. Barata et al. [23] present an interesting approach to gamification in education, applied to a master course in the Information systems and Computer Engineering program. In their experiment, they added a set of game (e.g., points, levels, leaderboards, challenges and badges) to the traditional course, and com- pared the impact of the introduced gamification on student performance and satisfac- tion. After a period of two consecutive years, results were very positive, with increased student performance in terms of class attendance, number of lecture slides downloads, and number of post on the course’s forums [23]. This experiment was inspirational to the design of the Learning Scorecard, given that the institutional context and student profile are similar. That is, both projects are realized in Public Universities in Lisbon, Portugal, with students enrolled in similar programs (i.e., Computer Science Engineer- ing master programs; although the LS is also being tested by undergraduate students of Information and Management). 4 The Learning Scorecard Effective time management is pivotal for student academic success. Higher educa- tion students need to conciliate their personal and often professional responsibilities with their academic ambitions. Poor planning of tasks, in terms of effort and scheduling, 41 often results in failing a course or achieving a lower grade than expected. In this con- text, the use of a strategic management tool – like the Learning Scorecard – customized by the course’s faculty can provide a valuable support to students. The LS enables the formulation of strategic objectives, performance indicators and targets that support stu- dents with a baseline for the performance in the course that will yield a successful out- come. The LS also enables students to monitor their performance throughout the se- mester in comparison with the average performance of the entire class. This increases students’ awareness of their leaning efforts, for instance, if they are falling behind the objectives defined by faculty or if they are in line with the average progression of the class. Additionally, students will be notified of incoming deadlines and their general course delay, in a proactive and motivating approach. Gamification techniques were used to design the LS, since the tool is used voluntarily by students. Gamification ena- bles the motivation of students in terms of achieving the course goals, and provides a healthy competition environment towards the best course performance. The LS is also a valuable tool for faculty, providing an aggregated view of students’ performance and its evolution throughout the semester. The LS enables the course co- ordinator to identify the pain points of the course experienced by students. Namely, what are the tasks that generally demand an extra effort from students (comparing to the faculty planning) and how many students are at risk of failing the course. The anal- ysis of student performance data can be used by the course coordinator to improve the course syllabus, with the goal of improving the teaching quality and the student learning experience. The specification of LS included the following functional requirements: • users need to have a profile and authentication credentials • the LS platform needs to be integrated with the e-learning system (having access to quiz results, forum participation, downloading of materials, etc.) • two types of accounts or views: student and course coordinator • dashboards for monitoring individual student performance (student view) • dashboards for monitoring class performance (course coordinator) • automatic course schedule with deadlines and studying guidelines, customized by the course coordinator • the LS platform should include game elements (i.e., scores, ranks, quests, leader- board) The design of the Learning Scorecard platform also encompasses the following non- functional requirements: • portability across web browsers (Firefox, Google Chrome, Safari, Edge) including mobile devices • intuitive and user-friendly interface (input data from students should be kept to a minimum) • student identification data must be private (ethical requirements), i.e., the course co- ordinator will only have access to aggregated class data, even for the case of at-risk students. 42 The former non-functional requirement can be sensed as a miss-opportunity, since course coordinators would want to know, individually, which students are at-risk. How- ever, the LS was mainly designed for students to support their learning experience in the course. By introducing privacy in student identity, students are more likely to vol- untarily use the Learning Scorecard and experience the benefits of this platform, with- out fearing any potential consequences of faculty scrutiny. 4.1 Strategic design of the Learning Scorecard The Learning Scorecard was designed according to the BSC methodology proposed by [24]. For the purpose of this paper, we will focus on the design decisions of a few selected steps. In the design of an organizational BSC this step usually entails the clar- ification of the strategy to be executed, including a strategic analysis of the organiza- tion, and the definition of mission and vision statements, as well as the organization’s values. Since the LS is a thematic scorecard, the strategic analysis of the ‘organization’ will not be presented. The mission of the Learning Scorecard is to “to provide Higher Education students with an analytical environment enabling the monitoring of their performance in a course, contributing to an enhanced student learning experience.” Three values encompass the design and implementation of the LS: Pursue Growth and Learning; Enjoying Participation; and Self-discipline (Make it happen). The vision statement was defined as follows: “by the end of the academic year of 2017-2018, the Learning Scorecard application should be used by at least 50% of the enrolled students in the DSS courses at ISCTE-IUL.” The vision statement complies with the guidelines proposed by [25], in which three elements must be clearly defined: a niche (enrolled students in the DSS courses at ISCTE-IUL), a stretch goal (used by at least 50% of enrolled students) and a time frame (by the end of academic year of 2017-2018). Clear indicators to Clear learning Communication measure milestones tools progress Achieve a Gains Increased better communication grade with colleagues Fun game and faculty elements Web Motivation Application Gamification to study Student techniques performance Perception Alignment monitoring of the with faculty course’s expectations Online pain points Quiz results Gain creators Data Time privacy management Lost of time Learning Learning Online with effort not management course irrelevant well schedule Well- Limited information invested Dashboards structured inputs Available for course productivity required personal contents monitoring Non- tools are Customer too complex centralized jobs course Frustration Products & information User-friendly by work services Integration Student Pains and intuitive overload with anonymity performance Procrastination monitoring e-learning of tasks and tools system assignments Pain relievers Fig. 1. Student value proposition canvas The Business Model Canvas (BMC) is a strategic tool used to describe, in an intui- tive and accessible language, the business model of an organization [26], i.e., how an 43 organization intends to create, deliver and capture value. Nine building blocks consti- tute the BMC: customer segments, value proposition, channels, customer relationships, revenue streams, key resources, key activities, key partnerships, and cost structure. For the purpose of the BSC design, two building blocks are of importance – customer seg- ments and the value proposition. In order to effectively design a value proposition, matching the needs and expectations of the customer segments, [27] proposed a new canvas – the Value Proposition Canvas (VPC). In the LS we have to address the needs and expectations of both students and the course coordinator, hence we need to define two customer segments. Figures 1 and 2 present the the LS value proposition canvas for students and course coordinator, respectively. Customer segments are represented on the right-end side and the value position on the left-hand side of these figures. More Support students At risk autonomous & with studying situations pro-active Decrease guidelines alerts Gains students with student embedded in Increase student course work retention game elements academic success in the Increase student course Closer communication collaboration Ongoing Increase with students with active forum monitoring of course’s discussions student teaching Identify learning quality student learning Dashboards patterns with Gain creators Improve the Effective Study aggregated student student guidelines & student learning support deadlines performance experience definition data Unable to Inexistent identify if monitoring students are Ensure Periodic self- tools retaining execution of Increase Quiz assessments student and contents course’s Lack of definition inform how faculty planning student students are engagement retaining lecture interest with with the the course Customer contents course jobs Products & Unable to User-friendly identify at Pains services and intuitive Less time used risk Overloading of performance to answer students student emails monitoring student related to tools emails course planning Pain relievers Fig. 2. Course coordinator value proposition canvas In the VPC model, customer segments are defined in terms of customer jobs, pains and gains. The student VPC (see Figure 1) will be used as an example to explain the model. A customer job is related to what the customer is trying to get done; it can be a task, a problem or even a basic need (e.g., time management). The pains are the negative aspects encountered by the customer in his/her current way of dealing with the ‘jobs’; including negative emotions or hurdles, undesired costs and risks (e.g., lost of time with irrelevant information). The gains reflect the benefits a customer expects or desires to achieve with the product or service; it can be translated into, for instance, a functional utility, positive emotions or cost savings (e.g., perception of the course’s pain points). The value proposition block in the VPC is described in terms of three components: (1) products and services (a list of products and services offered and their link to the cus- tomers’ jobs); (2) pain relievers (to eliminate or reduce the customers’ pains); and (3) gain creators (describing the positive outcomes and benefits that products and services deliver to customers). An example of these components are, respectively, (1) online course schedule, (2) well-structured course contents, and (3) clear learning milestones. The observation and design of the customer segment profile comes first. Then fol- lows the design of the value proposition, addressing the most relevant and critical jobs, 44 pains and gains of the target customers. In this way, the design process of the prod- uct/service is enclosed by the real needs of customers and there is a concrete mapping with expected benefits. This process is also useful to determine the differentiator factors of the product/service, which will be useful for the definition of performance indicators in the BSC. Fig. 3. The Learning Scorecard strategy map The strategy map is a design tool [8] that helps to tell the story of the strategy, high- lighting in a creative way the dependencies, called cause-and-effect relationships, be- tween the strategic objectives. The map, displayed in Figure 3, has three perspectives: Students and Faculty (S & F), Internal Processes (IP), and Learning and Growth (L & G). The financial perspective, the fourth standard perspective in a Balanced Scorecard is not relevant for the Learning Scorecard, which is focused solely on the student learn- ing experience. Three strategic themes were also defined to frame the definition of stra- tegic objectives: Engagement, Information and Learning. These themes are the main drivers to achieve the vision. That is, if we aim to have the LS platform being used by at least 50% of all enrolled students, then it is key to foster student engagement, provide updated information of performance monitoring, and help students to optimize their learning experience in the course. The strategy map should be read bottom-up, following the cause-and-effect relation- ships between objectives. As can be seen in Figure 3, the final strategic objective (the ultimate effect) is to improve the student learning experience, which is the central goal of the Learning Scorecard. Values are included in the strategy map, at the right end side of the “mobile phone”, a metaphor for the portability of the LS platform, since the de- velopment of a mobile application is contemplated in the near future. A crucial part of the balanced scorecard is the definition of performance indicators to measure the achievement of the intended strategic objectives. In this project, a subset of these performance indicators will also be used to populate the student and course coordinator dashboards provided by the LS platform. The key performance indicators 45 (KPI) of the LS, presented in Table 1, are either measured biweekly or at the end of the academic semester. As already mentioned, this paper describes the initial version of the LS platform. More functionalities are being developed in the LS platform, integrated in two Computer Science Engineering Master dissertations, which will be completed until September 2017. It is therefore foreseeable that new indicators may be able to be cal- culated, depending on new source data. Table 1. Learning Scorecard KPIs Persp. Strategic Objective KPI Frequency S&F Streamline the course Average forum activity fortnight S&F Motivate students Average number of points fortnight S&F Ensure class performance Number of visualizations of class fortnight monitoring dashboards in the LS faculty view S&F Ensure student self-moni- Number of visualizations of student fortnight toring dashboards in the LS student view S&F Study optimization % completed quests within course fortnight milestones S&F Improve teaching quality Average final grade semester S&F Decrease student retention Course retention rate semester S&F Improve student learning Student satisfaction index semester experience IP Ensure customization % of used LS input options semester IP Implement gamification Number of LS game elements semester techniques IP Develop intuitive interfaces Student usability assessment index semester IP Develop automatic services % quests without manual user input semester IP Implement monitoring Number of LS data visualizations semester techniques L&G Promote a culture of en- % LS active students fortnight gagement L&G Promote a culture of re- Average quest delay fortnight sponsibility L&G Improve course infor- Number of new implemented LS semester mation systems functionalities 5 LS prototype The LS platform was developed using Node.js. The front-end was developed using HTML and CSS. Javascript, specifically Express.js, was used for the back-end imple- mentation. Several modules were used: Bootstrap, for platform design, Chart.js, for the implementation of the charts in the LS dashboards, Passport and Crypto, for secure authentication of students in the LS. The LS pilot also includes a MySQL database that 46 stores data from the e-learning system and custom data provided by the course coordi- nator (input format .csv). 110 students are currently testing the pilot implementation of the Learning Scorecard in the Data Warehouse course; in the 2016-2017 fall semester this course is offered to four different programs. In the LS, students are divided into classes according to their program. A summary of the leaderboard is always visible (at the left down corner of Figure 4), presenting not only the top-5 gamers (ordered by points) but also the ranking of classes in terms of the percentage of active students using the LS. In the ranking classes are identified by their acronym (in Portuguese): MEI, IGE, IGE-PL, and METI. Figure 4 presents the initial page of the LS for a student (i.e., a “gamer”). In this page the student can visualize his/her performance (in points) and receive alerts about in- coming quests deadlines. Since the first LS experiment in ongoing, we opted to display in Figures 4-6 only test data (no real data is provided). Students begin with zero points, and are thus encourage to learn to earn points, and increase their game level. The planning functionality is developed based on the course syllabus (currently still in Portuguese). Each semester, the course coordinator needs to customize the set of quests as well as course’s milestones, and their respective deadlines or due dates. For instance, the following five milestones were defined for the Data Warehouse course in 2016-2017 fall semester: (1) group and practical assignment theme selection; (2) first group tutorial meeting; (3) second group tutorial meeting; (4) practical assignment sub- mission; and (5) practical assignment discussion. Figure 5 presents the current planning page in the LS platform for the Data Warehouse course. By clicking on each quest in this list, the student can visualize a pop-up interface with a detailed description of the quest, aligned with the information of the syllabus, and the number of points that can be awarded. Quests can be mandatory or optional. It is also possible to customize how many points students loose if they fail to comply with the quest deadline. For mandatory quests, the deadline must be met, otherwise no points are awarded. Quests are related to reading the lecture slides, solving exercises, completing the milestones of the practi- cal assignment, class attendance, quizzes, participating in the course’s forum, etc. Fig. 4. LS student view: quests and points 47 The performance functionality in the student view includes three standard visualiza- tions: radar chart, percentage chart and progress analysis. Figure 6 presents an example of the performance monitoring visualization currently developed in the student view. These set of charts enable a progress analysis of the student performance as the semester evolves; Figure 6 displays (test) data for the first two weeks of the semester (which comprises 12 weeks). Fig. 5. LS student view: planning functionality (in Portuguese) 6 Conclusions and future work The use of gamification in Higher Education is a recent technique, in which games elements are applied to non-game contexts. In this paper, the design and prototype im- plementation of the Learning Scorecard was described. Apart from the integration of game elements, such as points, levels, quests and leaderboard, the LS was also designed using the best practices of business intelligence. A set of functional and non-functional requirements were initially defined, which led to the definition of the strategic manage- ment tool – the Balanced Scorecard. The full implementation of the BSC, with dash- boards to visualize the KPIs will be part of the next version of the LS platform, which will have the course coordinator’s view fully developed. New versions of the platform are planned for the next two academic semesters, since the LS is the subject of two master dissertations in Computer Science Engineering that are due September 2017. Currently, the LS platform is being used by 110 students and is already integrated with the e-learning platform Blackboard. Future work entails the identification of study pat- terns linked to student success, using data mining techniques. The integration with Blackboard will also be further explored, particularly in terms of collaborative learning aspects already present in the tool. The gamification part will also be extensively de- veloped, specifically in terms of visual effects. The LS aims to be a fun tool, that really makes a difference in the way students learn and collaborate with each other. It is also 48 planned the design and implementation of student satisfaction questionnaires, to assess student engagement, motivation, and satisfaction with the course and the LS platform. Fig. 6. LS student view: performance visualization (progress analysis) References 1. Burton, B., Geishecker, L., Schlegel, K., Hostmann, B., Austin, T., Herschel, G., Soejarto, A., Rayner, N.: Business Intelligence focus shifts from Tactical to Strategic. Gartner Research Note G00139352 (2006) 2. Davenport, T., Harris, J., Morison, R.: Analytics at Work – Smarter Decisions, Better Results. Harvard Business Press (2010) 3. Rajni, J., Malaya, D. B.: Predictive Analytics in a Higher Education Context. IT Professional, 17(4), 24-33. IEEE Computer Society Publishing (2015) 4. van Barneveld, A., Arnold, K. E., Campbell, J. P.: Analytics in Higher Edu- cation: Establishing a Common Language. Educause (2012) 5. Clow, D.: An overview of learning analytics, Teaching in Higher Education, 18:6, 683-695 (2013) 6. Long, P., Siemens, G.: Penetrating the fog: Analytics in learning and educa- tion. Educause Review, 46(5), 31-40 (2011) 7. Kaplan, R., Norton, D.: The Balanced Scorecard – Translating Strategy into Action. Harvard Business School Press (1996) 8. Kaplan, R; Norton, D.: Having Trouble with Your Strategy? Then Map It. Harvard Business Review, 78, 167- 176 (2000) 9. O’Neil Jr., H., Bensimon, E., Diamond, M., Moore, M.: Designing and Imple- menting an Academic Scorecard. Change, 31, 33-40 (1999) 10. Karathanos, D., Karathanos, P.: Applying the Balanced Scorecard to Educa- tion. Journal of Education for Business, 80, 222-230 (2005) 49 11. Beard, D.: Successful Applications of the Balanced Scorecard in Higher Edu- cation. Journal of Education for Business, 84, 275-282 (2009) 12. Thomas, H.: Business school strategy and the metrics for success. Journal of Management Development 26, 33-42 (2007) 13. Taylor, J., Baines, C.: Performance Management in UK universities: imple- menting the Balanced Scorecard. Journal of Higher Education Policy and Management, 34(2), 111-124 (2012) 14. Boned, J. L., Bagur, L.: Management Information Systems: The Balanced Scorecard in Spanish Public Universities (2006) Available at SSRN: http://ssrn.com/abstract=1002517 15. Yu, M. L., Hamid, S., Ijab, M. T., Soo, H. P.: The e-balanced scorecard (e- BSC) for measuring academic staff performance excellence. Higher Educa- tion, 57, 813-828 (2009) 16. Pietrzak, M, Paliszkiewicz, J., Klepacki, B.: The application of the balanced scorecard (BSC) in the higher education setting of a Polish university. Online Journal of Applied Knowledge Management, 3(1), 151-164 (2015) 17. Fredin , A., Fuchsteiner, P., Portz, K.: Working toward more engaged and suc- cessful accounting students: a Balanced Scorecard approach. American Jour- nal of Business Education, 8(1), 49-62 (2015) 18. Chang, O., Chow, C.: The Balanced Scorecard: a Potential Tool for Support- ing Change and Continuous Improvement in Accounting Education. Issues in Accounting Education, 14, 395-412 (1999) 19. Scholey, C., Armitage, H.: Hands-on Scorecarding in the Higher Education Sector. Planning for Higher Education, 35, 31-41 (2006) 20. Cardoso, E., Viaene, S., Costa, C. S.: Designing strategy maps for programme performance measurement in higher education. In: 15th Int. Conf. of European University Information Systems (EUNIS 2009). Spain (2009) 21. Deterding, S., Dixon, D., Khaled, R., Nacke, L.: From Game Design Elements to Gamefulness: Defining “Gamification”. In: 15th International Academic MindTrek Conference. Finland (2011) 22. Dicheva, D., Dichev, C., Agre, G., Angelova, G.: Gamification in Education: a systematic mapping study. Educational Technology & Society, 18(3), 75-88 (2015) 23. Barata, G., Gama, S., Jorge, J., Gonçalves, D.: Engaging Engineering Students with Gamification. In: 5th Int. Conf. on Games and Virtual Worlds for Serious Applications (vs-games) (2013) 24. Cardoso, E.: Performance and Quality Management of Higher Education Pro- grammes. PhD thesis, University Institute of Lisbon (ISCTE-IUL) (2011) 25. Kaplan, R; Norton, D.: The Execution Premium – Linking Strategy to Opera- tions for Competitive Advantage. Harvard Business School Press (2008) 26. Osterwalder, A., Pigner, Y.: Business Model Generation – a Handbook for Visionaries, Game Changers, and Challengers. Wiley (2010) 27. Osterwalder, A., Pigner, Y., Smith, A., Bernarda, G., Papadakos, P.: Value Proposition Design. Wiley (2014) 50