A Data-based Guiding Framework for Digital Transformation Zakaria Maamar1 , Saoussen Cheikhrouhou2 and Said Elnaffar3 1 Zayed University, Dubai, UAE 2 University of Sfax, Sfax, Tunisia 3 Canadian University Dubai, Dubai, UAE Abstract This paper presents a framework for guiding organizations initiate and sustain digital transformation initiatives. Digital transformation is a long-term journey that an organization embarks on when it de- cides to question its practices in light of management, operation, and technology challenges. The guid- ing framework stresses out the importance of data in any digital transformation initiative by suggesting 4 stages referred to as collection, processing, storage, and dissemination. Because digital transformation could impact different areas of an organization for instance, business processes and business models, each stage suggests techniques to expose data. 2 case studies are adopted in the paper to illustrate how the guiding framework is put into action. Keywords Data, Digital Transformation, Framework, Guideline 1. Introduction According to Gartner, “Digital transformation can refer to anything from IT modernization (for example, cloud computing), to digital optimization, to the invention of new digital business mod- els. The term is widely used in public-sector organizations to refer to modest initiatives such as putting services online or legacy modernization. Thus, the term is more like “digitization” than “digital business transformation” (https://tinyurl.com/y8qwtrty). We consider Digital Transfor- mation (DT) as a long-term journey that organizations embark on, should they decide to question their strategic, management, and operational practices in response to factors like continuous development of Information and Communication Technologies (ICT) and dynamic nature of the ecosystems in which they reside. While DT targets 4 main areas that are business process, business model, domain, and cul- tural/organizational [1], we argue that data should be a driving factor that would ensure a successful DT. Data reflects how an organization is performing from all perspectives, whether percentage of marketshare, volume of sales, amount of liquidity, etc. Thus, prior to embarking on any DT initiative, decision makers should comply with the lifecycle that underpins data existence. Indeed, data that is unavailable, of poor quality, and not secure could make DT deviate Tunisian Algerian Conference on Applied Computing (TACC 2021), December 18–20, 2021, Tabarka, Tunisia " zakaria.maamar@zu.ac.ae (Z. Maamar); saoussen.cheikhrouhou@redcad.org (S. Cheikhrouhou); said.elnaffar@cud.ac.ae (S. Elnaffar) © 2021 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings http://ceur-ws.org ISSN 1613-0073 CEUR Workshop Proceedings (CEUR-WS.org) from its initial path leading to terrible consequences on the organization’s growth, for example. In this paper, we suggest a 4-stage data lifecycle that would shed light on the necessary tech- niques, technologies, and/or practices that would ensure DT success. We refer to these stages as collection, storage, processing, and dissemination that would feed each other according to specific communication channels. Depending on the main area that would be subject to DT such as business process, the 4 stages would act as a guiding framework for this DT by raising questions related to what data should be collected, processed, stored, and disseminated, how data should be collected, stored, processed, and disseminated, when and where data should be collected, stored, processed, and disseminated, and to whom data should be disseminated. For illustration, data to use during the transformation of business processes should expose the control flows of these processes while data to use during the transformation of business models should expose the nature of organization whether brick&mortar presence or online presence. Our contributions are, but not limited to, (i) 4 stages defining data lifecycle, (ii) a framework guiding DT based on these stages, and (iii) application of the framework to 2 DT initiatives. The rest of this paper is organized as follows. Section 2 is an overview of digital transformation. Section 3 presents the data-based framework in terms of stages and principles. Section 4 concludes the paper and presents some future work. 2. Overview of digital transformation Digitization to improve performance and increase marketshare is a hot topic for organizations across the globe. Unfortunately there are not clear guidelines on how organizations should proceed with their DT initiatives. The same concern is noted when it comes to defining DT. Many definitions exist ranging from “using technology to radially improve the performance of a company" [2] to “the changes or influences in all aspects of human’s life" [3]. In a tentative to come up with a common definition, Van Veldhoven and Vanthienen extracted key components from existing definitions such as use of digital technologies, new business models, and organi- zational transformation [4]. They also proposed a conceptual framework that reconciles the various aspects of DT. The benefits of the framework include providing novel insights into the concept of DT and raising intriguing questions about the scope, forces, and impact of the interactions between business, digital technologies, and society. In [5], Solis et al. affirmed that DT is a complex and enriching journey and propose steps that make organizations move along the road of DT. The authors identified a series of patterns, components, and processes, referred to as Altimeter Group’s “6 Levels of Digital Transformation Maturity” that provides a blueprint for implementing a DT Plan. In [6], Huang et al. dealt with digital systems engineering, which exemplifies the use of DT in the domain of engineering. The authors examined the transition from traditional to digital engineering along with tackling other challenges such as identifying enabling digital technolo- gies and defining related core concepts. Particularly, Huang et al. focused on digitizing artifacts to ensure information sharing across platforms, across life cycles, and across domains. In [7], Verina et al. completed a survey to shed light on DT’s key elements/components/categories. They targeted business DT with focus on the human element and then, evaluated both DT’s success factors such as internal communication with staff members and DT’s barriers such as staff resistance. In [8], Corver et al. proposed a framework to help organizations pursue a digital vision and develop new business models based on digital opportunities. The framework is built upon 4 elements, customer, organisation, product, and processes. Corver et al. advise organizations to re-think their ways of doing business in the context of DT and to proceed with digitizing customers, operations, products, and services. Despite the benefits of DT, many obstacles could undermine them as discussed in [9]. There is lack of vision about the proper way of undertaking DT associated with other factors like staff resistance to change, legacy technologies to deal with, and existing regulations to comply with. Many organizations also fail to assess the challenging job of embracing DT. 84% of large organizations have failed to deliver DT’s expected benefits [10]. According to Datta, “digital transformation is not a plug and play solution, especially in a diverse and complex democracy. A purely technological solution is not the answer. The answer is a leveraged buy-in.” [11]. Davenport and Westerman list many examples of high-profile organizations such as General Electric, Nike, Lego, Ford, and Procter & Gamble that “spent millions to develop digital products, infrastructures, and brand accompaniments, and got tremendous media and investor attention, only to encounter significant performance challenges, and often shareholder dissent” [12]. Consequently, some organizations like Nike drastically downsized their digital units while others like General Electric had fired their CEOs. To avoid such “unpleasant” experiences, we advocate for a framework that would guide DT by raising questions related to what, how, why, where, and when data need to be collected, processed, stored, and disseminated. 3. Data-based guiding framework This section consists of 4 parts. The first part defines the stages upon which the data lifecycle is built while the second and third parts apply this lifecycle to 2 DT initiatives related to business process and business model, respectively. The last part discusses some technologies that could support completing the data lifecycle. 3.1. Defining the data lifecycle In Fig. 1, plain lines illustrate the data lifecycle’s 4 stages while dotted lines illustrate the inter- actions that arise during DT between 2 of these stages and the ecosystem. According to Laudon and Laudon [13], an organization’s ecosystem consists of 5 stakeholders that are customers, suppliers, competitors, shareholders, and regulatory authorities. Each stakeholder contributes differently to an organization’s DT strategies when it comes to the products/services that are offered to customers, the goods/services that are requested from suppliers, the actions that are taken to respond to competitors, the strategies that are put in place to satisfy shareholders, and the policies that are defined to comply with regulatory authorities. Collection stage constitutes the organization’s “ears” with and “eyes” over the ecosystem in which it operates by capturing all the necessary data that DT initiatives would require. Depend- ing on a DT initiative’s type, data to collect would be tied up to the ecosystem’s stakeholders. For instance, a cultural/organizational-driven DT would require data about employees’ profiles out Disseminatioin Ecosystem in Collection Storage Processing Figure 1: Data lifecycle in the context of digital transformation so, that, an accurate appreciation of how the organization currently deals with and should deal with employees’ concerns would be formed. Storage stage constitutes the organization’s warehouse where data is saved temporarily and/or permanently in compliance with internal and external regulations. This saving could also be used to safeguard data from unauthorized manipulation that could misguide DT initiatives. Data to store could be either raw or processed. In either case, it is important to know the origin of data. Processing stage constitutes the organization’s “brain” by manipulating data that is collected and/or stored according to the needs and requirements of DT initiatives. Processing data could lead to new insights and be of different types such as real-time and batch depending on the DT initiative’s type. For instance, a business model-driven DT would need to differentiate data that is processed on-premise versus off-premise like the cloud. Dissemination stage constitutes the organization’s outreach by communicating data to the ecosystem’s stakeholders before, during, and after DT initiatives. The types of these initiatives would allow to decide when to communicate data, with whom, how, and where. For instance, a business process-driven DT would focus on the frequency of communicating data by the process along with the means used for communicating. 3.2. Applying the data lifecycle to business process In this part of the paper, we apply the data lifecycle to a Business Process (BP)-driven DT initiative. A BP is “a set of activities that are performed in coordination in an organizational and technical environment. These activities jointly realize a business goal” [14]. Formally, we define a BP with a 7-tuple ℬ𝒫 = < 𝒜, 𝒟𝒜, ℒ, 𝒟, 𝑎0 , ℱ, 𝒞 > where: 𝒜 is a finite set of activities (𝑎); 𝒟𝒜 is a finite set of data (𝑑𝑎) that activities consume (inputs) and/or produce (outputs); ℒ is a set of labels; 𝒟 ⊆ 𝒜 × ℒ × 𝒜 is the dependency relation so, that, each dependency 𝑑 = (𝑎𝑠𝑟 , 𝑙, 𝑎𝑡𝑔 ) consists of a source activity 𝑎𝑠𝑟 ∈ 𝒜, a target activity 𝑎𝑡𝑔 ∈ 𝒜, and a transition label 𝑙 ∈ ℒ; 𝑎0 ∈ 𝒜 is the initial activity; ℱ ⊆ 𝒜 is a finite set of final activities; and 𝒞 is a set of constraints over (some) activities in 𝒜 and/or (some) data in 𝒟𝒜. To apply the data lifecycle to a BP-driven DT initiative, we action this lifecycle’s 4 stages. During the data collection stage, we assume the existence of a function dataCollectBP() that takes a ℬ𝒫 as an input and produces details about all collected data associated with activities in 𝒜: - Input/Output data: type (number versus character versus Boolean), mandatory versus optional, and plain text versus encrypted text. - List of activities receiving input data. - List of activities sending output data. - Constraints over data such as temporal (e.g., when to collect data?), location (e.g., where to collect data?), logical (e.g., how to collect data?), and security (e.g., how and where to safeguard the collected data?). During the data processing stage, we assume the existence of a function dataProcessBP() that takes a ℬ𝒫 as an input and produces details about all activities in 𝒜: - Type of activity: batch processing versus transaction processing. - Predecessor activities to activity along with the dependency types; this detail is set to null for 𝑎0 . - Successor activities to activity along with the dependency types; this detail is set to null for any activity 𝑎 ∈ ℱ. - Resource usage to process activity in term of CPU. - Constraints over activity such as temporal (e.g., when to process the activity?), location (e.g., where to process the activity like on-premise versus off-premise?), logical (e.g., what does the activity do?), and security (e.g., how to safeguard the processed activity?). During the data storage stage, we assume the existence of a function dataStoreBP() that takes a ℬ𝒫 as an input and produces details about all stored data associated with activities in 𝒜: - Location of stored data whether on-premise, off-premise, or both. - Access control for stored data to satisfy security needs. - Resource usage to store data in term of HD size. Finally, during the data dissemination stage, we assume the existence of a function dataDis- seminateBP() that takes a ℬ𝒫 as an input and produces details about all disseminated data associated with activities in 𝒜: - Recipients of disseminated data who could be persons, organizations, or both. - Patterns for disseminated data like frequency, fee versus no-fee, and means (online versus offline). - Resource usage to disseminate data in term of network bandwidth. To illustrate the data lifecycle in the context of business processes, our case study targets a small-size financial company that processes credit applications on behalf of a group of commer- cial banks (adopted from [15]). For the sake of competitiveness, the credit company looks into the feasibility of deploying the credit-application BP on the cloud and hence, launches a BP-driven DT initiative. Fig. 2 is the Business Process Model Notation (BPMN, [16]) representation of the credit-application BP’s process model where different activities (e.g., check_for_completeness and request_credit_card) and gateways (e.g., application_completeness and credit_amount) are included in this process model. When the company’s clerk receives an online application, she verifies its completeness in compliance with the commercial banks’ regulations. Should the application be incomplete, the clerk asks the applicant for more information. Otherwise, the clerk performs additional checks depending on the amount of credit (e.g., up to $500). Next, the clerk sends the senior manager the application for final verification. Should the application be accepted, the applicant is notified, a credit-card production request is issued, and the process ends. Otherwise, the applicant is notified of the rejection and the process ends, too. In Fig. 2, 𝑅𝑖 are budget- and deadline-related restrictions on some of the credit-application BP’s activities. Along with Fig. 2, Table 1 summarizes these restrictions and presents the resources that some activities require at run-time. Figure 2: BPMN representation of credit application BP (adapted from [15]) Table 1 Some activities in credit-application BP Activities a1 : check for a2 : check credit a3 : make decision a4 : request credit completeness amount card Resources to require Computation Storage Computation Computation Memory:6 Capacity:50 Memory:32 Memory:64 CPU:2 CPU:4 CPU:8 Restrictions to satisfy Budget: 20 Budget: 30 Budget: 55 Budget: 60 Deadline: no Deadline: no Deadline: 10mn Deadline: no For a successful DT, we illustrate below how the guiding framework’s data lifecycle is adopted as per the recommended 4 stages. First, the collection stage calls dataCollectBP() that returns details about each data related to the activities in credit-application BP. - Input data include customer_name (character, mandatory, and plain text) and amount_of_credit (number, mandatory, and encrypted). - Output data include credit_application_decision (Boolean:accept/reject, mandatory, and encrypted) and amount_of_credit_approved (number, mandatory, and encrypted). - Check_for_completeness activity is the receiving activity for customer_name input data. - Make_decision activity is the sending activity for credit_application_decision output data. - Constraints over customer_name include temporal (immediate collection) and logical (online collection). Second, the processing stage calls dataProcessBP() that returns details about each activity in credit-application BP. For illustration, we consider make_decision activity. - Type is transaction processing. - Successor of 2 exclusive activities, perform_checks_for_large_amount and perform_checks for_small_amount activities. - Predecessor of 3 activities, notify_rejection_by_email, notify_acceptance_by_email, and request_credit_card. - Resource usage includes 4GHZ CPU speed. - Constraints include temporal (at most 2 days from submission) and location (main branch). Third, the storage stage calls dataStoreBP() that returns details about how each data related to the activities in credit-application BP is stored. - Location of customer_name is on-premise. - Access control to customer_name is secured with password. - Resource usage includes a HD of 2GB to store customer_name. Finally, the dissemination stage calls dataDisseminateBP() that produces details about each data associated with the activities in credit-application BP. For illustration, we consider both notify_acceptance_by_email and notify_rejection_by_email activities. - Recipients of disseminated data are applicants whether persons or organizations. - Patterns for disseminated data where the frequency is upon accepting each credit appli- cation, the communication means is email, and the cost is free. - Resource usage includes a bandwidth of 512 kbps. 3.3. Applying the data lifecycle to business model By analogy to Section 3.2, we, this time, apply the data lifecycle to a Business Model (BM)-driven DT initiative. Contrarily to BPs, there is not a consensus on what BM is, which should not ease identifying a BM’s necessary constructs. To address this lack of consensus, we mention some definitions of BM and, then, suggest some constructs according to what we deem necessary during DT of BMs. According to Geissdoerfer et al., a BM is “a simplified representation of the elements of an organisation and the interaction between these elements for the purpose of its systemic analysis, planning, and communication in face of organizational complexity” [17]. In a 2015 Business Harvard Review report (hbr.org/2015/01/what-is-a-business-model), a BM consists of 2 parts: “Part one includes all the activities associated with making something: design- ing it, purchasing raw materials, manufacturing, and so on. Part two includes all the activities associated with selling something: finding and reaching customers, transacting a sale, distributing the product, or delivering the service”. Last but not least, Li discusses a BM’s key constructs, namely, value proposition (product offerings of the firm, its market segments and its model of revenue generation), value architecture (how a firm senses, creates, distributes, and cap- tures values), and functional architecture (core activities of a firm, namely, product innovation and commercialization, infrastructure for production and delivery, and customer relations management) [18]. For the needs of DT, we adopt a restrictive definition of BM that emphasizes the way organizations operate when it provisions services and/or delivers products to customers (persons or other organizations). For service provisioning, this could happen by having an electronic presence (online) and/or a physical presence (brick&mortar). - For an electronic presence, a BM’s constructs could be URL of the organization’s Web site, public/unsecure access versus private/secure access to the organization’s Web site, one time service provisioning versus multiple times service provisioning, product deliv- ery versus product collection, services/products’ refund terms and conditions, accepted payment modes (in-house versus third party like PayPal; domestic credit card versus international credit card versus bank transfer), technologies supporting the organization’s Web site, social-media presence (yes/no, e.g., Facebook and Twitter), and judicial system in case of conflicts. - For a physical presence, a BM’s constructs could be location (e.g., city, state, and country), working days and hours, single branch versus multiple branches, local-based branch versus international-based branch, on-site service provisioning versus off-site service provisioning, product delivery versus product collection, services/products’ refund terms and conditions, accepted payment modes (cash versus check versus credit card versus bank transfer), and social-media presence (yes/no, e.g., Facebook and Twitter). To apply the data lifecycle to a BM-driven DT initiative, we action this lifecycle’s 4 stages over an online organization that allows customers to wire money overseas. During the data collection stage, we call dataCollectBM() function that takes a ℬℳ as an input and produces the following data-driven details about this organization: URL (www.wiremoney.com), pri- vate/secure access (yes through username/password), multiple times service provisioning (yes), product delivery (not-applicable), services’ refund terms and conditions (not-possible once the wiring is complete), accepted payment modes (PayPal-based third party accepting domestic credit cards, only), technologies supporting the organization’s Web site (HTML 5.0, Apache 2.4, and JavaScript), social-media presence (yes through Facebook), and Legal system in case of conflicts (the country where the customer is based). When it comes to data processing and data storage stages, a BM does not contain details that allow to “reveal” how the money wiring service processes nor stores data. However, a BM should contain details to support the data dissemination stage. In the case of money wiring service, the organization disseminates these details using its Web site and social media. 3.4. Supporting the data lifecycle The advent of powerful technologies such as machine learning, data science, blockchain, and IoT reshaped our perception of data. These technologies can help today’s DT adopters tap into hidden patterns and manifest prediction and classification capabilities that human and classical analytical tools fail to reveal. In the early days, the ultimate goal of DT was mostly digitization (e.g., going from a paper-based application to an online one) or building a solid workflow using e-business processes controlled by a well-defined business logic and guarded by proper authentication and authorization. We may call this the era of ERP-based digital transformation [19]. The obvious byproducts were time saved, accuracy increased, and security tightened. Nowadays, DT is capable of taking business to new frontiers beyond digitization by adopting contemporary technologies. We call it the era of automation where the machine is capable of reasoning about data and making decisions on behalf of humans. The data lifecycle’s 4 stages we present in this paper can be augmented with various technolo- gies. Initially and before we embark on the data collection stage, it is useful to determine DT’s end-goals that a business owner is aiming for. These goals will specify what data features should be gathered for future processing and analysis. For instance, to offer a customizable ticket fare for an online user, the browser may monitor the time duration the user spends checking the page and how often she leaves and gets back to the same offer in order to assess how anxious this buyer might be. Therefore, goal-oriented data collection can be crucial to the success of many artificial intelligence and data science algorithms. Lacking one feature (or dimension) in the data may render such algorithms unusable. On the other hand, we should avert the pitfall of actively soliciting excessive amount of data (e.g., lengthy surveys) from users that may add unnecessary burden. In the data collection stage, data are usually collected from organization’s operational databases, internal/external surveys, and from the daily transactions. Nowadays, however, DT implementors face a plethora of data sources such as social media posts, blogs, IoT sensors, conversations with chat-bots, etc., in addition to the latent data buried in Web sites (e.g., in- formation about the products of competitors and their prices). This demand advanced tools for data extraction, such as data scraping, [20] and tools and techniques for data cleansing and preparation. In the data storage stage, on top of the conventional storage mechanisms represented by files, spreadsheets, database management systems, and even data warehouses, new technologies such as blockchain and the wide spectrum of remote storage like cloud, fog, and edge computing will have remarkable impact on transforming businesses digitally. In the data processing stage, businesses look for useful information in the data that helps increase profits, reduce costs, and improve its performance. To that end, data science and a wide range of analytical tools can be used to unveil hidden patterns in the data. Other technologies such as deep learning [21] can help with forecasts, image recognition, smart recommendations, data classification, etc. Finally, in the data dissemination stage, and on top of the classical means of information dissemination such as reports and dashboards, the results of the data processing can be mani- fested and consumed by its final stakeholders in different ways such as suggesting personalized recommendations (e.g., movies and ads), targeted offers, smart surveillance, voice recognition, push notifications, demand predictions, sentiment analysis, or even robot and IoT physical actions. 4. Conclusion This paper presented a guiding framework that aims at assisting organizations embrace DT in an environment that is becoming more competitive. DT is a long-term journey that an organization embarks on, should it decide to question its practices in light of management, operation, and technology challenges. The guiding framework emphasizes the smooth integration of data into DT by suggesting 4 stages that would go along this integration. These stages are referred to as collection, processing, storage, and dissemination and were applied to 2 DT initiatives that target organizations’ business processes and business models. Each stage raises a set of questions that shed light on the prominent role of data in ensuring organizations’ competitive advantage. In term of future work, we would like to apply the guiding framework to real organizations and examine other aspects of DT like impact on organization culture and resistance to change. Another future-work element would be trustworthiness of data sources. Our guiding framework’s first stage is collecting data from an ecosystem that most probably host trustworthy and untrustworthy stakeholders. References [1] A. A., The 4 Types of Digital Transformation, Technical Report, TechNexus Venture Collaborative, https://www.linkedin.com/pulse/ 4-types-digital-transformation-andrew-annacone/, 2019. [2] G. Westerman, C. Calméjane, D. Bonnet, P. Ferraris, A. McAfee, et al., Digital transfor- mation: A roadmap for billion-dollar organizations, MIT Center for Digital Business and Capgemini Consulting 1 (2011) 1–68. [3] E. Stolterman, A. Fors, Information Systems Research: Relevant Theory and Informed Practice, volume 143, Springer, Boston, MA, 2004. [4] Z. Van Veldhoven, J. Vanthienen, Designing a Comprehensive Understanding of Digital Transformation and its Impact, in: Proceedings of the 32 Bled eConference on Humanizing Technology for a Sustainable Society, Bled, Slovenia, 2019. [5] B. Solis, J. Szymanski, The Six Stages of Digital, Transformation, The Race Against Digital Darwinism, Altimeter (2016). [6] J. Huang, A. Gheorghe, H. Handley, P. Pazos, A. Pinto, S. Kovacic, A. Collins, C. Keating, A. Sousa-Poza, G. Rabadi, R. Unal, T. Cotter, R. Landaeta, C. Daniels, Towards Digital Engineering – The Advent of Digital Systems Engineering, International Journal of System of Systems Engineering, arXiv preprint arXiv:2002.11672 (2020). [7] N. Verina, J. Titko, Digital Transformation: Conceptual Framework, in: Proceedings of the International Scientific Conference on Contemporary Issues in Business, Management and Economics Engineering, 2019. [8] C. Q., E. G., A Framework for Digital Business Transformation, Tech- nical Report, Cognizant, https://www.cognizant.com/InsightsWhitepapers/ a-framework-for-digital-business-transformation-codex-1048.pdf, 2014. [9] M. Fitzgerald, N. Kruschwitz, D. Bonnet, M. Welch, Embracing Digital Technology: A New Strategic Imperative, MIT Sloan Management Review 55 (2014). [10] B. Rogers, Why 84% of Companies fail at Digital Transformation, Forbes. com, January 7 (2016). [11] P. Datta, Digital Transformation of the Italian Public Administration: A Case Study, Communications of the Association for Information Systems (January 2020). [12] B. Libert, M. Beck, Y. Wind, Questions to Ask before your Next Digital Transformation, Harvard Business Review 60 (2016). [13] K. Laudon, J. Laudon, Management Information Systems: Managing the Digital Firm, Pearson, 2019. [14] W. Mathias, Business Process Management Architectures, in: Business Process Manage- ment, 2012. [15] S. Kallel, Z. Maamar, M. Sellami, N. Faci, A. Ben Arab, W. Gaaloul, T. Baker, Restriction- based fragmentation of business processes over the cloud, Concurrency and Computation: Practice and Experience, Wiley (2019 (forthcoming)). [16] Object Management Group (OMG), Business Process Model and Notation, www.omg.org/spec/BPMN/2.0.2, ???? [17] M. Geissdoerfer, P. Savaget, S. Evans, The Cambridge Business Model Innovation Process, in: Proceedings of the 14th Global Conference on Sustainable Manufacturing (GCSM’2017), 2017. [18] F. Li, The Digital Transformation of Business Models in the Creative Industries: A Holistic Framework and Emerging Trends, Technovation 92-93 (January 2020). [19] A. Petra, S. Bettina, G. Frank, ERP Systems Towards Digital Transformation, Studies in Systems, Decision and Control (2018). [20] R. Mitchell, Web Scraping with Python: Collecting Data from the Modern Web, O’Reilly Media, Inc, 2018. [21] V. Per, L. Peter, P. Ramjee, Advanced Business Model Innovation Supported by Artificial Intelligence and Deep Learning, Wireless Personal Communications 100 (2018).