=Paper=
{{Paper
|id=Vol-3855/aem2
|storemode=property
|title=Enhancing the Assessment of Digital Archive Maturity using Enterprise Architecture
|pdfUrl=https://ceur-ws.org/Vol-3855/aem2.pdf
|volume=Vol-3855
|authors=Miguel Cruz,Sergio Guerreiro,José Borbinha,Diogo Proença
|dblpUrl=https://dblp.org/rec/conf/ifip8-1/Cruz0BP24
}}
==Enhancing the Assessment of Digital Archive Maturity using Enterprise Architecture==
Enhancing the Assessment of Digital Archive Maturity
using Enterprise Architecture
Miguel Cruz1,† , Sérgio Guerreiro1,2,*,† , José Borbinha1,2,† and Diogo Proença1,2,†
1
INESC-ID, Rua Alves Redol 9, 1000-029 Lisbon, Portugal
2
Instituto Superior Técnico, Universidade de Lisboa, Portugal
Abstract
This paper addresses the problem of evaluating the maturity of a digital archive in terms of its digital preservation
capabilities. The European eArchiving initiative supports this assessment by providing the key tools of: a
maturity model (eACMM), a reference architecture (eRA), and a collection of real-world cases based on the eRA
framework. To our knowledge, this is the first paper to integrate these artifacts for a comprehensive evaluation.
The eACMM is represented by a set of detailed requirements classified by importance. The eRA is represented
by an ArchiMate model, and the real-world cases are represented as ArchiMate eRA viewpoints. This paper
details the process of how to assess the real-world cases against the eACMM and demonstrates the value of
such viewpoints. One significant challenge addressed in the literature is the lack of effective view models for
identifying gaps in reference architectures. Despite the systematic prescription offered by maturity models, they
often lack the support of mechanisms that are necessary to fully realize the potential of diagnosing the current
stage and identifying the next steps to increase maturity. This paper contributes with a solution implemented in
a Enterprise Architecture management tool where the output is the automatic creation of Enterprise Architecture
blueprints. Obtained results corroborate that our solution easily identifies the level of eACMM maturity in respect
to eRA concepts, and also an immediate identification of actions that should be done to improve the eACMM
level.
Keywords
Blueprint, Digital Archiving, Maturity Model, Reference Architecture, Viewpoint
1. Introduction
This paper addresses on how to assess the maturity of a digital archive using the resources of the
eArchiving Capability Maturity Model (eACMM) and the eArchiving Reference Architecture (eRA),
both initially developed by the E-ARK project1 and later adopted by the European Union eArchiving
Initiative2 .
Existing maturity models are beneficial, but often miss the supporting mechanisms to realize their
full potential. This paper seeks to bridge this gap by demonstrating how Enterprise Architecture
(EA) [1, 2] management can enhance the usability and effectiveness of maturity models, fostering better
decision-making and targeted improvements in organizational structures. The goal is to produce visual
aids that can facilitate the maintenance and enhancement of digital archive implementation maturity
by providing blueprints that encapsulate pertinent information about the maturity model. Furthermore,
the paper aims to enable stakeholders to identify and rectify gaps through automatically generated
visual representations, as already attempted by [3]. These blueprints provide a complete overview,
facilitating a more intuitive grasp of the maturity model’s current status and areas for growth.
Companion Proceedings of the 17th IFIP WG 8.1 Working Conference on the Practice of Enterprise Modeling Forum, M4S, FACETE,
AEM, Tools and Demos co-located with PoEM 2024, Stockholm, Sweden, December 3-5, 2024
*
Corresponding author.
†
These authors contributed equally.
$ migueldcostacruz@tecnico.ulisboa.pt (M. Cruz); sergio.guerreiro@tecnico.ulisboa.pt (S. Guerreiro); jlb@tecnico.ulisboa.pt
(J. Borbinha); diogo.proenca@tecnico.ulisboa.pt (D. Proença)
0009-0005-2869-9129 (M. Cruz); 0000-0002-8627-3338 (S. Guerreiro); 0000-0001-5463-8438 (J. Borbinha);
0000-0002-3671-9637 (D. Proença)
© 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
1
https://www.eark-project.com/
2
https://digital-strategy.ec.europa.eu/en/activities/earchiving
CEUR
ceur-ws.org
Workshop ISSN 1613-0073
Proceedings
One limitation identified in the literature is the lack of tools currently available for representing the
maturity gaps. Usually, this is done manually, but automatic approaches are already being pointed as a
future solution to obtain better assessments [4]. While maturity models provide valuable insights, their
full potential remains unrealized without adequate support mechanisms.
Central to this process is the usage of the tool ATLAS 3 that supports various features of EA manage-
ment. Specially relevant to our focus was its capability to generate views providing a visual synthesis of
complex data, making it easier to analyze and identify discrepancies within the models. By leveraging
ATLAS, the paper demonstrates how a systematic approach can be used to visualize and thereby enhance
the maturity models under consideration. A dataset containing real digital archives implementations
that used the eRA validates our maturity assessment proposal. It is important to emphasize that this
paper is closely aligned with the eArchiving initiative, the eRA and the eACMM. eACMM is composed by
a set of requirements represented as structured information, while eRA and the cases were represented
in an ArchiMate model as a structured collection of elements, relationships and viewpoints.
For short, the solution encompasses the following stages done in the EAM tool: a) importing data, b)
classifying the eACMM in the eRA model, c) computing the eACMM capability level for each element
of the real-world case represented by the eRA model, and d) visualizing the eACMM capability levels
organized by capability.
The Design Science Research (DSR) method, as outlined by Peffers et al. [5], is employed, in this
paper, due to its alignment with the expected enterprise architecture (EA) deliverables of our work,
which are conceptually similar to DSR artifacts. These artifacts represent an organization’s structure,
components, and interactions, providing a holistic perspective of its processes, information systems,
technology, and more. Moreover, DSR is chosen for its suitability in understanding and advancing
existing EA models. Its cyclical nature allows for iterative refinement of artifacts based on feedback
from experts and practitioners, ensuring that the resulting viewpoints to be both theoretically rigorous
and practically effective.
The paper is organized as follows. Section 2 presents the background concepts. Then, Section 3
describes and compares the EA tools that are of potential use in our paper. Afterwards, Section 4 details
our solution. Section 5 validates the solution using the available datasets from E-ARK project. Finally,
Section 6 concludes the paper and identifies future work.
2. Background
The E-ARK project (European Archival Records and Knowledge Preservation) was a multinational
initiative aimed at standardizing digital archiving methods across Europe. Running from 2014 and co-
funded by the European Commission, it developed guidelines, open technical products, and integrated
archiving infrastructure to enhance the availability, access, and use of archival data. Its results were
adopted by the present eArchiving Initiative, which is influencing the European landscape in relation to
the concern of digital preservation in the domain of digital archives.
This paper uses the dataset of the "eArchiving Referece Architecture v2.0", which is publicly available
at: https://kc.dlmforum.eu/earchiving-ra20/ and the maturity model eACMM, both developed by E-
ARK. These components already exist and are confirmed in the domain of digital archives and digital
preservation. The focus is on enhancing their usability and effectiveness through the development of
visual features.
2.1. Maturity Models
Maturity models are widely used and recognized in both business and academics due to their simplicity
and effectiveness. They can assist stakeholders grasp the present degree of maturity of a certain
element in a meaningful way, allowing them to clearly identify strengths and weaknesses that need
to be improved, and so prioritize what has to be done to achieve the next level. This may be used to
3
https://linkconsulting.com/what-we-do/products/atlas/
demonstrate the outcomes of that endeavor, allowing stakeholders to determine whether the results
justify the work or expenditure.
According to Proença and Borbinha [6], a maturity model outlines a progression toward a more
structured and systematic approach to business operations. It is a commonly employed method for
evaluating specific facets of organizations and businesses, such as business processes. A maturity
model represents commonly accepted and validated knowledge, thus conducting a maturity assessment
against and established maturity model helps stakeholders pinpoint strengths but also areas needing
improvement, allowing them to prioritize actions to reach higher maturity levels. These assessments
can vary from straightforward self-assessment questionnaires to comprehensive evaluation methods.
Proença and Borbinha [7] discuss how a maturity model can be a useful tool for assessing various
domains of concerns and charting organized routes to better business practices. Typically, maturity
models have various "maturity levels," usually five (but this might vary): Initial, Managed, Defined,
Quantitatively Managed, and Optimizing. Maturity models give businesses with: (1) auditing and
benchmarking metrics; (2) progress evaluation versus objectives; and (3) insights into strengths, short-
comings, and opportunities, which help in strategic decision-making and project management. Maturity
is described as the process of managing and directing an entity’s growth, meeting organizational goals,
and transitioning from an initial to a more mature condition. Mature companies use different indications
than less mature ones. Maturity models guide the progression from a starting condition to a targeted
maturity state, offering a framework for evaluation, benchmarking, and continuous improvement.
2.2. eACMM
The eACMM, that is a e-ARK’s product, considers five maturity levels, which are, from the lowest to the
highest: (1) Initial, (2) Basic, (3) Intermediate, (4) Advanced and (5) Optimizing, as depicted in Figure 1.
It also focuses that on six fundamental capabilities for digital preservation: Pre-Ingest, Ingest, Access,
Preservation and accessibility planning, Data Management, and Archival Storage.
Figure 1: The eArchiving Capability Maturity Model (eACMM).
This method allows businesses to analyze their information governance practices around digital
preservation. The eACMM focuses on the most significant references for digital archiving, particularly
those that are being enhanced as part of the old eArchiving Building Block and the present eArchiving
Initiative.
2.3. Maturity models and reference architectures
Reference architectures and maturity models are two key ideas in EA [2] that strive to improve or-
ganizational efficiency and effectiveness. Reference architectures provide a standardized framework
for designing and implementing business processes and IT infrastructure, whereas maturity models
examine an organization’s processes and structures for sophistication and competence. Understanding
the relationship between these two ideas is critical for achieving an organization’s strategic objectives
and operational excellence.
Pereira and Sousa [8], discuss how reference architectures, such as the Zachman Framework [9],
TOGAF (The Open Group Architecture Framework)4 , and others offer blueprints for the structure
and operation of the IT and business processes in an organization [10]. These references define the
important components and their linkages, ensuring that the organization’s procedures are in accordance
with its strategic goals. Reference architectures thus provide uniform languages and sets of standards
to simplify the understanding of complicated realities, promote consistency, and ensure interoperability
across domains.
Luftman [11] used the Capability Maturity Model Integration (CMMI) and the Business IT Alignment
Maturity model to assess the current state of the processes of an organization, and provides a road map
for continuous improvement. Models like that make possible to evaluate the effectiveness, efficiency,
and agility of business and IT processes, identifying strengths and areas for improvement. They can
guide organizations in progressing through various maturity levels, from initial and ad hoc practices to
optimized and continuously improving processes.
The symbiotic relationship between reference architecture and maturity models is addressed Pereira
and Sousa [12]. Reference architectures can serve as a foundation upon which maturity models can be
applied, as they can provide the structured environment necessary for the effective maturity assessment.
For instance, the use of a comprehensive reference architecture ensures that all components of the
organization that are modeled can be considered during the maturity assessment, thereby providing a
holistic view of the capabilities of the business.
Van Steenbergen et al. [13] discuss on how maturity models leverage reference architectures to identify
gaps and weaknesses in the current system is presented. By mapping maturity model assessments to the
reference architecture, organizations can pinpoint specific areas that require improvement and develop
targeted strategies to enhance those areas. This alignment ensures that improvements are implemented
in a systematic and cohesive way, consistent with the organization’s overarching strategic direction.
For example, the Business IT Alignment Maturity model may be linked to the architectural domains
of TOGAF (business, data, application, and technology), indicating a clear route to greater levels of
alignment and maturity.
Gonzalez-Perez et al. [14] emphasize that reference models facilitate the implementation of maturity
models by providing predefined templates and best practices. These characteristics are critical for
businesses with lower maturity levels, where procedures may be unclear or poorly defined. Reference
models provide a starting point and a set of standards that may be customized to meet the organization’s
particular needs. This organized strategy reduces the risks associated with ad hoc deployments and
guarantees that gains last in the long run.
According to Proença and Borbinha [15], the EA domain exists to address the challenge of ensuring
alignment between IT and business. EA offers advice on harmonizing business and IT, but adopting
its approaches might be difficult. This study seeks to provide a maturity model for EA in companies
as a governance tool for analyzing and improving their existing status. Maturity models enable re-
engineering linked to the EA life cycle through benchmarking and road map planning. This model
is built on design science research and current literature, and it identifies essential success elements
while also reviewing previous maturity models. The suggested model, which was tested in a multi-
step procedure, fills gaps in existing models by including best practices and key success criteria. EA
entails developing models that represent an enterprise’s organizational structure, business processes,
information systems, and infrastructure in order to manage and maintain alignment across time. Despite
existing approaches, companies lack tools for assessing their present EA practices and identifying
changes, making it difficult to improve their EA management practices.
In conclusion, the relationship between reference architectures and maturity models in the EA
context is integral to achieving organizational excellence. Reference architectures provide the structural
framework and common language necessary for effective maturity assessments, while maturity models
4
https://www.opengroup.org/togaf
offer a road map for continuous improvement. Together, they enable organizations to systematically
enhance their processes and capabilities, ensuring alignment with strategic objectives and fostering
long-term success [16, 17].
3. Enterprise Architecture tools evaluation
This section reports to EA tools that are available to capture, transform and generate related data. EA is
defined by TOGAF [18] as "the structure of components, their inter-relationships, and the principles and
guidelines governing their design and evolution over time". Also, Greefhorst and Proper [19] consider
three perspectives for architecture: "regulation-oriented, design-oriented, and knowledge-oriented, where
the first corresponds to the prescriptive perspective, the second corresponds to the descriptive perspective,
and the third corresponding to the high-level design decisions of the system". The EA way of working are
proposed by many standards that are widely used in industry, e.g. TOGAF, DODAF, MODAF, etc., and
are recognized as a lingua franca between clients, suppliers, consultants and/or researchers. The usage
of tools aids the creation of EA artifacts, such as catalogs, matrices and diagrams, avoiding expending
large amounts of time and effort to create them.
From the data summarized in Table 1, it is observable that ATLAS in comparison to other tools such
as ADOIT5 , Archi6 , ARIS7 , Enterprise Architect8 , LeanIX9 , MEGA10 , and BizzDesign11 , offers features
particularly suited to the maintenance of EA models and the creation of on-the-fly views of the model.
ATLAS was used with a specific focus on digital preservation efforts, such as the E-ARK project, and
is primarily used to generate graphical representations that aid in the analysis and decision-making
processes in complex architectural environments.
ATLAS was chosen for the E-ARK project because of its capabilities to model and manage digital
preservation procedures, and pragmatically, because it was a known tool for the research team. It
assisted with critical activities such as blueprint design and effect analysis, both of which were required
to assess digital archive maturity. However, it is crucial to highlight that the applicability of ATLAS
varies based on the project objectives, and this comparison reflects its specific application within the
context of the E-ARK project.
4. Solution
This section demonstrates the solution, systematically increasing the level of information provided
to ensure a thorough understanding of the concepts presented. This step-by-step approach enhances
clarity and guides the reader through the intricacies of our method, facilitating a detailed exploration of
how our solution addresses the research problem. As an overview of the solution, Figure 2 depicts how
the three steps are executed to generate the EA artifacts.
Prior to commencing the analysis of the solution, it is crucial to provide a comprehensive description
of the dataset. The dataset is fundamentally derived from the project’s reference architecture, which
encompasses 616 elements across various classes. The Table 2 provides a detailed breakdown and
analysis of these dataset elements.
The initial step is to import the dataset into our ATLAS tool, and then create our metamodel based
on the classes in the dataset. However, it is possible to add properties to these classes within the dataset.
In addition to the convenience of using a Business Actor like Joe, the tool allows you to add properties
to an object, such as Joe’s age. In ATLAS, the information about Joe object is stored, which is the
Business Actor type and that he is, e.g., 40 years old. This method manipulates and adds properties to
5
https://www.boc-group.com/en/adoit/
6
https://www.archimatetool.com/
7
https://aris.com/aris-enterprise/
8
https://sparxsystems.com/
9
https://www.leanix.net/en/
10
https://www.mega.com/enterprise-architecture-ea-tool
11
https://bizzdesign.com/
Table 1
A comparison of EA tools considering the attributes of: cloud usage, architectural layers, standards, artifacts
produced and meta-model usage.
Cloud/Non-Cloud Layers Supported Standards Artifacts Meta-model
ADOIT
Both Business ArchiMate Process models Variable
Information TOGAF ERDs
Application UML diagrams
Technology Data flow diagrams
Archi
Both Business ArchiMate Process models Fixed
Information ERDs
Application UML diagrams
Technology Reports
ARIS
Both Business ArchiMate Process maps Variable
Information TOGAF Swimlanes
Application Decision trees
Technology Organizational charts
Enterprise Architect
Both Business ArchiMate Process models Fixed
Information TOGAF Data models
Application UML diagrams
Technology BPMN diagrams
LeanIX
Cloud Technology — Resource maps Variable
Cost reports
Value stream maps
MEGA
Both Business ArchiMate Process models Variable
Information TOGAF Data models
Application UML diagrams
Technology Organizational charts
ATLAS
Both Business ArchiMate Process maps Fixed
Information TOGAF Swimlanes
Application Decision trees
Technology Organizational charts
BizzDesign
Both Business — Process models Variable
Information Data models
Application UML diagrams
BPMN diagrams
metamodel classes using this analogy, allowing us to create blueprints. In this scenario, the Boolean
property "isAssessed" is added to all metamodel classes to indicate which objects in the dataset were or
were not covered by the maturity model. The paper also included a property named "environment"
to distinguish which features are included in each case study and even the reference architecture. In
addition to the classes found in the dataset, the Assessment class is generated, which has attributes
like maturity level, capability level, and capability function. Objects in this class can be associated with
objects in any other class in the metamodel, thus providing information on the relationship between
the assessment made and the element covered by that assessment.
Our solution is based on the construction of blueprints and can be categorized into two distinct
branches. One branch focuses on conducting a compliance analysis between real cases and the eRA,
while the other branch centers on providing an overview of the eACMM. In the demonstration of the
first branch, a comparison between RODA case with eRA is done. Although the solution have the
capability to analyse five real cases: RODA, ARS, ESSArch, NAE and NAF. The solution covers all these
cases by inserting initial parameters when the blueprint is started. In this case, when blueprints are
opened, for the first part of the solution, a pop-up window will appear, allowing us to choose an option
from the list of real cases mentioned above. For example, for the following demonstrations below, the
STEP A: Import data from eRA Archi model, including the RODA, ARS, ESSArch, NAE and NAF implementation cases.
STEP B: Using eACMM as a reference, classify the capability level of each architectural element as a data property.
STEP C: Compute data, using Algorithm 1 and 2, to obtain the visualization of the EA Diagram Artifacts.
Figure 2: An overview of the solution encompassing data ingestion, data classification and data computation
and visualization.
Table 2
Distribution of the number of instances organized by class.
Application Component
Application Interface
Application Function
Application Process
Application Service
Business Function
ArchiMate Model
Business Process
Course of Action
Business Service
Business Object
Business Event
Business Role
Value Stream
Stakeholder
Capability
Grouping
Contract
Principle
Junction
Driver
Goal
Class
No. Elements 153 62 46 79 15 48 9 32 28 27 7 32 29 10 17 4 12 1 2 1 1 1
TOTAL 616
RODA parameter has been chosen. This means that the elements to be compared with the reference
architecture will be those present in the RODA architecture.
4.1. Case Studies and Application of the eACMM Model
In this study, various case studies were used to demonstrate the actual implementation of the eACMM
maturity model in conjunction with EA models. The case studies chosen include RODA, ARS, ESSArch,
NAE, and NAF. These are the identical cases used in the reports that will help us validate.
RODA (Repository of Authentic Digital Objects) is a digital preservation platform that ensures
long-term access to digital content. It covers the entire digital archiving process, including ingest,
archival storage, and access to digital objects.
The ARS (Archival Repository System) focuses on the management of archival records, ensuring that
ingest, archival storage, and preservation processes are compliant with digital preservation standards.
ESSArch is a digital preservation system designed to maintain the integrity and accessibility of
digital archives over time.
The NAE (National Archives of Estonia) focuses on ingesting and preserving large volumes of
government records.
NAF (National Archives of Finland) handles both the preservation and dissemination of government
records and historical data.
4.2. Clarifying the Link between eACMM and the Solution
The purpose of the eACMM is to evaluate the maturity of digital preservation processes. It focuses on
three major aspects of an archive: ingest, archival preservation, and dissemination. These processes are
crucial in ensuring the long-term preservation and accessibility of digital records. The eACMM evaluates
an organization’s capabilities in managing these processes using a set of capability levels, ranging from
basic to optimizing. Each level assesses an organization’s ability to implement and maintain effective
digital preservation workflows.
Our suggested approach makes use of EA models, specifically ArchiMate, to improve these proce-
dures by offering clear, pictorial depictions of the systems in question. These models aid in detecting
weaknesses and potential areas for enhancement in the adoption of digital preservation best practices.
Through mapping digital preservation systems’ capabilities to the eACMM, it is guaranteed that organi-
zations can systematically advance their maturity in all important areas. For instance, depicting the
data flow through the ingest process, from submission to archival storage, using ArchiMate models,
emphasizes decision-making points and roles. Inefficiencies or potential issues in the preservation
workflow can be then identified and changes that are consistent with greater capability levels can be
suggested, by evaluating the implemented flow against the eACMM maturity criteria.
4.3. Compliance analysis
As previously mentioned, our solution includes a branch dedicated to conducting a compliance analysis
between a given real case and the E-ARK reference architecture. This analysis involves assessing both
sides to determine their alignment and compatibility. Starting with the most basic subset of this solution
view. The figure 3 illustrates the business functions present in both the reference architecture and
the scenario. The left box displays all business functions within the reference architecture, while the
right box shows those present in the scenario. This visual comparison allows for a quick assessment of
compliance, identifying which elements align and which do not between the two sets.
Figure 4: Subset of the solution when applied the
second step.
Figure 3: Basic subset of the solution.
In the next stage, a box is located in the middle to indicate which elements have been reviewed, and
therefore, are included in the maturity model. At this point, merely adding a box increases the amount
of information provided by the plan. In the middle box, it is revealed the parts that are examined by the
maturity model, which are the model’s major components. Figure 4 illustrates the process.
In the third step of our process, the box housing the assessed elements based on their capability
are considered, and created an alternative blueprint version categorized by capability levels. As part
of enhancing clarity and usability, color coding is incorporated, specifically employing green and
red hues within the scenario elements box. These colors serve a crucial role in swiftly identifying
whether an element has undergone evaluation or not. A red indication adjacent to an element signifies
its non-assessment status, whereas a green indication denotes that the element has been thoroughly
evaluated. This strategic integration of color coding not only streamlines the assessment process but
also enhances the overall accessibility and interpretation of the blueprint and its respective components.
Algorithm 1 depicts the pseudo code for construction of this process to designing the final blueprints.
Algorithm 1 Pseudo-code explaining the blueprint organized by the assessments distributed by each
capability.
𝑐𝑎𝑠𝑒 ← 𝑅𝑂𝐷𝐴; 𝐴𝑅𝑆; 𝐸𝑆𝑆𝐴𝑟𝑐ℎ; 𝑁 𝐴𝐸; 𝑁 𝐴𝐹
for each 𝑜𝑏𝑗𝑒𝑐𝑡 in the dataset do
if 𝑜𝑏𝑗𝑒𝑐𝑡.environment = RefArch then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to RefArch container
end if
if 𝑜𝑏𝑗𝑒𝑐𝑡.environment = 𝑐𝑎𝑠𝑒 then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to 𝑐𝑎𝑠𝑒 container
if 𝑜𝑏𝑗𝑒𝑐𝑡.isAssessed = true then
𝑜𝑏𝑗𝑒𝑐𝑡 shows green color
end if
end if
if 𝑜𝑏𝑗𝑒𝑐𝑡.isAssessed = true then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to Assessed container
if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityFunction = Pre-Ingest then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-container Pre-Ingest
else if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityFunction = Ingest then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-container Ingest
else if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityFunction = Access then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-container Access
else if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityFunction = Preservation then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-container Preservation
else if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityFunction = Data Management then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-container Data Management
else if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityFunction = Archival Storage then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-container Archival Storage
end if
end if
end for
The pseudo-code is adapted to the blueprint version that is organized by the capability levels, but it
can be easily adapted to the maturity level version. On the one hand, our proposal have to check what
capacity function an assessment that is associated with a dataset element has. The solution is checking
what is written in the property "CapabilityFunction" from an assessment that is associated to an object
from our dataset. To adapt it to the maturity level blueprint version a check about what are the number
written on the "MaturityLevel" property from an assessment associated to an object is required. It will
look like this: 𝑜𝑏𝑗𝑒𝑐𝑡.𝐴𝑠𝑠𝑒𝑠𝑠𝑚𝑒𝑛𝑡.𝑀 𝑎𝑡𝑢𝑟𝑖𝑡𝑦𝐿𝑒𝑣𝑒𝑙.
Figures 5 and 6 present the generated blueprints that include that third step, in the leftmost box
all the elements belonging to the reference architecture. In the middle box, these same elements are
evaluated, i.e., they are linked to the maturity model. In the right-hand box is depicted the elements
that are part of the scenario, in this case the RODA scenario. It is shown that although there are only
part of the elements present in the reference architecture, they correspond, and in the small colored
boxes the elements evaluated in the scenario also conform to the reference model.
In this case, the scenario is in line with the reference architecture. Therefore, the organization would
only recommend adding the elements that the reference architecture box contains and the real case
does not. The elements that should be assessed are also in conformity. Figure 7 exemplifies the final
solution blueprint covering all the layers that the scenario contains.
Figure 5: Subset of the solution with the assessed box organized by capability and with color appointment.
4.4. Maturity model overview
This section introduces an alternative blueprint emphasizing the maturity context of eACMM’s capabil-
ities. Each capability is depicted as a series of nested boxes, illustrating its evolution across various
capability levels. This hierarchical representation offers a detailed view of how each capability pro-
gresses in terms of maturity within the E-ARK framework. Algorithm 2 depicts the pseudo-code trying
to demonstrate the logic of the implementation of this part. Due to the code becoming repetitive, only
the case in which the object is populated in the Pre-Ingest sub-container will be exemplified, however it
can be generalized to Pre-ingest, Ingest, Access, Preservation, Data Management and Archival Storage.
Figure 8 shows not only one class but as many as the ones contained in our maturity model. One
business process and two business functions are observed. It is also relevant how the elements are
distributed by their capability level. Figure 9 depicts the global result of the blueprint, here intentionally
shown only as a low detailed glimpse view.
5. Validation
This section details the validation process undertaken to assess the effectiveness and reliability of
our proposed solution. Our validation criterion centers on a comparison with definitive reports 12 13
generated by the E-ARK project team, considered authoritative and universally accepted as correct.
The core of our validation hinges on achieving identical results to those outlined in this benchmark
document. Our approach systematically scrutinizes the outputs generated by our solution against the
outcomes documented in the E-ARK report. This thorough analysis ensures not only technical accuracy
12
https://www.eark-project.com/resources/project-deliverables/46-d72initassess/eark_d7_2v2.pdf
13
https://www.eark-project.com/resources/project-deliverables/96-d76-1/620998%20E-ARK%20D7.6.pdf_%3b%20filename_
%3dUTF-8%27%27620998%2520E-ARK%2520D7.6.pdf
Figure 6: Subset of the solution with the assessed box organized by maturity level and with color appointment.
Algorithm 2 Pseudo-code explaining the construction of the solution blueprint showing the maturity
model.
for each 𝑜𝑏𝑗𝑒𝑐𝑡 in the dataset do
if 𝑜𝑏𝑗𝑒𝑐𝑡.isAssessed = true then
𝑜𝑏𝑗𝑒𝑐𝑡 is select to be part of the maturity model
if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityFunction = Pre-Ingest then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-container Pre-Ingest
if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityLevel = 1 then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-sub-container Capability Level 1
else if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityLevel = 2 then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-sub-container Capability Level 2
else if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityLevel = 3 then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-sub-container Capability Level 3
else if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityLevel = 4 then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-sub-container Capability Level 4
else if 𝑜𝑏𝑗𝑒𝑐𝑡.Assessment.CapabilityLevel = 5 then
𝑜𝑏𝑗𝑒𝑐𝑡 goes to sub-sub-container Capability Level 5
end if
end if
end if
end for
but also alignment with established standards and methods stipulated by E-ARK. By demonstrating
that our solution consistently produces results congruent with those in the authoritative report, this
paper underscore its reliability and applicability within the context of compliance analysis for reference
architectures.
Moreover, this validation phase serves as a cornerstone in affirming the credibility of our solution. The
convergence of our findings with those of the E-ARK report validates not only the technical robustness
of our method but also its potential to contribute significantly to advancing the field’s understanding
and implementation of compliance analysis within reference architecture frameworks.
ODA x RA compliance (w/ assessment) by maturity l
Motivation Layer
Value Stream
RefArch Assessed RODA
Access and Reuse Level 1 Access and Reuse
Access data Access data
Level 2
Create and manage active records Create and Manage Active Records
Create and Manage Active Records Level 3 Long-term Preservation and Access
Create and manage records Manage Archived Information
Level 4
Design principles, functions and procedures Provide access to information
Integrated records and archival management (Archiving by Design approach) Level 5 Receive and store records
Long-term Preservation and Access Records Management
Manage Archived Information Reuse information
Manage information long-term Submit Records to Archive
Provide access to information
Receive and store records
Records Management
Reuse information
Submit Records to Archive
Business Layer
Business Service Business Function Business Object Business Process
RefArch Assessed RODA RefArch Assessed RODA RefArch Assessed RODA RefArch Assessed RODA
Access Level 1 Access Service Data Management Level 1 Discovery Archival information Level 1 Archival information Administer and Update Database Level 1 Archival Storage
Access Service Administration Discovery Information access coordination (Order management) Archival Information Package (AIP) Dissemination Information Archival Information Update Check availability and rights
Level 2 Level 2 Level 2 Level 2
Administration Archival Storage Establish preservation and accessibility strategies and standards Ingest Function Common Specification SIARD / ERMS / Geodata / eHealth / ... Submission information Archival Storage Coordinate archival and descriptive information storage
Metadata Database Develop preservation and accessibility plans
Archival storage
Level 3 Data Management Information access coordination (Order management)
Level 3 Prepare submission Common specification for Information Packages (CSIP) Assembling dissemination information Prepare dissemination information
Disaster Recovery
Level 3
Prepare submission
Archival Storage Ingest Ingest Function Transfer information Content Information Type Specification Automated QA Process request
Level 4 Generate AIP
Submission Log
Transfer information
Archival Storage Service Preservation and accessibility planning Migrate files Descriptive information Check access rights Provide access
Manage Storage Hierarchy
Level 5
Data Management Submission service Negotiate submission agreement Level 4 Descriptive Information
Level 4 Check availability and rights Receive and validate submission information
Prepare dissemination information
Ingest Perform Preservation Action Disaster Recovery Policies Collect / Prepare additional data Retrieve information
Level 5 Level 5 Receive and store data
Preservation and accessibility planning Prepare submission Dissemination Information Content audit
Submission Agreement Negotiations
Submission service Preservation and accessibility planning function Dissemination Information Package (DIP) Coordinate archival and descriptive information storage
Level 3
Quality Assurance Duplicate AIP Coordinate Updates
Administer and Update Database
Restructure information object E-ARK AIP Create Quality Assurance Report
Check access rights
Retrieve DIP E-ARK DIP Decision on acceptance
Collect / Prepare additional data
Risk management E-ARK SIP Develop and Maintain Storage Disaster Recovery
Coordinate Updates
Transfer information Error Logs Develop and Maintain Storage Policies
Disaster Recovery
Exported Data Develop information object structure and rules
Error Checking
Format Registry Develop preservation and accessibility plans
Establish and Review Standards and Policies
Metadata Database Develop preservation and accessibility strategies and policies
Generate AIP
Migration Package Disaster Recovery
Generate Descriptive Information
Notice of Data Transfer Error Checking
Manage Storage Hierarchy
Operational Statistics Establish and Review Standards and Policies
Manage System Configuration
Preservation and accessibility strategies and policies Export Data
Monitor legal and technological changes
Preservation Plan File and metadata format identification
Monitor legal and technology environment and community needs
Quality Assurance Results Generate AIP
Prepare dissemination information
Records Generate Descriptive Information
Preservation and accessibility risk assessment
Storage Management Policies Generate Report
Quality Assurance
Submission information Identify archived information
Retrieve information
Submission Information Package (SIP) Inform the user on reuse restrictions
Transfer error checking
Submission Log Internal integrity checks
Make information accessible
Level 4
Identify archived information
Manage Storage Hierarchy
Manage System Configuration Level 5
Manual QA Develop information object structure and rules
Modify Data Develop preservation and accessibility strategies and policies
Modify Infrastructure Generate AIP
Monitor legal and technological changes
Monitor legal and technology environment and community needs
Other checks
Other manual tasks
Package submission information
Perform Queries
Prepare archival information
Prepare dissemination information
Preservation and accessibility risk assessment
Process assistance request
Process request
Provide access
Provide Data
Quality Assurance
Receive and store data
Receive and validate submission information
Receive submission
Retrieve information
Semantic transformation
Send information
Submission Agreement Negotiations
Technical transformation
Transfer error checking
Transfer information package
Transfer to temporary storage area
Update media and/or storage infrastructure
Update metadata
Virus and malware check
Application Layer
Application Component Application Service Application Function
RefArch Assessed RODA RefArch Assessed RODA RefArch Assessed RODA
[AIS] Access Catalogue Level 1 [External] Source Systems [External] Services Level 1 Access [AIS] Arrange content Level 1 Activity log
[Archivist's Toolbox] [RODA] Pre-Ingest services [NDPS] National Digital Preservation Service Administration [AIS] Manage Descriptive Metadata Appraisal and disposal
Level 2 Level 2 Level 2
[Astia] Access. Public application RODA Preservation Platform [Production Systems] Manage information in the active part of information lyfe-cycle Archival Storage [AIS] Provide Access Audio player
[Astia] Virtual Archival Reading Room
Level 3 [scopeArchiv] Descriptive metadata management
Level 3 ERMS [Astia] Discovery
Level 3 Commons-IP
[Astra] Appraisal & Classification Application [scopeIngest] Ingest Filesystem [Astra] Arrange content Cross-datacentre replication
Level 4 Level 4 Level 4
[Astra] Submission Application [scopeOAIS] Manage ingest and preservation action activities as well as metadata managemen Ingest [Astra] Content QA Custom build viewer
Level 5 Level 5 Level 5
[CubeMAM] [VAČ] Manage users, orders, requests, DIPs and provide access Preservation planing [Astra] Create SIP DBPTK
[dbDIPview] Application Access RDBMS [Astra] File Upload/Transfer Information Discovery services / Catalogue
[DBPTK] Accounting [Astra] Manage Appraisal Decisions Filesystem storage Adaptor
[DBPTK] Application Administration [Astra] Manage Archival Schema & Description Image viewer
[DBPTK] Command Another System X [Astra] Manage Classification Schema Manual assessment
[DGS] DIP Storage Archival Storage [AV System] Export data and metadata Metadata Viewer
[ERMS] CAS [CubeMAM] View AV DIP content Preservation actions execution engine
[ESSArch] Access Disk [Datasette] Explore & Publish Preservation agents register
[ESSArch] Actions ERMS [dbDIPview] Install DB from CSV Preservation event register
[ESSArch] Administration Filesystem [dbDIPview] Adjust DB DIP content (e.g. redaction) Reporting & statistics
[ESSArch] Archival Storage Healthcare [dbDIPview] View DB DIP content Representation information network
[ESSArch] Archive Maintenance Ingest [DBPTK] Content QA Risk register
[ESSArch] Data Management Manage archived information [DBPTK] Create SIARD RODA-in
[ESSArch] EAIS Manage information in the active part of information lyfe-cycle [DBPTK] Create SIP SIARD Viewer
[ESSArch] Finding & Viewing Aids Personal [DBPTK] Export data & metadata SIP validation
[ESSArch] Ingest Preservation planing [DBPTK] Install DB from SIARD Text/HTML/XML Viewer
[ESSArch] Pre-Ingest Provide content validation, DIP storage, Access, Rendering and Data Management for special content data type [DBPTK] Provide Access Transfer
[ESSArch] Services RDBMS [DBPTK] Validate SIP Users/Groups management
[ESSArch] System Administration Config Reuse information [ElasticSearch] Search Video player
[ESSArch] User settings Separated Digital Repository [ERMS] Export data & metadata
[ESSArch] Workflow Engine (Tasks/Actions) Tape [ERMS] Export data and metadata
[External] Finding & Viewing Aids [File System] Export data and metadata
[External] Source Systems [GEONetwork] Geo Search
[FEDORA] AIP Storage [GEOServer] View GEO DIP content
[File server] AIP Storage [GIS] Export data and metadata
[GEONetwork] Application [Kibana] Explore, Visualize, Discover
[GEOServer] Application [Omeka] Web Publish
[LTO] AIP Storage [Preservica] Create AIP
[Preservica] Digital Preservation [Preservica] Create DIP
[RDBMS] Database [Preservica] Manage Storage
[RODA] Pre-Ingest services [Preservica] Manage Technical Metadata
[scopeOAIS] Archival IS [Preservica] Preservation Planning
[Unstructured content] [Preservica] Validate SIP
[USIP] Pre-Ingest application [RDBMS] Export data and metadata
[VAČ] ACCESS [RDBMS] Export data and metadata (copy)
[VAČ] Basic Content Viewer [scopeArchiv] Metadata management
[VAČ] Data Management [scopeIngest] Create AIP
[VAČ] Finding Aid [scopeIngest] Prepare archival information
[VAČ] Orders' Management [scopeIngest] Preservation Action
[VAČ] Users' management [scopeIngest] Preservation Planning
[VAČ] Virtual Archival Reading Room [scopeIngest] Validate SIP
[VAU] Finding Aids [USIP] Capture data and metadata
[VAU] Orders' Management [USIP] Create SIP
[VAU] Support and Consultation [USIP] SIP Structure definition
[VAU] Virtual Archival Reading Room [UVviewsoft] Universal Viewer
Archival database (AHAA) [VAČ] Create/Adjust DIP
AV System [VAČ] Provide Access
Commons IP [VAČ] View DIP
Content type validator [VAČ] View unstructured data DIP content
CSIP Validators [VAU] Check availability and rights
DBPTK [VAU] Manage Order
Digital Preservation Service (SAPA) [VAU] Process Request
DIP Validators [VAU] Provide Access
eArchiving Tool Box (eATB) Accessions
E-ARK Web Activity log
E-ARK Web - Direct indexing and access to AIP Appraisal / Select data and metadata
ERMS Appraisal and disposal
ERMS export Audio player
ESSArch Authentication
File System Authority Records
GIS Check availability and rights
IP Viewer Cleanup
Production Systems Collect Content (SIP)
Figure 7: The excerpt of a blueprint organized by the assessments distributed by maturity level. The blueprint is
RDBMS
RODA
RODA Preservation Platform
Commons-IP
Compile Request Results
Coordinate archival and descriptive information storage
cropped due to paper’ limits.
RODA-in
SIP Validators
Storage Infrastructure
Create AIP
Create Archival Package (AIP)
Create Delivery (SIP)
Use-copy storage and management Create DIP
User management Create Dissemination Package (DIP)
Validators Create SIP
www Page of NAF Cross-datacentre replication
Custom build viewer
DBPTK
Discovery
Discovery services / Catalogue
Disposal
Dissemination
ERMS
Export data and metadata
Filesystem storage Adaptor
FVA Xx
Image viewer
Index Archival Package (AIP)
Locations
Manage Access Requests
Manage Archival Storage Requests
Manage Content (SIP)
Manage Delivery (SIP)
Manage metadata
Manage Order
Figure 8: Subset of the solution centered in the maturity context and in the Pre-ingest OAIS ISO standard
package.
Figure 9: A glimpse view of the complete solution blueprint showing the maturity model for the business process
elements of the RODA implementation case organized by OAIS ISO standard packages. It is possible to identify
the distribution of the business processes through the five capability levels.
6. Conclusion
This paper presents an advancement to the field of EA by addressing the critical need for effective
tools/view models to enhance and maintain maturity models. By developing visual tools using the
ATLAS platform, this research provides a novel approach to synthesizing complex data into compre-
hensible blueprints. These visual representations enable stakeholders to identify and rectify gaps,
fostering a more intuitive understanding of the maturity model’s current state and areas requiring
improvement. The alignment with the E-ARK project underscores the practical applicability of this
research, demonstrating its relevance and potential impact. By integrating these visual tools with the
E-ARK reference architecture and maturity model, the paper addresses the existing gaps between them
and enhances the overall usability and effectiveness of maturity models.
Ultimately, this work aims to empower stakeholders to make more informed decisions and implement
targeted improvements within their organizational structures. The comprehensive approach outlined
in this paper not only maximizes the utility and impact of maturity models but also contributes to the
broader goal of advancing EA practices. This paper represents a pivotal step towards more effective
management and enhancement of maturity models, ensuring they fulfill their potential as valuable
tools for organizational development and success.
In the end, we realized that two tasks could have been done differently, which is now identified
as future work: the assessments could have been imported by interpreting ATLAS Forms, creating a
questionnaire that generates the same data as the manually importation that was done. Consequently,
by filling in a questionnaire, the maturity assessment process would resemble the process that was
used to generate the reports used to validate our solution. In addition, the second part of the solution,
which had a different focus and did not allow the manipulation of colored flags, could have included a
compliance analysis for the selected practical cases.
Acknowledgement
This work was funded by national funds through Fundação para a Ciência e a Tecnologia (FCT) (No:
UIDB/50021/2020 (INESC-ID)), and the INESC-ID project E-ARK CSP (PR09033) and the eArchiving
Common Services Platform - CNECT/LUX/2021/OP/0077 NUMBER – LC-01905904.
References
[1] P. Saint-Louis, M. C. Morency, J. Lapalme, Defining enterprise architecture: A systematic literature
review, in: 2017 IEEE 21st International Enterprise Distributed Object Computing Workshop
(EDOCW), 2017, pp. 41–49. doi:10.1109/EDOCW.2017.16.
[2] M. Lankhorst, Introduction to Enterprise Architecture, Springer Berlin Heidel-
berg, Berlin, Heidelberg, 2009. URL: https://doi.org/10.1007/978-3-642-01310-2_1.
doi:10.1007/978-3-642-01310-2_1.
[3] P. Sousa, J. Tribolet, S. Guerreiro, Enterprise Cartography: From Theory to Practice, Springer
Nature Switzerland, Cham, 2023, pp. 237–264. URL: https://doi.org/10.1007/978-3-031-30214-5_19.
doi:10.1007/978-3-031-30214-5_19.
[4] C. Pinheiro, S. Guerreiro, H. S. Mamede, A survey on association rule mining for enterprise
architecture model discovery, Business & Information Systems Engineering (2023) 1–22.
[5] K. Peffers, T. Tuunanen, M. A. Rothenberger, S. Chatterjee, A design science research methodology
for information systems research, Journal of management information systems 24 (2007) 45–77.
[6] D. Proença, J. Borbinha, Maturity model architect: A tool for maturity assessment support, in:
2018 IEEE 20th Conference on Business Informatics (CBI), volume 2, IEEE, 2018, pp. 42–51.
[7] D. Proença, J. Borbinha, Maturity models for information systems-a state of the art, Procedia
Computer Science 100 (2016) 1042–1049.
[8] C. M. Pereira, P. Sousa, A method to define an enterprise architecture using the zachman framework,
in: Proceedings of the 2004 ACM symposium on Applied computing, 2004, pp. 1366–1371.
[9] J. A. Zachman, Enterprise architecture: The issue of the century, Database programming and
design 10 (1997) 44–53.
[10] S. Guerreiro, P. Sousa, A systematic approach to generate togaf artifacts founded on multiple data
sources and ontology, in: International Conference on Conceptual Modeling, Springer, 2023, pp.
95–106.
[11] J. Luftman, Assessing business-it allignment maturity, in: Strategies for information technology
governance, Igi Global, 2004, pp. 99–128.
[12] C. M. Pereira, P. Sousa, Enterprise architecture: business and it alignment, in: Proceedings of the
2005 ACM symposium on Applied computing, 2005, pp. 1344–1345.
[13] M. Van Steenbergen, R. Bos, S. Brinkkemper, I. Van De Weerd, W. Bekkers, The design of focus
area maturity models, in: Global Perspectives on Design Science Research: 5th International
Conference, DESRIST 2010, St. Gallen, Switzerland, June 4-5, 2010. Proceedings. 5, Springer, 2010,
pp. 317–332.
[14] C. Gonzalez-Perez, T. McBride, B. Henderson-Sellers, A metamodel for assessable software
development methodologies, Software Quality Journal 13 (2005) 195–214.
[15] D. Proença, J. Borbinha, Enterprise architecture: A maturity model based on togaf adm, in: 2017
IEEE 19th Conference on Business Informatics (CBI), volume 01, 2017, pp. 257–266. doi:10.1109/
CBI.2017.38.
[16] J. Ross, P. Weill, D. Robertson, Architecture as a strategy–creating a foundation for business
execution, 2006.
[17] K. D. Niemann, From enterprise architecture to IT governance, volume 1, Springer, 2006.
[18] O. Group, Togaf® standard, version 9.2, 2018.
[19] D. Greefhorst, E. Proper, D. Greefhorst, E. Proper, The role of enterprise architecture, Springer,
2011.