=Paper= {{Paper |id=Vol-2017/paper05 |storemode=property |title=Multi-Level Compliance Measurements for Software Process Appraisal |pdfUrl=https://ceur-ws.org/Vol-2017/paper05.pdf |volume=Vol-2017 |authors=Suppasit Roongsangjan,Thanwadee Sunetnanta,Pattanasak Mongkolwat |dblpUrl=https://dblp.org/rec/conf/apsec/RoongsangjanSM17 }} ==Multi-Level Compliance Measurements for Software Process Appraisal== https://ceur-ws.org/Vol-2017/paper05.pdf
            5th International Workshop on Quantitative Approaches to Software Quality (QuASoQ 2017)




                Multi-Level Compliance Measurements
                   for Software Process Appraisal
                       Suppasit Roongsangjan, Thanwadee Sunetnanta, Pattanasak Mongkolwat
                     Faculty of Information and Communication Technology, Mahidol University,
                       999 Phuttamonthon 4 Road, Salaya, Nakhon Pathom 73170, THAILAND
                    suppasit.run@student.mahidol.ac.th, {thanwadee.sun, pattanasak.mon}@mahidol.ac.th


     Abstract— Software process appraisal is to assess whether           of a process model.
an implemented software process complies with a process
reference model. To conduct the appraisal, the appraisal team                To better prepare for software process appraisal, we
will request an organization to provide objective evidence               argued that the compliance of a process can be measured
reflecting practice implementation. Then such evidence will be           prior to its implementation. That is, we can check model
examined, verified, and validated to generate appraisal results.         practice compliance from how the process is defined, i.e., its
This evidence collection process is done after a process is              process model. In light of that, we proposed multi-level
implemented. To better prepare for software process appraisal,           compliance measurements for software process appraisal.
we argued that the compliance of a process can be measured               The measurements quantify the compliance in terms of
prior to its implementation. In light of that, we proposed multi-        Process Model Readiness Score, Process Enactment Score,
level compliance measurements to determine process reference             and Process Implementation Readiness Score at process
model compliance, in terms of Process Model Readiness Score,             modeling, process enactment, and process implementation,
Process Enactment Score, and Process Implementation                      respectively.
Readiness Score. These measurements help provide an insight
analysis of where the problems of practice implementation lie,               In the next section, we describe existing research works
i.e. at process modeling, at process enactment, or at process            related to process compliance measurement and highlight
implementation.                                                          our contribution in comparison with them. In Section III, we
                                                                         explain the proposed measurements. Section IV shows
   Keywords—Software Process Appraisal, Software Process                 calculations of these measurements and their applications
Improvement (SPI), Insight Analysis, Compliance Measurement,             for insight analysis in process design, process
Process Reference Model (PRM), Process Enactment                         implementation, and appraisal context. Section V concludes
                                                                         this paper.
                      I.    INTRODUCTION
    In a software development organization, organizational                       II.   PROCESS COMPLIANCE MEASUREMENTS
maturity can be measured by an appraisal process [1]. A                                    AND RELATED WORKS
software process appraisal determines strengths and                          The recent study related to obstacles in software process
weaknesses of an implemented software development                        improvement (SPI) from Khan et al. (2017) [6] discussed
process against a process reference model (PRM) by an                    that the needs for process deployment techniques are more
appraisal team [2]. PRM is a collection of practices. Well-              crucial than the needs for new SPI models. Process
known examples of PRMs are ISO/IEC 12207 Systems and                     deployment is concerned with introducing and supporting a
software engineering - Software life cycle processes [3],                new process model in a working environment [7]. The
ISO/IEC 29110 Software engineering - Lifecycle profiles                  effectiveness of a deployed process can be measured by
for Very Small Entities (VSEs) [4], and Capability Maturity              using process enactment tools, such as Spider-PE [8],
Model Integration (CMMI) [2].                                            SysProVal [9], and Taba Workstation [10]. Spider-PE is a
    Objective evidence is a result of a physical                         process enactment tool that shows process adherence to a
implementation of a process model. It can be output work                 PRM. SysProVal and Taba Workstation measure team
products and outcomes. During an appraisal process, an                   performance by using time and effort. Spider-PI [11] is a
appraisal team analyzes appraisal requirements, develops an              process improvement tool that shows strengths, weaknesses,
appraisal plan, and obtains and inventories initial objective            opportunities, and threats of a process model when
evidence [5]. In so doing, the appraisal team members                    compared with a PRM.
(ATMs) collect output work products and usually conduct                     Some process enactment tools measure compliance of
interview sessions to obtain affirmations of the outcomes.               process model implementation against a process model.
ATMs use objective evidence to indicate PRM compliance.                  Process enactment deviation represents the difference
Typically, such an appraisal process is done after the                   between the implementation of a process model and the
process implementation has finished. Accordingly, PRM                    model itself. Full compliance means no deviation. Several
compliance, therefore, is a measure of the implementation                types of process enactment deviations are listed in the work




    Copyright © 2017 for this paper by its authors.                 22
            5th International Workshop on Quantitative Approaches to Software Quality (QuASoQ 2017)



of Thompson et al. (2007) [12].. The works of Smatti et al.
(2015) [13] and Silva et al. (2011) [14] measured deviation
levels by using criteria and predefined
                               edefined rules. They counted a
number of unmet criteria as process enactment deviation
measurement. He et al. (2009) [15] used process pattern to
measure process enactment deviation, such as sequence,
parallel, and choice of activities. The absent, skipped, or
reverse order of the implemented tasks represent a non- non
compliance process model. The work of Huo et al. (2006a,
b) [16], [17] and Hug et al. (2012) [18] used data mining
techniques to find deviated process enactment.                                 Figure 1 Multi-Level
                                                                                              Level Compliance Measurements
                                                                     model to reflect compliance by design of the process model.
    While process enactment deviation is measured during             Secondly, Process Enactment Score measures the
process implementation, the compliance of process model                       ce between the deployed process model and the
                                                                     compliance
implementation against a PRM is measured as part of                  process model itself to reflect compliance by the enactment
software process appraisal.. Software process appraisal tools        of the process model. Finally, Process Implementation
usually follow the measurement framework
                                       mework defined by             Readiness Score measures the compliance between the
process assessment models, such as ISO/IEC 15504                     deployed process model and the PRM to reflect compliance
Information technology - Software process assessment [19],           by the implementation of the process model.
and CMMI [2].. Examples of software process appraisal
                                                    app
tools are SEAL QQ [20],, Generic Software Process                        Note that our model does not attempt to measure
Assessment [21], Appraisal Assistant [22], Appraisal                 capability and maturity levels as the ones defined by CMMI
Wizard [23], SPiCE 1-2-1 [24], ProEvaluator [25],                    [2], and those of ISO/IEC 15504 [19]
                                                                                                       [19]. Instead, each level of
SelfVation [26], CERTICSys [27], and Assessment                      measurements in our model aims to quantify the degree of
Visualisation Tool (AVT) [28].. The compliance of process            effort required to achieve those capability and maturity
model implementation against a PRM is usually represented            levels. The following subsections will describe how these
in terms of process maturity and capability levels.                  measurements are calculated.
    Unlike the existing works that we have reviewed, we              A. Process Model Readiness Score
proposed an additional level of compliance measurement                   As mentioned earlier, the Process Model Readiness
between
   ween a process model and a PRM. This measurement                  Score represents compliance by design. Typically, a PRM
represents the similarity of a process model to a PRM in an          consists of a collection of practices that define good,
appraisal context. The work of Gerke et al. (2009) [29] also         common activities. These practices are required to be
raised the importance of this measurement. The compliance            implemented in a process model. To put it differently, a
distance from a process model to a PRM will help determine           PRM defines the required practices. A process model
SPI effort needed to achieve maturity levels and capability          defines the implemented practices. A process in ISO/IEC
levels defined by the PRM. Such compliance distance can              15504 is used for grouping practices of the same activity in
be measured before we actually implement the process                 the same way as a process area in CMMI do        does. The
model, and thus encourage us to achieve the compliance by            implementation of these practices in a process model
the design of process model.. In the next section, we will           represents process model readiness for the assessment by
explain how to combine this additional compliance                    following the particular PRM.
measurement to the existing ones to create our model of
multi-level compliance measurements for software process                 This work defines two means to represent this
appraisal.                                                           measurement. The c-score is a ratio of the implemented
                                                                     practices of a process model to the required practices of the
       III. MULTI-LEVEL COMPLIANCE MEASUREMENTS                      process area of a PRM. It represents the degree of
             FOR SOFTWARE PROCESS APPRAISAL                          achievement of process capability. The m-score is a ratio of
                                                                     the implemented practices of a process model to the
    Fig. 1 illustrates our model of multi-level
                                          level compliance
                                                                     required practices of a whole PRM. It represents the degree
measurements for software process appraisal. It is the
                                                                     of achievement of process maturity. A process area a has
subsequence of our work in [30] that focuses on the tasks in
a process model. We argue that compliance measurement                practices in a PRM, or a ฬ PRM,, where PRM represents a
should not be limited only to be during an appraisal but it          set of practices in a PRM. We write        ( ) for the c-score
should be done from the design of process model to process           of a process model p for the process area a in the PRM. A
model implementation, then to process appraisal. Therefore,          pair of vertical bars around a set name means a number of
we propose three levels of compliance measurements which             elements of that set. We define Implemented
suit different stages of work towards software process                               ( ) for the set of practices in a that is
appraisal. They are Process Model Readiness Score,    S              implemented in p and define                  ( ) for the set of
Process Enactment Score, and Process Implementation                  practices in a. The           can be calculated  by using the
                                                                                               ( )
Readiness Score. Firstly, Process Model Readiness Score
measures the compliance between a PRM and a process                  following equation:




    Copyright © 2017 for this paper by its authors.             23
            5th International Workshop on Quantitative Approaches to Software Quality (QuASoQ 2017)



                     |                            ( )|
                                                                      for risk management process, and a plan for process and
             ( ) =                                                    product quality assurance process. These plans must be
                           |              ( )|                        included in a project plan. Moreover, the project planning
    Since process area is a focused group of practices in a           process itself must implement this generic practice as an
PRM for a certain activity, the c-score is also the                   activity to “plan the plan” [2].
measurement for the particular process, such as                          We use the difference in the degree of achievement of
requirements, planning, or configuration management. It is            process capability and the degree of achievement of process
useful for practitioners to use this score as an improvement          maturity to represent the degree of generic practices
indicator for the concerning software development activity.           implementation in a process model. We focus on the highest
   We define the m-score for process model readiness for a            maturity level, thus we use              for the degree of
whole PRM. It is a ratio of the implemented practices of a            achievement of process maturity. The degree of
process model to the required practices of a PRM. We write            achievement of process capability is the average of c-score
       for the m-score of a process model p for a PRM. It             of every process area. We use n to represent a number of
can be calculated by using the following equation:                    process areas to calculate average c-score. The calculation is
                                                                      shown in the following equation:
                     |                             |
                 =
                           |               |                                               =(         (       ( ) )) /
                                                                                                ฬ
    This measurement represents the degree of the required
practices implemented in a process model. However, a                      The c-score is calculated from specific practices, but the
group of all practices in a PRM represents the highest                m-score includes both specific practices and generic
maturity level of that PRM. Maturity level 5 is the highest           practices. In the same process model, the c-score is always
maturity level of CMMI and ISO/IEC 15504. It means that               larger or equal to the m-score. We define          ( ) for the
       ( ) equals           and       /         ( ) equals            degree of generic practices implementation in a process
              , where  CMMI(5)  represent maturity  level 5           model. This measurement can be calculated by using the
     /
of CMMI and ISO/IEC 15504(5) represent maturity level 5               following equation:
of ISO/IEC 15504.
                                                                                             ( ) =        −
    In case that an organization needs to measure a process
model readiness with the lesser maturity level of a PRM, the              A process designer can use this measurement to check
calculation for the m-score must be applied for the smaller           the implementation of generic practices in a process model.
set of practices. We define                                           For example, a process model that is good for each process
                                            ( ) for the set of
practices for maturity level l of a PRM. The m-score of the           area has the       as 1.00. If this process model does not
process model p for maturity level l of a PRM can be                  implement generic practices at all, we assume that the
calculated by using the following equation:                           equals 0.90. The implementation degree of generic practices
                                                                      in the PRM in the process model p, or        ( ) is 1.00-0.90
                     |                            ( )|
              () =
                                                                      = 0.10.
                           |               ( )|
                                                                          The ultimate goal of SPI initiative is a matured process
    A process designer can use this score as an improvement           that is the implementation of the process model with the
indicator for the maturity level of the selected PRM. For             highest maturity level. Such model has         equal to 1.00.
example, an organization must implement every practice in             In practice, when designing a process model, a process
seven process areas and ten generic practices to achieve the          designer would implement more tasks that would
CMMI maturity level 2. If they aim to achieve CMMI                    complement this goal. We write (          ) to represent this
maturity level 3, they must implement the additional                  complement. It is the degree of effort required to reach the
practices in eleven process areas and two additional generic          mature process model. The degree of effort required for the
practices.                                                            maturity level l of the PRM of the process model p can be
    Process capability focuses on predictable results in              calculated by using the following equation:
particular process performance objectives [31]. Process
                                                                                       (      ( ))   =1−           ()
maturity measures process controllability in an organization
[32]. CMMI uses generic practices to represent the gap                    This measurement can be applied for the particular
between process capability and process maturity. It shows             process area to represent the degree of effort required to
that the implementation of every practice in every process            reach full capability of that process area. We write
area does not represent the highest maturity level. Generic           (     ( ) ) for this effort and it can be calculated by using
practice is applied to many process areas. For example,
                                                                      the following equation:
generic practice 2.2 Plan the Process is an activity that must
be implemented in every process area, such as a plan for                               (     ( ))    =1−         ( )
performing the requirements management process, a plan




    Copyright © 2017 for this paper by its authors.              24
             5th International Workshop on Quantitative Approaches to Software Quality (QuASoQ 2017)



    The ATMs and appraisal participants can benefit from                degree. We write ( ) for the effort to achieve the fully
the Process Model Readiness Score. The ATMs can                         enacted process. This effort can be calculated by using the
determine process maturity of a process to be assessed                  following equation:
during the readiness review after they get initial objective
evidence. They can use this score to support an on-site                                       (   ) =1−
visiting plan to collect more evidence. They can use the                    This measurement represents a process enactment gap.
degree of effort required for the maturity level l, or                  This gap may exist through the fully matured process
(       ( ) ) to represent an implementation gap. This gap              because SPI keeps on evolving a software development
can be used to support the suggestions about the                        process. Process designers and process implementers can
unimplemented activities. These suggestions represent                   monitor this gap to manage the continuously improving
process improvement opportunities in which the appraisal                software process.
participants will benefit from the SPI initiative.                      C. Process Implementation Readiness Score
    Appraisal participants, process designers, in particular,               The Process Implementation Readiness Score represents
can use this score to evaluate a process model before                   compliance by the implementation of process model against
deployment process. If they add some tasks that do not                  a PRM. It is the overview of the degree of compliance by
change this score, these tasks may focus on a different detail          design and compliance by enactment. The full enactment of
level or scope. For example, a task to elicit system level              the mature process model has the full Process
requirements and a task to elicit Use-Case level                        Implementation Readiness Score, or this score equals 1.00.
requirements is implemented to understand requirements.                 This score is semantically equivalent to the highest maturity
Both tasks are needed, although they do not increase this               level in CMMI and ISO/IEC 15504.
score.
                                                                            The score calculation has two parts, Process Model
B. Process Enactment Score                                              Readiness Score and Process Enactment Score. We define
    The Process Enactment Score represents compliance by                       for Process Implementation Readiness Score for
the enactment of a process model. It measures the                       process model p against a PRM. This measurement can be
completeness of process model implementation. The                       calculated by using the following equation:
concept of process enactment concentrates on the enacted
tasks and their output work products (WPs). The fully                                              =
enacted process model means each task is performed and all                  This measurement can be applied for the particular
output WP is created. A deployed task is a task that software           maturity level or process area. The equations for the Process
development team members performed in their work. A set                 Implementation Readiness Score for the maturity level l and
of deployed tasks is a subset of or equal to a set of tasks in a        this score for the process area a can be written as follows:
process model. A created output WP is a WP that is existed
in a project repository. A set of created output WPs is a                             () =        ()    ,      ( ) =      ( )
subset of or equal to a set of output WPs in a process model.
A number of the created output WPs of a process model p,                   The Process Implementation Readiness Score shows the
or |                          |, and a number of the output             overview compliance degree by using Process Model
                                                                        Readiness Score and Process Enactment Score. This
WPs of a process model p, or |                    |, is counted
                                                                        measurement emphasizes the proposed concept about the
on a per-task basis. We write       for the enactment score of
                                                                        compliance measurement in a process model level in
process model p. This measurement can be calculated by
                                                                        addition to the compliance measurement in process
using the following equation:
                                                                        implementation level. On one hand, this work used Process
         |                   |+|                            |           Model Readiness Score and Process Enactment Score to
     =                                                                  create Process Implementation Readiness Score. On the
                   |        |+ |                 |
                                                                        other hand, this work digests an appraisal result into process
    This score equals 1.00 in the fully enacted process                 model level and process model implementation level.
model. In a defined process, a project manager can use this
score to monitor and control how a software development                    These measurements are useful for appraisal participants
team follows a deployed process model. He or she can use                for an appraisal preparation. They can use these
this score to manage the team commitment to process model               measurements to sketch summarizing the SPI effort used
compliance. In a managed process, a process model may not               and to be used to achieve their SPI goal. The separation of
complete, not well-defined, or the team may not follow a                concern in the compliance by design and compliance by
process model. A project manager can use this score to                  enactment would help a process designer and a project
support how he or she manages the process.                              manager to distinguish the improvement effort for a process
                                                                        model and process model implementation. Therefore, the
    The ATMs also benefit from Process Enactment Score.                 proposed compliance measurements in process design and
They can use this score as stopping criteria for objective              process enactment aspects would help to detect SPI
evidence collection iteration. If this score does not increase,         problems and speed up the improvement cycle at both
the team may assume that the process was enacted at that                design time and enactment time.




    Copyright © 2017 for this paper by its authors.                25
                                                                                                                                                                                                               Output Work Product
                                                                                                                                                    Task
                                                                                                                                  Activity                        Task
                                                                                                                                                   Number                                         System-Wide           Use-Case Work Items Iteration
                                                                                                                                                                                Vision Glossary                Use Case                               Test Case
                                                                                                                                                                                                  Requirements           Model      List      Plan



                                                                                                                                                               Develop       SP 1.1,    SP 1.1,
                                                                                                                               Initiate Project      1*
                                                                                                                                                            Technical Vision
                                                                                                                                                                             SP 1.2     SP 1.2


                                                                                                                                                                                                                                     SP 1.2    SP 1.2
                                                                                                                             Plan and Manage         2        Plan Iteration
                                                                                                                                 Iteration                                                                                             +         +




Copyright © 2017 for this paper by its authors.
                                                                                                                                                              Identify and
                                                                                                                                                     3          Outline                 SP 1.1,      SP 1.1,     SP 1.1,   SP 1.1,   SP 1.1,
                                                                                                                                                             Requirements




26
                                                                                                                                                                                        SP 1.2       SP 1.2      SP 1.2    SP 1.2    SP 1.2
                                                                                                                                                                                          +            +           +         +         +


                                                                                                                                                            Detail Use-Case
                                                                                                                                                     4                                  SP 1.1                   SP 1.1    SP 1.1
                                                                                                                             Identify and Refine               Scenarios
                                                                                                                                Requirements                                              +                        +         +


                                                                                                                                                     5          Detail                  SP 1.1       SP 1.1




                                                      ‘+’ indicates output work product existence in a project repository.
                                                                                                                                                             System-Wide




                                                  ‘*’ indicates an unimplemented task in process model enactment process.
                                                                                                                                                             Requirements                 +            +
                                                                                                                                                                                                                                                                  TABLE I. Practices Implementation of Software Development Tasks and Output Work Products




                                                                                                                                                     6      Create Test Cases                                                                             +
                                                                                                                                                                                                                                                                                                                                                             5th International Workshop on Quantitative Approaches to Software Quality (QuASoQ 2017)
            5th International Workshop on Quantitative Approaches to Software Quality (QuASoQ 2017)



  IV.   APPLICATIONS OF THE PROPOSED MEASUREMENTS                    Glossary, System-Wide Requirements, Use Case, Use-Case
                                                                     Model, and Work Items List to support SP 1.1 practice
    The proposed measurements involve three processes in             implementation in this example process model.
the SPI cycle, which are measure, analyze, and change [32].
Process designing results as the changing of a process                   SP 1.2 Obtain Commitment to Requirements practice is
model. After the updated process model is implemented, a             satisfied by one of the implementations of Develop
project manager and a process designer can use Process               Technical Vision, Plan Iteration, or Identify and Outline
Enactment Score to monitor and analyze process model                 Requirements tasks, or task number 1, 2, and 3. An ATM
implementation to update the implemented process model in            will find Vision, Glossary, System-Wide Requirements, Use
the next SPI cycle.                                                  Case, Use-Case Model, Work Items List, and Iteration Plan
                                                                     to support this practice implementation. SP 1.3 Manage
    We use OpenUp process model from Eclipse Process                 Requirements Changes practice, SP 1.4 Maintain
Framework (EPF) [33] and CMMI [2] to demonstrate how                 Bidirectional Traceability of Requirements practice, and SP
the proposed measurements are used to improve the PRM                1.5 Ensure Alignment Between Project Work and
compliance. OpenUp process model from EPF is a public                Requirements practice are not satisfied. Thus, they do not
process model resource that provides software development            show in this table.
task description, purpose and output work products to be
used in this work. This demonstration concentrates on                    The following subsections show the calculation of
Inception phase and the first specific goal (SG1 Manage              Process Model Readiness Score, Process Enactment Score,
Requirements) of the Requirements Management (REQM)                  and Process Implementation Readiness Score. At the end of
process area of CMMI.                                                this section shows how to use these measurements for
                                                                     insight analysis.
    We use Table I to show some model elements in the
OpenUp process model and the related specific practices in           A. Process Model Readiness Score
CMMI. This table shows three requirements-related                        The measurement for Process Model Readiness Score of
activities in Inception phase in the OpenUp process model,           the example OpenUp process model against CMMI, or
Initiate Project, Plan and Manage Iteration, Identified and                    requires the complete set of software development
Refine Requirements, and Agree on Technical Approach
                                                                     tasks and CMMI practices to calculate, which requires more
activities. Activity is broken down into tasks. A task is the
                                                                     space. However, we show the c-score for REQM process
process element that we can assign a unit of work to be
performed by roles [34]. This table has six tasks. The first         area, or      (      ) , which presents the same calculation in
task is Develop Technical Vision task in Initiate Project            the smaller scope. Table I shows two satisfied specific
activity. This task has two output work products, Vision, and        practices, SP 1.1 and SP 1.2. This process area has five
Glossary. The second to the sixth task also has output work          practices, SP 1.1 to SP 1.5. Therefore, the process model
products. The implemented task creates output work                   capability score for the example OpenUp process model for
products in a project repository. We use an asterisk symbol          Requirements Management (REQM) process area of
to indicate an unimplemented task, which does not create             CMMI, or         (       ) , is calculated as follows:
output work product. We use a plus symbol to indicate
output work product existence in a project repository. These                                           2
                                                                                             (     ) =   = 0.40
output work products are used to support practice                                                      5
implementation.                                                          The gap to reach full capability of this process area, or
    This work uses task description and purpose to                   (      (    ) ) , is 1 − 0.40 = 0.60.
determine     whether     the   task    indicates    practice
implementation or not. The process to identify the                   B. Process Enactment Score
relationship between software development tasks in a                     There are six tasks in Table I. The number of the tasks in
process model and practices in a PRM is described in [30].           this example, or |               |, is 6. We assume that Task
Table I show this relationship by placing practice number in         number 1 is not implemented in this example by using the
this table to show the relationship between practice, task,          asterisk symbol after the task number. This makes the
and output work product. For example, Develop Technical              number of the deployed tasks, or |                           |,
Vision task indicates SP 1.1 and SP 1.2. An ATM will use             equal to 6-1=5.
Vision and Glossary to support the implementation of these
two practices.                                                           Task number 1 has two output work products. Task
                                                                     number 2 to task number 6 have 2, 5, 3, 2, and 1 output
    Table I also shows example practices in CMMI. SG 1 of            work products, respectively. The summation of the number
REQM process area of CMMI has five specific practices                of output work products of each task is 2+2+5+3+2+1=15.
(SP). SP 1.1 Understand Requirements practice is satisfied           This number is the number of output work products in the
by one of the implementations of Develop Technical Vision,           example process model, or |                         |. We
Identify and Outline Requirements, Detail Use-Case                   assume that Task number 1 is not implemented in this
Scenarios, or Detail System-Wide Requirements tasks, or              example, then two output work products of this task are not
task number 1, 3, 4, and 5. An ATM will find Vision,                 included in the created output work products. The number




    Copyright © 2017 for this paper by its authors.             27
            5th International Workshop on Quantitative Approaches to Software Quality (QuASoQ 2017)



of    the     created     output     work   products,         or            The Process Implementation Readiness Score for
|                              |, is 2+5+3+2+1= 13.                      REQM process area             (   ) shows the big gap to
                                  (    )                                 achieve the full capability of REQM process area. A process
                              =            = 0.86
                                  (    )                                 designer and a project manager must fill this gap by adding
                                                                         some activities and monitoring the process model enactment
C. Process Implementation Readiness Score
                                                                         process, respectively.
   For the same reason in Subsection A, this work shows
the calculation for the Process Implementation Readiness                    V.    CONCLUSION, DISCUSSIONS, AND FUTURE WORKS
Score for REQM process area, or                  . This                      This paper presents the multi-level compliance
measurement uses Process Model Readiness Score from                      measurements for software process appraisal. These
Subsection A and Process Enactment Score from                            measurements do not replace the measurement of the
Subsection B to calculate the Process Implementation                     existing appraisal models, such as CMMI, and ISO/IEC
Readiness Score for REQM process area.                                   15504. Actually, the proposed measurements can support
                                                                         the currently available compliance measurements for insight
                  (     ) = (0.40)(0.86) = 0.34                          analysis for the better appraisal preparation for both ATMs
    The Process Model Readiness Score for REQM process                   and appraisal participants.
area        (      ) shows that the example OpenUp process                   These measurements consist of the Process Model
model does not achieve the capability of CMMI REQM                       Readiness Score, Process Enactment Score, and Process
process area. A process designer can drill down to the                   Implementation Readiness Score. They represent
relationship between tasks and practices in Table I to                   compliance by design, compliance by the enactment, and
identify that SP 1.3, SP 1.4, and SP 1.5 are not satisfied by            compliance by the implementation, respectively. These
the tasks in this example OpenUp process model.                          measurements can detect problems in SPI cycle, which are
                                                                         the process changing in process design time, and process
    Due to the CMMI maturity level determination rule,                   measurement in process enactment time. A process designer
each practice in a goal must be satisfied. In this example, SP           and a project manager can use these measurements to speed
1.3 to 1.5 are not satisfied; thus the first specific goal of the        up SPI cycle toward the matured process by using the
REQM process area is not reached and this example                        measurements that focus on the degree of compliance that
OpenUp process model does not achieve CMMI maturity                      refers to the maturity and capability of the selected PRM.
level 2.                                                                 The earlier SPI problems detection before the actual
    The Process Model Readiness Score shows at the very                  appraisal process could help the organization to prepare for
beginning in a process designing process that the c-score for            an appraisal.
this process model for REQM process area is less than 1.00.                  However, the measurements that are based on a process
A process designer can use this score to identify that the SPI           model cannot detect the alternative practice that is not
problem is in a process model. If a process designer wants               defined in a process model. This problem can occur in an
to achieve full capability of this process area, he or she must          organization with process maturity level 2 because a process
add some activities with all required activity concepts.                 model may not exist, not be well prepared, or a software
 SP 1.3 requires activity to manage requirements changes.               development team may not follow it. The output work
                                                                         products that do not follow a process model seem to be the
 SP 1.4 requires activity to trace requirements.                        evidence for the alternative practices. An ATM must affirm
                                                                         whether these practices are managed or not managed to
 SP 1.5 requires activity to trace requirements and to                  determine the practice satisfaction for the practices in
  validate requirements.                                                 maturity level 2. This problem will not occur in an
    This example shows that the Process Enactment Score                  organization with process maturity level 3 or more because
          does not reach 1.00. It means that the process                 they have a defined process. The alternative practices have
model does not fully enact. The SPI problem is in enactment              to be defined in a process model.
process. This problem usually occurs in a fast-paced for                     As a proof of concept, we are developing the appraisal
high maturity level organization. A process designer may try             assistant tool that implements the concept in [30] and it will
to deploy a process model that is over capacity for change of            implement the proposed measurements in this work. The
a software development team. This measurement would                      planned evaluation of this work is to compare the benefits of
help to adjust SPI speed, or it would measure capacity for               using and not using this tool in terms of the measurements
change of a software development team. For example, a                    for insight analysis to support SPI initiative.
software development team cannot perform every task and
cannot create every output work products in a process                                         ACKNOWLEDGMENT
model. The Process Enactment Score will go below 1.00. A
                                                                            We would like to thank Dr. Chayakorn Piyabunditkul,
process designer and a project manager should work
                                                                         Software Engineering Specialist of the National Science and
together to concentrate on the commitment to follow the
                                                                         Technology Development Agency of Thailand (NSTDA)
process model.




     Copyright © 2017 for this paper by its authors.                28
             5th International Workshop on Quantitative Approaches to Software Quality (QuASoQ 2017)



for kindly giving suggestions. This research project was                  17. M. Huo, H. Zhang, and R. Jeffery, “A Systematic Approach to
partially supported by the Faculty of Information and                         Process Enactment Analysis as Input to Software Process
Communication Technology, Mahidol University.                                 Improvement or Tailoring,” in 2006 13th Asia Pacific Software
                                                                              Engineering Conference (APSEC’06), 2006, pp. 401–410.
                          REFERENCES                                      18. C. Hug, R. Deneckère, and C. Salinesi, “Map-TBS: Map
                                                                              process enactment traces and analysis,” in 2012 Sixth
1. ISO, “ISO/IEC/IEEE 24765 Systems and software engineering
                                                                              International Conference on Research Challenges in
    - Vocabulary,” Geneva, CH, Dec. 2010.
2. SEI, “CMMI for Development, Version 1.3 (CMMI-DEV,                         Information Science (RCIS), 2012, pp. 1–6.
                                                                          19. ISO, “ISO/IEC 15504 Information technology - Process
    V1.3) Improving processes for developing better products and
                                                                              assessment,” 2004.
    services,” Nov. 2010.
                                                                          20. R. H. Lok and A. J. Walker, “Automated tool support for an
3. ISO, “ISO/IEC 12207 Systems and software engineering -
    Software life cycle processes,” 2008.                                     emerging international software process assessment standard,”
                                                                              in Proceedings of IEEE International Symposium on Software
4. ISO, “ISO/IEC 29110 Systems and software engineering -
                                                                              Engineering Standards, 1997, pp. 25–35.
    Lifecycle profiles for Very Small Entities (VSEs),” 2016.
                                                                          21. O. R. Yürüm, Ö. Ö. Top, and O. Demirörs, “Assessing
5. SCAMPI Upgrade Team, “Standard CMMI Appraisal Method
    for Process Improvement (SCAMPI) A , Version 1.3: Method                  Software Processes over a New Generic Software Process
                                                                              Assessment Tool,” Coll. Econ. Anal. Ann., 2016.
    Definition Document,” Management, no. March, p. 245, 2011.
                                                                          22. F. Liang, T. Rout, and A. Tuffley, “Appraisal Assistant Beta.”
6. A. A. Khan, J. Keung, M. Niazi, S. Hussain, and A. Ahmad,
                                                                              [Online].                                            Available:
    “Systematic Literature Review and Empirical Investigation of
    Barriers to Process Improvement in Global Software                        https://www.sqi.griffith.edu.au/AppraisalAssistant/about.html.
                                                                          23. Integrated System Diagnostics Incorporated, “Appraisal
    Development: Client– Vendor Perspective,” Inf. Softw.
                                                                              Wizard.”           [Online].         Available:      http://isd-
    Technol., 2017.
                                                                              inc.com/tools.appraisalWizard/.
7. Inform-IT, Foundations of IT Service Management Based on
    ITIL V3, 1st ed. Van Haren Publishing, 2007.                          24. HM&S IT-Consulting GmbH, “SPiCE 1-2-1.” [Online].
                                                                              Available: http://www2.hms.org/cms/en/default.html.
8. C. Portela and A. Vasconcelos, “Spider-PE: A Set of Support
                                                                          25. J. Moura, C. Xavier, A. Marcos, and L. De Vasconcelos,
    Tools to Software Process Enactment,” ICSEA 2014 Ninth Int.
                                                                              “ProEvaluator : Uma Ferramenta para Avaliação de Processos
    Conf. Softw. Eng. Adv., no. c, pp. 539–544, 2014.
9. I. Garcia, C. Pacheco, and J. Calvo-Manzano, “Using a web-                 de Software,” no. June 2008, pp. 201–214, 2008.
                                                                          26. I. Garcia, C. Pacheco, and D. Cruz, “Adopting an RIA-Based
    based tool to define and implement software process
                                                                              Tool for Supporting Assessment, Implementation and Learning
    improvement initiatives in a small industrial setting,” IET
                                                                              in Software Process Improvement under the NMX-I-059/02-
    Softw., vol. 4, no. 4, p. 237, 2010.
10. A. I. F. Ferreira et al., “Taba Workstation: Supporting                   NYCE-2005 Standard in Small Software Enterprises,” in 2010
    Software Process Improvement Initiatives Based on Software                Eighth ACIS International Conference on Software
                                                                              Engineering Research, Management and Applications, 2010,
    Standards and Maturity Models,” Softw. Process Improv. 13th
                                                                              pp. 29–35.
    Eur. Conf. EuroSPI 2006, Joensuu, Finland, Oct. 11-13, 2006.
                                                                          27. D. C. Silva, A. Raldi, T. Messias, A. M. Alves, and C. F.
    Proc., pp. 207–218, 2006.
11. L. P. Mezzomo, S. Ronaldo, A. Marcos, and L. De                           Salviano, “A Process Driven Software Platform to Full Support
                                                                              Process Assessment Method,” in 2014 40th EUROMICRO
    Vasconcelos, “A Set of Support Tools to Software Process
                                                                              Conference on Software Engineering and Advanced
    Appraisal and Improvement in Adherence to CMMI-DEV,”
                                                                              Applications, 2014, pp. 135–136.
    ICSEA 2016 Elev. Int. Conf. Softw. Eng. Adv., no. CMMI, pp.
    263–271, 2016.                                                        28. R. Hunter, G. Robinson, and I. Woodman, “Tool support for
                                                                              software process assessment and improvement,” Softw.
12. S. Thompson, T. Torabi, and P. Joshi, “A Framework to Detect
                                                                              Process Improv. Pract., vol. 3, pp. 213–223, 1997.
    Deviations During Process Enactment,” in 6th IEEE/ACIS
                                                                          29. K. Gerke, J. Cardoso, and A. Claus, “Measuring the
    International Conference on Computer and Information
    Science (ICIS 2007), 2007, pp. 1066–1073.                                 Compliance of Processes with Reference Models,” in On the
                                                                              Move to Meaningful Internet Systems: OTM 2009, vol. 5870,
13. M. Smatti, M. Oussalah, and M. A. Nacer, “A review of
                                                                              R. Meersman, T. Dillon, and P. Herrero, Eds. Springer Berlin /
    detecting and correcting deviations on software processes,” in
                                                                              Heidelberg, 2009, pp. 76–93.
    2015 10th International Joint Conference on Software
    Technologies (ICSOFT), 2015, vol. 1, pp. 1–11.                        30. S. Roongsangjan, T. Sunetnanta, and P. Mongkolwat, “Using
                                                                              FCA Implication to Determine the Compliance of Model
14. M. A. A. da Silva, R. Bendraou, J. Robin, and X. Blanc,
                                                                              Practice Implementation for Software Process,” in Proceedings
    “Flexible Deviation Handling during Software Process
                                                                              of the ICMSS 2017, 2017, pp. 64–70.
    Enactment,” in 2011 IEEE 15th International Enterprise
    Distributed Object Computing Conference Workshops, 2011,              31. NIST/SEMATECH, “e-Handbook of Statistical Methods,”
                                                                              2012.                     [Online].                  Available:
    pp. 34–41.
                                                                              http://www.itl.nist.gov/div898/handbook/.
15. X. He, J. Guo, Y. Wang, and Y. Guo, “An Automatic
                                                                          32. I. Sommerville, Software Engineering, 9th ed. Harlow,
    Compliance Checking Approach for Software Processes,” in
    2009 16th Asia-Pacific Software Engineering Conference,                   England: Addison-Wesley, 2010.
                                                                          33. “OpenUp Process in Eclipse Process Framework Project
    2009, pp. 467–474.
                                                                              (EPF).”                    [Online].                 Available:
16. M. Huo, H. Zhang, and R. Jeffery, “An Exploratory Study of
                                                                              http://epf.eclipse.org/wikis/openup/.
    Process Enactment As Input to Software Process
    Improvement,” in Proceedings of the 2006 International                34. OMG, “Software & Systems Process Engineering Meta-Model
                                                                              Specification V2.0,” no. April. p. 236, 2008.
    Workshop on Software Quality, 2006, pp. 39–44.




    Copyright © 2017 for this paper by its authors.                  29