<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>G. Güngör);</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>A Maturity Model Guidance Approach for Integration Testing of Avionics Software</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Gü lsü m Gü ngö r</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ayç a Kolukısa Tarhan</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Hacettepe University</institution>
          ,
          <addr-line>Hacettepe Beytepe Kampü sü , Ankara, 06800</addr-line>
          ,
          <country country="TR">Turkey</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Turkish Aerospace Industries(TAI)</institution>
          ,
          <addr-line>Fethiye Mahallesi Havacılık Bulvarı No:17, Ankara, 06980</addr-line>
          ,
          <country country="TR">Turkey</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2023</year>
      </pub-date>
      <volume>000</volume>
      <fpage>0</fpage>
      <lpage>0003</lpage>
      <abstract>
        <p>Safety-critical software failures lead to serious results such as loss of live or damage to the environment; therefore, safety-critical software verification requires special attention. Avionics system software is one type of safety-critical software. “DO-178C: Software Considerations in Airborne Systems and Equipment Certification” was released in 2011 by RTCA, Inc., which defines processes for airborne systems software development and verification. On the other hand, there are well-defined guidelines to improve verification and validation processes of software system development, specifically for software testing. TMMI (Test Maturity Model Integration) model is produced by TMMI Foundation as a guidance for organizations to improve their test processes and product quality. Avionics system software has own safety-related software characteristics, and TMMI does not specifically address software testing practices of these characteristics. To fill this gap, we first identify avionics software characteristics from DO-178C handbook as the base for software testing, and then propose a domain specific guidance document that employs TMMI practices as complementary to DO178C activities. The document is aimed to help test organizations in improving test processes by focusing on airborne software characteristics. The proposed approach is targeted for integration testing of avionics software, since this level of testing is very critical for defect prevention in the safetycritical domain.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Safety-critical</kwd>
        <kwd>avionics software</kwd>
        <kwd>integration testing</kwd>
        <kwd>DO-178C</kwd>
        <kwd>TMMI</kwd>
        <kwd>maturity model</kwd>
        <kwd>test maturity 1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Safety-critical system failures lead to serious results such as loss of lives or damage to the
environment. The software used in avionics systems is classified as safety-critical software in
which emerging errors can cause serious consequences. Verification of avionics software is
crucial to prevent these undesired results. The first guide to standardize avionics software
development was published in 1981 with the name “DO-178: Software Considerations in
Airborne Systems and Equipment Certification” [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. In 2011, DO-178C version [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] was released,
which addresses software verification processes with different levels of testing (i.e.,
requirementbased low-level testing, integration testing, and hardware-software integration testing).
      </p>
      <p>
        DO-178C defines integration testing as it ensures that software components interact correctly
and satisfies software requirements and software architecture [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Defects that can be detected
only at the level of integration testing are critical to avoid serious consequences in avionics
software. Since DO- 178C document heavily focuses on requirement-based integration testing,
obeying to DO-178C alone is not sufficient to evaluate and improve integration testing processes.
32nd International Workshop on Software Measurement (IWSM) and the 17th International Conference on Software
Process and Product Measurement (MENSURA), September 14–15, 2023, Rome, Italy
gulsumgng@gmail.com (G. Güngör); atarhan@cs.hacettepe.edu.tr (A. K. Tarhan)
      </p>
      <p>
        On the purpose of testing process and software quality improvements, various models have
been developed. Test maturity models such as Test Improvement Model (TIM) [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], Test Process
Improvement Model (TPI) [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], Test Maturity Model Integration (TMMI) [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], Unit Test Maturity
Model [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] and PTMM [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] are among these models. The mentioned maturity models can be
classified in different groups according to their characteristics. The first group can be defined as
tester (or person) skills centered maturity models such as PTMM [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. This type of models focuses
on tester skills to improve testing maturity. The second group includes maturity level-based
models that each level has its own goals to be achieved to reach a defined maturity level [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. The
third group of maturity models are testing level based models that specifically focus on one
testing level (such as unit testing) and offer activities for the concerned level [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. Another group
of test maturity models include continuous models that define key performance areas to
determine maturity levels [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Also, there are some models applicable on automated testing
activities [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. None of the maturity models mentioned above focuses on integration testing level
or avionics software (in safety-critical) testing domain. Similarly, the well- recognized software
testing standard ISO/IEC 29119 [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] does not focus on software testing maturity or avionics
software testing in particular. TMMI doesn’t particularly focus on any domain, but it is a
welldesigned common model. Furthermore, the structure of TMMI document is similar to DO-178C
handbook. In order to fill this gap, this study aims to offer a guidance document for test
organizations who want to apply TMMI practices for their integration testing processes within
avionics software development.
      </p>
    </sec>
    <sec id="sec-2">
      <title>2. Method</title>
      <p>
        The DO-178C handbook defines software development life-cycle processes starting from
software planning [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. In the study [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] entitled “Evaluation of accomplishment of DO-178C
objectives by CMMI-DEV 1.3”, intersection of CMMI-DEV (Capability Maturity Model Integration
for Development) [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] practices and DO-178C activities are defined. Some of the CMMI-DEV
practices are matched with the DO-178C activities [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. It is concluded that CMMI-DEV is not
sufficient to cover all the software development activities mentioned in DO-178C and most of the
DO-178C verification activities are out of CMMI-DEV’s scope [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. On the other hand, verification
and testing activities are in the scope of TMMI since the testing terminology used in TMMI has
been derived from ISTQB (International Software Qualifications Board) Standard Glossary of
terms used in Software Testing [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. However, TMMI is a generic testing maturity model and not
specific to avionics domain.
      </p>
      <p>
        In this study, efficient application of TMMI model is aimed for improving avionics software
verification process, specifically integration testing process. The DO-178C handbook and the
TMMI model are analyzed to understand the necessity for a maturity model that is specific to
safety-critical software integration testing. The TMMI model, which can be used complementary
to CMMI [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], is found to be convenient to match its processes and practices with the verification
activities defined in the DO-178C handbook. Also, TMMI is one of the level-based models that is
applicable for all software testing levels, including integration testing, and it covers both manual
regression tests and automated tests [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
      </p>
      <p>
        In this context, as the first step, processes in DO-178C are inspected and the activities
mentioned in DO-178C processes are mapped with the TMMI (Release 1.3) practices, in order to
understand the similarity between the two as specific to verification and software testing. After
this, DO-178C avionics software characteristics that are not specifically mentioned in the TMMI
model identified. As the second step, considering these domain based characteristics, a guidance
document for test organizations is proposed to effectively apply TMMI model and improve
integration test processes of avionics software. For example, TMMI Level-2 refers to specific goals
such as establishing test policy (addressing test goal definition practice) and test strategy [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
Furthermore, in this level, TMMI practices that aim to achieve these goals refer to determining
business objectives, business needs and project needs [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. Based on the findings of the mapping
process, characteristics and needs of DO-178C software integration testing are defined within
this guidance document and can be utilized to determine business objectives, business type, or
project needs mentioned in TMMI practices.
      </p>
    </sec>
    <sec id="sec-3">
      <title>3. Comparison and Results</title>
      <sec id="sec-3-1">
        <title>3.1. Comparison between DO-178C and TMMI</title>
        <p>Software development life-cycle processes are covered in subtitles of the DO-178C handbook as
listed below.</p>
        <p>1. Software planning process,  
2. Software development process,  
3. Software verification process,  
4. Software configuration management process,  
5. Software quality assurance process,  
6. Certification liaison process.  </p>
        <p>
          The DO-178C handbook summarizes these software development life-cycle processes within
tables in Annex-A [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]. Each process involves objectives and related activities to reach the defined
objectives. That is, the tables in Annex-A map activities and objectives for each process [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ].
        </p>
        <p>
          The TMMI model, on the other hand, offers process areas together with their goals and
practices to achieve these goals, for each TMMI level [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. The TMMI model contains five maturity
levels, and each level has its own specific practices [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. Besides, TMMI defines generic practices
that are common for all process areas [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ].
        </p>
        <p>In the first step of this study, DO-178C process areas are analyzed and each activity mentioned
in DO-178C sections are compared with TMMI (Release 1.3) practices, in order to understand the
relation between DO-178C verification activities and TMMI test process maturity practices. Each
DO-178C activity is compared with the TMMI’s specific practices at all maturity levels and with
the generic practices. Figure-1 shows the mapping schema of each DO-178C activities to TMMI
practices. The analyzed DO-178C activities are grouped as “Covered”, “Partially Covered” and
“Not Covered” according to the comparison results with the TMMI practices. DO-178 activity that
is common for at least one TMMI practice is classified as “Covered”. DO-178C activity having
scope that is partially matched with any TMMI practice is classified as “Partially Covered”. If there
is no relevant TMMI practice for the analyzed DO-178C activity, that activity is classified as “Not
Covered”.</p>
        <p>DO-178C activities are analyzed respectively, starting from the first process defined in
DO178C Section-4: Software Planning Process. Table 1 shows a snapshot from the comparison
between the activities of software planning process of DO-178C and the TMMI process area
practices.</p>
        <p>
          Since TMMI focuses on testing and test planning process areas, they are related only with test
planning activities within Software Planning Process of DO-178C. Accordingly, the DO-178C
software planning activities are classified as “Covered”, “Partially Covered” and “Not Covered” as
shown in Table 1. The DO-178C software planning activities, which are in the scope of “Test
Planning” process area of TMMI, are classified as “Covered”, and it has been observed that the
number of activities in “Covered” and “Partially Covered” groups corresponds to only half of the
practices in this process area. The complete mapping between the TMMI process area practices
and the DO-178C process activities can be reached from [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ].
        </p>
        <p>
          For each software development life-cycle process defined in DO-178C sections, a new table
(similar to the one in Table 1) is created per section (or subsection) considering the structure of
Annex-A. It should be reminded that the tables in Annex-A summarize the activities [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] of the
processes covered in DO-178C. The handbook’s sections of processes from Section-4 to
Section9 include “Software Planning Process”, “Software Development Process”, “Software Verification
Process”, “Software Configuration Management Process”, “Software Quality Assurance Process”
and “Certification Liaison Process”, respectively [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ].
        </p>
        <p>
          Section-5 in DO-178C defines software development process, and activities of this process are
related with software requirements, software design, coding and integration processes but not
testing [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]. According to the results of the mapping given in [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ], 5.71% of the activities are
considered as “Partially Covered” whereas rest of the software development process activities
are not covered by TMMI process area practices.
        </p>
        <p>
          Software verification process in Section-6 is addressed by the following five tables in DO-178C
Annex-A [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]: 1) Table A-3: Verification of Outputs of Software Requirements Process, 2) Table
A4: Verification of Outputs of Software Design Process, 3) Table A-5: Verification of Outputs of
Software Coding and Integration Processes, 4) Table A-6: Testing of Outputs of Integration
Process, and 5) Table A-7: Verification of Verification Process Results. Each table corresponds to
sub-sections of Section-6 for software verification process. Some emerging needs and challenges
for the practices are observed during the comparison of this process, as summarized below:
• DO-178C refers to separate sub-sections for low-level and high-level requirements
analyses [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ];  however, the TMMI model does not particularly mention about testing practices
of low-level or high-level requirements since it can be applied to improve the practices at any
testing level. Accordingly, the activities in Section 6.3.1: Reviews and Analyses of High-Level
Requirements and 6.3.2: Reviews and Analyses of Low-Level Requirements do not match with
the TMMI process area practices specifically.
• DO-178C Section 6.4 defines software testing process, and different levels of testing
activities, including the ones for integration testing, are described in this process [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ].
Therefore, the 65.21% of the activities mentioned in this process with the TMMI practices is
identified as quite high.
• DO-178C Section 6.5 addresses software verification process traceability [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ], and the
activities are considered as “Covered” and “Partially Covered” by the TMMI practices for this
subsection since DO-178C defines three bi-directional traceability analyses which are between
software requirements and test cases, test cases and test procedures, test procedures and test
results.  
• DO-178C has avionics software development characteristics and “parameter data item”,
which is a feature to enable changing behavior of software without modifying its code, is one
of them as mentioned in Section 6.6 [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]. Since TMMI is not a domain specific test maturity
model, it does not offer any verification or test maturity practice for parameter data items.
        </p>
        <p>
          Section 7 in DO-178C defines software configuration management process [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] and TMMI
considers configuration management in its general practices by referring to CMMI’s configuration
management process [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. The study [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ] provides coverage rate as 48% between CMM-Dev and
DO-178C for configuration management activities. In this current study, activities of this process
are considered by TMMI practices and the practices labeled both “Covered” and Not Covered” are
found. Software quality assurance process is defined in DO-178C Section 8 [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] and its activities
are also considered in the groups of “Covered” or “Not Covered”. Finally, DO-178C Section 9:
Certification Liaison Process [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] is out of the TMMI’s scope, and therefore, the activities belonging
to this process are “Not Covered” by any TMMI practice.
        </p>
        <p>
          As a result, it has been observed that the TMMI practices are not sufficient alone to accomplish
DO- 178C verification activities. DO-178C has avionics software development characteristics and
definitions that are not in the scope of TMMI. DO-178C defines avionics software specific items
and concepts such as verification of parameter data item, user modifiable software, deactivated
code, multi- version dissimilar software verification, COTS software, option selectable software,
and field-loadable software [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] that are not discussed in TMMI. Therefore, some of the process
activities in DO-178C (regarding verification and testing of avionics software) do not match with
the TMMI practices. For example; low-level requirements, high-level requirements, and their
relation to system requirements are mentioned in detail within DO-178C objectives, but TMMI
process area goals do not refer to this structure which is specific to avionics domain. In addition,
some of the change related activities (software change, requirement change, new compiler usage,
different loader version, change of application or development environment, etc.) and
reexecution needs of tests are defined in DO-178C within safety-critical aspects [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]. Even though
TMMI offers change related practices [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ], scope of the change should be revised by considering
DO-178C safety-critical avionics software development. Moreover, the DO-178C handbook
includes a subsection for “Robustness Test Cases” that shows software behavior in abnormal
conditions [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]. It is critical to avoid undesired results, but it is not particularly discussed in TMMI
practices.
        </p>
        <p>
          The shortages mentioned above should not be considered as weaknesses of the TMMI model
since it is a generic maturity model that offers many practices to improve testing processes and
product quality. Rather, the shortages indicate the need for a testing maturity model specific to
avionics domain. Finally, on the opposite side of the mapping, some TMMI process areas such as
test training programs, incident management and advanced reviews [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ] are not discussed in
DO178C processes in detail. Integration testing is one of the critical test levels in the scope of
highlevel requirements-based testing in DO-178C to avoid undesired results of safety-critical avionics
software. It is observed that TMMI process area practices can enrich the activities for integration
testing level defined in DO-178C. Therefore, the results of bi-directional comparison between
DO178C activities and TMMI practices have shown that the mutual consideration of these two
resources for developing a domain specific maturity model guidance approach for integration
testing of avionics software is prominent.
        </p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Guidance Document for TMMI Applications on Avionics Software</title>
      </sec>
      <sec id="sec-3-3">
        <title>Integration Testing</title>
        <p>In the previous section, the contents of the TMMI model and the DO-178C handbook are
compared to understand the requirements of avionics software testing maturity concept.
Avionics software characteristics need more specific testing practices to comprise avionics
software item verification activities. As the next step, a guidance document that employs TMMI
practices as complementary to DO-178C activities is developed to effectively improve domain
specific software testing processes, more specifically integration testing activities, within
avionics software development.</p>
        <p>
          After determining domain specific characteristics from DO-178C handbook, each TMMI
practice is reviewed and relevant DO-178C handbook section references are provided for
practices to implement them considering domain specific characteristics. The reference from a
TMMI practice to DO-178C section is called link. To apply a TMMI practice, links would be helpful
to describe and implement given practice within safety-critical avionics software characteristics.
Table 2 shows a link example from guidance document for the TMMI practice called “Test Policy
and Strategy” [
          <xref ref-type="bibr" rid="ref2 ref5">2,5</xref>
          ].
        </p>
        <p>Links contain DO-178C section references and each referred section defines activities or
definitions related to airborne software.</p>
        <p>
          TMMI document defines specific goals and practices starting from TMMI Level-2. In this level,
TMMI refers to the specific goal of “Establish a Test Policy” that contains specific practices called
“Define test goals” and “Define test policy” [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. These practices refer to determining business
objectives, business needs and project needs [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. Also, other TMMI Level-2 process areas such as
“Test Planning” and the processes at further maturity levels (from level-3 to level-5) describe
practices addressing test goals and strategy that are defined in level-2 practices previously [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ].
The business objectives and needs, in other words test policy and strategy that are defined based
on domain specific characteristics, can help the implementation of the rest of the TMMI processes
considering domain specific characteristics. Therefore, a guidance document that offers links to
airborne specific items of DO-178C sections is proposed to employ TMMI practices for improving
integration testing of avionics software focusing on domain specific needs.
        </p>
        <p>
          The list of items mentioned in the DO-178C sections 6.1, 6.4.3 in the following [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]:
1. Software components must interact correctly with each other.  
2. System requirements allocated to software have been developed into high-level
requirements that satisfy those system requirements.  
3. Software must satisfy high-level requirements and responds as expected.  
4. Software is robust to respond correctly when abnormal inputs and conditions occurred.
5. Software components together satisfy the software requirements and software
architecture.
        </p>
        <p>
          These items mentioned above are relevant to TMMI level-2 “Define test goals” practice therefore,
a link is defined for this practice that refers DO-178C sections 6.1 and 6.4.3. The items can be used
to guide test organizations on the application of “Define test goals” practice of TMMI considering
them within the business needs and objectives [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]. “Define test goals” practice refers to a
subpractice called “Define test goals traceable to business needs and objectives” and the verification
of guidance document items mentioned above can be example to test goals of avionics software
integration tests. TMMI document refers to these test goals in further maturity level practices.
Therefore, links offered by this guidance approach for a practice can affect other practices if it
contains same terms or it is referred by other practices.
        </p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Challenges</title>
      <p>In the first step of this study, a comparison process was implemented. Since the coverage
decisions in comparison process are subjective, there is a risk to get exactly the same coverage
labels for each practice on every repeat by other practitioners. Nevertheless, even different
practitioners execute the same comparison process, still they will see that DO-178C activities are
not fully covered by TMMI practices so at the end, different outcome isn’t expected.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusion and Future Work</title>
      <p>This study explains the preliminary steps taken to propose an avionics software testing maturity
model in order to improve integration testing processes of projects obeying to DO-178C
requirements. In this context, DO-178C is analyzed to understand avionics software verification
activities and needs. TMMI is taken as the base maturity model since its “process area &amp; practice”
structure is similar to the process structure in DO-78C. In the first step, TMMI practices and
DO178C activities are analyzed for bi-directional mapping with respect to the needs of avionics
software testing. Then, a document as complementary to DO-178-C handbook is proposed to help
test organizations on applying TMMI to improve their integration testing processes within
avionics software development, focusing on the characteristics of airborne software domain as
described in DO-178C.</p>
      <p>As the next step, empirical validation studies are planned to review the complementary
(guidance) document and improve it as necessary. DO-178C characteristics and links are defined
in this study as a guidance approach.</p>
      <p>First, expert opinions will be taken to finalize the scope and content of the proposed guidance
document considering links and directions mentioned in it. Then, case study research method
will be used to investigate the applicability and usefulness of the final version of the document.
Also, maturity level for the case study will be defined and reported according to this approach.</p>
    </sec>
    <sec id="sec-6">
      <title>6. References</title>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <source>[1] DO-178: Software Considerations in Airborne Systems and Equipment Certification</source>
          , Washington, DC: RTCA Inc.,
          <year>1981</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <source>[2] DO-178C: Software Considerations in Airborne Systems and Equipment Certification</source>
          , Washington, DC: RTCA Inc.,
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Ericson</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Subotic</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Ursing</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          (
          <year>1997</year>
          ),
          <article-title>TIM-a test improvement model</article-title>
          .
          <source>Softw. Test. Verif.</source>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>J.</given-names>
            <surname>Andersin</surname>
          </string-name>
          , “
          <article-title>TPI -a model for Test Process Improvement</article-title>
          ,”
          <year>2004</year>
          . Accessed: Jun.
          <volume>01</volume>
          ,
          <year>2023</year>
          . [Online]. Available: https://www.cs.helsinki.fi/u/paakki/Andersin.pdf
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>TMMi</given-names>
            <surname>Foundation</surname>
          </string-name>
          , “
          <article-title>Test Maturity Model integration (TMMi®) Guidelines for Test Process Improvement Release 1.3 Produced by the TMMi Foundation</article-title>
          .”
          <source>Accessed: Jun. 01</source>
          ,
          <year>2023</year>
          . [Online]. Available: https://www.tmmi.org/tmmi-documents/
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>D. M.</given-names>
            <surname>Karr</surname>
          </string-name>
          , “
          <source>The Unit Test Maturity Model” Accessed: Jun. 01</source>
          ,
          <year>2023</year>
          . [Online]. Available: http://davidmichaelkarr.blogspot.com/
          <year>2013</year>
          /01/the-unit
          <article-title>-test-maturity-model</article-title>
          .html
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>S.</given-names>
            <surname>Reid</surname>
          </string-name>
          , “Personal Test Maturity Matrix”,
          <source>Accessed: Jun. 01</source>
          ,
          <year>2023</year>
          . [Online]. Available: https://www.stureid.info/stuart-reid
          <article-title>-software-testing/software-testing-whitepapers/personal-test- maturity-matrix/</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>V.</given-names>
            <surname>Garousi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Felderer</surname>
          </string-name>
          and
          <string-name>
            <given-names>T.</given-names>
            <surname>Hacalog</surname>
          </string-name>
          <article-title>̆lu, "What We Know about Software Test Maturity and Test Process Improvement,"</article-title>
          <source>in IEEE Software</source>
          , vol.
          <volume>35</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>84</fpage>
          -
          <lpage>92</lpage>
          , January/February 2018, doi: 10.1109/MS.
          <year>2017</year>
          .
          <volume>4541043</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <source>[9] ISO/IEC/IEEE 29119-1:2022 Software and systems engineering - Software testing - Part</source>
          <volume>1</volume>
          :
          <string-name>
            <surname>General</surname>
            <given-names>concepts</given-names>
          </string-name>
          ,
          <source>ISO/IEC/IEEE 29119-1</source>
          :
          <year>2022</year>
          ,
          <year>2022</year>
          . [Online]. Available: https://www.iso.org/standard/81291.html
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>A.</given-names>
            <surname>Ferreirós</surname>
          </string-name>
          and
          <string-name>
            <given-names>L. A. V.</given-names>
            <surname>Dias</surname>
          </string-name>
          ,
          <article-title>"Evaluation of Accomplishment of DO-178C Objectives by CMMI- DEV 1</article-title>
          .3,
          <issue>"</issue>
          2015 12th International Conference on Information Technology - New Generations, Las Vegas,
          <string-name>
            <surname>NV</surname>
          </string-name>
          , USA,
          <year>2015</year>
          , pp.
          <fpage>759</fpage>
          -
          <lpage>760</lpage>
          , doi: 10.1109/ITNG.
          <year>2015</year>
          .
          <volume>132</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>G.</given-names>
            <surname>Güngör</surname>
          </string-name>
          ,
          <string-name>
            <given-names>TMMI</given-names>
            &amp;DO-178C
            <surname>Mapping</surname>
          </string-name>
          , Accessed:
          <year>June 2023</year>
          .[Online]. Available: https://doi.org/10.5281/zenodo.8002215
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>