Do We Need to Teach Testing Skills in Courses on Requirements Engineering and Modelling? Gayane Sedrakyan Monique Snoeck Dept. of Decision Sciences and Information Management Dept. of Decision Sciences and Information Management K.U. Leuven K.U. Leuven Leuven, Belgium Leuven, Belgium gayane.sedrakyan@kuleuven.be monique.snoeck@kuleuven.be Abstract—It is commonly accepted that quality testing is the for requirements verification lead to incomplete, inaccurate, integral part of system engineering. Recent research highlights ambiguous, and/or incorrect specifications [2]. When detected the need of shifting testing of a system to the earliest phases of later in the engineering process such requirements errors can engineering in order to reduce the number of errors resulting be expensive and time-consuming to resolve [3]. This from miscommunicated and/or wrongly specified requirements. significant gap between the knowledge and skills of novices Information and Computer Science education might need to adapt to such needs. This paper explores the perspectives and and experts triggers the question of how analysis skills can be benefits of testing-based teaching of requirements engineering. trained to facilitate the fast progression of novice analysts into Model Driven Engineering (MDE) is known to promote the advanced levels of expertise. early testing perspective through fast prototyping of a prospective system contributing in this way to semantic validation of B. Testing perspective contributes to improved knowledge requirements. Our previous research presents empirically Testing is known as an integral part of software engineering. validated positive results on the learning effectiveness of model- Recent research highlights the need of shifting testing of a based requirements engineering in combination with adapted system to the earliest phases of engineering [4]. The term early MDE-prototyping method within an educational context to test testing is used to define a line in test research oriented to the requirements and to test the requirements testability. Despite enhance the systematic implementation of test cases based on these positive results, our observation of the prototype testing system requirements and business models [5]. Several patterns of novice analysts suggest that combining this prototype- based learning with the teaching of testing skills, such combined approaches (such as the V-model [6] or the Business Driven approach can result in even better learning outcomes. Test Management [7]) focus on early testing of business requirements within the system development process. Testing Index Terms—Requirements, analysis, conceptual modelling of requirements includes the following perspectives: 1. quality, testing, validation, prototyping, feedback, technology- requirements must be tested and validated, 2. Test cases must enhanced learning. be defined early, 3. Requirements must be specified in a way to be testable [8]. Teaching testing knowledge and skills is I. INTRODUCTION however largely neglected from Requirements Engineering courses. While testing is refined into a more exact discipline A. Problem domain using well-established standards, processes and document In the early project phases the functionality of the prospective artefacts to integrate software and requirements [9], knowledge system is not yet understood precisely enough for of requirements analysis is inexact by nature and is mostly formalization, which makes the requirements elicitation not reliant on experience. This suggests that teaching requirements only a refinement, but also a learning process. This process is engineering using a test-based approach may contribute to complicated by at least two problems present in natural improved requirements engineering skills. language: ambiguity and inaccuracy. Formalization of C. Prototyping supports testing-based learning requirements through models enables quality control at a level that is impossible to reach with requirements articulated in Model Driven Engineering (MDE) [10] is known to natural language. While experienced requirements engineers promote early testing of software requirements through fast manage to mentally picture the prospective system in their prototyping of a prospective system contributing in this way to mind when transforming requirements into formal models, the semantic validation of requirements (see Fig. 1). such ability to truly understand the consequences of modelling The learning context of prototyping as a type of simulation choices can only be achieved through extensive experience. (e.g. learning by experiencing [11], [12]) suggests that, when However, the tacit knowledge expert have developed over time adapted to the educational context, MDE prototyping can is difficult to transfer to junior analysts. While teaching such support the testing-based teaching of requirements engineering knowledge and skills to novice analysts is already a skills. In this work we explore the effectiveness of testing- challenging task considering that system analysis is by nature based teaching of requirements analysis and validation using an inexact skill, transferring the academic knowledge and skills conceptual modeling and MDE prototyping method. We posit to real world businesses is yet another concern as the that testing-based teaching of conceptual modeling can classroom and real world situations are not identical [1]. In contribute to improved skills of novice business analysts for their early careers the error-prone problem-solving patterns of analysis, verification and validation of requirements. This then novices and their lack of capability to identify relevant triggers raises the question of "how the testing perspective can be domain for the purpose of understanding and communicating integrated in the educational context?”. system requirements [15], thus making it easier to integrate business domain and ICT expertise in the system design functional errors process. In particular, conceptual models are an essential functional errors instrument to capture and formalize the domain assumption part of requirements [16]. Furthermore being a sub-discipline non-functional errors of requirements engineering (communicating requirements) and software engineering (providing a foundation for building analysis design development testing information systems) [17] makes conceptual models the earliest formally testable artefact. Conceptual modeling also a. Classical development cycle supports the MDE approach, which, in addition to its testing (Prototyping-based) potential, brings forward additional requirements towards requirements testing models such as a sufficient level of preciseness and detail to functional errors non-functional errors provide executable specifications, contributing in this way to improved quality of design artefacts. Thus we focus on analysis design development testing conceptualization of functional requirements as a basis of producing formally testable artefacts to facilitate the process of b. Development cycle using requirements testing domain understanding and requirements elicitation. Fig. 1. Prototyping-based testing of requirements III. RELATED WORK Despite the considerable amount of work devoted to simulation II. EDUCATIONAL CONTEXT AND CONCEPTS methodologies and prototyping in particular, to our knowledge The proposed method (adapted MDE environment) has been no research publications have been written describing courses developed by the Management Informatics research group at that use prototyping in the context of requirements the faculty of Business and Economics, University of K.U. engineering, nor empirically proven learning benefits have Leuven. The approach has been subsequently tested and been reported for a certain tool. The reason is that the existing validated within the course “Architecture and Modeling of standards for simulation/prototyping technologies also Information Systems”1 over a 5-years period of teaching, with introduce a number of shortcomings. Among major reasons are participation and constant feedback from 500 students overall. (1) being too complex and time consuming to achieve by The course targets at master level students with heterogeneous novice analysts whose technical expertise is limited, (2) the backgrounds from the Management Information Systems difficulty of interpreting the simulation results. Among program. The goal of the course is to familiarize the students different types of simulation, the method of prototyping is with modern methods and techniques of Object-Oriented capable of achieving the most concrete form of a prospective Analysis and Design for Enterprise Information Systems. system. In our previous works we proposed a lightweight Within the course the specific focus is on functional MDE-based prototyping method adapted to learning context. requirements. We motivate this choice by several reasons. The effectiveness of a prototype in a learning context was When propagated to the later stages of development, enhanced by the use of textual and graphical feedback when requirements errors incur high cost to repair. Empirical studies and why the execution of a triggered business event is refused, show that more than half the errors that occur during system thus making the links between a prototype and its model development are requirements errors [3]. Furthermore explicit [18], [19], [20]. The methodology used (rapid requirements errors are the most common cause of failure of prototyping method enabled by executable conceptual models) development projects [3]. The software development process is based on the concepts of MERODE [21]. A sample screen involves the translation of information from one form to shot is shown in Fig. 3. another (e.g. from customer needs to requirements to architecture to design to code). Because this process is human- based, mistakes are likely to occur during the translation steps [13]. Formalization of requirements through models enables quality control at a level that is impossible to reach with requirements articulated less formal in natural language. Formalization of requirements includes transformation of informally represented knowledge into a formal specification Fig. 2. Testing a prototype requires a skill that is a good example of a (transformation step) affecting all three dimensions of requirements engineering: specification, The prototyping method was also maximally adapted to novice representation, agreement [14]. Because of targeting a high analysts whose technical expertise is limited. The effects of level functional view on the prospective system, functional feedback-enabled simulation on learning outcomes of novice requirements can be formalized by means of highly abstract learners were observed by means of empirical studies. design representations – conceptual models. As a sub- Extensive experimental testing with participation of 114 discipline of requirements engineering, conceptual modeling is students has demonstrated the positive effect of prototype-based described as the process of formally describing a problem simulation on requirements analysis and validation skills of junior modelers [19]. Despite the significant improvement of 1 learning outcomes, we also observed several difficulties in The course’s page can be found on students’ testing cycles (see the following chapters). http://onderwijsaanbod.kuleuven.be/syllabi/e/D0I71AE.htm IV. TEACHING EXPERIENCES WITH FEEDBACK-ENABLED the solution is scored, and then students are interrogated to PROTOTYPING determine the final score as a correction on the model score. In Throughout the semester testing-based analysis and validation the cohort of January 2012, students were asked to demonstrate cycles are stimulated by a problem-based learning method. In their solution by manually inspecting the model using a test parallel with theoretical sessions students are requested to case provided by the teacher. Less than half of the students in participate in computer lab exercise sessions in which they are this cohort were able to identify mistakes in their solution, not given analysis tasks such as validating a given conceptualized even when manually simulating it through a mental execution specifications (usually a conceptual model solution of their with a given test scenario. In the cohort of January 2013, the peers) against given business requirements. The proposed same type of evaluation was performed, but this time students solutions usually contain erroneous models which students had to execute the given test scenario using the prototype. By need to read, understand, validate against requirements and in means of the dynamic testing approach in this cohort, more case design errors are detected propose improvements. than half of the students were not only able to see mistakes but Validation cycles are supported by MDE-prototyping as were also able to correct them. Although this result is positive, described in this paper. During the semester students are also we nevertheless observed student incapacities to develop their assigned a group project (a real-world case with approximately own adequate test scenarios [19]. To assess the effectiveness of 5-15 pages requirements document). At the end of the semester the feedback-enabled simulation cycle on learning outcomes Fig. 3. Validation through prototyping using feedback of novice learners three studies were conducted in the context using the simulated model in the process of validating the of two master-level courses from two different study programs proposed model solutions to the results of the tests in which spanning two academic years with participation of 104 they did manual inspection. The results of the statistical students overall. During the experiments students were asked analysis showed significant improvement on students’ to assess whether or not the model reflected a particular capabilities to validate conceptual specifications for given requirement statement correctly by responding to a set of requirements (relative advantage (positive correction) of true/false questions (requirements rephrased into test approximately 2.33 points on 8 was observed; without = 3.1, questions), e.g. “in this model solution invoicing is required to with = 5.43, p = 0.000) [19]. The evaluation by students for the buy a retail product (TRUE/FALSE?)”. They were also asked improved tool extended with feedbacks in 2013 resulted in to motivate their answers. For each correct answer 1 point was average of 4,58 on perceived usefulness (for the prototyping attributed, and 0 for each wrong answer. In total 8 questions tool) and 4,52 (for the incorporated feedbacks) on a five-point had to be answered (min. score = 0; max. score = 8). The Likert scale. results were analyzed by comparing the test scores of students A. Observations of testing patterns A As stated above while the findings of the experiments showed a significant improvement in students’ model-based validation capabilities when using feedback-enabled simulation, we still observed difficulties in testing by students. In this work we B report on our findings on testing approaches of novice analysts by exploring the wrong answers by students. Motivations to Fig. 5. Parallel paths of dependencies the answers provided by students were qualitatively analyzed and the scenarios that occurred more frequently were V. PROPOSED SOLUTION: BORROWING TESTING ARSENAL generalized into patterns. An example of testing an erroneous model is shown in Fig. 3 B. Testing patterns by means of a model about (mobile phone) services which Major problems generalized from students motivations customers can subscribe to, and for which promotion packages resulted in the following error patterns: (1) Omitted prototyping are offered regularly. Testing the prototype reveals a semantic cycle; (2) Partial testing with a use of prototype characterized mismatch (design error): trying to subscribe to a service results by incomplete testing scenarios. In their motivations for the in execution failure due to a sequence constraint violation (the answer when a simulation cycle was omitted, students referred state of the “promotionPackage” object to which the chosen to a modeling construct that according to them was already service is associated is “suspended”). The scenario fails obvious with manual inspection (e.g. relationship is optional), because of a behavioral constraint, but it actually reveals a failing to consider another constraint that resulted in a wrong hidden dependency from “service” to mandatory relationship (e.g. cardinality constraint was “promotionPackage”: it seems a service depends on the omitted). The following frequent patterns were found in the availability of a promotion, which is incorrect. The explanation motivations where a partial test was performed: can be extended with graphical visualization linking to the specific part of the model that causes the error.  Pattern 1: Confirmative rather than explorative While in the example above the testing results can be inter- (approximately 20% of wrong answers) preted subjectively by students depending on their analytical skills, teaching a more systematic testing approach would bene- Sample requirement : “Each request can be processed by fit to improved skills for verification. To stimulate test-based exactly one reviewer”. requirements validation we propose borrowing the concept of Testing approach : The testing scenario is limited to acceptance test, the goal of which is to ensure the testability of confirmation scenario. While the requirement is tested for requirements [6]. This requires teaching knowledge of how to the positive case “can be viewed by a reviewer”, testing write/reformulate requirements as tests with the use of testing the constraint “by not more than one” was omitted. artifacts such as Test Case (purpose, assumptions, pre-  Pattern 2: Insufficient examination of path dependencies to conditions, steps, expected outcome, actual outcome, post- identify related instances through transitive paths of conditions) and Test Scenario (process flows, i.e. sequence of dependencies (approximately 50% of wrong answers) executing test cases). Next, the concept of coverage testing can be including to ensure the completeness of execution (each Sample requirement : “Ordering is not required for sell- requirement should be exercised at least once). To ensure bet- ing Retail Products to Walk-in Customers”. ter results peer expertise can be exploited by peer reviews of Testing approach : The testing scenario is limited to the group projects in which one group of students would act as first level of dependency, e.g. the student’s motivation testers for another group. A simple example demonstrating an refers to the need of creating an invoice line which only improved validation cycle for an erroneous model (see Fig. 6) requires an instance of invoice, thus rejecting the with the use of a testing artefact is presented below. dependency to order. Testing the next level dependency between invoice and order was omitted (i.e. the creation of invoice was not executed to discover the dependency). A B Fig. 4. Transitive path of dependencies  Pattern 3: Insufficient examination of path dependencies to identify related instances through parallel paths of dependencies (approximately 30% of wrong answers) Fig. 6. Sample erroneous model Sample requirement : “If a business customer A orders For the requirement statement “Ordering is not required for some products, then it is possible that business customer B selling Retail Products to Walk-in Customers” a student would pays the invoice for these products. have to specify a test scenario (in the model solution of a stu- Testing approach : Testing scenario is limited to one of dent selling requires registering an invoice) … In random, blind the parallel paths, e.g. when a direct relationship between verification of this requirement, a student's attempt to create an invoice and a customer was examined, the examination of invoice line will reveal the need for an invoice first: a popup a hidden relationship through order object linked both to window of a prototype would suggest creating an instance of invoice and customer objects was omitted. invoice (or choosing from existing instances) to be associated with a newly created instance of invoice line. This will lead to [3] A. Enders, H.D. Rombach. (2003). A Handbook of Software and the conclusion that invoicing is required (but not ordering) and Systems Engineering: Empirical Observations, Laws and hence to the erroneous conclusion that the requirement is satis- Theories Addison-Wesley, Reading, MA, USA fied. A systematic approach to test plan development would [4] Robertson, S. (2000). Requirements testing: Creating an stipulate defining a complete test scenario, including the crea- effective feedback loop. FEAST 2000. tion of the invoice which would then reveal a dependency from [5] Gutiérrez, J.J., Escalona, M.J., Mejías, M., Torres, J. Generation invoice to the order object (a popup window of a prototype of test cases from functional requirements A survey, 4th requiring a creation or choice of an existing order instance) to Workshop on System Testing and Validation, Potsdam, Germany (2006) be associated with an instance of invoice, leading to the con- clusion that the above requirement is not satisfied. [6] V-Model Lifecycle Process Model. http://v-modell.iabg.de/ Furthermore, teaching regression testing knowledge can [7] Roodenrijs, E., van der Aalst, L., Baarda, R., Visser, B., Vink, benefit to improved skills for integrating changes in require- J., (2008) TMAP NEXT® - Business Driven Test Management, UTN, ISBN 9789072194930 ments (identifying the test scenarios to be repeated because of a change). To stimulate such analytical skills assignments for [8] Pohl, K. (2010). Requirements Engineering: Fundamentals, integrating modifications in requirements can be used. Principles, and Techniques, Berlin, ISBN 978-3-642-12577-5 [9] ANSI/IEEE Std 829-1983, IEEE Standard for Software Test VI. CONCLUSIONS Documentation We compared the results of oral examination with and without [10] OMG. Model-Driven Architecture. http://www.omg.org/mda/ testing scenarios provided by the teacher. Two conclusions [11] Kluge, A. (2007). Experiential Learning Methods, Simulation were obtained from this comparison: 1. the results demonstrate Complexity and their Effects on Different Target Groups. that the testing by means of a working prototype improves Journal of Educational Computing Research, 3(36), 323-349. model understanding compared to a paper exercise by 2.33 [12] Barjis, J., Gupta, A., Sharda, R., Bouzdine-Chameeva, T., Lee, points on 8. The paper exercises limit the scope of P. D., & Verbraeck, A. (2012). Innovative Teaching Using Simulation and Virtual Environments. Interdisciplinary Journal understanding to a static view of a model, whereas dynamic of Information, Knowledge, and Management, 7, 237-255. testing fosters a more thorough understanding; 2. Validation [13] Walia, G., Carver, J. (2009). A systematic literature review to cycles supported with test scenarios provided by the teacher identify and classify software requirement errors, Information resulted in better model understanding indicators than and Software Technology, Volume 51, Issue 7, pp.1087-1109, unassisted testing cycles. The results of experiments from our ISSN 0950-5849 previous studies also confirmed the effectiveness of testing- [14] Pohl, K. (1994). The three dimensions of requirements based learning of analysis and validation of requirements over engineering: A framework and its applications, Information traditional methods of learning allowing a student to build a Systems, Volume 19, Issue 3, pp. 243-258, ISSN 0306-4379, deeply understood knowledge that is developed from own http://dx.doi.org/10.1016/0306-4379(94)90044-2 practice. The observations of testing patterns of students also [15] Siau, K. (2004). Informational and computational equivalence in suggest that when combined with teaching high level testing comparing information modeling methods. Journal of Database knowledge and skills the method will result in even better Management (JDM), 15(1), 73-86. learning outcomes. The results of this work contribute to [16] Jureta I.J., Mylopoulos J., Faulkner S. (2008). Revisiting the innovative teaching practices by means of computer-enhanced Core Ontology and Problem in Requirements Engineering, IEEE learning [22] in the domain of requirements engineering thus International Requirements Engineering Conference, 71-80 promoting to better skill preparedness of novice analysts. [17] Moody D. L., Theoretical and practical issues in evaluating the The work presented in this paper can be extended in several quality of conceptual models: current state and future directions, ways. One direction would be related to data collection by Data & Knowledge Engineering, Volume 55, Issue 3, December means of the logs of the prototyping tool that might provide 2005, pp. 243-276, ISSN 0169-023X. new insights on testing approaches and patterns of novices. [18] Sedrakyan, G., & Snoeck, M. (2013). A PIM-to-Code While our observations were limited to a single prototyping requirements engineering framework. In Proceedings of cycle within the context of oral exams and experiments, anoth- Modelsward 2013-1st International Conference on Model-driven Engineering and Software Development-Proceedings, 163-169. er possibility could be the investigation of testing patterns ex- tended to longer periods of observations, e.g. prototyping logs [19] Sedrakyan, G., & Snoeck, M. (2013). Feedback-enabled MDA- prototyping effects on modeling knowledge. In Enterprise, of testing activities for group projects. Examination of testing Business-Process and Information Systems Modeling (pp. 411- patterns where a combination of structural and behavioral con- 425): Springer. straints are involved could be interesting as well. Based on the [20] Sedrakyan, G., Snoeck, M., Poelmans, S. (2014). “Assessing the findings a tool support to enable automated assistance or gen- effectiveness of feedback enabled simulation in teaching eration of test scenarios can be investigated as well. conceptual modeling”, Computers & Education (accepted). REFERENCES [21] Snoeck, M., Dedene, G., Verhelst, M., Depuydt, A.: Object- oriented enterprise modelling with MERODE, Leuvense [1] Damassa, D. A., & Sitko, T. (2010). Simulation Technologies in Universitaire Pers, Leuven (1999) Higher Education: Uses, Trends, and Implications. EDUCAUSE [22] EuropeanCommission. (2013). Opening up education: Center for Analysis and Research (ECAR), Research Bulletins. Innovative teaching and learning for all through new [2] Schenk, K. D., Vitalari, N. P., & Davis, K. S. (1998). technologies and open educational resources, http://eur- Differences between Novice and Expert Systems Analysts: What lex.europa.eu/legal-content/EN/TXT/?qid=1389115469384 Do We Know and What Do We Do? Journal of Management &uri=CELEX:52013DC0654 Information Systems, 15(1), 9-50.