Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden Formalizing and Verifying Natural Language System Requirements using Petri Nets and Context based Reasoning Aishwarya Chhabra, Amit Sangroya, C. Anantaram TCS Innovation Labs, Gurgaon, India {aishwarya.chhabra, amit.sangroya, c.anantaram}@tcs.com Abstract Table 1: Example of natural language specifications Natural language descriptions are generally used to Automobile Domain: describe requirements in reactive systems. Trans- If the ignition is on, and switch 1 is on for 2 seconds then lating the natural language requirements to a more operation 1 becomes required. formal specification is a challenging task. One pos- Turn Indicator System: sible approach to handle complex natural language When the turn indicator lever becomes left position, and requirements is to convert them to an intermediary the emergency flashing is off, then the flashing mode com- formal representation. This intermediate represen- ponent shall assign left flashing to the flashing mode, set tation is further converted into a more formal rep- 0 at the flashing timer. resentation such as EDT (Expressive Decision Ta- Air Conditioning Domain: bles). In this paper, we use Petri nets in combina- When swing mode is vertical, and operation mode is cool- tion with domain based context reasoning as a tool ing, then operation type becomes high speed. to model natural language requirements. We have also built a tool, NatEDT, to generate EDT specifi- cations. In a case study, consisting of natural lan- such specifications also tend to change for different system guage requirements across three domains, our ex- configurations and also over a period of time. In order to perimental results show that Petri nets provide an make it easier for system designers to define such specifica- efficient way of formalizing natural language re- tions and update them, it is important to have a way to convert quirements. the specifications from natural language sentences into for- mal specifications. In general, requirements broadly consists 1 Introduction of two parts: the condition part and the action part. In our ap- With the rapid growth of the internet of things (IoT), smart proach we take NL requirement specification as an input and homes, smart cars, smart factories have become a reality. produce Expressive decision tables as the final output. EDT These smart systems are kind of reactive systems that interact is a formal way of representing the NL specification which via various sensors. It becomes a challenging task for sys- can be tested and verified [Venkatesh et al., 2014]. EDT is a tem designers to conceptualize systems that can take complex regular expression based, tabular format notation for reactive natural language sentences as an input and test/verify the re- systems. An example of EDT is as follows (See Table 2). quirement. Table 1 shows some examples of natural language For reactive systems, EDT representations can be easily specifications from multiple domains of reactive systems. verified for functional testing in comparison to NL specifica- The NL requirements can belong to a system as simple as tions. This is because of the fact that natural language specifi- switching on a light or as complex as control a fire detection cations can be ambiguous or sometimes incomplete. System system remotely. It is important for system designers to have designers sometimes use natural language as it can be easily an engineering approach to formalize NL requirement speci- written without getting into the complexity of the problem. fications. In this paper, we focus on system requirements that However, to verify these reactive systems, we need test cases, are primarily the descriptions of how a system (i.e. Smart and it is better to generate test cases using a formal language. Home/Automobile/IoT based system) is expected to perform Hence, it necessitates the automation of natural language to in a real environment. An example of such a description in a formal language in order to bridge the gap between natural an automobile domain is as follows. ”If the ignition is on for more than 45 seconds, and seat belt is not engaged then Table 2: Example of EDT (Expressive Decision Table) alarm should beep”. In this example, alarm system should beep; if the ignition is on and the seat belt is not engaged. in IGN in SEAT BELT out ALARM Many-a-times the specifications of such systems are spec- ON {> 45s} NOT ENGAGED BEEP ified in natural language sentences by designers. At times 64 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden language to test case generation. Our approach is based upon parsing that reduces the man- To this end, we designed a tool called NatEDT (Natural ual effort of validating the output that is involved in ap- Language to EDT) to translate the natural language specifi- proached based on information extraction [Ghosh et al., cations to formal notation using Petri nets as an intermedi- 2016]. Shalini et al. proposed ARSENAL which works for ary and verification mechanism. Our Approach consists of less restrictive grammar but our approach verifies for the cor- several steps consisting of pre-processing, NL parsing, inter- rectness using domain ontology. In this approach complete mediary Petri net representation and final output as EDT. In parsing is done and its semantic interpretation is done in con- addition to this, we have an additional step of verification that text of the domain knowledge. Selvet et al. also takes an helps to test the properties like completeness and consistency advantage of parsing but their approach is different in all re- of requirements. The overall approach is domain agnostic and spects as they are using SBVR (Semantics of Business Vocab- can be easily adapted to new domains. ulary and Business Rules) for semantic understanding [Sel- The remainder of this paper is structured as follows. Sec- way et al., 2015]. Sadoun et al. make the use of extracting tion 3 provides a small introduction to preliminaries which rules automatically acquired by a training corpus, and iden- are essential for our work. Section 4 presents the proposed tify concepts using a domain ontology [Sadoun et al., 2013]. architecture of the system that takes natural language require- We also use domain knowledge in the form of ontology ments as an input and provides a formal specification as an to validate all the identified Variables and its values. Petri output. Then section 5 discusses the experimental evaluation. nets have been used as a verification mechanism in various Section 2 offers an overview of the existing state of art ap- domains [Lee et al., 2001; Sarmiento et al., 2015]. Our ap- proaches that focus on formalizing natural language require- proach is more promising as it gives an additional step of ver- ment specifications. Finally, in section 6, this work ends up ification. Using Petri Nets as an intermediary model gives us with some conclusions and an outlook of our future work in a more robust verification mechanism and visual representa- this area. tion. The primary advantages of our approach over the state of art approaches is that requirements can be in less restrictive natural language. We use a domain dictionary that allows to 2 Related Work write NL requirements using a rich vocabulary. Use of Petri We classify the existing approaches into two categories. nets as an intermediary helps in validation of NL specifica- Approaches using Restrictive Natural Language tion and also preserving the context. Additionally, NatEDT One way is to restrict the language used in writing the require- has a verification process for formal verification of NL spec- ments specifications to make semantic parsing easier [Yan ifications. et al., 2015]. Gutavo et al. use the Control Natural Lan- guage(CNL) for writing the specifications and have also de- 3 Preliminaries fined a grammar for that CNL structure to do Syntactic Anal- ysis [Carvalho et al., 2014]. They also talk about the trade 3.1 Expressive Decision Tables (EDT) off between removing restrictions from the grammar and au- An EDT specification [Venkatesh et al., 2014] consists of one tomation extent. They say that they aim at fully automat- or more tables where the column headers specify the input ing the formalizing process by restricting the language and and output ports and the rows specify the relationship be- providing a fixed format of the specifications. Since NL tween input and output values. Each cell in a row consists is controlled, a lot of manual effort is required in convert- of a regular expression that is used to match input streams at ing the Specification to that controlled format. Böschen et that port. Input values arrive as a stream at input ports at dis- al. [Böschen et al., 2016] uses a preprocessing approach to crete time units and output values are generated as a stream at enrich the natural language requirements using a knowledge output ports at discrete time units. Example of EDT is shown base. In this context, our approach is complementary to their in table 2, where “in” stands for input and “out” stands for work since we also use a domain ontology to contextualize output. the natural language requirements. Approaches using Less Restrictive Natural Language 3.2 Ontology Less restrictive NL specifications are more natural in com- In the context of knowledge sharing, the term ontology is parison to restrictive approaches; hence making it easier used to mean a specification of a conceptualization. That is, for users to write these requirements. Bajwa et al. pro- an ontology is a description of the concepts and relationships pose an approach of scanning the specifications for rele- that can exist for an agent or a community of agents. This vant relationships and extracting them [Bajwa et al., 2012; definition is consistent with the usage of ontology as set-of- 2010]. They also use intermediary models from which they concept-definitions, but more general. And it is certainly a extract the final concepts. However, their approach is pri- different sense of the word than its use in philosophy. We are marily based on information extraction. Other NL process- using domain ontology which covers the concepts and their ing techniques such as parsing, exploiting the use of patterns, relationships with its attributes and other values. We are us- regular expressions, and rules, etc. can provide extraction ing Protege to construct the ontology in OWL format [Musen, of concepts and relationships from the unrestricted natural 2015]. language requirement specification. Validation of the out- Domain ontology helps us specialize the approach to a par- put model from the business specification must be performed ticular domain which will help in fetching better results. We which can be a laborious for large specifications. are using ontology for checking the identified concepts to 65 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden check if the concepts extraction does not give incorrect con- dependency parsing. Thereafter, we perform semantic analy- cepts. The verified concepts are processed further and the sis using semantic parsing techniques. We make use of a do- remaining concepts which do not belong to the domain are main ontology to identify and confirm the context for a given dropped. It also helps us identify the correct relationships of specification. Thereafter, Petri nets are used for verification the values and the variables. of NL specification. The output of this is a formal EDT that can be further processed for test case generation [Venkatesh 3.3 Petri Nets et al., 2016]. Petri nets, also known as Place/Transition Nets, are used to We designed a tool NatEDT, that takes a natural language verify work flows. Petri nets are classical models of con- sentence as an input and generates an equivalent EDT specifi- currency, non-determinism, and control flow, first proposed cation and preserves the original context. The current version by Carl Adam Petri in 1962. It is a collection of arcs con- of the developed prototype not only generates corresponding necting places and transitions. Places refer to the current formal specification but also verifies the NL specifications. state of the system whereas transitions are the events that The overall workflow of NatEDT consist of following main take place and may cause a change in the state of the sys- steps: tem. Places may hold tokens which enable the transitions 4.1 Preprocessing and eventually the transition gets fired, then the tokens are distributed as per the weight given on the arcs. Places of Each system requirement consists of mainly two parts : con- Petri nets usually represent states or resources in the system dition clause and action clause. In preprocessing, the first while transitions model the activities of the system. Petri step is to split the system requirement based on its syntactic nets are bipartite graphs and provide an elegant and math- structure. The specifications when written in natural language ematically rigorous modeling framework for discrete event might make use of synonyms of domain specific terms (terms dynamically systems. Petri nets are a simple but effective present in domain ontology) instead of directly using them. method of analysing manufacturing systems [Murata, 1989; To overcome this we assume that we have a domain dictio- van der Aalst, 1998]. nary built by domain experts. Using the domain dictionary, synonyms of the domain specific words are replaced with ac- Definition 3.1. Petri Nets [Petri, 1962] tual domain specific words. For example: For a given NL A Petri net is a four-tuple (P, T, IN, OU T ) where specification ”If ignition is ON, and switch 1 is ON for 2 sec- P = p1 , p2 , p3 , ...pn is a finite set of places onds then operation 1 becomes REQ.” , this will be changed T = t1 , t2 , t3 , ...tn is a finite set of transitions to ”If IGN is ON, and SW 1 is ON for 2 seconds then OP 1 IN : is an input function that defines directed arcs from becomes REQ.”. Since in this domain we consider n-gram places to transitions, and switch 1 as one domain term so we replace it with SW 1. OU T : is an output function that defines directed arcs form Moreover, we have some general n-grams like greater than, transitions to places. less than or equal to. These are are replaced with greater than, Graphically places are represented by circles and transi- less than or equal to, respectively. tions represented by horizontal or vertical bars (See Figure 1). If IN (pi , tj ) = k, where k > 1 is an integer, a directed arc 4.2 Extraction of Domain Variables from place pi to transition tj is drawn with the label k. If In requirement specifications, both the condition clause and k = 1 we include an unlabeled arc and if k = 0 then no arc is action clause have some variables and values associated to drawn. these variables. We use term input variables for the terms used in conditional clause and output variables for the terms used in action clause. Extracting these variables from the re- quirement requires an intricate approach, which is described underneath: Dependency Parsing We use Stanford CoreNLP dependency parser for depen- dency parsing, which gives us a set of triples (dependent, de- pendency, governor) [Manning et al., 2014]. For Example: (IGN, nsubj, ON) which means IGN is is related to ON and is the nominal subject for ON. We parse our specification using this Stanford typed dependency parser and get a list of such Figure 1: A simple example of Petri net sets with various dependencies [Marneffe et al., 2006]. Now let us consider the given example from general automobile domain which has time attributes also. “If IGN is ON, and SW1 is ON for 2 seconds”. Figure 3 represents the depen- 4 System Architecture dency tree for this specification. Figure 2 represents the overall architecture. System design- Semantic Analysis and Validation using Ontology ers specify the requirement in natural language. First, we per- Now we have all the dependencies, we can also call them form syntactic analysis that involves POS tagging, chunking, grammatical relationships. We need to make sure that while 66 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden System Requirements in Natural Language Example: “When power saving mode is on, swing mode is off, the controller shall assign off to ventilation function” Syntactic Semantic Domain Petri nets Analysis using Analysis Ontology Based NLP Using Based Context Verification techniques Semantic Validation Parsing Formal Expressive Decision Tables (EDT) in power_saving_mode in swing_mode out ventilation_function on off off 1 Figure 2: Overview of System Architecture mapping them to the EDT, we don’t lose the context of the The expression which satisfies the condition is given at the specification as these are general grammar relationships. So arcs. When the token fired satisfies the expression on the arcs, for all the variables (concepts), we have some attributes as- transition assigns tokens to the output place. Example : “If sociated to them like - value, type, time, modifier. We define IGN is ON, and SW 1 is ON for 2 seconds then OP 1 becomes some rules associated with the dependencies to fill the slots REQ.”. In this step we can fire tokens and visualize the work for these attributes. Using these types of rules we preserve flow of the requirement specification. Figure 4 shows the net- the context of the original requirement while modeling Petri work before and after firing of tokens. nets and generating formal EDT specifications. For example: (SW 1, nsubj, ON) we get SW 1 is the vari- 4.4 Generating EDT Specification able and and ON is its value. We can validate the semantic The last step is generation of EDT specifications. As initially relationship between SW 1 and ON using a domain ontology described, EDT is in tabular format so we use python libraries so that the context is preserved. (2, nummod, seconds) and like pandas to create a table for the specification. As by now (seconds , nmod:for, ON) together helps fill us the slot for we have identified the concepts and its attributes, and have time attribute. In case of “type” attribute we have two options also validated them in the above steps. We can map it to the numeric or non-numeric. For variables like voltage, timer, etc EDT format. The places in the Petri nets with the tokens de- for which the value will always be numeric are categorized in notes the values at the current state of the system so we can that category and remaining falls into another category. We consider places as the column names, where input place will extract that using POS tags of the values. Last attribute re- be represented with a prefix ‘in’ and output variables ‘out’. maining is the modifier which has the values of type greater The tokens will be represented as the values in the corre- than, less than, greater than or equal to, equal to, etc. In this sponding rows. Figure 3 shows the table generated for the example absence of any relation like this implies ’equal to’. given example. 4.3 Deriving Petri Nets Representations 4.5 Verifying Specifications for Contextual In this step we derive Petri net based intermediate repre- Inconsistencies sentation from the natural language requirement. To model We are verifying the extracted domain variables and the val- this information into Petri nets we use python SNAKES li- ues associated to them using an ontology in the transforma- brary [Pommereau, 2008; 2015]. We assign each variable a tion process itself. We have added an additional verifica- place. The values of the variables are considered as tokens. tion step to to the tool which verifies the other requirements 67 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden conj:and dep nmod:for nsubj cc nsubj case cop punct cop nummod LS NNP VBZ NNP , CC NN VBZ NNP IN CD NNS If IGN is ON , and SW1 is ON For 2 seconds Figure 3: Dependency Graph for NL Specification Table 3: EDT for the representative example Table 4: Precision and Recall for various domains in IGN in SW 1 out OP 1 # Samples Recall % Precision% ON ON REQ Turn Indica- 76 85.52% 100% tor System Automobile 49 91.78% 94.36% given to check their consistency with the existing require- Air Condi- 17 100% 100% ments. We process a finite number of specifications S (where tioning S = S1 , S2 , S3 , ...Sn ) as explained in the above sections and store the information extracted at each step in a file. When a user provides a new input condition (C), it is processed as ex- on different domains to test the adaptability of the approach plained in previous sections. The information extracted from on different domains and we realized that the approach is C is matched to the information extracted for specifications generic. To adapt to different domains one need to have exter- (S1 , S2 , S3 , ...Sn ) using the verification algorithm which re- nal domain dictionary for preprocessing and domain ontology turns complete specification with a Petri net for the given in- for verification of those specifications. Our evaluation criteria put conditions if an exact match is found. If the exact match was based upon calculating the precision and recall. is not found it looks for the best match which refers to the one We calculated the total number of concepts that need with the highest number of matching input conditions and re- to be identified in each sample set and then the variables turns the Petri net and table stored for those specifications. which were correctly identified, incorrectly identified, and For Example: We originally had a specification in set the missed concepts. The largest requirement in english was S : “If screen is unlock and power button is pressed for composed of 48 words and the smallest sentence was com- less than 1 second and released, then turn screen to lock”. posed of 12 words. The TIS sample was mostly state based User describes an input condition as: “If screen is unlock examples and the other set had some state based as well as and power button is pressed for less than 3 second and examples having time and stream of inputs. The results are released”, the tool process this condition and extract two compiled in table 4. We get a precision of 94.36% and a input variables: SCREEN having a value UNLOCK and recall of 91.78% in automobile domain. Though we cannot POWER BUTTON having a value PRESSED for less than compare our results with any other work as the formal nota- 3s and then changes to RELEASED. The verification algo- tion and approach used in our paper is quite different than the rithm couldn’t find an exact match in set S, hence looks for formal notations and approaches in prior work. Our results the best match highlighting the differences. In this example, are quite promising for the transformation to a relatively new it highlights the time attribute is different from the existing formal notation. best match for the given input. Thereafter, it provides user an additional option to either correct if it was an error or add it 6 Conclusions and Future Work as a new requirement specification in the set S by providing To this end, a tool called NatEDT is developed that takes a the output action for the corresponding input. natural language sentence as an input and generates an EDT specification. We make use of domain knowledge in the form 5 Experimental Evaluation of dictionary and ontology to preserve the context in NL spec- NatEDT tool was tried on two different set of requirements ification. We are also able to verify the new NL requirements from automobile domain (Turn Indicator Systems (17 re- based on the existing requirements for its consistency and quirements) and another automobile sample set [Venkatesh completeness. In the future, this work could be extended for et al., 2014] (29 requirements). We also tested toy exam- robust verification and validation of the requirements in the ples from Air Conditioner domain (See table 5). We tested system. 68 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden Table 5: Examples of Natural Language Specification and their corresponding EDTs 1 If Mist Remover in in out out Switch is on and Mist Remover SW Mist Remover Ctrl Mist Remover Request Mist Remover Relay Mist Remover controller is NO req then Mist Remover Request becomes on and Mist remover relay is on. ON NO REQ ON ON 2 If Mist Remover in in out out Switch is off and Mist Remover SW Mist Remover Ctrl Mist Remover Request Mist Remover Relay Mist Remover controller is NO req then Mist Remover Request becomes off and Mist remover relay is off. OFF NO REQ OFF OFF 3 If alarm is ON, in alarm in panicSw out flash out alarm and panicsw is pressed for more than 3 seconds and released then the flash should be NO REQ, and alarm should be OFF. ON pressed > 3s re- NO REQ OFF leased 4 When the turn In In emer- Out flashing mode Out flashing timer indicator lever turn indicator lever gency flashing becomes left position, and the emergency flashing is off, then the flashing mode component shall assign left flashing to the flashing mode, reset the flashing timer. Left position off right flashing 0 5 When the emer- In In emer- Out flashing mode Out flashing timer gency flashing is turn indicator lever gency flashing off, and the turn indicator lever becomes the right position, and the flashing mode is not right flashing, the flashing mode component shall assign right flash- ing to the flashing mode, reset the flashing timer. Right position on left flashing 0 69 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden IGN SW_1 IGN SW_1 {‘ON’} {‘ON’} {‘’} {‘’} ‘ON’ ‘ON’ ‘ON’ ‘ON’ T T true true ‘REQ’ ‘REQ’ OP_1 OP_1 {‘’} {‘REQ’} Before Token Firing After Token Firing Figure 4: Before and After Firing of Tokens References [Lee et al., 2001] Jonathan Lee, Jiann-I Pan, and Jong-Yih [Bajwa et al., 2010] I. S. Bajwa, B. Bordbar, and M. G. Lee. Kuo. Verifying scenarios with time petri-nets. Information Ocl constraints generation from natural language specifi- and Software Technology, 43(13):769 – 781, 2001. cation. In 2010 14th IEEE International Enterprise Dis- [Manning et al., 2014] Christopher D. Manning, Mihai Sur- tributed Object Computing Conference, pages 204–213, deanu, John Bauer, Jenny Finkel, Steven J. Bethard, and Oct 2010. David McClosky. The Stanford CoreNLP natural lan- [Bajwa et al., 2012] Imran Bajwa, Behzad Bordbar, and guage processing toolkit. In Association for Computa- Mark Lee. Nl2alloy: A tool to generate alloy from nl con- tional Linguistics (ACL) System Demonstrations, pages straints. 10:365–372, 12 2012. 55–60, 2014. [Böschen et al., 2016] Martin Böschen, Ralf Bogusch, An- [Marneffe et al., 2006] M. Marneffe, B. Maccartney, and abel Fraga, and Christian Rudat. Bridging the gap be- C. Manning. Generating typed dependency parses from tween natural language requirements and formal specifi- phrase structure parses. In Proceedings of the Fifth In- cations. In Joint Proceedings of REFSQ-2016 Workshops, ternational Conference on Language Resources and Eval- Doctoral Symposium, Research Method Track, and Poster uation (LREC-2006), Genoa, Italy, May 2006. European Track co-located with the 22nd International Conference Language Resources Association (ELRA). ACL Anthol- on Requirements Engineering: Foundation for Software ogy Identifier: L06-1260. Quality (REFSQ 2016), Gothenburg, Sweden, March 14, [Murata, 1989] Tadao Murata. Petri nets: Properties, analy- 2016., 2016. sis and applications. Proceedings of the IEEE, 77(4):541– [Carvalho et al., 2014] Gustavo Carvalho, Ana Carvalho, 580, April 1989. Eduardo Rocha, Ana Cavalcanti, and Augusto Sampaio. A formal model for natural-language timed requirements of [Musen, 2015] Mark A. Musen. The protege project: A look reactive systems. In Stephan Merz and Jun Pang, editors, back and a look forward. AI Matters, 1(4):4–12, June Formal Methods and Software Engineering, pages 43–58, 2015. Cham, 2014. Springer International Publishing. [Petri, 1962] Carl Adam Petri. Kommunikation mit Auto- [Ghosh et al., 2016] Shalini Ghosh, Daniel Elenius, Wen- maten. PhD thesis, Universitat Hamburg, 1962. chao Li, Patrick Lincoln, Natarajan Shankar, and Wilfried [Pommereau, 2008] Franck Pommereau. Quickly prototyp- Steiner. Arsenal: Automatic requirements specification ing Petri nets tools with SNAKES. Petri net newsletter, extraction from natural language. In Sanjai Rayadurgam (10-2008):1–18, 10 2008. and Oksana Tkachuk, editors, NASA Formal Methods, pages 41–46, Cham, 2016. Springer International Publish- [Pommereau, 2015] Franck Pommereau. SNAKES: a flex- ing. ible high-level Petri nets library. In Proceedings of 70 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden PETRI NETS’15, volume 9115 of LNCS, pages 254–265. Springer, 06 2015. [Sadoun et al., 2013] D. Sadoun, C. Dubois, Y. Ghamri- Doudane, and B. Grau. From natural language require- ments to formal specification using an ontology. In 2013 IEEE 25th International Conference on Tools with Artifi- cial Intelligence, pages 755–760, Nov 2013. [Sarmiento et al., 2015] E. Sarmiento, J. C. S. D. P. Leite, and E. Almentero. Analysis of scenarios with petri-net models. In 2015 29th Brazilian Symposium on Software Engineering, pages 90–99, Sept 2015. [Selway et al., 2015] Matt Selway, Georg Grossmann, Wolf- gang Mayer, and Markus Stumptner. Formalising nat- ural language specifications using a cognitive linguis- tic/configuration based approach. 54, 04 2015. [van der Aalst, 1998] Wil M. P. van der Aalst. The appli- cation of petri nets to workflow management. Journal of Circuits, Systems, and Computers, 8(1):21–66, 1998. [Venkatesh et al., 2014] R. Venkatesh, U. Shrotri, G. M. Kr- ishna, and S. Agrawal. Edt: A specification notation for re- active systems. In 2014 Design, Automation Test in Europe Conference Exhibition (DATE), pages 1–6, March 2014. [Venkatesh et al., 2016] R. Venkatesh, Ulka Shrotri, Amey Zare, and Supriya Agrawal. On generating test cases from edt specifications. In Leszek A. Maciaszek and Joaquim Filipe, editors, Evaluation of Novel Approaches to Soft- ware Engineering, pages 1–20, Cham, 2016. Springer In- ternational Publishing. [Yan et al., 2015] R. Yan, C. H. Cheng, and Y. Chai. For- mal consistency checking over specifications in natural languages. In 2015 Design, Automation Test in Europe Conference Exhibition (DATE), pages 1677–1682, March 2015. 71