EARS-CTRL: Generating Controllers for Dummies Levi Lúcio Salman Rahman Saad Bin Abid Model-Based Software Engineering Technische Universität München Model-Based Software Engineering fortiss GmbH Munich, Germany fortiss GmbH Munich, Germany salman.rahman@tum.de Munich, Germany lucio@fortiss.org abid@fortiss.org Alistair Mavin Rolls-Royce Derby, UK alistair.mavin@rolls-royce.com Abstract—In this paper we present the EARS-CTRL tool for improving the coverage of the semantic gap between EARS synthesizing and validating controller software for embedded and the underlying logical formalism used by the controller systems. EARS-CTRL has as starting point requirements written synthesizer. We now also offer the possibilities of simulating in (English) natural language, more specifically in the EARS (Easy Approach to Requirements Syntax) language invented at requirements specifications as well as of generating test cases. Rolls-Royce and currently in use at numerous organizations The EARS-CTRL tool is freely available as a GitHub project around the world. After expressing the requirements in English, at [1]. the requirements engineer can produce the controller code at the press of a button. EARS-CTRL then provides facilities II. H IGHLIGHTS for validating the generated controller that allow step-by-step A. “Real” EARS simulation or test-case generation using MATLAB Simulink. Index Terms—Software Controllers, Natural Language, Code EARS was not originally built to describe requirements at Synthesis, MathLab Simulink a level where they can automatically be transposed into a computer application. As such, an effort had to be made in I. I NTRODUCTION order to overcome the semantic gap between the structured In this paper we describe the EARS-CTRL tool for build- but non-formal nature of EARS and the strictly formal nature ing and verifying discrete-event software controllers. EARS- of the Linear Temporal Logic (LTL) formalism needed by the CTRL has as starting point the EARS (Easy Approach to automated synthesis mechanisms. In particular, our work uses Requirements Syntax) language. EARS was created at Rolls- the GXW subset of LTL as that subset is supported by the Royce to improve the expression of natural language require- autoCode4 tool we use for synthesizing controllers. ments [12] and can be seen as a way to “gently” constrain Figure 1 illustrates a set of EARS-CTRL requirements for English. The application of EARS produces requirements in the software controller for a sliding door. By remaining as a small number of patterns. EARS copes well with large close as possible to the original EARS syntax our editor allows specifications of requirements for several domains [10], [11]. building requirements as correct English sentences that can EARS is also an effective way of reducing many of the easily be written and understood by humans. In fact, given problems that plague requirements documents written using the requirements stated in figure 1, no additional explanations unconstrained natural language [12]. are necessary for a human to understand the behavior of the With the EARS-CTRL tool we make a step in the direction sliding door controller that should be generated. In [9] we have of controller construction using natural language as the central presented a previous version of EARS-CTRL which included specification. After specifying the vocabulary to be used in the templates that, although not part of the original EARS, had specification, a requirements engineer writes the specification been introduced to simplify translation into LTL. In particular using EARS templates. Then, at the press of a button, the we had introduced the possibility of adding an until segment controller is synthesized. Simulation and test case generation at the end of requirements which is not standard EARS and panels allow the requirements engineer to immediately exper- which we have removed in the current version of the tool. The iment with and validate the controller. work of matching the syntax of EARS-CTRL closer to “real” This paper builds on a previous article [9]. Our new EARS while preserving a semantically meaningful translation contributions are a revision of the requirements language of into LTL was done together with Alistair Mavin, the co-author EARS-CTRL, which is now has exactly the same syntax of this paper who is also the lead author of EARS [12]. as the EARS language presented in [12]. We do so by Our rationale is that, by remaining as faithful as possible to the original EARS syntax, we: 1) benefit from all the The work presented in this paper was developed for the “IETS3” project, funded by the German Federal Ministry of Education and Research under advantages of using EARS already investigated and described code 01IS15037B. in the literature [11], [12]; and 2) provide to Rolls-Royce and Fig. 1: EARS-CTRL Requirements for a sliding door controller potentially other companies a tool that can immediately be • Allow parallel inputs: enables or disables the possibility used by engineers trained in the use of EARS. of having more than one sensor being active for inputs in the test case. B. A Push-Button Approach • Allow repeated inputs: enables or disables having re- EARS-CTRL can synthesize software controllers directly peated inputs in a test. from EARS requirements, at the push of a button. Such Test cases generated by EARS-CTRL can serve two purposes: syntheses are produced by the autoCode4 [5] tool in the form firstly, they are traces of execution of the synthesized controller of a synchronous dataflow (SDF) diagram which our tool can and can be used as witnesses of correct/incorrect behavior; display graphically. secondly, it may be that the synthesizer is not trusted for C. Validation generating controllers used in production: in this case the synthesized controller can behave as an oracle to generate test 1) Well-Formedness by Construction: In order to build cases for a controller implemented using alternative means. requirements model illustrated in figure 1 it is necessary to firstly build a glossary for the controller. Glossaries are specific D. Code Generation to EARS-CTRL and are not part of the EARS language Although it is not possible to generate C code for the described in [12]. An EARS-CTRL glossary identifies the controller directly from EARS-CTRL, this can be achieved components of the system to be controlled. Each one of by directly running Simulink’s code generator on the Simulink those components contains actuators and (possibly) sensors model obtained from an EARS-CTRL requirements specifi- that will be used by the controller logic as outputs to and cation. This possibility is particularly important for our next inputs from the real system. The vocabulary defined in the steps in our collaboration with Rolls-Royce as we wish to glossary is proposed to the requirements engineer by the experiment with running the synthesized controllers in real or EARS-CTRL IDE in order to fill in the placeholders of simulated execution environments. an EARS template when a new requirement is being writ- ten. Because well-formedness is enforced by construction, III. A RCHITECTURE requirement specifications written in EARS-CTRL are always In figure 3 we depict the architecture of the EARS-CTRL syntactically correct. Once a controller has been synthesized tool, its main components and the artifacts those components from a set of EARS requirements, it becomes important to they exchange. The paragraphs below are numbered such understand whether it behaves as expected. In order to do so that each description can be matched to the process-related we have used the Simulink engine [2] as a simulation back- components of the tool depicted in figure 3. Letter-labels are end. In figure 2 we display the EARS-CTRL panel that allows used in figure 3 to refer to data artifacts. “playing” the controller by providing a sequence of inputs manually. Outputs are incrementally added to the panel as new a) Editors and Control Panels: The requirements inputs are provided by the requirements engineer. Note that editor, the glossary editor, the simulation and test generation the simulation panel dynamically displays the sensors of the control panels and the synchronous data-flow diagram controller being simulated, as can be seen in figure 2 for the visualizer (respectively noted (a), (b), (c) and (d) in figure 3) sliding doors example. have all been built as domain-specific languages (DSLs) 2) Generation of Test Cases: EARS-CTRL allows gener- in the Meta Programming System (MPS) tool [4]. MPS is ating test cases directly from the EARS requirements. A test both a projectional editor and a domain-specific language case consists of a sequence of hinput, outputi pairs, where workbench. Domain-specific languages in MPS are composed each input is a vector of sensor states and each output a vector of an abstract syntax, also known as meta-model, and a of actuator states. Note that individual sensors and actuators concrete syntax. The concrete syntax allows displaying and/or can assume two states: ON or OFF. Test case generation is editing the information present in a model, as depicted for configured by three parameters: instance in figures 1 and 2. Note that because MPS is a • Maximum test case length: defines the maximum length projectional editor, the abstract syntax is directly edited of the hinput, outputi pair sequences to be generated. which avoids the explicit or implicit intermediate step where Fig. 2: EARS-CTRL specification simulator the concrete syntax is parsed. A consequence of this is for from Req2 that, when the opening limit reached sensor is example the fact that when a component’s name is updated activated, the doors will stop. Without additional information, an EARS-CTRL glossary, that change will immediately be the autoCode4 synthesis tool identifies a contradition in these reflected in any requirements that refer to that component two requirements since, if the two sensors are activated during name. the same execution, the doors will logically simultaneously open and close. In order to avoid such contradictions it b) From EARS to Lineal Temporal Logic: Let us con- becomes necessary to establish a temporal dependency sider the requirement Req1 which is part of the specification between the behaviors specified by the requirements. To of the sliding doors controller in figure 1: achieve this our tool performs a static analysis of the requirements in order to identify such dependencies and to When object proximity sensor is activated then the add this information to the generated LTL specification. This automatic door controller shall open door. additional contextual information in the generated LTL is This requirement, taken in isolation, translates to the following clear from the second translation above: the door will only LTL formula: open, until (denoted by the “W” operator) the door opening limit reached sensor is activated, or other events stated in [](objectproximitysensorisactivated ! dooropen) door-related requirements occur. which, if one takes into consideration the semantics of the ! operator as “implies”, is the expected logical meaning of c) Synthesizing a Controller using autoCode4: Req1. All EARS templates, when taken in isolation, can be Controller synthesis is achieved via autoCode4’s Java API. directly translated into LTL and propositional logic in such The LTL specification obtained as explained in section III-0b a straightforward manner. However, when one translates the is passed into the synthesizer, which returns a synchronous whole set of requirements for the automatic door in 1 into data-flow (SDF) diagram as a Java object instance. The LTL, the result for Req1 will be as follows: SDF diagram is then rebuilt as a visual model which is an instance of the synchonous data-flow diagram visualizer [](objectproximitysensorisactivated ! (dooropen W DSL (identified by label (e) in figure 1). Such a visual (dooropeninglimitsensorisactivated _ timerexpires model provides the requirements engineer with a graphical _ doorclosinglimitsensorisactivated))) and technical view of the synthesized controller as a set of blocks and wires, which can be used as a debugging artifact. This is due to the fact that the requirements specify behaviors that are interwined during execution. For example, d) Simulation and Test Generation using Simulink: from Req1 in figure 1 we know that if the object proximity The SDF diagram obtained from autoCode4 consists, for sensor is activated, the doors will open. We also know short, of a set of synchronized blocks that perform arithmetic, Fig. 3: The EARS-CTRL Tool Chain logical or other functions on input signals and return the result blocks present in SDF specifications. The EARS-CTRL IDE as output signals. The controller’s inputs and outputs are also communicates with Simulink via the matlabcontrol [3] Java themselves represented as blocks. The fashion in which blocks API. are synchronized is declared by connecting those blocks’ inputs and outputs via wires. In order to simulate EARS- IV. R ELATED W ORK CTRL specifications we have built a translator from such Given the recent fast-paced development of Artificial Intel- SDF diagrams onto Simulink models (label (e) in figure 1). ligence relying on increasingly powerful hardware, a number Given that the SDF formalism is very similar to the Simulink of projects have devoted effort to the generation of controllers formalism, the structural translation is one-to-one. However, from requirements. The ARSENAL project [6] has as starting only a subset of all blocks present in the SDF specifications point specifications written in arbitrary natural language and that are produced by autoCode4 is available off-the-shelf in uses the GR-1 [13] synthesizer for automatically building Simulink. As such, a number of stateful Simulink blocks had controllers. In [8] the authors also use the GR-1 synthesizer to be built by us to mimic the semantics of some of the to automatically build robot controllers. The work of Yan et. al. [15] takes as inputs full LTL specifications and includes [11] A. Mavin, P. Wilkinson, S. Gregory, and E. Uusitalo. Listens Learned features such as the use of dictionaries for automatically derive (8 Lessons Learned Applying EARS). In RE. IEEE, 2016. [12] A. Mavin, P. Wilkinson, and M. Novak. Easy Approach to Requirements relations between terms, or guessing the I/O partitioning that Syntax (EARS). In RE. IEEE, 2009. allow detecting inconsistencies in the specifications. The com- [13] N. Piterman, A. Pnueli, and Y. Saar. Synthesis of reactive (1) designs. mercial argosym STIMULUS tool [7], while not based on AI In VMCAI. Springer, 2006. [14] S. Schewe and B. Finkbeiner. Bounded Synthesis. In ATVA. Springer, algorithms from controller synthesis, is a commercial platform 2007. that allows specifying requirements in a formal language using [15] R. Yan, C. Cheng, and Y. Chai. Formal Consistency Checking Over a close-to natural language syntax. Requirements expressed Specifications in Natural Languages. In DATE, 2015. in STIMULUS can be simulated and test cases can also be directly generated from the requirements. Our approach differs from the GR-1-based projects men- tioned above in the sense that we do not aim at applying pure natural language parsing to arbitrary requirements. Using EARS allows us to provide the readability of the English language while gently constraining it to fit the domain of expressing requirements. Also, rather than using the full expressiveness of LTL, we have restricted our approach to the GXW subset of LTL which is handled by the autoCode4 tool. We then directly generate controllers as SDF diagrams, which are easy to inspect and to simulate. GR-1- or bounded synthesis [14]-based tools typically produce controllers as BDD or explicit state machine structures that can be very large and difficult to inspect or simulate. Regarding the STIMULUS tool, our approach was concep- tually though of starting from an opposite direction – while STIMULUS essentially uses as central formalism state ma- chines wrapped by a syntactic-sugar English-like specification language, EARS-CTRL uses a constrained version of the English language. We have purposefully placed EARS at the center on our tool – the goal has been to adapt the subset of LTL used by autoCode4 to EARS and to remain unbiased towards the formalisms “under the hood”. STIMULUS relies on the state machines underlying the requirements to allow simulation as in fact the approach’s goal is to verify require- ments and not to synthesize usable controllers. R EFERENCES [1] Ears-ctrl github project. https://github.com/saadbinabid1/ EARS-CTRLReqAnalysis/. [2] Matlab simulink. https://de.mathworks.com/products/simulink.html/. [3] matlabcontrol java api. https://code.google.com/archive/p/ matlabcontrol/. [4] Meta Programming System. https://www.jetbrains.com/mps/. [5] C.-H. Cheng, E. Lee, and H. Ruess. autoCode4: Structural Reactive Synthesis. In TACAS’17. Tool available at: http://autocode4.sourceforge.net. [6] S. Ghosh, D. Elenius, W. Li, P. Lincoln, N. Shankar, and W. Steiner. ARSENAL: Automatic Requirements Specification Extraction from NL. In NFM, 2016. [7] B. Jeannet and F. Gaucher. Debugging Embedded Systems Require- ments with STIMULUS: an Automotive Case-Study. In 8th European Congress on Embedded Real Time Software and Systems (ERTS 2016), TOULOUSE, France, Jan. 2016. [8] H. Kress-Gazit, G. E. Fainekos, and G. J. Pappas. Translating Structured English to Robot Controllers. Advanced Robotics, 22(12):1343–1359, 2008. [9] L. Lúcio, S. Rahman, C. Cheng, and A. Mavin. Just formal enough? automated analysis of EARS requirements. In NASA Formal Methods - 9th Intl. Symposium, NFM 2017, Moffett Field, CA, USA, May 16-18, 2017, Proceedings, 2017. [10] A. Mavin and P. Wilkinson. Big Ears (The Return of ”Easy Approach to Requirements Engineering”). In RE. IEEE, 2010.