1 Automated Specification-based Testing of Interactive Components with AsmL Ana C. R. Paiva, João C. P. Faria, and Raul F. A. M. Vidal Abstract — It is presented a promising approach to test interactive components, supporting the automatic generation of test cases from a specification. The relevance and difficulties (issues and challenges) associated with the testing of interactive components are first presented. It is shown that a formal specification with certain characteristics allows the automatic generation of test cases while solving some of the issues presented. The approach is illustrated with an example of automatic testing of the conformity between the implementation of a button, in the .Net framework, and a specification, written in the AsmL language, using the AsmL Tester tool. The conclusion discusses the characteristics of the tool and gives directions for future work. Index Terms — Formal Methods, Interactive Systems Testing. —————————— —————————— 1 INTRODUCTION T he development of high-quality interactive systems and applications is a difficult and time-consuming task, requiring expertise from diverse areas (software engi- final section. 2 ISSUES AND CHALLENGES OF TESTING neering, psychology). Current IDE's are not powerful enough for specifying/modeling, building and testing INTERACTIVE COMPONENTS those systems in an effective way. The development of in- Testing interactive components is particularly difficult be- teractive systems and applications based on reusable inter- cause it shares and combines the issues and challenges of active components is the key to achieve higher quality and testing object-oriented systems [1], component-based sys- productivity levels. Improving the quality of interactive tems [2], and interactive systems. Some of the main issues components should have a major impact in the quality of and challenges are identified and described next. interactive systems and applications built from them, and should contribute to their increased reuse. 2.1 Complex Event-driven Behaviour In this paper, it is presented a promising approach to the Interactive components (and interactive applications and testing of interactive components. By interactive compo- systems in general) have complex event-driven behaviour, nents we mean reusable controls or widgets or interactors, difficult to analyze and predict, and, consequently, also dif- capable of both input from the user and output to the user, ficult to test and debug. Even basic interactive components, written in a general-purpose object-oriented language, such such as buttons and text boxes, may react to and generate as Java or C#. Interactive components range from the more dozens of events. Most of us have already experienced basic ones (such as buttons, text boxes, combo boxes, list "strange" behaviours (blocked interfaces, dirty displays, boxes, etc.) to the more sophisticated ones (calendars, data etc.) apparently at random when using wide-spread inter- grids, interactive charts, etc.) built from simpler ones. The active applications and systems. This should not be a sur- overhead incurred in testing reusable interactive compo- prise given their complex event-driven behaviour. nents pays-off, because of their wider usage and longevity, 2.2 Highly-configurable (or Customizable) Behaviour when compared to special purpose and short lived "final" user interfaces. Reusable interactive components usually have a highly- The paper is organized as follows: next section (section configurable (or customizable) behaviour. This can be done 2) presents some important issues and challenges of testing statically or dynamically by setting configuration properties interactive components. Section 3 explains the type of test or attributes, by adding event-handlers or by defining sub- automation that is envisioned (automated specification- classes and method overriding. Testing an interactive com- based testing), discusses the type of formal specification ponent in all the configurations allowed is almost impossi- required, and discusses its costs and benefits. Section 4 pre- ble because of the huge set of possible configurations and sents an example of performing automated specification- the difficulty to predict the customized behaviour. based tests using the AsmL language and the AsmL Tester 2.3 Multiple Interfaces tool. Some conclusions and future work can be found in the Interactive components have both a user interface (GUI) ———————————————— and an application interface (API). The application interface • All authors are with the Electrical Engineering Department, Engineering is used for customizing and composing them, and for link- Faculty of Porto University. E-mail: apaiva@fe.up.pt; jpf@fe.up.pt; and ing them with the underlying application logic. Different rmvidal@fe.up.pt. kinds of inputs and outputs occur via these different inter- faces. Adequate testing of an interactive component cannot look at just one of these interfaces in isolation, and has to 2 QUATIC’2004 PROCEEDINGS take into account all these kinds of inputs and outputs in testing. For example, from the documentation, it is difficult the definition of test cases and in test execution. to know precisely: 1. when are events signalled and by what order; 2.4 GUI Testing is Difficult to Automate 2. what is the internal state of a component when it Automating the testing of graphical user interfaces poses signals an event; well-known challenges: 3. what is safe for an event handler to do; 1. How to properly simulate inputs from the user 4. what interactions exist between events. (mouse, keyboard and other higher-level events that This usually leads to a trial-and-error style of application are generated by the user)? programming and poor application quality, and also com- 2. How to check the outputs to the user without exces- plicates the design of test cases. sive sensitivity to formatting and rendering details? 2.8 Poor Testability 2.5 API's with Callbacks and Reentrance Testing of interactive components is usually difficult and The designer of a reusable interactive component defines its time-consuming due to: methods but does not know in advance which kind of ap- 1. the lack of rigorous, unambiguous and comprehen- plications will make use of them. Method calls between an sive documentation; interactive component and an application occur in both 2. the reduced observability (capability to observe the directions: internal state, display produced, and events raised); 1. The application (or test driver) may call methods of 3. the deficient controllability (capability to simulate the interactive component. From the testing perspec- user input). tive, inputs are methods invoked with parameters Some of the issues and challenges described in this sec- while outputs are the values returned by those tion will be addressed by our testing approach and dis- methods. This is the traditional situation in unit test- cussed in the next sections. ing. 2. The interactive component may generate events (originated from the user or internally generated) 3 AUTOMATED SPECIFICATION-BASED TESTING that cause the invocation of methods in the applica- Manual testing of GUIs and interactive components is la- tion (or test stub), by some kind of callback mecha- bour-intensive, frequently monotonous, time-consuming nism (event handlers, or subclassing and method and costly. Some of the reasons are the existence of varied overriding). Again, from the testing perspective, the possibilities for user interaction and a large number of pos- outputs are the events and parameters passed to the sible configurations for each component, and other issues application, while inputs are returned parameters. described in section 2, making it impractical the satisfaction Testing the second kind of interaction (callbacks) poses of adequate coverage criteria by manual testing. It is neces- specific issues and challenges, as already noted in [3]: sary to use some type of automation to perform those tests. 1. An application method invoked in a callback may, in turn, invoke methods of the interactive component 3.1 Degree of Automation Envisioned (reentrancy situation) and have access or change its The degree of automation we envision is the automatic intermediate state. Hence, the internal state of the in- generation of test cases (inputs and expected outputs) from teractive component when it issues a callback is not specification, and not just the type of automation that is irrelevant. Moreover, some restrictions may have to provided by unit testing frameworks and tools, such as be posed on the state changes that an application JUnit (www.junit.org) or NUnit (www.nunit.org), or the type may request when processing a callback. of automation provided by capture and replay tools, such 2. During testing, one has to check that: (1) the appro- as WinRunner (www.mercure.com) and other tools priate callbacks are being issued; (2) when a callback (www.stlabs.com/marick/faqs/t-gui.htm). is issued, the interactive component is put in the ap- Unit testing frameworks and tools are of great help in orga- propriate internal state; (3) during the processing of nizing and executing test cases, particularly for API testing, a callback, the application doesn't try to change the but not in generating test cases from a specification. state of the interactive component in ways that are Capture and replay tools are probably the most popular not allowed. tools for GUI testing, but don’t support the automatic gen- eration of test cases. With these tools, it is possible to record 2.6 Operating System Interference the user interactions with a graphical user interface (mouse Interaction with the user is mediated by the operating sys- input, keyboard input, etc.) and replay them later. Capture tem in non trivial ways (often, several layers are involved), and replay tools are of great help in several scenarios, but introducing more dimensions of configurability, and com- also have widely recognized limitations (see e.g. the lesson plicating the analysis and prediction of its behaviour, as "capture replay fails" in [4]). In this type of test automation, well as the testing and debugging tasks. there is no guarantee of test coverage, and there is an exces- sive dependency on the “physical” details of the user inter- 2.7 Insufficient Documentation face. The documentation supplied with interactive components From a higher perspective, these different approaches is usually scarce and not rigorous enough for more ad- and types of automation are complementary and not oppo- vanced uses, such as advanced customization and thorough nents. 3 QUATIC’2004 PROCEEDINGS The automatic generation of test cases from specification before issuing the callback; a second step where the call- requires some sort of formal specification of the software to back is issued; a third step to lead the object to the appro- be tested, that can be used to generate concrete input values priate final state and return. These steps facilitate the defini- and sequences, as well as the expected outputs (as a test tion of restrictions on sequences of actions/events that are oracle). common to find in user interface modelling and are not It is possible to design test cases (for black-box testing) easy to express using just post-conditions. Each step com- from informal specifications, but not in an automated way. prises one or more non-contradictory "model statements" At most, inputs can be generated automatically based on that are executed simultaneously. Model statements are the signatures of methods and events (their calling syntax), written in a high-level action language with primitives to but expected outputs can only be generated based on a create new objects, assign new values to the attributes of an formal specification of their semantics. object, and call other methods. Model programs may be used in combination with pre and post-conditions, usually 3.2 Type of Specification Needed dispensing the later. Examples of specifications written in A popular type of specification of object-oriented systems is AsmL will be presented in section 4. based on the principles of design by contract [5], by means of invariants and pre and post-conditions, as found in Eiffel 3.3 Conformity Checks (www.eiffel.com ), ContractJava [6] or JContract With appropriate tool support (as is the case of the AsmL (www.parasoft.com ). An invariant is a condition that restricts Tester tool), model programs can be used as executable the valid states of an object, at least on the boundaries of specification oracles [1]. That is, the results and state method calls. A pre-condition of a method is a condition on changes produced by the execution of model programs (ex- the input parameters and the internal state of the object that ecutable specifications written in AsmL) can be compared should hold when a method is called. On the opposite side, with the results produced by the execution of the corre- a post-condition of a method is a condition on the input sponding implementation under test (written in any .Net parameters, initial state of the object (when the method is compliant language in this case). Any discrepancies found called), final state of the object (when the method returns), are reported by the tool. Mappings between actions and and value returned that should hold at the end of the states in the specification and the implementation have to method execution. be defined, either explicitly or implicitly (based on name Although with limitations, some test tools, such as JTest equality). Although this is not the only way of performing (www.parasoft.com ), have the capability of generating unit conformity checks between a specification and an tests based on the specification of pre and post-conditions. implementation (see [8] for a discussion of other possible While pre and post-conditions are a good mean to restrict ways), it is a feasible way. the allowed behaviours of an object, they are not adequate, in general, to fully specify their intended behaviour, par- 3.4 Finite State Machine Model and Test Case ticularly when callbacks are involved. As already noted by Generation Szyperski in his book [3], the semantics of components that For the generation of test cases, the AsmL Tester tool first issue callbacks, as is the case of interactive components (see generates a FSM (Final State Machine) from the AsmL section 2), cannot be captured only by means of pre and specification, and then generates a test suite (with one or post-conditions (at least with the meaning presented more test cases) from the FSM, according to criteria pro- above). vided by the user. Since the number of possible object states The Object Constraint Language (OCL) (see (possible combinations of values of instance variables) is www.uml.org) goes a step further, by allowing the specifica- usually huge, the states in the FSM are an abstraction of the tion, in the post-condition of a method, of messages that possible object states, according to some criteria provided must have been sent (method calls and sending of signals) by the user. during its execution. However, in general, it is not possible It is well known that state machine models are appropri- to specify the order by which messages are sent and the ate for describing the behaviour of interactive systems (and state of the object when each message is sent (important reactive systems in general), and a good basis for the gen- because of re-entrance, as explained in section 2). The defi- eration of test cases, but usually there is not a good integra- nition of post-conditions in OCL has another advantage tion between the object model and the state machine model. over its definition in Java or Eiffel, because OCL is a higher- AsmL and the AsmL Tester tool solve this problem with the level formal language supporting formal reasoning and generation of the FSM from the specification (formal object automation. model). AsmL [7] (http://research.microsoft.com/fse/asml), a formal 3.5 Advantages of the Formal Specification of specification language developed at Microsoft Research, Interactive Components tightly integrated with the .Net framework, bridges over When compared to other testing techniques, automated the limitations found in OCL by means of "model pro- specification-based testing has the disadvantage of requir- grams". A "model program" is an executable specification of ing a formal specification (to achieve a higher degree of a method. A model program may be organized in a se- automation). But the investment in the formal specification quence of steps. For example, if a method issues a callback of reusable interactive components may be largely compen- in the middle of its execution, three steps should be de- sated by the multiple benefits it can bring: fined: a first step to lead the object to the appropriate state 4 QUATIC’2004 PROCEEDINGS 1. Formal specifications and models are an excellent cation 1) model the internal state of the button, and corre- complement to informal specifications and docu- spond to properties existing in the     class in the .Net mentation, because ambiguities are removed and in- framework. consistencies are avoided. Instance variables         and        2. Formal specifications allow the automation of speci- represent event flags that were added to check that appro- fication-based testing, as described in this paper. priate sequences of mouse and keyboard events are re- 3. Besides being useful as the basis for the generation ceived by a button (       after      ,      after of test cases, FSM's can also be used to automatically     ). prove required properties of a system, with model- Instance variable          was added to tell checking tools that exhaustively search the state whether the      event was generated by the button in space. The properties are written in temporal logic. response to the last keyboard or mouse event received. This For example, Campos, in [9], uses model checking instance variable is reset each time an event is received tools to prove usability properties of user interfaces. (method     !        "       ). 4. Desired properties of a system (with a finite or infi- # $% & & ' () ) * + , - . , /0 & 1 2 34 5) nite state space) may be proved in a semi-automated 6 % 7 8 4 9 ) % & :) ;< += 6 % 7 > +? 2 @4 A % & '* * @4 ? + way, given a formal specification or model of the 6 % 7 B* 5 ( C4 A % & '* * @4 ? + D E % $ &, system, and a formal description of those properties 6 % 7 F * ( C4 G *H + B@? = % & '* * @4 ? + D E % $ &, 6 % 7 I4J G *H + B@? = % & '* * @4 ? + D E % $ &, [10]. 6 % 7 K @ < 5 L4 A B@? = % & '* * @4 ? + D E % $ &, 5. Executable specifications (or models) of user inter- faces and interactive systems may be used as fully functional prototypes. Problems in specification and Specification 1 - Button instance variables. design can be discovered and corrected before im- plementation begins. Events are defined in a way similar to the .Net frame- 6. In restricted domains, and with appropriate tool work ( Specification 2). For each event named M N O P Q R S T O , support (see for example [11]), formal specifications there is an instance variable named M N O P Q R S T O U S P V W O X Y or models of user interfaces can be used as the basis that stores the event handlers registered with the event. for the automatic generation of an implementation Operations  and    "  are responsible to register and in some target platform, according to refinement or unregister event handlers. In an AsmL specification, it is translation rules. The generated implementations are possible to use types defined in the .Net framework. That's correct by construction, and conformity tests are not the case of the type  "   Z     in Specification 2, and needed. types  "   [    ,     "   [    and      "   [    in Overall, higher rigor in the description and verification Specification 3. of interactive components is important to gain confidence \ 7]6 % . , 6 % 7 K @ < 5 L ^ ? + A @ 4 ; C % & : 4 ) _ E > ` 4 + ) ^ ? + A @ 4 ;D a b on their correctness and encourage their reuse [10]. ,6 , /. K @ < 5 L % & >` 4 +) ^ ? +A @4 ; % 0 0 % 0 0 ` ? @ (4 . _ K @ < 5 L^ ? +A @4 ;C 7, c _6 , 7, c _ 6 , ` ? @ ( 4 E 7_ c K @ < 5 L ^ ? + A @ 4 ;C d d C ` 4 + ) ;= C  \   $ ] # # $% & & 8 ' () ) * +  ' () ) * + a 7,   ] 7, > + ? 2 @ 4 A D . 7 , \   $ ] # '* * @4 ? + K @ < 5 L4 A B@? = D E % $ &,  E _ 7% $ $ ? +A @4 ; ] / K @ < 5 L^ ? +A @4 ;C  ? + A @ 4 ; c , g4  \   $ ] # 8 ' ( ) ) * + : ) ; < + = C g' * * @ 4 ? + 2  2 ? C 4  a K @ < 5 L4 A B@? = D . 7, 8 4 9 ) D C  > +? 2 @4 A D 2  b 1 + I4J G *H + 4 % & I4J >` 4 +) ;= C \ 7_ . , # . , 0 _ 6 , 7 7] 0 , 6 _ ] 0 1 + K @ < 5 L >` 4 + ) ;= C 4  a 7,   ] 7, > + ? 2 @ 4 A D . 7 , % / 0 B* 5 ( C 4 A D . 7 , % /0 2? C4 1 +K @ < 5 L 4   I4J G *H + B@? = D E % $ &, K @ < 5 L4 A B@? = D ) ;(4  b &. , \ 4 C4 ) 4 +4 ;? ) 4 A >` 4 +) B@? = C  \   $ ] # 6 _ ]0 h1 + I4J G *H + I4J >` 4 +) ;= C 4  a & . , \ E _ 7% $ $  ? + A @ 4 ; ] / I 4 J G *H + ^ ? + A @ 4 ;C K @ < 5 L4 A B@? = D f ? @ C4   ? + A @ 4 ; c , g4  1 + I4J G *H + 4   b I4J G *H + B@? = D . 7, d d C ` 4 +) ;= C b 7,   ] 7, > + ? 2 @ 4 A D . 7 , % / 0 B* 5 ( C 4 A D . 7, % /0 I4J G *H + B@? = D . 7, &. , \ 4 C4 ) 4 +4 ;? ) 4 A >` 4 +) B@? = C  & . , \ E _ 7% $ $  ? + A @ 4 ; ] / I 4 J h i ^ ? + A @ 4 ;C  ? + A @ 4 ; c , g4  Implementation 1 – Implementation of a testable button class &. , \ ]E 4 I4J G ? ) ? D (TButton). :J C) 4 e < +A *H C B* ;e C I4J C : i? 54  . , / 1 +K @ < 5 L /, >` 4 +) ;= C   I4J G *H + B@? = D E % $ &, 1 +F * ( C4 G *H + 4 % & F * ( C4 >` 4 +) ;= C 4.3 Specification and Implementation of a Test 7,   ] 7, > + ? 2 @ 4 A D . 7 , % / 0 F * ( C 4 G *H + B @ ? = D E % $ &, Container &. , \ 4 C4 ) 4 +4 ;? ) 4 A >` 4 +) B@? = C  & . , \ E _ 7% $ $  ? + A @ 4 ; ] / F * ( C 4 G *H + ^ ? + A @ 4 ;C In order for a user to interact with a button, it must be  ? + A @ 4 ; c , g4  made visible by putting it inside some window or con- B* 5 ( C4 A D . 7, F * ( C4 G *H + B@? = D . 7, tainer. With this purpose, a class     was created both at the specification level ( Specification 4) and at the im- 1 +F * ( C4 h i 4 % & F * ( C4 >` 4 +) ;= C plementation level ( Implementation 2). Due to limitations 7,   ] 7, > + ? 2 @ 4 A D . 7 , % / 0 B* 5 ( C 4 A D . 7 , % /0 F * ( C4 G *H + B@? = D . 7, of the test tool, auxiliary methods        , &. , \ 4 C4 ) 4 +4 ;? ) 4 A >` 4 +) B@? = C         ,      and        had to be created to & . , \ E _ 7% $ $  ? + A @ 4 ; ] / F * ( C 4 h i ^ ? + A @ 4 ;C  ? + A @ 4 ; c , g4  simulate user events that are sent to the button contained in &. , \ 1 +K @ < 5 L /, >` 4 +) ;= C   the form. These methods are selected to trigger the transi- F * ( C4 G *H + B@? = D E % $ &, tions in the state transition diagram ( Figure 1). Each test & % 7, 0 ' ( ) ) * + C % & : ) ;< + = g 2 % & '* * @4 ? + case will be constructed as a sequence of calls to these 8 4 9 ) D C methods. > +? 2 @4 A D 2 % 0 0 c , . _ ' + C) ? + 54 C # $% & & 8 ' B* ;e 6 % 7 8 ' % & ' () ) * + D /, ' ( ) ) * +  8 ' ( ) ) * +  g. 7  ,   4 C 4 )  4 + 4 ;? ) 4 A > ` 4 + ) B @ ? = C               ! "  #  $ %& '  (    % )  * $  %      + ,   # - * .  / K @ < 5 L4 A B@? = D E % $ &, 8 ' 1 +F * ( C4 G *H + 4       0 1   ! "  #  $ %& '  (    % )  * $  %     + ,   # - * .  / 8 ' 1 +F * ( C4 h i 4  8 ' I4J G *H + 4 % & :J C) 4 e < +A *H C B* ;e C I4J >` 4 +) ;= C  Specification 3 – Button methods. 8 ' 1 + I4J G *H + 4  8 ' I4J h i 4 % & :J C) 4 e < +A *H C B* ;e C I4J >` 4 +) ;= C 8 ' 1 + I4J h i 4  & % 7, 0 8 ' B* ;e  4.2 Extension of the Button Implementation Class d d < + C4 ;) )  < C < + C) ? + 54 < + ) 4 C4 ) * f < + C) ? + 54 C d d ) * 24 ) 4 C) 4 A for Testability % 0 0 c , . _ < + C) ? + 54 C In order to overcome the limited testability of the     class in the .Net framework, a      (Testable Button) class, inheriting from     , was created in C# ( Specification 4 – Container class TBForm. Implementation 1). Event flag          is similar to the one used in the  & ] / 2 :J C) 4 e  specification. This flag was marked as public, to facilitate  & ] / 2 :J C) 4 e G ;?H < +=  the mapping of states between the specification and the  & ] / 2 :J C) 4 e < +A *H C B* ;e C  /% c , & \% #, 8 ' B* ;e implementation. a Method l      overrides a protected method with the \   $ ] # # $% & & 8 ' B* ;e  B * ;e a \ 7]6 % . , 8 ' () ) * + 8 '  same name defined in the     class, in order to ma- \   $ ] # 8 ' B* ;e  a nipulate the added event flag and record the occurrence of + <) ` 4 +) ;= C 4  a 8 ' h1 + I4J G *H + 4   b \ 7]6 % . , 6 _ ]0 8 ' I4J h i :J C) 4 e < +A *H C B* ;e C I4J >` 4 +) ;= C 4  a Figure 1 – Finite State Machine (FSM) diagram. 8 ' h1 + I4J h i 4   b \ 7]6 % . , 6 _ ]0 8 'F * ( C4 G *H + :J C) 4 e < +A *H C B* ;e C F * ( C4 >` 4 +) ;= C 4  a 8 ' h1 +F * ( C4 G *H + 4   4.5 Definition of Mappings between the Specification b and the Implementation \ 7]6 % . , 6 _ ]0 8 'F * ( C4 h i :J C) 4 e < +A *H C B* ;e C F * ( C4 >` 4 +) ;= C 4  a To perform conformity tests it is necessary to define map- 8 ' h1 +F * ( C4 h i 4   pings (conformance relations) between specification and b b implementation methods and data (state) [8]. These rela- b tions can be established manually or in an automated way. It was defined a relation between the classes Implementation 2 – Container class TBForm.       (specification) and   [       m        (implementation). After this, the methods with the same names and arguments in both classes are 4.4 Generation of the Finite State Machine Model automatically related. In the current version of the test tool, data relations have to be defined manually. In this example, The AsmL Tester tool was used to automatically generate only the          instance variable was mapped, as the finite state machine (FSM) from the AsmL specification, shown in Figure 2. Conformance tests will execute related based on the following configuration information: methods in both levels (specification and implementation) 1. List of state variables - all the event flags defined in and will compare results obtained from both and also com- the specification of the     class (      m pare the related data.    ,        and          ); 2. List of actions that trigger transitions – the construc- tor and methods defined in the     class (       ,     ,     and     m    ). 3. Domains (values to consider) for the previous state variables and actions’ arguments. For each state variable the domain is          . In the case of the mouse actions, a single argument was pro- vided, specifying the left mouse button and a posi- Figure 2 – Conformance relations configuration. tion inside the form button, more precisely, the value                   "   [                              4.6 Generation of the Test Suite  In the case of the keyboard actions, a  After generating the FSM, it is possible to generate a test single argument was provided, specifying the ‘A’ suit automatically ( Figure 3), based on coverage criteria key, that is, the value            selected by the user. In this case, it was used the default         "   [                   [  criteria that ensures full coverage of states and transitions. The FSM obtained is shown in Figure 1. Each state cor- A test suite with a single test case (sequence) was sufficient responds to a combination of values of the state variables. to cover all the transitions. This test suit will be used as the Apart from the start state (greyed), the FSM has 6 states. input to conformance testing. This means that two of the possible combinations of values  of the three state variables cannot occur (  ,     ,     . The possible transitions departing from each state are con- strained by the methods’ pre-conditions (   k     clause). 7 QUATIC’2004 PROCEEDINGS spacebar key. 5 CONCLUSION An approach to test interactive components, with the automatic generation of test cases from a specification was described. In comparison with others, the approach pre- sented in this paper requires a formal specification with demonstrated benefits in the development and verification of interactive components. In the past, formal specification and verification techniques have been used mainly in the development of critical systems, but, from our point of view, they also have a major role to play in the develop- ment and verification of reusable components, as is the case of interactive components. It was presented an example of automatic testing the conformity between the implementation of a button, in the .Net framework, and a specification, written in the AsmL Figure 3 – Test suit generated. language, using the AsmL Tester tool. Some test code was needed to overcome testability limitations of the target code. Although, only a small part of the behaviour of a but- ton was specified and tested, the tests were successful, that 4.7 Test Execution and Results is, a bug was detected. A larger example could be used As soon as the conformity relations are defined and the since the approach can easily scale but it would be difficult FSM and the test suit are generated, it is possible to execute to explain that example in few pages. conformance tests. Every time there is an inconsistence, the However, in its current state, the AsmL Tester tool also tool stops and reports the error. has some limitations: The tool reports a conformance error when the sequence of 1. It still requires too much user intervention. events      ,    , and      is executed 2. While the tight integration with the .Net framework ( Figure 4), with key 'A'. The error is an inconsistency be- has some advantages, one of its shortcomings arises tween the value of the          value at the imple- from the fact that the level of abstraction of the mentation (the value is true) and the specification (the value specification is not as high as should be. is false). This means that the implementation (the     3. Interactive components can have lots of states and class in the .Net framework) generates a      event, when actions or events that can be hard to manipulate and it receives from the user the sequence of events      , test. The AsmL Tester tool allows the selection of     , and      . According to the documentation of which actions should appear in the FSM diagram the .Net framework, this should only happen when the key (and in the test cases generated from the FSM). Con- pressed is the spacebar (which is not the case here). sequently, it is possible to test separately parts of the behaviour of the object or component under test. But a rigorous method is needed to define those parts Error and “sum” the results obtained in each part to take coverage criteria conclusions. The approach presented in this paper has to be extended and matured in several directions: 1. Use the approach presented in larger examples. 2. Explore other ways to generate test cases from the FSM model – some criteria to generate specification- based tests can be found at [12]. 3. Define additional check points – for instance, when a callback is issued and on return. 4. Model Checking – integrate the approach with model Figure 4 – Conformance test inconsistency (the path to the error is shown in red). checking techniques to prove properties about the model. 5. Verification of the user interface contract – particularly To reproduce this abnormal behaviour manually it is challenging is the problem of checking that the outputs necessary to press the left mouse button on a .Net button, sent by an interactive component to the user obey to and press and release a keyboard key without releasing the some kind of specification or contract. For example, mouse button. This will have the effect of selecting the but- ton and executing the action associated with it. According the user interface contract of a textbox is to allow the to the documentation, this should only happen with the user to insert and visualize a string through a small 8 QUATIC’2004 PROCEEDINGS window. Above points and possible others will be subject of future work. ACKNOWLEDGMENT The authors wish to thank the anonymous reviewers for their comments and suggestions. REFERENCES [1] R. V. Binder, Testing Object-Oriented Systems: Models, Patterns and Tools: Addison-Wesley, 2000. [2] J. Z. Gao, H.-S. J. Tsao, and Y. Wu, Testing and Quality Assurance for Component-Based Software: Artech House Publishers, 2003. [3] C. Szyperski, Component Software: Beyond Object-Oriented Pro- gramming: Addison-Wesley, 1999. [4] C. Kaner, J. Bach, and B. Pettichord, Lessons Learned in Software Testing: A Context-Driven Approach: John Wiley & Sons, 2002. [5] B. Meyer, "Applying Design by Contract," IEEE Computer, pp. 40- 51, 1992. [6] F. Findler, "Contract Soundness for Object-Oriented Languages," presented at Object-Oriented Programming Systems, Languages and Applications (OOPSLA), 2001. [7] Microsoft, "Introducing AsmL: A Tutorial for the Abstract State Machine Language," Foundations of Software Research, 2002. [8] A. C. Paiva, J. P. Faria, and R. M. Vidal, "Specification-based Test- ing of User Interfaces," presented at 10th DSV-IS Workshop - De- sign, Specification and Verification of Interactive Systems, Fun- chal - Madeira, 2003. [9] J. Campos and M. D. Harrison, "Model Checking Interactor Specifications," in Automated Software Engineering, vol. 8, 2001. [10] I. MacColl and D. Carrington, "User Interface Correctness," pre- sented at Human Computer Interaction - HCI'97, 1997. [11] M. D. Lozano, "Entorno Metodológico Orientado a Objectos para la Especificación y Desarrollo de Interfaces de Usuario," in Siste- mas Informáticos y Computación. Valencia: Universidad Politécnica de Valencia, 2001. [12] J. Offutt, S. Liu, A. Abdurazik, and P. Ammann, "Generating test data from state-based specifications," Software Testing, Verification and Reliability, vol. 13, pp. 25-53, 2003. Ana C. R. Paiva received M.Sc degree in Electrical and Computers Engineering from Engineering Faculty of Porto University (FEUP) and a degree in Information Systems Engineering from Minho University of Portugal in 1997 and 1995 respectively. She is currently developing hers doctorate in formal methods applied to user interfaces at FEUP, Electrical and Computers Engineering Department, where she is an Assistant Lecture since 1999. João C. P. Faria received a Ph.D. in Electrical and Computer Engi- neering from the Engineering Faculty of Porto University (FEUP) in 1999, and a degree in Electrical Engineering from FEUP in 1985. He is an Assistant Professor at FEUP, Electrical and Computers Engineering Department, Informatics. Raul F. A. M. Vidal received a Ph.D. in Digital Electronics at UMIST in 1978, an M.Sc in Communication Engineering at UMIST in 1974 and a degree in Electrical Engineering at Engineering Faculty of Porto Uni- versity (FEUP) in 1972. He is an Associate Professor at FEUP, Electri- cal and Computers Engineering Department, Informatics.