<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Process Engine Benchmarking with Betsy - Current Status and Future Directions</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Matthias Geiger</string-name>
          <email>matthias.geiger@uni-bamberg.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Simon Harrer</string-name>
          <email>simon.harrer@uni-bamberg.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jörg Lenhard</string-name>
          <email>joerg.lenhard@uni-bamberg.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Distributed Systems Group, University of Bamberg</institution>
          ,
          <country country="DE">Germany</country>
        </aff>
      </contrib-group>
      <fpage>37</fpage>
      <lpage>44</lpage>
      <abstract>
        <p>Business process management and automation has been the focus of intense research for a long time. Today, a plethora of process languages for specifying and implementing process models have evolved. Examples for such languages are established international standards, such as the Web Services Business Process Execution Language 2.0 or, more recently, the Business Process Model and Notation 2.0. Implementations of these standards which are able to execute models, so called process engines, differ in the quality of service they provide, e.g., in performance or usability, but also in the degree to which they actually implement a given standard. Selecting the “best” engine for a particular usage scenario is hard, as none of the existing process standards features an objective certification process to assess the quality of its implementations. To fill this gap, we present our work on process engine benchmarking. We discuss what has been achieved so far and point out future directions that deserve further investigation.</p>
      </abstract>
      <kwd-group>
        <kwd>business process management</kwd>
        <kwd>process engine</kwd>
        <kwd>BPEL</kwd>
        <kwd>BPMN</kwd>
        <kwd>benchmarking</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        The field of business process management (BPM) forms an umbrella for a
variety of research areas, ranging from managerial challenges to application
engineering [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Among these fields are the modeling and automation of processes
using process-aware technologies specifically dedicated to this task. This has led
to the development of a multiplicity of process languages and standards [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ] that
can be used for specifying process models. A subset of these languages allow to
specify models that are intended for execution in specific runtime environments,
called process engines. Typically, multiple alternative engines are available for a
given process language. The user of the language can implement process models
as defined in the language specification and has to select the best-fitting engine for
execution. Naturally, a variety of properties can form the basis for this selection,
such as pricing, performance, usability, or actual language support.
      </p>
      <p>
        The problem in this setting is that it is hard for a potential user to meaningfully
judge these properties for a given set of engines, due to the inherent complexity of
such software products. In general, this selection problem is not new, and exists
in similar fashion for any sufficiently sophisticated software tooling or technology,
such as application servers or ERP systems. To make such a decision, there are
a plethora of methods available [
        <xref ref-type="bibr" rid="ref29">29</xref>
        ], one being the analytic hierarchy process
(AHP) [
        <xref ref-type="bibr" rid="ref25">25</xref>
        ]. But to apply these methods, the properties of the different alternatives
need to be known. One technique to reveal these properties is benchmarking [
        <xref ref-type="bibr" rid="ref26">26</xref>
        ],
which in this case resolves to process engine benchmarking. The enabling of the
benchmarking of state-of-the-art engines for widely used process standards and
for a comprehensive set of quality properties is the long term goal of our work. To
this end, we are developing the BPEL/BPMN engine test system (betsy), which
implements a comprehensive benchmark for process engines1. The development
of betsy is in progress for more than three years already and, by now, more than
a dozen engines in a variety of revisions are integrated in a fully automated and
reproducible benchmarking process.
      </p>
      <p>In this paper, we briefly discuss related approaches for process engine
benchmarking in Sect. 2. Next, we detail the current status of betsy and how it has
evolved since its first public release in 2012 in Sect. 3 and, in Sect. 4, how we plan
to evolve betsy even further in the future. The paper is summed up in Sect. 5.
2</p>
    </sec>
    <sec id="sec-2">
      <title>Related Work</title>
      <p>
        Benchmarking of IT products is not a new phenomenon and therefore there
exists already lots of related work regarding this topic (e.g., [
        <xref ref-type="bibr" rid="ref11 ref18 ref27 ref5">5, 11, 18, 27</xref>
        ]).
Particularly interesting is [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ] which defines general requirements to be fulfilled
by benchmarks to be a valid, “good” benchmark: Overall, a benchmark should
measure relevant aspects to be able to give substantial answers to the investigated
research questions. Benchmarking workflow engines is a relevant topic as there are
no certification authorities to check claimed compliance promises. So each vendor
can claim that his product is BPMN [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ] or BPEL 2.0 [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ] conformant without
the need to actually prove it. Moreover, also other questions are relevant for
users of BPM products such as: ease of installation, portability and conformance
to statical analysis rules which also can be compared for different products.
However, [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ] lists other requirements to benchmarks which might be conflictory
to the aspect relevance as a benchmark should also be repeatable, fair, verifiable
and economical. As betsy focuses on standard’s conformance testing those four
requirements are fulfilled: betsy is Open Source and fully automated which allows
for repeated test execution. Moreover, (most) tested engines are freely available
and directly integrated into our approach, which allows every interested party
to execute the tests without economic barriers on standard developer hardware.
As the standard documents define all relevant aspects to be fulfilled by the
implementing engines and we are building upon the same documents, betsy
does not give an advantage to some engines but is fair. Due to the openness of
the standard texts and our implementation the correctness of betsy is open to
scrutiny fostering the verifiability.
1 The tool itself is available at https://github.com/uniba-dsg/betsy.
      </p>
      <p>
        Apart from those general works, there are some approaches regarding process
engine benchmarking, which are more closely related to our work. In [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] BPEL 2.0
engines are assessed regarding their performance using the SOABench testbed.
Another approach dedicated to performance benchmarking of workflow engines is
the BenchFlow2 project which focuses on benchmarking BPMN 2.0 engines [
        <xref ref-type="bibr" rid="ref23 ref28 ref8">8, 23,
28</xref>
        ]. Their latest work [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] evaluates the performance of two anonymized open source
BPMN 2.0 engines within a container-based environment. By using
containerbased environments, the authors follow the recommended approach to achieve
reproducible research and benchmarks [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. Directly reusing the concepts and
artifacts generated by those two approaches is not useful for the scope of our tests,
as measuring performance needs a far more complex infrastructure apart from
the actual engines under test to generate sensible workloads and to ensure the
validity of the results [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]. Our tool betsy should be able to reproduce the results
without economical and technological barriers, i.e., it should be executable on
standard developer machines without any complex installation and configuration
steps. However, as both approaches are automatically executing tests on workflow
engines at least the usage of virtualization techniques such as virtual machines
(e.g., with Oracle VirtualBox3) or using containers (e.g., with Docker4) to store
and restore working engine installations is also relevant for our work.
      </p>
      <p>
        A third notable approach [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] presents a method to evaluate BPM systems
(BPMS) with the aim of selecting the best fitting BPMS for a list of requirements.
In a series of case studies, the authors evaluate a large list of open source and
proprietary BPMS implementing three different process languages (e.g., the XML
Process Definition Language (XPDL) 2.2 [
        <xref ref-type="bibr" rid="ref31">31</xref>
        ], BPEL 2.0 [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ] and BPMN 2.0 [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ]).
In contrast to our work, this evaluation is on a more abstract level and the actual
engine evaluation is not automated.
3
      </p>
    </sec>
    <sec id="sec-3">
      <title>The Current State of betsy</title>
      <p>Betsy 2.1.0, the most recent version, has been published on September, the 29th
in 20155. The tool is freely available and licensed under the LGPL v3. Currently,
it is capable of benchmarking three BPMN engines in thirteen different versions
with 135 tests and seven BPEL engines in 16 different versions, two of them also
in an in-memory configuration, with 1110 tests.</p>
      <p>
        The current state of betsy can be described according to four dimensions: 1)
process languages, 2) process engine capabilities, 3) process engine types and
4) process engine environments. The dimension process language is reflected
in the betsy acronym. Although the acronym never changed, its meaning has
evolved from BPEL engine test system to BPEL/BPMN engine test system, since
betsy is able to evaluate process engines implementing the process languages
BPEL [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ] or BPMN [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ]. The dimension process engine capabilities describes
2 See http://www.iaas.uni-stuttgart.de/forschung/projects/benchflow.php
3 See https://www.virtualbox.org/
4 See https://www.docker.com/.
5 See https://github.com/uniba-dsg/betsy/releases for all releases.
the features of the process engine that are tested. At first, betsy was used to
evaluate feature conformance, but over time it was extended to assess the static
analysis conformance, expressiveness, robustness, and installability of process
engines. The third dimension process engine types investigates which type of
process engine is put under scrutiny, being either an open source or a proprietary
process engine. The last dimension process engine environments refers to the
ability to benchmark the process engines in a bare metal environment or a virtual
environment, such as in a virtual machine or a container.
4
      </p>
    </sec>
    <sec id="sec-4">
      <title>Future Directions</title>
      <p>
        To support a more meaningful selection of process engines, we aim to extend
betsy to a process engine benchmarking platform, making it faster, more flexible,
powerful, and extensible. Our plans are detailed along the four dimensions.
Dimension process language: The field of process standards is vast [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ] and in
constant evolution. The relevancy of a process engine benchmarking system
depends on the relevancy of the language it supports. Currently, betsy
supports BPEL 2.0 [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ] and BPMN 2.0 [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ]. Arguably, these two languages
are sufficient at the moment, since there is no competing standard that
equally targets process engine execution. XPDL [
        <xref ref-type="bibr" rid="ref31">31</xref>
        ] is also a process
standard that allows for the specification of executable process models, but it is
primarily meant as an interchange format. Although it is used as execution
format in some engines, it is expected to be replaced for this purpose by
BPMN 2.0 [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. Therefore, there is no reason to include XPDL in
benchmarking directly. Furthermore, academic approaches to process languages, such
as Yet Another Workflow Language (YAWL) [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], do exist. However, YAWL
is neither standardized, nor do competing implementations of YAWL, apart
from the reference implementation, exist. As a result, there is no selection or
comparison problem and no reason to consider the language.
      </p>
      <p>
        Dimension process engine capability: For BPEL 2.0 engines, betsy already covers
a large variety of engine capabilities [
        <xref ref-type="bibr" rid="ref12 ref13 ref15 ref16 ref17 ref20">12, 13, 15–17, 20</xref>
        ]. With the emergence
of BPMN 2.0, we have started to benchmark the feature conformance of
BPMN 2.0 engines as well [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. Our current goal is to fill in open gaps by
benchmarking BPMN 2.0 engines for the same set of capabilities which we
benchmarked for BPEL 2.0 engines, including static analysis conformance,
expressiveness, installability, and robustness. The challenge here is how can
the BPEL 2.0 benchmarks be ported to BPMN 2.0, effectively reusing the
benchmarks to some extent. And interesting aspect is the statical analysis
conformance, i.e., do perform the engines statical analysis of models as
defined in the specification. Whereas the BPEL standard [
        <xref ref-type="bibr" rid="ref22">22</xref>
        ] is directly
listing relevant statical analysis checks this is not the case for BPMN. As
shown in preliminary work [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] this raises issues for BPMN modeling tools
which are also to be expected for BPMN engines.
      </p>
      <p>
        In addition, it would be desirable to increase the set of already covered
engine capabilities by also benchmarking performance. Performance has
always been an important criterion for software selection and evaluation [
        <xref ref-type="bibr" rid="ref30">30</xref>
        ].
In a preliminary work, we evaluated existing benchmarking approaches of
BPEL 2.0 engines [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ] and revealed that most of them test a very small
number of engines, use a limited workload model and only focus on mostly one
or two metrics. Moreover, as stated in [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ] for BPEL, additional challenges
arise as the process engines do not support the same set of features. The same
holds true for BPMN engines as well. Hence, either the benchmark’s workload
can only be executed on a few engines or it must be reduced to using only the
features that all engines support. Apart from extending betsy, our current
results can be used to improve the related work presented in Sect. 2: The
conformance results of betsy can be used to determine a sensible workload
leading to a benchmark which produces fair and reproducible results for all or
at least the most important engines. What is more, existing test suites, e.g.,
of the control-flow pattern, can be used as workloads for micro-performance
benchmarks. Thus, this area calls for further investigation.
      </p>
      <p>
        Dimension process engine type: The market of process engines can currently be
separated into proprietary and open source engines. In academic research,
the usage of open source tooling is much more common, due to a more
permissive access that does not involve costs. As a result, most analyses of
process engines focus primarily on open source engines, e.g., [
        <xref ref-type="bibr" rid="ref12 ref15 ref16 ref17 ref20 ref9">9, 12, 15–17, 20</xref>
        ].
In contrast, work that explicitly compares these two types of process engines
is rare, e.g., [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. This is problematic, since, to the best of our knowledge,
there is no indication that the usage of process engines is dominated by open
source solutions. Instead, there are plenty of proprietary engines available,
including products by large multi-national enterprises with a huge customer
base world-wide. A blind spot regarding the evaluation of proprietary engines
in research is problematic, as, potentially, the quality of such engines might
be vastly different. An omission of these tools could result in wrong and
unfounded conclusions that are not generalizable. This danger is especially
valid for practical studies or case studies that depend on particular engines.
It is our intention to extend betsy to support the benchmarking of more
proprietary engines. This is most important for BPMN engines, where no
proprietary implementations are supported so far. The biggest obstacle in
this endeavor is the licensing strategy of many vendors. Pseudonymization of
research results, as used in [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ], is a way to relieve restrictions, given academic
licenses are available, but this is not always the case. By working together
with the vendors, we see a possibility to publish the results nonetheless.
What also makes benchmarking proprietary engines complicated is that most
proprietary tools are not simple BPMN engines but full-fledged BPM suites.
This heavily affects both the installation and startup procedures which are
complex and take a long time. We already provide an approach to use virtual
machines with snapshots to easily restore a started process engine within a
virtual machine [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. Currently, this is quite cumbersome to use. Therefore,
we are aiming to replace this with Docker and its light-weight containers as
they are working to include a similar snapshot functionality as well.
Each engine, being it open source or proprietary, has to fulfill certain criteria
so that it can be tested by betsy. For BPEL 2.0, we already created an API
to handle engines uniformly in [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], making it easier to add new engines or
new versions of existing engines. In the future, we plan to extend this API
to include BPMN 2.0 engines as well. This is especially important for the
proprietary engines as they do have more complex APIs, resulting in a higher
entry barrier to actually benchmark them.
      </p>
      <p>
        Dimension process engine environment: For reproducible research and
reproducible benchmarks alike, it is paramount that results are correct and their
computation is repeatable [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. Currently, we use a fresh engine installation for
every test, ensuring test isolation and an absence of side-effects. Furthermore,
betsy is fully automated and therefore provides repeatable results. Again, the
usage of container technology is promising to achieve an even higher degree of
isolation fixing the benchmark environment, which makes it easier to repeat
the benchmark.
      </p>
      <p>
        What is more, we showed in [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ], that virtualization helps to circumvent the
install and startup times of the engines, reducing the time to compute the
benchmark results drastically, thus, leading to a significantly lower turnaround
time [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. This is helps to integrate our benchmark into contemporary
continuous integration infrastructures, which can be used by the engine vendors
to improve the quality of their implementations. To reduce the execution
time even further, we suggest cutting down unnecessary waiting time by
calibrating timeouts required during testing to better match the actual system
performance. Also parallel and distributed test execution forms a promising
area of future work.
5
      </p>
    </sec>
    <sec id="sec-5">
      <title>Conclusion</title>
      <p>In this paper, we have presented a roadmap for process engine benchmarking using
the betsy system. We delineated important dimensions for engine benchmarking
and outlined what has been achieved so far in these dimensions with betsy. This
identifies gaps in current work and outlines potential areas for future work in the
area of process engine benchmarking, including a) to put more focus on testing
proprietary engines, b) porting benchmarks for BPEL to BPMN engines, and c)
speeding up process engine benchmarks through parallelization and virtualization
technologies. By filling these gaps in the future, we hope to support process
engine users in a meaningful decision when selecting an engine. To help users
with such decisions, we are planning to publish all benchmark results as an
interactive website. Furthermore, our work could help process engine vendors to
enhance the quality of their products, e.g., by integrating the conformance test
features of betsy into their continuous integration processes. This should reduce
the occurrence of test regressions we were able to reveal in our results. Because of
this, we aim to get engine vendors on board, fostering and validating our results.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>van der Aalst</surname>
            ,
            <given-names>W.M.P.</given-names>
          </string-name>
          :
          <article-title>Business Process Management: A Comprehensive Survey</article-title>
          .
          <source>ISRN Software</source>
          Engineering pp.
          <fpage>1</fpage>
          -
          <lpage>37</lpage>
          (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>van der Aalst</surname>
          </string-name>
          , W., ter
          <string-name>
            <surname>Hofstede</surname>
          </string-name>
          , A.:
          <article-title>YAWL: yet another workflow language</article-title>
          .
          <source>Information Systems</source>
          <volume>30</volume>
          (
          <issue>4</issue>
          ),
          <fpage>245</fpage>
          -
          <lpage>275</lpage>
          (
          <year>June 2005</year>
          ), http://www.sciencedirect.com/ science/article/pii/S0306437904000304
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Banker</surname>
          </string-name>
          , R.D.,
          <string-name>
            <surname>Datar</surname>
            ,
            <given-names>S.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kemerer</surname>
            ,
            <given-names>C.F.</given-names>
          </string-name>
          :
          <article-title>Factors affecting software maintenance productivity: An exploratory study</article-title>
          .
          <source>In: Proc. of the 8th Intl. Conf. on Information Systems</source>
          . pp.
          <fpage>160</fpage>
          -
          <lpage>175</lpage>
          . University of Pittsburgh Pittsburgh, PA (
          <year>1987</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Bianculli</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Binder</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Drago</surname>
            ,
            <given-names>M.L.:</given-names>
          </string-name>
          <article-title>Automated Performance Assessment for Service-oriented Middleware: A Case Study on BPEL Engines</article-title>
          .
          <source>In: Proc. of the 19th Int. Conf. on World Wide Web</source>
          . pp.
          <fpage>141</fpage>
          -
          <lpage>150</lpage>
          . ACM, New York, USA (
          <year>2010</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Boettiger</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>An introduction to Docker for reproducible research</article-title>
          .
          <source>ACM SIGOPS Operating Systems Review</source>
          <volume>49</volume>
          (
          <issue>1</issue>
          ),
          <fpage>71</fpage>
          -
          <lpage>79</lpage>
          (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Chinosi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Trombetta</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>BPMN: An introduction ot the standard</article-title>
          .
          <source>Computer Standards &amp; Interfaces</source>
          <volume>34</volume>
          (
          <issue>1</issue>
          ),
          <fpage>124</fpage>
          -
          <lpage>134</lpage>
          (
          <year>January 2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Delgado</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Calegari</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Milanese</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Falcon</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>García</surname>
          </string-name>
          , E.:
          <article-title>A Systematic Approach for Evaluating BPM Systems: Case Studies on Open Source and Proprietary Tools</article-title>
          .
          <source>In: Open Source Systems: Adoption and Impact</source>
          , pp.
          <fpage>81</fpage>
          -
          <lpage>90</lpage>
          . Springer (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Ferme</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ivanchikj</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pautasso</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>A Framework for Benchmarking BPMN 2.0 Workflow Management Systems</article-title>
          .
          <source>In: 13th Intl. Conf. on Business Process Management (BPM</source>
          <year>2015</year>
          ). Springer, Innsbruck, Austria (
          <year>August 2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Geiger</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Harrer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lenhard</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Casar</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vorndran</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
          </string-name>
          , G.:
          <article-title>BPMN Conformance in Open Source Engines</article-title>
          . In: IEEE Intl.
          <article-title>Symp. on Service-Oriented System Engineering</article-title>
          . IEEE, San Francisco Bay, CA, USA (March 30 - April 3
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Geiger</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
          </string-name>
          , G.:
          <article-title>BPMN 2.0 Serialization - Standard Compliance Issues and Evaluation of Modeling Tools</article-title>
          .
          <source>In: 5th Int. Workshop on Enterprise Modelling and Information Systems Architectures. St. Gallen</source>
          ,
          <string-name>
            <surname>Switzerland</surname>
          </string-name>
          (
          <year>September 2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Gray</surname>
          </string-name>
          , J.:
          <article-title>Benchmark Handbook: For Database and Transaction Processing Systems</article-title>
          . Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (
          <year>1992</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Harrer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lenhard</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
          </string-name>
          , G.:
          <article-title>BPEL Conformance in Open Source Engines</article-title>
          . In: IEEE SOCA, Taipei, Taiwan. pp.
          <fpage>237</fpage>
          -
          <lpage>244</lpage>
          . IEEE (
          <volume>17</volume>
          -
          <issue>19</issue>
          <year>December 2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Harrer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lenhard</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
          </string-name>
          , G.:
          <article-title>Open Source versus Proprietary Software in Service-Orientation: The Case of BPEL Engines</article-title>
          .
          <source>In: Proc. of the 11th Intl. Conf. on Service-Oriented Computing (ICSOC'13)</source>
          . pp.
          <fpage>99</fpage>
          -
          <lpage>113</lpage>
          . Berlin, Germany (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Harrer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lenhard</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
            , G., van Lessen,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>Towards Uniform BPEL Engine Management in the Cloud</article-title>
          .
          <source>In: Proceedings des CloudCycle14 Workshops auf der 44</source>
          .
          <article-title>Jahrestagung der Gesellschaft für Informatik e</article-title>
          .V.
          <string-name>
            <surname>GI</surname>
          </string-name>
          (
          <year>September 2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Harrer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nizamic</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lazovik</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Towards a Robustness Evaluation Framework for BPEL Engines</article-title>
          .
          <source>In: IEEE Intl. Conf. on Service-Oriented Computing and Applications</source>
          . pp.
          <fpage>199</fpage>
          -
          <lpage>206</lpage>
          . IEEE, Matsue,
          <source>Japan (17-19 November</source>
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Harrer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Preißinger</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
          </string-name>
          , G.:
          <article-title>BPEL Conformance in Open Source Engines: The Case of Static Analysis</article-title>
          .
          <source>In: IEEE Intl. Conf. on Service-Oriented Computing and Applications</source>
          . pp.
          <fpage>33</fpage>
          -
          <lpage>40</lpage>
          . IEEE, Matsue,
          <source>Japan (17-19 November</source>
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Harrer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Röck</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
          </string-name>
          , G.:
          <article-title>Automated and Isolated Tests for Complex Middleware Products: The Case of BPEL Engines</article-title>
          .
          <source>In: Proc. of the 7th IEEE Intl. Conf. on Software Testing, Verification and Validation Workshops (ICSTW'14)</source>
          . pp.
          <fpage>390</fpage>
          -
          <lpage>398</lpage>
          . Cleveland, Ohio, USA (April
          <year>2014</year>
          ), testing Tools Track
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Huppler</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          :
          <article-title>The Art of Building a Good Benchmark</article-title>
          .
          <source>In: Performance Evaluation and Benchmarking</source>
          . Springer Berlin Heidelberg (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19. ISO/IEC: ISO/IEC 19510:
          <fpage>2013</fpage>
          - Information technology - Object
          <source>Management Group Business Process Model and Notation (November</source>
          <year>2013</year>
          ),
          <year>v2</year>
          .
          <fpage>0</fpage>
          .
          <fpage>2</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          20.
          <string-name>
            <surname>Lenhard</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Harrer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          :
          <article-title>Measuring the Installability of Service Orchestrations Using the SQuaRE Method</article-title>
          . In: IEEE Intl.
          <article-title>Conf. on Service-Oriented Computing and Applications (SOCA)</article-title>
          . IEEE, Kauai,
          <string-name>
            <surname>HI</surname>
          </string-name>
          , USA (December 16-18
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          21.
          <string-name>
            <surname>Mili</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tremblay</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jaoude</surname>
            ,
            <given-names>G.B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lefebvre</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Elabed</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Boussaidi</surname>
            ,
            <given-names>G.E.</given-names>
          </string-name>
          :
          <article-title>Business Process Modeling Languages: Sorting Through the Alphabet Soup</article-title>
          .
          <source>ACM Comput. Surv</source>
          .
          <volume>43</volume>
          (
          <issue>1</issue>
          ), 4:
          <fpage>1</fpage>
          -
          <lpage>4</lpage>
          :
          <issue>56</issue>
          (
          <year>December 2010</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          22.
          <string-name>
            <surname>OASIS: Web Services Business Process Execution Language</surname>
          </string-name>
          (
          <year>April 2007</year>
          ),
          <year>v2</year>
          .
          <fpage>0</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          23.
          <string-name>
            <surname>Pautasso</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ferme</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Roller</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leymann</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Skouradaki</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          : Towards Workflow Benchmarking: Open Research Challenges.
          <source>In: Conf. on Database Systems for Business, Technology, and Web</source>
          . pp.
          <fpage>1</fpage>
          -
          <lpage>20</lpage>
          . Hamburg, Germany (March
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          24.
          <string-name>
            <surname>Röck</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Harrer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wirtz</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          :
          <article-title>Performance Benchmarking of BPEL Engines: A Comparison Framework, Status Quo Evaluation and Challenges</article-title>
          .
          <source>In: Proc. of the 26th Intl. Conf. on Software Engineering and Knowledge Engineering (SEKE'14)</source>
          . pp.
          <fpage>31</fpage>
          -
          <lpage>34</lpage>
          . Vancouver, Canada (
          <year>July 2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          25.
          <string-name>
            <surname>Saaty</surname>
            ,
            <given-names>T.L.</given-names>
          </string-name>
          :
          <article-title>How to make a decision: The Analytic Hierarchy Process</article-title>
          .
          <source>European Journal of Operational Research</source>
          <volume>48</volume>
          (
          <issue>1</issue>
          ),
          <fpage>9</fpage>
          -
          <lpage>26</lpage>
          (
          <year>1990</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          26.
          <string-name>
            <surname>Sim</surname>
            ,
            <given-names>S.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Easterbrook</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Holt</surname>
            ,
            <given-names>R.C.</given-names>
          </string-name>
          :
          <article-title>Using Benchmarking to Advance Research: A Challange to Software Engineering</article-title>
          . In: 25th International Conference on Software Engineering. Portland, Oregon, USA (May
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          27.
          <string-name>
            <surname>Sim</surname>
            ,
            <given-names>S.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Easterbrook</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Holt</surname>
            ,
            <given-names>R.C.</given-names>
          </string-name>
          :
          <article-title>Using benchmarking to advance research: A challenge to software engineering</article-title>
          .
          <source>In: Proceedings of the 25th International Conference on Software Engineering</source>
          . pp.
          <fpage>74</fpage>
          -
          <lpage>83</lpage>
          . IEEE Computer Society (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          28.
          <string-name>
            <surname>Skouradaki</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Roller</surname>
            ,
            <given-names>D.H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leymann</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ferme</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pautasso</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>On the Road to Benchmarking BPMN 2.0 Workflow Engines</article-title>
          .
          <source>In: Proc. of the 6th ACM/SPEC Intl. Conf. on Performance Engineering</source>
          . pp.
          <fpage>301</fpage>
          -
          <lpage>304</lpage>
          . ACM (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          29.
          <string-name>
            <surname>Triantaphyllou</surname>
          </string-name>
          , E.:
          <article-title>Multi-criteria decision making methods: a comparative study</article-title>
          , vol.
          <volume>44</volume>
          . Springer Science &amp; Business
          <string-name>
            <surname>Media</surname>
          </string-name>
          (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          30.
          <string-name>
            <surname>Weyuker</surname>
            ,
            <given-names>E.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vokolos</surname>
            ,
            <given-names>F.I.</given-names>
          </string-name>
          :
          <article-title>Experience with Performance Testing of Software Systems: Issues, an Approach, and Case Study</article-title>
          .
          <source>IEEE Trans. Softw. Eng</source>
          .
          <volume>26</volume>
          (
          <issue>12</issue>
          ),
          <fpage>1147</fpage>
          -
          <lpage>1156</lpage>
          (
          <year>December 2000</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          31.
          <string-name>
            <surname>WfMC: XML Process Definition Language</surname>
          </string-name>
          (
          <year>August 2012</year>
          ),
          <year>v2</year>
          .
          <fpage>2</fpage>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>