Business Process Management, Workflow Engine, Benchmarking, Performance Evaluation, Service Oriented Architectures, Middleware, WS-BPEL, Web Service Composition
Vincenzo Ferme, Ana Ivanchikj, Cesare Pautasso (2016), Estimating the Cost for Executing Business Processes in the Cloud, in Business Process Management Forum
, Rio de JaneiroSpringer, Cham.
Marigianna Skouradaki, Vincenzo Ferme, Cesare Pautasso, Frank Leymann, Andre van Hoorn (2016), Micro-Benchmarking BPMN 2.0 Workflow Management Systems with Workflow Patterns, in 28th International Conference on Advanced Information Systems Engineering (CAISE)
, Ljubljana, SloveniaSpringer, Cham.
Vincenzo Ferme, Ana Ivanchikj, Cesare Pautasso, Marigianna Skouradaki, Frank Leymann (2016), A Container-centric Methodology for Benchmarking Workflow Management Systems, in 6th International Conference on Cloud Computing and Service Science (CLOSER 2016)
, RomeSciTePress, Setubal, Portugal.
Vincenzo Ferme, Cesare Pautasso (2016), Integrating Faban with Docker for Reliable, Repeatable and Reusable Performance Benchmarking, in 7th ACM/SPEC International Conference on Performance Engineering (ICPE2016 Demo)
, Delft, NLACM, USA.
Ana Ivanchikj, Vincenzo Ferme, Cesare Pautasso (2015), BPMeter: Web Service and Application for Static Analysis of BPMN 2.0 Collections, in 13th International Conference on Business Process Management (BPM Demos 2015)
, InnsbruckCEUR-WS.org, Aachen.
Vincenzo Ferme, Ana Ivanchikj, Cesare Pautasso (2015), A Framework for Benchmarking BPMN 2.0 Workflow Management Systems, in 13th International Conference on Business Process Management (BPM 2015)
, InnsbruckSpringer, Heidelberg.
Marigianna Skouradaki, Katharina Göerlach, Michael Hahn, Frank Leymann (2015), Application of Sub-Graph Isomorphism to Extract Reoccurring Structures from BPMN 2.0 Process Models, in IEEE Symposium on Service-Oriented System Engineering (SOSE2015)
, San FranciscoIEEE, USA.
Marigianna Skouradaki, Dieter Roller, Frank Leymann, Vincenzo Ferme, Cesare Pautasso (2015), On the Road to Benchmarking BPMN 2.0 Workflow Engines, in 6th ACM/SPEC International Conference on Performance Engineering (ICPE2015)
, Austin, TexasACM, USA.
Cesare Pautasso, Dieter Roller, Vincenzo Ferme, Frank Leymann, Marigianna Skouradaki (2015), Towards Workflow Benchmarking: Open Research Challenges, in 16. Fachtagung Datenbanksysteme für Business, Technologie und Web (BTW 2015)
, HamburgGesellschaft für Informatik, Bonn.
Marigianna Skouradaki, Dieter Roller, Frank Leymann, Vincenzo Ferme, Cesare Pautasso (2015), "BPELanon": Protect Business Processes on the Cloud, in 5th International Conference on Cloud Computing and Service Science (CLOSER 2015)
, LisbonSciTePress, Setubal, Portugal.
The goal of the BenchFlow project is to design the first benchmark for assessing and comparing the performance of workflow management systems (aka business process execution engines). Given the large number of heterogeneous systems and languages that have been proposed for modeling and executing workflows, to ensure the feasibility of this project, we will initially focus our efforts on a benchmark for standard WS-BPEL compliant Web service composition engines. Due to the inherent complexity of workflow engine architectures and the very large number of parameters affecting their performance, benchmarking workflow systems poses a number of scientific research challenges that require investigating a novel set of performance evaluation techniques. These will complement existing ones used to benchmark database management systems (e.g., TPC) or programming language compilers (e.g., SPEC), which are already well understood and have contributed to drive a very large performance improvement within the database, compiler and processor architecture industry. Workflow systems have become the platform to build composite service-oriented applications, whose performance depends on two factors: the performance of the workflow system itself and the performance of the composed services (which could lie outside of the control of the workflow). We plan to use a model-driven, self-/recursive testing approach to eliminate the impact of the external services by having them implemented as processes. The processes selected for the benchmark will be synthesized out of real-world WS-BPEL processes. Given the very large number of metrics that can be used to observe the performance of a workflow system, we aim at distilling a reduced set of performance indicators to compare different engines as well as different configurations of a given engine.