Agile Processes; Continuous Software Development; DevOps; Microservices; Performance Testing
Rosinosky Guillaume, Labba Chahrazed, Ferme Vincenzo, Youcef Samir, Charoy Françcois, PautassoCesare (2018), Evaluating Multi-tenant Live Migrations Effects on Performance, in On the Move to Meaningful Internet Systems. OTM 2018 Conferences
, 61-77, Springer International Publishing, Malta61-77.
Avritzer Alberto, Ferme Vincenzo, Janes Andrea, Russo Barbara, Schulz Henning, van Hoorn André (2018), A Quantitative Approach for the Assessment of Microservice Architecture Deployment Alternatives by Automated Performance Testing, in Software Architecture
, 159-174, Springer International Publishing, Madrid159-174.
Bezemer C.-P., Eismann S., Ferme V., Grohmann J., Heinrich R., Jamshidi P., Shang W., van Hoorn A., Villaviencio M., Walter J., Willnecker F. (2018), How is Performance Addressed in DevOps? A Survey on Industrial Practices
, arXiv, arXiv.
Klinaku Floriment, FermeVincenzo, Towards Generating Elastic Microservices: A Declarative Specification for Consistent Elasticity Configurations, in Euromicro Conference on Software Engineering and Advanced Applications
, IEEE, Prague.
Modern software development processes, such as DevOps, and Microservice architectures, decrease the time from development to production. It is important to integrate performance testing in this context, and several techniques for continuously verifying software performance have already been introduced. These techniques usually provide only specification and automation of basic performance tests, and do not leverage the software performance knowledge generated in continuous lifecycles to speed-up software performance testing. During my PhD, I have been working on automated benchmarking of Workflow Management Systems, thus developing a declarative domain specific language (DSL), where the users can specify their performance testing goal(s), and a model-driven framework for automated performance testing of such systems. In this proposal we plan to extend both the DSL and the framework to enable automated benchmarking of Microservices. We intend to enhance the expressiveness of the DSL by adding new performance goals suited to the new domain. These goals are to be automated by the framework that we are going to integrate with the DevOps continuous development lifecycle. By leveraging the integration, we also plan to enable the framework to collect, analyze and model the relevant lifecycle data, so that it can be leveraged in order to speed-up and enhance performance testing. To assess to which extent the framework can perform automated goal-driven performance testing for Microservices in DevOps, we will evaluate the framework with real-world use cases provided by our hosts at the University of Stuttgart.