Braniborská brána Berlin, Mitte, Jägerstraße, Berlin-Brandenburgische Akademie der Wissenschaften 01 Berliner Dom von der Spree aus gesehen

Workshop Agenda

09:00 - 09:05Welcome & Introduction
 
09:05 - 10:00Towards Just-in-time Mitigation of Performance Regression Introducing Changes
Weiyi (Ian) Shang
 
10:00 - 10:30CC4CS: an Off-the-Shelf Unifying Statement-Level Performance Metric for HW/SW Technologies
Luigi Pomante, Giacomo Valente, Vittoriano Muttillo, Fabio Salice, Fausto D'Antonio and Vincenzo Stoico
 
Coffee Break
 
11:00 - 11:30Better Early Than Never: Performance Test Acceleration by Regression Test Selection
David Georg Reichelt and Stefan Kühne
 
11:30 - 12:00A Workload-Dependent Performance Analysis of an In-Memory Database in a Multi-Tenant Configuration
Dominik Paluch, Harald Kienegger and Helmut Krcmar
 
12:00 - 12:30JMeter vs. Gatling
Marius Oehler and Mario Mann
 
Lunch Break
(The afternoon program will be shared with the QUDOS workshop.)
 
13:30 - 14:30Model-Driven Software Performance Engineering - Challenges and Ways ahead
Manoj Nambiar
 
14:30 - 15:00Towards Automating Representative Load Testing in Continuous Software Engineering
Henning Schulz, Tobias Angerstein and Andre van Hoorn
 
Coffee Break
 
15:30 - 16:00A Cloud Benchmark Suite Combining Micro and Applications Benchmarks
Joel Scheuner and Philipp Leitner
 
16:00 - 17:00Joint panel discussion


Call for papers

Software systems (e.g., smartphone apps, desktop applications, e-commerce systems, IoT infrastructures, big data systems, and enterprise systems, etc.) have strict requirements on software performance. Failure to meet these requirements will cause customer dissatisfaction and negative news coverage. In addition to conventional functional testing, the performance of these systems must be verified through load testing or benchmarking to ensure quality service. Load testing examines the behavior of a system by simulating hundreds or thousands of users performing tasks at the same time. Benchmarking evaluates a system's performance and allows to optimize system configurations or compare the system with similar systems in the domain.

Load testing and benchmarking software systems are difficult tasks, which requires a great understanding of the system under test and customer behavior. Practitioners face many challenges such as tooling (choosing and implementing the testing tools), environments (software and hardware setup) and time (limited time to design, test, and analyze). This one-day workshop brings together software testing researchers, practitioners and tool developers to discuss the challenges and opportunities of conducting research on load testing and benchmarking software systems.

We solicit the following two tracks of submissions: research papers (maximum 4 pages) and presentation track for industry or experience talks (maximum 700 words extended abstract). Technical papers should follow the standard ACM SIG proceedings format and need to be submitted electronically via EasyChair. Short abstracts for the presentation track need to be submitted as "abstract only" submissions via EasyChair. Accepted technical papers will be published in the ICPE 2018 Proceedings. Materials from the presentation track will not be published in the ICPE 2018 proceedings, but will be made available on the workshop website. Submitted papers can be research papers, position papers, case studies or experience reports addressing issues including but not limited to the following:


Important Dates

Research papers: Jan. 13, 2018 Jan. 20, 2018
Presentation track: Mar. 2, 2018
Paper notification: Feb. 8, 2018
Presentation notification: Mar. 9, 2018
Workshop date: Apr. 10, 2018


Organization

Organizers

Johannes Kroß fortiss GmbH, Germany
Cor-Paul Bezemer Queen's University, Canada

Program Committee

Adams, Bram Polytechnique Montreal, Canada
Bergel, Alexandre University of Chile, Chile
Brunnert, Andreas RETIT GmbH, Germany
Csallner, Christoph University of Texas at Arlington, USA
Eichelberger, Holger University of Hildesheim, Germany
Franks, Greg Carleton University, Canada
Garousi, Vahid University of Luxembourg, Luxembourg
Ghaith, Shadi IBM, Ireland
Heinrich, Robert Karlsruher Institute of Technology, Germany
van Hoorn, André University of Stuttgart, Germany
Horrox, Robert EMC Isilon, USA
Jamshidi, Pooyan Carnegie Mellon University, USA
Jiang, Zhen Ming (Jack) York University, Canada
Podelko, Alexander Oracle, USA
Shang, Weiyi Concordia University, Canada
Sunyé, Gerson University of Nantes, France

Steering Committee

Ahmed E. Hassan Queen’s University, Canada
Marin Litoiu York University, Canada
Jiang, Zhen Ming (Jack) York University, Canada


Past LT Workshops