Write a Blog >>
ICSE 2020
Wed 24 June - Thu 16 July 2020
Wed 8 Jul 2020 16:08 - 16:16 at Goguryeo - A11-Performance and Analysis Chair(s): Pooyan Jamshidi

Automated test case generation is an effective technique to yield high-coverage test suites. While the majority of research effort has been devoted to satisfying coverage criteria, a recent trend emerged towards optimizing other non-coverage aspects. In this regard, runtime and memory usage are two essential dimensions: less expensive tests reduce the resource demands for the generation process and later regression testing phases. This study shows that performance-aware test case generation requires solving two main challenges: providing a good approximation of resource usage with minimal overhead and avoiding detrimental effects on both final coverage and fault detection effectiveness. To tackle these challenges, we conceived a set of performance proxies -inspired by previous work on performance testing- that provide a reasonable estimation of the test execution costs (i.e., runtime and memory usage). Thus, we propose an adaptive strategy, called aDynaMOSA, which leverages these proxies by extending DynaMOSA, a state-of-the-art evolutionary algorithm in unit testing. Our empirical study -involving 110 non-trivial Java classes- reveals that our adaptive approach generates test suite with statistically significant improvements in runtime (-25%) and heap memory consumption (-15%) compared to DynaMOSA. Additionally, aDynaMOSA has comparable results to DynaMOSA over seven different coverage criteria and similar fault detection effectiveness. Our empirical investigation also highlights that the usage of performance proxies (i.e., without the adaptiveness) is not sufficient to generate more performant test cases without compromising the overall coverage.

Wed 8 Jul

Displayed time zone: (UTC) Coordinated Universal Time change

16:05 - 17:05
A11-Performance and AnalysisNew Ideas and Emerging Results / Journal First / Technical Papers / Demonstrations at Goguryeo
Chair(s): Pooyan Jamshidi University of South Carolina
16:05
3m
Talk
Nimbus: Improving the Developer Experience for Serverless ApplicationsDemo
Demonstrations
Robert Chatley Imperial College London, Thomas Allerton Starling Bank
Pre-print
16:08
8m
Talk
Testing with Fewer Resources: An Adaptive Approach to Performance-Aware Test Case GenerationJ1
Journal First
Giovanni Grano University of Zurich, Christoph Laaber University of Zurich, Annibale Panichella Delft University of Technology, Sebastiano Panichella Zurich University of Applied Sciences
Link to publication DOI Pre-print
16:16
8m
Talk
What's Wrong with My Benchmark Results? Studying Bad Practices in JMH BenchmarksJ1
Journal First
Diego Costa Concordia University, Canada, Cor-Paul Bezemer University of Alberta, Canada, Philipp Leitner Chalmers University of Technology & University of Gothenburg, Artur Andrzejak Heidelberg University
16:24
12m
Talk
Towards the Use of the Readily Available Tests from the Release Pipeline as Performance Tests. Are We There Yet?ACM SIGSOFT Distinguished Paper AwardsTechnical
Technical Papers
Zishuo Ding University of Waterloo, Canada, Jinfu Chen Concordia University, Canada, Weiyi Shang Concordia University
Pre-print
16:36
8m
Talk
ModGuard: Identifying Integrity & Confidentiality Violations in Java ModulesJ1
Journal First
Andreas Dann Paderborn University, Ben Hermann Paderborn University, Eric Bodden Heinz Nixdorf Institut, Paderborn University and Fraunhofer IEM
Link to publication DOI
16:44
6m
Talk
Program Debloating via Stochastic OptimizationNIER
New Ideas and Emerging Results
Qi Xin Georgia Institute of Technology, Myeongsoo Kim Georgia Institute of Technology, Qirun Zhang Georgia Institute of Technology, USA, Alessandro Orso Georgia Tech
16:50
8m
Talk
The ORIS Tool: Quantitative Evaluation of Non-Markovian SystemsJ1
Journal First
Marco Paolieri University of Southern California, Marco Biagi University of Florence, Laura Carnevali University of Florence, Enrico Vicario University of Florence