The performance of software systems (such as speed, memory usage, correct identification rate) tends to be an evermore important concern, often nowadays on par with functional correctness for critical systems.Systematically testing these performance concerns is however extremely difficult, in particular because there exists no theory underpinning the evaluation of a performance test suite, i.e., to tell the software developer whether such a test suite is “good enough” or even whether a test suite is better than another one. This paper proposes to apply Multimorphic testing and empirically assess the effectiveness of performance test suites of software systems coming from various domains. By analogy with mutation testing, our core idea is to leverage the typical configurability of these systems, and to check whether it makes any difference in the outcome of the tests: i.e., are some tests able to “kill” underperforming system configurations? More precisely, we propose a framework for defining and evaluating the coverage of a test suite with respect to a quantitative property of interest. Such properties can be the execution time, the memory usage or the success rate in tasks performed by a software system. This framework can be used to assess whether a new test case is worth adding to a test suite or to select an optimal test suite with respect to a property of interest. We evaluate several aspects of our proposal through 3 empirical studies carried out in different fields: object tracking in videos, object recognition in images, and code generators.
Wed 8 JulDisplayed time zone: (UTC) Coordinated Universal Time change
16:05 - 17:05 | A12-TestingJournal First / New Ideas and Emerging Results / Demonstrations / Technical Papers at Silla Chair(s): Sasa Misailovic University of Illinois at Urbana-Champaign | ||
16:05 12mTalk | Practical Fault Detection in Puppet ProgramsTechnical Technical Papers Thodoris Sotiropoulos Athens University of Economics and Business, Dimitris Mitropoulos Athens University of Economics and Business, Diomidis Spinellis Athens University of Economics and Business | ||
16:17 8mTalk | Empirical Assessment of Multimorphic TestingJ1 Journal First Paul Temple PReCISE, NaDi, UNamur, Mathieu Acher (Univ Rennes, Inria, IRISA), Jean-Marc Jézéquel Univ Rennes - IRISA | ||
16:25 3mTalk | RTj: a Java framework for detecting and refactoring rotten green test casesDemo Demonstrations Matias Martinez Université Polytechnique Hauts-de-France, Anne Etien Université de Lille, CNRS, Inria, Centrale Lille, UMR 9189 –CRIStAL, Stéphane Ducasse INRIA Lille, Christopher Fuhrman École de technologie supérieure Pre-print Media Attached | ||
16:28 6mTalk | A Container-Based Infrastructure for Fuzzy-Driven Root Causing of Flaky TestsNIER New Ideas and Emerging Results Valerio Terragni Università della Svizzera Italiana, Pasquale Salza University of Zurich, Filomena Ferrucci University of Salerno Pre-print Media Attached | ||
16:34 12mTalk | Learning from, Understanding, and Supporting DevOps Artifacts for DockerTechnical Technical Papers Jordan Henkel University of Wisconsin–Madison, Christian Bird Microsoft Research, Shuvendu K. Lahiri Microsoft Research, Thomas Reps University of Wisconsin-Madison, USA | ||
16:46 8mTalk | Improving Change Prediction Models with Code Smell-Related InformationJ1 Journal First Gemma Catolino Delft University of Technology, Fabio Palomba University of Salerno, Francesca Arcelli Fontana University of Milano-Bicocca, Andrea De Lucia University of Salerno, Andy Zaidman TU Delft, Filomena Ferrucci University of Salerno DOI Pre-print | ||
16:54 3mTalk | SMRL: A Metamorphic Security Testing Tool for Web SystemsDemo Demonstrations Phu X. Mai University of Luxembourg, Arda Goknil SnT, University of Luxembourg, Fabrizio Pastore University of Luxembourg, Lionel C. Briand SnT Centre/University of Luxembourg |