Visible to the public Biblio

Filters: Author is Rothberg, Valentin  [Clear All Filters]
2017-09-26
Rothberg, Valentin, Dietrich, Christian, Ziegler, Andreas, Lohmann, Daniel.  2016.  Towards Scalable Configuration Testing in Variable Software. Proceedings of the 2016 ACM SIGPLAN International Conference on Generative Programming: Concepts and Experiences. :156–167.

Testing a software product line such as Linux implies building the source with different configurations. Manual approaches to generate configurations that enable code of interest are doomed to fail due to the high amount of variation points distributed over the feature model, the build system and the source code. Research has proposed various approaches to generate covering configurations, but the algorithms show many drawbacks related to run-time, exhaustiveness and the amount of generated configurations. Hence, analyzing an entire Linux source can yield more than 30 thousand configurations and thereby exceeds the limited budget and resources for build testing. In this paper, we present an approach to fill the gap between a systematic generation of configurations and the necessity to fully build software in order to test it. By merging previously generated configurations, we reduce the number of necessary builds and enable global variability-aware testing. We reduce the problem of merging configurations to finding maximum cliques in a graph. We evaluate the approach on the Linux kernel, compare the results to common practices in industry, and show that our implementation scales even when facing graphs with millions of edges.

2017-03-07
Ziegler, Andreas, Rothberg, Valentin, Lohmann, Daniel.  2016.  Analyzing the Impact of Feature Changes in Linux. Proceedings of the Tenth International Workshop on Variability Modelling of Software-intensive Systems. :25–32.

In a software project as large and as rapidly evolving as the Linux kernel, automated testing systems are an integral component to the development process. Extensive build and regression tests can catch potential problems in changes before they appear in a stable release. Current systems, however, do not systematically incorporate the configuration system Kconfig. In this work, we present an approach to identify relationships between configuration options. These relationships allow us to find source files which might be affected by a change to a configuration option and hence require retesting. Our findings show that the majority of configuration options only affects few files, while very few options influence almost all files in the code base. We further observe that developers sometimes value usability over clean dependency modelling, leading to counterintuitive outliers in our results.