CfP: CPS-IoTBench 2021
CALL FOR PAPERS
The 4th Workshop on Benchmarking Cyber-Physical Systems and Internet of Things (CPS-IoTBench)
CPS-IoTBench 2021 is a satellite workshop of the Cyber-Physical Systems and Internet-of-Things Week (CPS-IoT Week 2021), the premier event on Cyber-Physical Systems and IoT.
IMPORTANT DATES
- Submission deadline - Feb 15, 2021 (AoE)
- Authors notification - March 17, 2021
- Camera-ready deadline - March 26, 2021
- Workshop - May 18, 2021
Research on Cyber-Physical Systems (CPS) and the Internet of Things (IoT) has bloomed over the last decade. This research has led to the development of smart systems at different scales and environments, from smart homes and cities to smart grids and factories. Significant progress has been made through contributions in areas as diverse as control, embedded and real-time systems, wireless communication, and networking.
Despite these advances, a lack of standard evaluation criteria and methodologies makes it difficult to measure and compare the utility of this progress. This problem concerns the evaluation against the state of the art in an individual area, the comparability of different integrated designs that span multiple areas (e.g., control and networking), and the applicability of tested scenarios to present and future real-world CPS/IoT applications and deployments. This state of affairs is alarming, as it significantly hinders further progress in CPS and IoT research.
The Workshop on Benchmarking Cyber-Physical Systems and Internet of Things (CPS-IoTBench) has been a forum to bring together researchers from different communities since 2018, to engage in a lively debate on all facets of rigorously evaluating and comparing CPS and IoT solutions. CPS-IoTBench provides a venue for learning about each other's challenges and evaluation methodologies and for debating future research agendas to jointly define the performance metrics and benchmarking scenarios that matter from an overall system perspective.
Call for Papers and Reproducibility Studies
For the workshop's 4th edition, which will take place in a virtual form within CPS-IoT Week, we invite researchers and practitioners from academia and industry to submit papers that focus on one of the following:
- Presenting exemplary benchmarking systems and approaches from any of the relevant communities (e.g., embedded systems, real-time systems, networking, wireless communication and control, robotics, machine learning).
- Identifying fundamental challenges and open questions in rigorous benchmarking and evaluation of CPS and IoT solutions.
- Offering a constructive critique on the current practices and state of experimental comparisons.
- Designing and evaluating benchmarking methodologies, infrastructures, or tools for cyber-physical and IoT systems.
- Benchmarking industrial standardized solutions against each other and against academic solutions.
- Presenting novel performance comparisons in the context of cyber-physical and IoT systems.
- Reporting on success stories or failures when using standard evaluation criteria.
- Describing techniques for improving the reproducibility of real-world testing of wireless systems.
- Proposing new research directions, methodologies, or tools to increase the level of reproducibility and comparability of evaluation results.
Well-reasoned arguments or preliminary evaluations are sufficient to support a paper's claims.
Unique to this year's workshop edition, we also seek contributions describing the reproduction of experimental results from published work within any of the relevant communities. Examples of this include:
- Papers that repeat prior experiments (ideally using the original source code) to show how, why, and when the methods work (or not).
- Papers that repeat prior experiments in new contexts (e.g., different application domains or using different evaluation methodologies and metrics) to further generalize and validate (or not) previous work.
- Reproducibility studies of a given scientific work for which some key aspect(s) in the experimental setup was unspecified, showing the different spectrum of solutions as a function of such aspect(s).
- Evaluations carried out by undergraduate and graduate students, experienced researchers, or industry practitioners looking to validate another piece of work before comparing their own to it.
Reports should carefully describe the experimental setup as well as how the authors made sure to carefully reproduce conditions similar to those of the original study. If there are ambiguous conditions not specified in the original study which may affect the outcome (e.g. it is not specified whether the environment is an empty room or full of people), authors are encouraged to highlight, discuss, and empirically explore the impact of changing such a condition. Regardless of the study results, the tone of the report must be constructive and not offensive to the original authors. Prospective authors are invited to contact the original authors of the examined study and let them provide a short discussion paragraph that can be included in the final section within the report. Submissions from the same authors of the reproduced experiments will not be accepted.
Submission Instructions
Submitted papers must contain at most 6 pages (US letter, 9pt font size, double-column format, following the ACM master article template), including all figures, tables, and references. All submissions must be written in English and should contain the authors' names, affiliations, and contact information.
Accepted papers will be published in the ACM Digital Library as part of the CPS-IoT Week 2021 proceedings. Authors of accepted papers are expected to present their work in a plenary session as part of the main workshop program.
The submission website is available at https://cps-iotbench2021.hotcrp.com/.