Visible to the public Biblio

Filters: Keyword is runtime verification  [Clear All Filters]
2020-11-02
Pinisetty, Srinivas, Schneider, Gerardo, Sands, David.  2018.  Runtime Verification of Hyperproperties for Deterministic Programs. 2018 IEEE/ACM 6th International FME Workshop on Formal Methods in Software Engineering (FormaliSE). :20—29.
In this paper, we consider the runtime verification problem of safety hyperproperties for deterministic programs. Several security and information-flow policies such as data minimality, non-interference, integrity, and software doping are naturally expressed formally as safety hyperproperties. Although there are monitoring results for hyperproperties, the algorithms are very complex since these are properties over set of traces, and not over single traces. For the deterministic input-output programs that we consider, and the specific safety hyperproperties we are interested in, the problem can be reduced to monitoring of trace properties. In this paper, we present a simpler monitoring approach for safety hyperproperties of deterministic programs. The approach involves transforming the given safety hyperproperty into a trace property, extracting a characteristic predicate for the given hyperproperty, and providing a parametric monitor taking such predicate as parameter. For any hyperproperty in the considered subclass, we show how runtime verification monitors can be synthesised. We have implemented our approach in the form of a parameterised monitor for the given class, and have applied it to a number of hyperproperties including data minimisation, non-interference, integrity and software doping. We show results concerning both offline and online monitoring.
2019-03-06
Aniculaesei, Adina, Grieser, Jörg, Rausch, Andreas, Rehfeldt, Karina, Warnecke, Tim.  2018.  Towards a Holistic Software Systems Engineering Approach for Dependable Autonomous Systems. Proceedings of the 1st International Workshop on Software Engineering for AI in Autonomous Systems. :23-30.

Autonomous systems are gaining momentum in various application domains, such as autonomous vehicles, autonomous transport robotics and self-adaptation in smart homes. Product liability regulations impose high standards on manufacturers of such systems with respect to dependability (safety, security and privacy). Today's conventional engineering methods are not adequate for providing guarantees with respect to dependability requirements in a cost-efficient manner, e.g. road tests in the automotive industry sum up millions of miles before a system can be considered sufficiently safe. System engineers will no longer be able to test and respectively formally verify autonomous systems during development time in order to guarantee the dependability requirements in advance. In this vision paper, we introduce a new holistic software systems engineering approach for autonomous systems, which integrates development time methods as well as operation time techniques. With this approach, we aim to give the users a transparent view of the confidence level of the autonomous system under use with respect to the dependability requirements. We present already obtained results and point out research goals to be addressed in the future.