Biblio
A critical need exists for collaboration and action by government, industry, and academia to address cyber weaknesses or vulnerabilities inherent to embedded or cyber physical systems (CPS). These vulnerabilities are introduced as we leverage technologies, methods, products, and services from the global supply chain throughout a system's lifecycle. As adversaries are exploiting these weaknesses as access points for malicious purposes, solutions for system security and resilience become a priority call for action. The SAE G-32 Cyber Physical Systems Security Committee has been convened to address this complex challenge. The SAE G-32 will take a holistic systems engineering approach to integrate system security considerations to develop a Cyber Physical System Security Framework. This framework is intended to bring together multiple industries and develop a method and common language which will enable us to more effectively, efficiently, and consistently communicate a risk, cost, and performance trade space. The standard will allow System Integrators to make decisions utilizing a common framework and language to develop affordable, trustworthy, resilient, and secure systems.
When employing biometric recognition systems, we have to take into account that biometric data are considered sensitive data. This has raised some privacy issues, and therefore secure systems providing template protection are required. Using homomorphic encryption, permanent protection can be ensured, since templates are stored and compared in the encrypted domain. In addition, the unprotected system's accuracy is preserved. To solve the problem of the computational overload linked to the encryption scheme, we present an early decision making strategy for iris-codes. In order to improve the recognition accuracy, the most consistent bits of the iris-code are moved to the beginning of the template. This allows an accurate block-wise comparison, thereby reducing the execution time. Hence, the resulting system grants template protection in a computationally efficient way. More specifically, in the experimental evaluation in identification mode, the block-wise comparison achieves a 92% speed-up on the IITD database with 300 enrolled templates.
Software systems nowadays communicate via a number of complex languages. This is often the cause of security vulnerabilities like arbitrary code execution, or injections. Whereby injections such as cross-site scripting are widely known from textual languages such as HTML and JSON that constantly gain more popularity. These systems use parsers to read input and unparsers write output, where these security vulnerabilities arise. Therefore correct parsing and unparsing of messages is of the utmost importance when developing secure and reliable systems. Part of the challenge developers face is to correctly encode data during unparsing and decode it during parsing. This paper presents McHammerCoder, an (un)parser and encoding generator supporting textual and binary languages. Those (un)parsers automatically apply the generated encoding, that is derived from the language's grammar. Therefore manually defining and applying encoding is not required to effectively prevent injections when using McHammerCoder. By specifying the communication language within a grammar, McHammerCoder provides developers with correct input and output handling code for their custom language.
Building secure systems used to mean ensuring a secure perimeter, but that is no longer the case. Today's systems are ill-equipped to deal with attackers that are able to pierce perimeter defenses. Data provenance is a critical technology in building resilient systems that will allow systems to recover from attackers that manage to overcome the "hard-shell" defenses. In this paper, we provide background information on data provenance, details on provenance collection, analysis, and storage techniques and challenges. Data provenance is situated to address the challenging problem of allowing a system to "fight-through" an attack, and we help to identify necessary work to ensure that future systems are resilient.
Sensors of diverse capabilities and modalities, carried by us or deeply embedded in the physical world, have invaded our personal, social, work, and urban spaces. Our relationship with these sensors is a complicated one. On the one hand, these sensors collect rich data that are shared and disseminated, often initiated by us, with a broad array of service providers, interest groups, friends, and family. Embedded in this data is information that can be used to algorithmically construct a virtual biography of our activities, revealing intimate behaviors and lifestyle patterns. On the other hand, we and the services we use, increasingly depend directly and indirectly on information originating from these sensors for making a variety of decisions, both routine and critical, in our lives. The quality of these decisions and our confidence in them depend directly on the quality of the sensory information and our trust in the sources. Sophisticated adversaries, benefiting from the same technology advances as the sensing systems, can manipulate sensory sources and analyze data in subtle ways to extract sensitive knowledge, cause erroneous inferences, and subvert decisions. The consequences of these compromises will only amplify as our society increasingly complex human-cyber-physical systems with increased reliance on sensory information and real-time decision cycles.Drawing upon examples of this two-faceted relationship with sensors in applications such as mobile health and sustainable buildings, this talk will discuss the challenges inherent in designing a sensor information flow and processing architecture that is sensitive to the concerns of both producers and consumer. For the pervasive sensing infrastructure to be trusted by both, it must be robust to active adversaries who are deceptively extracting private information, manipulating beliefs and subverting decisions. While completely solving these challenges would require a new science of resilient, secure and trustworthy networked sensing and decision systems that would combine hitherto disciplines of distributed embedded systems, network science, control theory, security, behavioral science, and game theory, this talk will provide some initial ideas. These include an approach to enabling privacy-utility trade-offs that balance the tension between risk of information sharing to the producer and the value of information sharing to the consumer, and method to secure systems against physical manipulation of sensed information.
Sensors of diverse capabilities and modalities, carried by us or deeply embedded in the physical world, have invaded our personal, social, work, and urban spaces. Our relationship with these sensors is a complicated one. On the one hand, these sensors collect rich data that are shared and disseminated, often initiated by us, with a broad array of service providers, interest groups, friends, and family. Embedded in this data is information that can be used to algorithmically construct a virtual biography of our activities, revealing intimate behaviors and lifestyle patterns. On the other hand, we and the services we use, increasingly depend directly and indirectly on information originating from these sensors for making a variety of decisions, both routine and critical, in our lives. The quality of these decisions and our confidence in them depend directly on the quality of the sensory information and our trust in the sources. Sophisticated adversaries, benefiting from the same technology advances as the sensing systems, can manipulate sensory sources and analyze data in subtle ways to extract sensitive knowledge, cause erroneous inferences, and subvert decisions. The consequences of these compromises will only amplify as our society increasingly complex human-cyber-physical systems with increased reliance on sensory information and real-time decision cycles.Drawing upon examples of this two-faceted relationship with sensors in applications such as mobile health and sustainable buildings, this talk will discuss the challenges inherent in designing a sensor information flow and processing architecture that is sensitive to the concerns of both producers and consumer. For the pervasive sensing infrastructure to be trusted by both, it must be robust to active adversaries who are deceptively extracting private information, manipulating beliefs and subverting decisions. While completely solving these challenges would require a new science of resilient, secure and trustworthy networked sensing and decision systems that would combine hitherto disciplines of distributed embedded systems, network science, control theory, security, behavioral science, and game theory, this talk will provide some initial ideas. These include an approach to enabling privacy-utility trade-offs that balance the tension between risk of information sharing to the producer and the value of information sharing to the consumer, and method to secure systems against physical manipulation of sensed information.