Biblio
The problems of random numbers application to the information security of data, communication lines, computer units and automated driving systems are considered. The possibilities for making up quantum generators of random numbers and existing solutions for acquiring of sufficiently random sequences are analyzed. The authors found out the method for the creation of quantum generators on the basis of semiconductor electronic components. The electron-quantum generator based on electrons tunneling is experimentally demonstrated. It is shown that it is able to create random sequences of high security level and satisfying known NIST statistical tests (P-Value\textbackslashtextgreater0.9). The generator created can be used for formation of both closed and open cryptographic keys in computer systems and other platforms and has great potential for realization of random walks and probabilistic computing on the basis of neural nets and other IT problems.
Despite the benefits offered by smart grids, energy producers, distributors and consumers are increasingly concerned about possible security and privacy threats. These threats typically manifest themselves at runtime as new usage scenarios arise and vulnerabilities are discovered. Adaptive security and privacy promise to address these threats by increasing awareness and automating prevention, detection and recovery from security and privacy requirements' failures at runtime by re-configuring system controls and perhaps even changing requirements. This paper discusses the need for adaptive security and privacy in smart grids by presenting some motivating scenarios. We then outline some research issues that arise in engineering adaptive security. We particularly scrutinize published reports by NIST on smart grid security and privacy as the basis for our discussions.
Part of our team proposed a new steganalytic method based on NIST tests at MMM-ACNS 2017 [1], and it was encouraged to investigate some cipher modifications to prevent such types of steganalysis. In the current paper, we propose one cipher modification based on decompression by arithmetic source compression coding. The experiment shows that the current proposed method allows to protect stegosystems against steganalysis based on NIST tests, while security of the encrypted embedded messages is kept. Protection of contemporary image steganography based on edge detection and modified LSB against NIST tests steganalysis is also presented.
Generally, methods of authentication and identification utilized in asserting users' credentials directly affect security of offered services. In a federated environment, service owners must trust external credentials and make access control decisions based on Assurance Information received from remote Identity Providers (IdPs). Communities (e.g. NIST, IETF and etc.) have tried to provide a coherent and justifiable architecture in order to evaluate Assurance Information and define Assurance Levels (AL). Expensive deployment, limited service owners' authority to define their own requirements and lack of compatibility between heterogeneous existing standards can be considered as some of the unsolved concerns that hinder developers to openly accept published works. By assessing the advantages and disadvantages of well-known models, a comprehensive, flexible and compatible solution is proposed to value and deploy assurance levels through a central entity called Proxy.
Several operational and economic factors impact the patching decisions of critical infrastructures. The constraints imposed by such factors could prevent organizations from fully remedying all of the vulnerabilities that expose their (critical) assets to risk. Therefore, an involved decision maker (e.g. security officer) has to strategically decide on the allocation of possible remediation efforts towards minimizing the inherent security risk. This, however, involves the use of comparative judgments to prioritize risks and remediation actions. Throughout this work, the security risk is quantified using the security metric Time-To-Compromise (TTC). Our main contribution is to provide a generic TTC estimator to comparatively assess the security posture of computer networks taking into account interdependencies between the network components, different adversary skill levels, and characteristics of (known and zero-day) vulnerabilities. The presented estimator relies on a stochastic TTC model and Monte Carlo simulation (MCS) techniques to account for the input data variability and inherent prediction uncertainties.
With the advancement of Technology, the existing electric grids are shifting towards smart grid. The smart grids are meant to be effective in power management, secure and safe in communication and more importantly, it is favourable to the environment. The smart grid is having huge architecture it includes various stakeholders that encounter challenges in the name of authorisation and authentication. The smart grid has another important issue to deal with that is securing the communication from varieties of cyber-attacks. In this paper, we first discussed about the challenges in the smart grid data communication and later we surveyed the existing cryptographic algorithm and presented comparative work on certain factors for existing working cryptographic algorithms This work gives insight conclusion to improve the working scheme for data security and Privacy preservation of customer who is one of the stack holders. Finally, with the comparative work, we suggest a direction of future work on improvement of working algorithms for secure and safe data communication in a smart grid.
When implemented on real systems, cryptographic algorithms are vulnerable to attacks observing their execution behavior, such as cache-timing attacks. Designing protected implementations must be done with knowledge and validation tools as early as possible in the development cycle. In this article we propose a methodology to assess the robustness of the candidates for the NIST post-quantum standardization project to cache-timing attacks. To this end we have developed a dedicated vulnerability research tool. It performs a static analysis with tainting propagation of sensitive variables across the source code and detects leakage patterns. We use it to assess the security of the NIST post-quantum cryptography project submissions. Our results show that more than 80% of the analyzed implementations have at least one potential flaw, and three submissions total more than 1000 reported flaws each. Finally, this comprehensive study of the competitors security allows us to identify the most frequent weaknesses amongst candidates and how they might be fixed.
In this work NTRU-like cryptosystem NTRU Prime IIT Ukraine, which is created on the basis of existing cryptographic transformations end-to-end encryption type, is considered. The description of this cryptosystem is given and its analysis is carried out. Also, features of its implementation, comparison of the main characteristics and indicators, as well as the definition of differences from existing NTRU-like cryptographic algorithms are presented. Conclusions are made and recommendations are given.
Cyber-attacks and intrusions in cyber-physical control systems are, currently, difficult to reliably prevent. Knowing a system's vulnerabilities and implementing static mitigations is not enough, since threats are advancing faster than the pace at which static cyber solutions can counteract. Accordingly, the practice of cybersecurity needs to ensure that intrusion and compromise do not result in system or environment damage or loss. In a previous paper [2], we described the Cyberspace Security Econometrics System (CSES), which is a stakeholder-aware and economics-based risk assessment method for cybersecurity. CSES allows an analyst to assess a system in terms of estimated loss resulting from security breakdowns. In this paper, we describe two new related contributions: 1) We map the Cyberspace Security Econometrics System (CSES) method to the evaluation and mitigation steps described by the NIST Guide to Industrial Control Systems (ICS) Security, Special Publication 800-82r2. Hence, presenting an economics-based and stakeholder-aware risk evaluation method for the implementation of the NIST-SP-800-82 guide; and 2) We describe the application of this tailored method through the use of a fictitious example of a critical infrastructure system of an electric and gas utility.
This paper presents a true random number generator that exploits the subthreshold properties of jitter of events propagating in a self-timed ring and jitter of events propagating in an inverter based ring oscillator. Design was implemented in 180nm CMOS flash process. Devices provide high quality random bit sequences passing FIPS 140-2 and NIST SP 800-22 statistical tests which guaranty uniform distribution and unpredictability thanks to the physics based entropy source.
In this paper, machine learning attacks are performed on a novel hybrid delay based Arbiter Ring Oscillator PUF (AROPUF). The AROPUF exhibits improved results when compared to traditional Arbiter Physical Unclonable Function (APUF). The challenge-response pairs (CRPs) from both PUFs are fed to the multilayered perceptron model (MLP) with one hidden layer. The results show that the CRPs generated from the proposed AROPUF has more training and prediction errors when compared to the APUF, thus making it more difficult for the adversary to predict the CRPs.
The paper considers the general structure of Pseudo-random binary sequence generator based on the numerical solution of chaotic differential equations. The proposed generator architecture divides the generation process in two stages: numerical simulation of the chaotic system and converting the resulting sequence to a binary form. The new method of calculation of normalization factor is applied to the conversion of state variables values to the binary sequence. Numerical solution of chaotic ODEs is implemented using semi-implicit symmetric composition D-method. Experimental study considers Thomas and Rössler attractors as test chaotic systems. Properties verification for the output sequences of generators is carried out using correlation analysis methods and NIST statistical test suite. It is shown that output sequences of investigated generators have statistical and correlation characteristics that are specific for the random sequences. The obtained results can be used in cryptography applications as well as in secure communication systems design.
The extremely rapid development of the Internet of Things brings growing attention to the information security issue. Realization of cryptographically strong pseudo random number generators (PRNGs), is crucial in securing sensitive data. They play an important role in cryptography and in network security applications. In this paper, we realize a comparative study of two pseudo chaotic number generators (PCNGs). The First pseudo chaotic number generator (PCNG1) is based on two nonlinear recursive filters of order one using a Skew Tent map (STmap) and a Piece-Wise Linear Chaotic map (PWLCmap) as non linear functions. The second pseudo chaotic number generator (PCNG2) consists of four coupled chaotic maps, namely: PWLCmaps, STmap, Logistic map by means a binary diffusion matrix [D]. A comparative analysis of the performance in terms of computation time (Generation time, Bit rate and Number of needed cycles to generate one byte) and security of the two PCNGs is carried out.
Smart grid is an evolving new power system framework with ICT driven power equipment massively layered structure. The new generation sensors, smart meters and electronic devices are integral components of smart grid. However, the upcoming deployment of smart devices at different layers followed by their integration with communication networks may introduce cyber threats. The interdependencies of various subsystems functioning in the smart grid, if affected by cyber-attack, may be vulnerable and greatly reduce efficiency and reliability due to any one of the device not responding in real time frame. The cyber security vulnerabilities become even more evident due to the existing superannuated cyber infrastructure. This paper presents a critical review on expected cyber security threats in complex environment and addresses the grave concern of a secure cyber infrastructure and related developments. An extensive review on the cyber security objectives and requirements along with the risk evaluation process has been undertaken. The paper analyses confidentiality and privacy issues of entire components of smart power system. A critical evaluation on upcoming challenges with innovative research concerns is highlighted to achieve a roadmap of an immune smart grid infrastructure. This will further facilitate R&d; associated developments.
The American National Standards Institute (ANSI) has standardized an access control approach, Next Generation Access Control (NGAC), that enables simultaneous instantiation of multiple access control policies. For large complex enterprises this is critical to limiting the authorized access of insiders. However, the specifications describe the required access control capabilities but not the related algorithms. While appropriate, this leave open the important question as to whether or not NGAC is scalable. Existing cubic reference implementations indicate that it does not. For example, the primary NGAC reference implementation took several minutes to simply display the set of files accessible to a user on a moderately sized system. To solve this problem we provide an efficient access control decision algorithm, reducing the overall complexity from cubic to linear. Our other major contribution is to provide a novel mechanism for administrators and users to review allowed access rights. We provide an interface that appears to be a simple file directory hierarchy but in reality is an automatically generated structure abstracted from the underlying access control graph that works with any set of simultaneously instantiated access control policies. Our work thus provides the first efficient implementation of NGAC while enabling user privilege review through a novel visualization approach. These capabilities help limit insider access to information (and thereby limit information leakage) by enabling the efficient simultaneous instantiation of multiple access control policies.
Information security management is time-consuming and error-prone. Apart from day-to-day operations, organizations need to comply with industrial regulations or government directives. Thus, organizations are looking for security tools to automate security management tasks and daily operations. Security Content Automation Protocol (SCAP) is a suite of specifications that help to automate security management tasks such as vulnerability measurement and policy compliance evaluation. SCAP benchmark provides detailed guidance on setting the security configuration of network devices, operating systems, and applications. Organizations can use SCAP benchmark to perform automated configuration compliance assessment on network devices, operating systems, and applications. This paper discusses SCAP benchmark components and the development of a SCAP benchmark for automating Cisco router security configuration compliance.
EPC Gen2 tags are working as international RFID standards for the use in the supply chain worldwide, such tags are computationally weak devices and unable to perform even basic symmetric-key cryptographic operations. For this reason, to implement robust and secure pseudo-random number generators (PRNG) is a challenging issue for low-cost Radio-frequency identification (RFID) tags. In this paper, we study the security of LFSR-based PRNG implemented on EPC Gen2 tags and exploit LFSR-based PRNG to provide a better constructions. We provide a cryptanalysis against the J3Gen which is LFSR-based PRNG and proposed by Sugei et al. [1], [2] for EPC Gen2 tags using distinguish attack and make observations on its input using NIST randomness test. We also test the PRNG in EPC Gen2 RFID Tags by using the NIST SP800-22. As a counter-measure, we propose two modified models based on the security analysis results. We show that our results perform better than J3Gen in terms of computational and statistical property.
Strength of security and privacy of any cryptographic mechanisms that use random numbers require that the random numbers generated have two important properties namely 1. Uniform distribution and 2. Independence. With the growth of Internet many devices are connected to Internet that host sensors. One idea proposed is to use sensor data as seed for Random Number Generator (RNG) since sensors measure the physical phenomena that exhibit randomness over time. The random numbers generated from sensor data can be used for cryptographic algorithms in Internet activities. These sensor data also pose weaknesses where sensors may be under adversarial control that may lead to generating expected random sequence which breaks the security and privacy. This paper proposes a wash-rinse-spin approach to process the raw sensor data that increases randomness in the seed value. The generated sequences from two sensors are combined by Decimation method to improve unpredictability. This makes the sensor data to be more secure in generating random numbers preventing attackers from knowing the random sequence through adversarial control.
The need for cyber security professionals continues to grow and education systems are responding in a variety of way. The US government has weighed in with two efforts, the NICE effort led by NIST and the CAE effort jointly led by NSA and DHS. Industry has unfilled needs and the CAE program is changing to meet both NICE and industry needs. This paper analyzes these efforts and examines several critical, yet unaddressed issues facing school programs as they adapt to new criteria and guidelines. Technical issues are easy to enumerate, yet it is the programmatic and student success factors that will define successful programs.
App vetting is the process of approving or rejecting an app prior to deployment on a mobile device. • The decision to approve or reject an app is based on the organization's security requirements and the type and severity of security vulnerabilities found in the app. • Security vulnerabilities including Cross Site Scripting (XSS), information leakage, authentication and authorization, session management, and SQL injection can be exploited to steal information or control a device.
In recent years, Attribute Based Access Control (ABAC) has evolved as the preferred logical access control methodology in the Department of Defense and Intelligence Community, as well as many other agencies across the federal government. Gartner recently predicted that “by 2020, 70% of enterprises will use attribute-based access control (ABAC) as the dominant mechanism to protect critical assets, up from less that 5% today.” A definition and introduction to ABAC can be found in NIST Special Publication 800-162, Guide to Attribute Based Access Control (ABAC) Definition and Considerations and Intelligence Community Policy Guidance (ICPG) 500.2, Attribute-Based Authorization and Access Management. Within ABAC, attributes are used to make critical access control decisions, yet standards for attribute assurance have just started to be researched and documented. This presentation outlines factors influencing attributes that an authoritative body must address when standardizing attribute assurance and proposes some notional implementation suggestions for consideration. Attribute Assurance brings a level of confidence to attributes that is similar to levels of assurance for authentication (e.g., guidelines specified in NIST SP 800-63 and OMB M-04-04). There are three principal areas of interest when considering factors related to Attribute Assurance. Accuracy establishes the policy and technical underpinnings for semantically and syntactically correct descriptions of Subjects, Objects, or Environmental conditions. Interoperability considers different standards and protocols used for secure sharing of attributes between systems in order to avoid compromising the integrity and confidentiality of the attributes or exposing vulnerabilities in provider or relying systems or entities. Availability ensures that the update and retrieval of attributes satisfy the application to which the ABAC system is applied. In addition, the security and backup capability of attribute repositories need to be considered. Similar to a Level of Assurance (LOA), a Level of Attribute Assurance (LOAA) assures a relying party that the attribute value received from an Attribute Provider (AP) is accurately associated with the subject, resource, or environmental condition to which it applies. An Attribute Provider (AP) is any person or system that provides subject, object (or resource), or environmental attributes to relying parties regardless of transmission method. The AP may be the original, authoritative source (e.g., an Applicant). The AP may also receive information from an authoritative source for repacking or store-and-forward (e.g., an employee database) to relying parties or they may derive the attributes from formulas (e.g., a credit score). Regardless of the source of the AP's attributes, the same standards should apply to determining the LOAA. As ABAC is implemented throughout government, attribute assurance will be a critical, limiting factor in its acceptance. With this presentation, we hope to encourage dialog between attribute relying parties, attribute providers, and federal agencies that will be defining standards for ABAC in the immediate future.
- « first
- ‹ previous
- 1
- 2
- 3