Biblio
Applications of true random number generators (TRNGs) span from art to numerical computing and system security. In cryptographic applications, TRNGs are used for generating new keys, nonces and masks. For this reason, a TRNG is an essential building block and often a point of failure for embedded security systems. One type of primitives that are widely used as source of randomness are ring oscillators. For a ring-oscillator-based TRNG, the true randomness originates from its timing jitter. Therefore, determining the jitter strength is essential to estimate the quality of a TRNG. In this paper, we propose a method to measure the jitter strength of a ring oscillator implemented on an FPGA. The fast tapped delay chain is utilized to perform the on-chip measurement with a high resolution. The proposed method is implemented on both a Xilinx FPGA and an Intel FPGA. Fast carry logic components on different FPGAs are used to implement the fast delay line. This carry logic component is designed to be fast and has dedicated routing, which enables a precise measurement. The differential structure of the delay chain is used to thwart the influence of undesirable noise from the measurement. The proposed methodology can be applied to other FPGA families and ASIC designs.
Multifactor authentication presents a robust security method, but typically requires multiple steps on the part of the user resulting in a high cost to usability and limiting adoption. Furthermore, a truly usable system must be unobtrusive and inconspicuous. Here, we present a system that provides all three factors of authentication (knowledge, possession, and inherence) in a single step in the form of an earpiece which implements brain-based authentication via custom-fit, in-ear electroencephalography (EEG). We demonstrate its potential by collecting EEG data using manufactured custom-fit earpieces with embedded electrodes. Across 7 participants, we are able to achieve perfect performance, mean 0% false acceptance (FAR) and 0% false rejection rates (FRR), using participants' best performing tasks collected in one session by one earpiece with three electrodes. Our results indicate that a single earpiece with embedded electrodes could provide a discreet, convenient, and robust method for secure one-step, three-factor authentication.
Emerging zero-day vulnerabilities in information and communications technology systems make cyber defenses very challenging. In particular, the defender faces uncertainties of; e.g., system states and the locations and the impacts of vulnerabilities. In this paper, we study the defense problem on a computer network that is modeled as a partially observable Markov decision process on a Bayesian attack graph. We propose online algorithms which allow the defender to identify effective defense policies when utility functions are unknown a priori. The algorithm performance is verified via numerical simulations based on real-world attacks.
Although connecting a microgrid to modern power systems can alleviate issues arising from a large penetration of distributed generation, it can also cause severe voltage instability problems. This paper presents an online method to analyze voltage security in a microgrid using convolutional neural networks. To transform the traditional voltage stability problem into a classification problem, three steps are considered: 1) creating data sets using offline simulation results; 2) training the model with dimensional reduction and convolutional neural networks; 3) testing the online data set and evaluating performance. A case study in the modified IEEE 14-bus system shows the accuracy of the proposed analysis method increases by 6% compared to back-propagation neural network and has better performance than decision tree and support vector machine. The proposed algorithm has great potential in future applications.
Information-Centric Network (ICN) is one of the most promising network architecture to handle the problem of rapid increase of data traffic because it allows in-network cache. ICNs with Linear Network Coding (LNC) can greatly improve the performance of content caching and delivery. In this paper, we propose a Secure Content Caching and Routing (SCCR) framework based on Software Defined Network (SDN) to find the optimal cache management and routing for secure content delivery, which aims to firstly minimize the total cost of cache and bandwidth consumption and then minimize the usage of random chunks to guarantee information theoretical security (ITS). Specifically, we firstly propose the SCCR problem and then introduce the main ideas of the SCCR framework. Next, we formulate the SCCR problem to two Linear Programming (LP) formulations and design the SCCR algorithm based on them to optimally solve the SCCR problem. Finally, extensive simulations are conducted to evaluate the proposed SCCR framework and algorithms.
We introduce a novel mathematical model that treats network security as a game between cyber attackers and network administrators. The model takes the form of a zero-sum repeated game where each sub-game corresponds to a possible state of the attacker. Our formulation views state as the set of compromised edges in a graph opposed to the more traditional node-based view. This provides a more expressive model since it allows the defender to anticipate the direction of attack. Both players move independently and in continuous time allowing for the possibility of one player moving several times before the other does. This model shows that defense-in-depth is not always a rational strategy for budget constrained network administrators. Furthermore, a defender can dissuade a rational attacker from attempting to attack a network if the defense budget is sufficiently high. This means that a network administrator does not need to make their system completely free of vulnerabilities, they only to ensure the penalties for being caught outweigh the potential rewards gained.
Today the technology advancement in communication technology permits a malware author to introduce code obfuscation technique, for example, Application Programming Interface (API) hook, to make detecting the footprints of their code more difficult. A signature-based model such as Antivirus software is not effective against such attacks. In this paper, an API graph-based model is proposed with the objective of detecting hook attacks during malicious code execution. The proposed model incorporates techniques such as graph-generation, graph partition and graph comparison to distinguish a legitimate system call from malicious system call. The simulation results confirm that the proposed model outperforms than existing approaches.
The article issue is the enterprise information protection within the internet of things concept. The aim of research is to develop arrangements set to ensure secure enterprise IPv6 network operating. The object of research is the enterprise IPv6 network. The subject of research is modern switching equipment as a tool to ensure network protection. The research task is to prioritize functioning of switches in production and corporation enterprise networks, to develop a network host protection algorithm, to test the developed algorithm on the Cisco Packet Tracer 7 software emulator. The result of research is the proposed approach to IPv6-network security based on analysis of modern switches functionality, developed and tested enterprise network host protection algorithm under IPv6-protocol with an automated network SLAAC-configuration control, a set of arrangements for resisting default enterprise gateway attacks, using ACL, VLAN, SEND, RA Guard security technology, which allows creating sufficiently high level of networks security.
Due to the increasing threat of network attacks, Firewall has become crucial elements in network security, and have been widely deployed in most businesses and institutions for securing private networks. The function of a firewall is to examine each packet that passes through it and decide whether to letting them pass or halting them based on preconfigured rules and policies, so firewall now is the first defense line against cyber attacks. However most of people doesn't know how firewall works, and the most users of windows operating system doesn't know how to use the windows embedded firewall. This paper explains how firewall works, firewalls types, and all you need to know about firewall policies, then presents a novel application (QudsWall) developed by authors that manages windows embedded firewall and make it easy to use.
Malicious applications have become increasingly numerous. This demands adaptive, learning-based techniques for constructing malware detection engines, instead of the traditional manual-based strategies. Prior work in learning-based malware detection engines primarily focuses on dynamic trace analysis and byte-level n-grams. Our approach in this paper differs in that we use compiler intermediate representations, i.e., the callgraph representation of binaries. Using graph-based program representations for learning provides structure of the program, which can be used to learn more advanced patterns. We use the Shortest Path Graph Kernel (SPGK) to identify similarities between call graphs extracted from binaries. The output similarity matrix is fed into a Support Vector Machine (SVM) algorithm to construct highly-accurate models to predict whether a binary is malicious or not. However, SPGK is computationally expensive due to the size of the input graphs. Therefore, we evaluate different parallelization methods for CPUs and GPUs to speed up this kernel, allowing us to continuously construct up-to-date models in a timely manner. Our hybrid implementation, which leverages both CPU and GPU, yields the best performance, achieving up to a 14.2x improvement over our already optimized OpenMP version. We compared our generated graph-based models to previously state-of-the-art feature vector 2-gram and 3-gram models on a dataset consisting of over 22,000 binaries. We show that our classification accuracy using graphs is over 19% higher than either n-gram model and gives a false positive rate (FPR) of less than 0.1%. We are also able to consider large call graphs and dataset sizes because of the reduced execution time of our parallelized SPGK implementation.
There were many researches about the parameter estimation of canonical dynamic systems recently. Extended Kalman filter (EKF) is a popular parameter estimation method in virtue of its easy applications. This paper focuses on parameter estimation for a class of canonical dynamic systems by EKF. By constructing associated differential equation, the convergence of EKF parameter estimation for the canonical dynamic systems is analyzed. And the simulation demonstrates the good performance.
Side-channels have been increasingly demonstrated as a practical threat to the confidentiality of private user information. Being able to statically detect these kinds of vulnerabilites is a key challenge in current computer security research. We introduce a new technique, path-cost analysis (PCA), for the detection of side-channels. Given a cost model for a type of side-channel, path-cost analysis assigns a symbolic cost expression to every node and every back edge of a method's control flow graph that gives an over-approximation for all possible observable values at that node or after traversing that cycle. Queries to a satisfiability solver on the maximum distance between specific pairs of nodes allow us to detect the presence of imbalanced paths through the control flow graph. When combined with taint analysis, we are able to answer the following question: does there exist a pair of paths in the method's control flow graph, differing only on branch conditions influenced by the secret, that differs in observable value by more than some given threshold? In fact, we are able to answer the specifically state what sets of secret-sensitive conditional statements introduce a side-channel detectable given some noise parameter. We extend this approach to an interprocedural analysis, resulting in a over-approximation of the number of true side-channels in the program according to the given cost model. Greater precision can be obtained by combining our method with predicate abstraction or symbolic execution to eliminate a subset of the infeasible paths through the control flow graph. We propose evaluating our method on a set of sizeable Java server-client applications.
Similar to criminals in the physical world, cyber-criminals use a variety of illegal and immoral means to achieve monetary gains. Recently, malware known as ransomware started to leverage strong cryptographic primitives to hold victims' computer files "hostage" until a ransom is paid. Victims, with no way to defend themselves, are often advised to simply pay. Existing defenses against ransomware rely on ad-hoc mitigations that target the incorrect use of cryptography rather than generic live protection. To fill this gap in the defender's arsenal, we describe the approach, prototype implementation, and evaluation of a novel, automated, and most importantly proactive defense mechanism against ransomware. Our prototype, called PayBreak, effectively combats ransomware, and keeps victims' files safe. PayBreak is based on the insight that secure file encryption relies on hybrid encryption where symmetric session keys are used on the victim computer. PayBreak observes the use of these keys, holds them in escrow, and thus, can decrypt files that would otherwise only be recoverable by paying the ransom. Our prototype leverages low overhead dynamic hooking techniques and asymmetric encryption to realize the key escrow mechanism which allows victims to restore the files encrypted by ransomware. We evaluated PayBreak for its effectiveness against twenty hugely successful families of real-world ransomware, and demonstrate that our system can restore all files that are encrypted by samples from twelve of these families, including the infamous CryptoLocker, and more recent threats such as Locky and SamSam. Finally, PayBreak performs its protection task at negligible performance overhead for common office workloads and is thus ideally suited as a proactive online protection system.
Untrusted third-party vendors and manufacturers have raised security concerns in hardware supply chain. Among all existing solutions, formal verification methods provide powerful solutions in detection malicious behaviors at the pre-silicon stage. However, little work have been done towards built-in hardware runtime verification at the post-silicon stage. In this paper, a runtime formal verification framework is proposed to evaluate the trust of hardware during its execution. This framework combines the symbolic execution and SAT solving methods to validate the user defined properties. The proposed framework has been demonstrated on an FPGA platform using an SoC design with untrusted IPs. The experimentation results show that the proposed approach can provide high-level security assurance for hardware at runtime.
Peer-to-peer (P2P) botnets have become one of the major threats in network security for serving as the infrastructure that responsible for various of cyber-crimes. Though a few existing work claimed to detect traditional botnets effectively, the problem of detecting P2P botnets involves more challenges. In this paper, we present PeerHunter, a community behavior analysis based method, which is capable of detecting botnets that communicate via a P2P structure. PeerHunter starts from a P2P hosts detection component. Then, it uses mutual contacts as the main feature to cluster bots into communities. Finally, it uses community behavior analysis to detect potential botnet communities and further identify bot candidates. Through extensive experiments with real and simulated network traces, PeerHunter can achieve very high detection rate and low false positives.
In March 2016, several online news media reported on the inadequate emotional capabilities of interactive virtual assistants. While significant progress has been made in the general intelligence and functionality of virtual agents (VA), the emotional intelligent (EI) VA has yet been thoroughly explored. We examine user's perception of EI of virtual agents through Zara The Supergirl, a virtual agent that conducts question and answering type of conversational testing and counseling online. The results show that overall users perceive an emotion-expressing VA (EEVA) to be more EI than a non-emotion-expressing VA (NEEVA). However, simple affective expression may not be sufficient enough for EEVA to be perceived as fully EI.
The underlying element that supports the device communication in the MANET is the wireless connection capability. Each node has the ability to communicate with other nodes via the creation of routing path. However, due to the fact that nodes in MANET are autonomous and the routing paths created are only based on current condition of the network, some of the paths are extremely instable. In light of these shortcomings, many research works emphasizes on the improvement of routing path algorithm. Regardless of the application the MANET can support, the MANET possesses unique characteristics, which enables mobile nodes to form dynamic communication irrespective the availability of a fixed network. However the inherent nature of MANET has led to nodes in MANET to be vulnerable to denied services. A typical Denial of Service (DoS) in MANET is the Black Hole attack, caused by a malicious node, or a set of nodes advertising false routing updates. Typically, the malicious nodes are difficult to be detected. Each node is equipped with a particular type of routing protocol and voluntarily participates in relaying the packets. However, some nodes may not be genuine and has been tampered to behave maliciously, which causes the Black Hole attack. Several on demand routing protocol e.g. Ad hoc On Demand Distance Vector (AODV) and Dynamic Source Routing (DSR) are susceptible to such attack. In principle, the attack exploits the Route Request (RREQ) discovery operation and falsifies the sequence number and the shortest path information. The malicious nodes are able to utilize the loophole in the RREQ discovery process due to the absence of validation process. As a result, genuine RREQ packets are exploited and erroneously relayed to a false node(s). This paper highlights the effect Black Hole nodes to the network performance and therefore substantiates the previous work done [1]. In this paper, several simulation experiments are iterated using NS-2, which employed various scenarios and traffic loads. The simulation results show the presence of Black Hole nodes in a network can substantially affects the packet delivery ratio and throughput by as much as 100%.
In this work we investigate existing and new metrics for evaluating transient stability of power systems to quantify the impact of distributed control schemes. Specifically, an energy storage system (ESS)-based control scheme that builds on feedback linearization theory is implemented in the power system to enhance its transient stability. We study the value of incorporating such ESS-based distributed control on specific transient stability metrics that include critical clearing time, critical control activation time, system stability time, rotor angle stability index, rotor speed stability index, rate of change of frequency, and control power. The stability metrics are evaluated using the IEEE 68-bus test power system. Numerical results demonstrate the value of the distributed control scheme in enhancing the transient stability metrics of power systems.
In this work we investigate existing and new metrics for evaluating transient stability of power systems to quantify the impact of distributed control schemes. Specifically, an energy storage system (ESS)-based control scheme that builds on feedback linearization theory is implemented in the power system to enhance its transient stability. We study the value of incorporating such ESS-based distributed control on specific transient stability metrics that include critical clearing time, critical control activation time, system stability time, rotor angle stability index, rotor speed stability index, rate of change of frequency, and control power. The stability metrics are evaluated using the IEEE 68-bus test power system. Numerical results demonstrate the value of the distributed control scheme in enhancing the transient stability metrics of power systems.
Microcavity polaritons are a hybrid photonic system that arises from the strong coupling of confined photons to quantum-well excitons. Due to their light-matter nature, polaritons possess a Kerr-like nonlinearity while being easily accessible by standard optical means. The ability to engineer confinement potentials in microcavities makes polaritons a very convenient system to study spatially localized bosonic populations, which might have great potential for the creation of novel photonic devices. Careful engineering of this system is predicted to induce Gaussian squeezing, a phenomenon that lies at a heart of the so-called unconventional photon blockade associated with single photon emission. This paper reveals a manifestation of the predicted squeezing by measuring the ultrafast time-dependent second-order correlation function g(2)(0) by means of a streak-camera acting as a single photon detector. The light emitted by the microcavity oscillates between Poissonian and super-Poissonian in phase with the Josephson dynamics. This behavior is remarkably well explained by quantum simulations, which predict such dynamical evolution of the squeezing parameters. The paper shows that a crucial prerequisite for squeezing is presence of a weak, but non-zero nonlinearity. Results open the way towards generation of nonclassical light in solid-state systems possessing a single particle nonlinearity like microwave Josephson junctions or silicon-on-chip resonators.
Authentication is one of the key aspects of securing applications and systems alike. While in most existing systems this is achieved using usernames and passwords it has been continuously shown that this authentication method is not secure. Studies that have been conducted have shown that these systems have vulnerabilities which lead to cases of impersonation and identity theft thus there is need to improve such systems to protect sensitive data. In this research, we explore the combination of the user's location together with traditional usernames and passwords as a multi factor authentication system to make authentication more secure. The idea involves comparing a user's mobile device location with that of the browser and comparing the device's Bluetooth key with the key used during registration. We believe by leveraging existing technologies such as Bluetooth and GPS we can reduce implementation costs whilst improving security.
Among many research efforts devoted to automated art investigations, the problem of quantification of literary style remains current. Meanwhile, linguists and computer scientists have tried to sort out texts according to their types or authors. We use the recently-introduced p-leader multifractal formalism to analyze a corpus of novels written for adults and young adults, with the goal of assessing if a difference in style can be found. Our results agree with the interpretation that novels written for young adults largely follow conventions of the genre, whereas novels written for adults are less homogeneous.
Homomorphic encryption technology can settle a dispute of data privacy security in cloud environment, but there are many problems in the process of access the data which is encrypted by a homomorphic algorithm in the cloud. In this paper, on the premise of attribute encryption, we propose a fully homomorphic encrypt scheme which based on attribute encryption with LSSS matrix. This scheme supports fine-grained cum flexible access control along with "Query-Response" mechanism to enable users to efficiently retrieve desired data from cloud servers. In addition, the scheme should support considerable flexibility to revoke system privileges from users without updating the key client, it reduces the pressure of the client greatly. Finally, security analysis illustrates that the scheme can resist collusion attack. A comparison of the performance from existing CP-ABE scheme, indicates that our scheme reduces the computation cost greatly for users.
With the robots being applied for more and more fields, security issues attracted more attention. In this paper, we propose that the key center in cloud send a polynomial information to each robot component, the component would put their information into the polynomial to get a group of new keys using in next slot, then check and update its key groups after success in hash. Because of the degree of the polynomial is higher than the number of components, even if an attacker got all the key values of the components, he also cannot restore the polynomial. The information about the keys will be discarded immediately after used, so an attacker cannot obtain the session key used before by the invasion of a component. This article solves the security problems about robotic system caused by cyber-attacks and physical attacks.
Existing access control mechanisms are based on the concept of identity enrolment and recognition and assume that recognized identity is a synonym to ethical actions, yet statistics over the years show that the most severe security breaches are the results of trusted, identified, and legitimate users who turned into malicious insiders. Insider threat damages vary from intellectual property loss and fraud to information technology sabotage. As insider threat incidents evolve, there exist demands for a nonidentity-based authentication measure that rejects access to authorized individuals who have mal-intents of access. In this paper, we study the possibility of using the user's intention as an access control measure using the involuntary electroencephalogram reactions toward visual stimuli. We propose intent-based access control (IBAC) that detects the intentions of access based on the existence of knowledge about an intention. IBAC takes advantage of the robustness of the concealed information test to assess access risk. We use the intent and intent motivation level to compute the access risk. Based on the calculated risk and risk accepted threshold, the system makes the decision whether to grant or deny access requests. We assessed the model using experiments on 30 participants that proved the robustness of the proposed solution.