Biblio
In this paper, we develop a statistical framework for image steganography in which the cover and stego messages are modeled as multivariate Gaussian random variables. By minimizing the detection error of an optimal detector within the generalized adopted statistical model, we propose a novel Gaussian embedding method. Furthermore, we extend the formulation to cost-based steganography, resulting in a universal embedding scheme that works with embedding costs as well as variance estimators. Experimental results show that the proposed approach avoids embedding in smooth regions and significantly improves the security of the state-of-the-art methods, such as HILL, MiPOD, and S-UNIWARD.
Named Data Networking (NDN) is the most mature proposal of the Information Centric Networking paradigm, a clean-slate approach for the Future Internet. Although NDN was designed to tackle security issues inherent to IP networks natively, newly introduced security attacks in its transitional phase threaten NDN's practical deployment. Therefore, a security monitoring plane for NDN is indispensable before any potential deployment of this novel architecture in an operating context by any provider. We propose an approach for the monitoring and anomaly detection in NDN nodes leveraging Bayesian Network techniques. A list of monitored metrics is introduced as a quantitative measure to feature the behavior of an NDN node. By leveraging the hypothesis testing theory, a micro detector is developed to detect whenever the metric significantly changes from its normal behavior. A Bayesian network structure that correlates alarms from micro detectors is designed based on the expert knowledge of the NDN specification and the NFD implementation. The relevance and performance of our security monitoring approach are demonstrated by considering the Content Poisoning Attack (CPA), one of the most critical attacks in NDN, through numerous experiment data collected from a real NDN deployment.
We present essential concepts of a model-based testing framework for probabilistic systems with continuous time. Markov automata are used as an underlying model. Key result of the work is the solid core of a probabilistic test theory, that incorporates real-time stochastic behaviour. We connect ioco theory and hypothesis testing to infer about trace probabilities. We show that our conformance relation conservatively extends ioco and discuss the meaning of quiescence in the presence of exponentially distributed time delays.
The diverse views of science of security have opened up several alleys towards applying the methods of science to security. We pursue a different kind of connection between science and security. This paper explores the idea that security is not just a suitable subject for science,. but that the process of security is also similar to the process of science. This similarity arises from the fact that both science and security depend on the methods of inductive inference. Because of this dependency, a scientific theory can never be definitely proved, but can only be disproved by new evidence, and improved into a better theory. Because of the same dependency, every security claim and method has a lifetime, and always eventually needs to be improved.
In this general framework of security-as-science, we explore the ways to apply the methods of scientific induction in the process of trust. The process of trust building and updating is viewed as hypothesis testing. We propose to formulate the trust hypotheses by the methods of algorithmic learning, and to build more robust trust testing and vetting methodologies on the solid foundations of statistical inference.
A novel physical layer authentication scheme is proposed in this paper by exploiting the time-varying carrier frequency offset (CFO) associated with each pair of wireless communications devices. In realistic scenarios, radio frequency oscillators in each transmitter-and-receiver pair always present device-dependent biases to the nominal oscillating frequency. The combination of these biases and mobility-induced Doppler shift, characterized as a time-varying CFO, can be used as a radiometric signature for wireless device authentication. In the proposed authentication scheme, the variable CFO values at different communication times are first estimated. Kalman filtering is then employed to predict the current value by tracking the past CFO variation, which is modeled as an autoregressive random process. To achieve the proposed authentication, the current CFO estimate is compared with the Kalman predicted CFO using hypothesis testing to determine whether the signal has followed a consistent CFO pattern. An adaptive CFO variation threshold is derived for device discrimination according to the signal-to-noise ratio and the Kalman prediction error. In addition, a software-defined radio (SDR) based prototype platform has been developed to validate the feasibility of using CFO for authentication. Simulation results further confirm the effectiveness of the proposed scheme in multipath fading channels.
In bound applications, the locations of events reportable by a device network have to be compelled to stay anonymous. That is, unauthorized observers should be unable to notice the origin of such events by analyzing the network traffic. The authors analyze 2 forms of downsides: Communication overhead and machine load problem. During this paper, the authors give a new framework for modeling, analyzing, and evaluating obscurity in device networks. The novelty of the proposed framework is twofold: initial, it introduces the notion of "interval indistinguishability" and provides a quantitative live to model obscurity in wireless device networks; second, it maps supply obscurity to the applied mathematics downside the authors showed that the present approaches for coming up with statistically anonymous systems introduce correlation in real intervals whereas faux area unit unrelated. The authors show however mapping supply obscurity to consecutive hypothesis testing with nuisance Parameters ends up in changing the matter of exposing non-public supply data into checking out associate degree applicable knowledge transformation that removes or minimize the impact of the nuisance data victimization sturdy cryptography algorithmic rule. By doing therefore, the authors remodeled the matter of analyzing real valued sample points to binary codes, that opens the door for committal to writing theory to be incorporated into the study of anonymous networks. In existing work, unable to notice unauthorized observer in network traffic. However this work in the main supported enhances their supply obscurity against correlation check, the most goal of supply location privacy is to cover the existence of real events.