Biblio
Recent approaches have proven the effectiveness of local outlier factor-based outlier detection when applied over traffic flow probability distributions. However, these approaches used distance metrics based on the Bhattacharyya coefficient when calculating probability distribution similarity. Consequently, the limited expressiveness of the Bhattacharyya coefficient restricted the accuracy of the methods. The crucial deficiency of the Bhattacharyya distance metric is its inability to compare distributions with non-overlapping sample spaces over the domain of natural numbers. Traffic flow intensity varies greatly, which results in numerous non-overlapping sample spaces, rendering metrics based on the Bhattacharyya coefficient inappropriate. In this work, we address this issue by exploring alternative distance metrics and showing their applicability in a massive real-life traffic flow data set from 26 vital intersections in The Hague. The results on these data collected from 272 sensors for more than two years show various advantages of the Earth Mover's distance both in effectiveness and efficiency.
The design of optimal energy management strategies that trade-off consumers' privacy and expected energy cost by using an energy storage is studied. The Kullback-Leibler divergence rate is used to assess the privacy risk of the unauthorized testing on consumers' behavior. We further show how this design problem can be formulated as a belief state Markov decision process problem so that standard tools of the Markov decision process framework can be utilized, and the optimal solution can be obtained by using Bellman dynamic programming. Finally, we illustrate the privacy-enhancement and cost-saving by numerical examples.
Gaussian random attacks that jointly minimize the amount of information obtained by the operator from the grid and the probability of attack detection are presented. The construction of the attack is posed as an optimization problem with a utility function that captures two effects: firstly, minimizing the mutual information between the measurements and the state variables; secondly, minimizing the probability of attack detection via the Kullback-Leibler (KL) divergence between the distribution of the measurements with an attack and the distribution of the measurements without an attack. Additionally, a lower bound on the utility function achieved by the attacks constructed with imperfect knowledge of the second order statistics of the state variables is obtained. The performance of the attack construction using the sample covariance matrix of the state variables is numerically evaluated. The above results are tested in the IEEE 30-Bus test system.
Although several methods have been proposed for the detection of resampling operations in multimedia signals and the estimation of the resampling factor, the fundamental limits for this forensic task leave open research questions. In this work, we explore the effects that a downsampling operation introduces in the statistics of a 1D signal as a function of the parameters used. We quantify the statistical distance between an original signal and its downsampled version by means of the Kullback-Leibler Divergence (KLD) in case of a wide-sense stationary 1st-order autoregressive signal model. Values of the KLD are derived for different signal parameters, resampling factors and interpolation kernels, thus predicting the achievable hypothesis distinguishability in each case. Our analysis reveals unexpected detectability in case of strong downsampling due to the local correlation structure of the original signal. Moreover, since existing detection methods generally leverage the cyclostationarity of resampled signals, we also address the case where the autocovariance values are estimated directly by means of the sample autocovariance from the signal under investigation. Under the considered assumptions, the Wishart distribution models the sample covariance matrix of a signal segment and the KLD under different hypotheses is derived.