Visible to the public Biblio

Filters: Author is Pasquini, Cecilia  [Clear All Filters]
2018-03-05
Pasquini, Cecilia, Böhme, Rainer.  2017.  Information-Theoretic Bounds of Resampling Forensics: New Evidence for Traces Beyond Cyclostationarity. Proceedings of the 5th ACM Workshop on Information Hiding and Multimedia Security. :3–14.

Although several methods have been proposed for the detection of resampling operations in multimedia signals and the estimation of the resampling factor, the fundamental limits for this forensic task leave open research questions. In this work, we explore the effects that a downsampling operation introduces in the statistics of a 1D signal as a function of the parameters used. We quantify the statistical distance between an original signal and its downsampled version by means of the Kullback-Leibler Divergence (KLD) in case of a wide-sense stationary 1st-order autoregressive signal model. Values of the KLD are derived for different signal parameters, resampling factors and interpolation kernels, thus predicting the achievable hypothesis distinguishability in each case. Our analysis reveals unexpected detectability in case of strong downsampling due to the local correlation structure of the original signal. Moreover, since existing detection methods generally leverage the cyclostationarity of resampled signals, we also address the case where the autocovariance values are estimated directly by means of the sample autocovariance from the signal under investigation. Under the considered assumptions, the Wishart distribution models the sample covariance matrix of a signal segment and the KLD under different hypotheses is derived.

2017-05-30
Pasquini, Cecilia, Schöttle, Pascal, Böhme, Rainer, Boato, Giulia, Pèrez-Gonzàlez, Fernando.  2016.  Forensics of High Quality and Nearly Identical JPEG Image Recompression. Proceedings of the 4th ACM Workshop on Information Hiding and Multimedia Security. :11–21.

We address the known problem of detecting a previous compression in JPEG images, focusing on the challenging case of high and very high quality factors (textgreater= 90) as well as repeated compression with identical or nearly identical quality factors. We first revisit the approaches based on Benford–Fourier analysis in the DCT domain and block convergence analysis in the spatial domain. Both were originally conceived for specific scenarios. Leveraging decision tree theory, we design a combined approach complementing the discriminatory capabilities. We obtain a set of novel detectors targeted to high quality grayscale JPEG images.