Visible to the public Biblio

Filters: Keyword is temporal consistency  [Clear All Filters]
2019-05-01
Carpent, Xavier, ElDefrawy, Karim, Rattanavipanon, Norrathep, Tsudik, Gene.  2018.  Temporal Consistency of Integrity-Ensuring Computations and Applications to Embedded Systems Security. Proceedings of the 2018 on Asia Conference on Computer and Communications Security. :313–327.
Assuring integrity of information (e.g., data and/or software) is usually accomplished by cryptographic means, such as hash functions or message authentication codes (MACs). Computing such integrity-ensuring functions can be time-consuming if the amount of input data is large and/or the computing platform is weak. At the same time, in real-time or safety-critical settings, it is often impractical or even undesirable to guarantee atomicity of computing a time-consuming integrity-ensuring function. Meanwhile, standard correctness and security definitions of such functions assume that input data (regardless of its size) remains consistent throughout computation. However, temporal consistency may be lost if another process interrupts execution of an integrity-ensuring function and modifies portions of input that either or both: (1) were already processed, or (2) were not processed yet. Lack of temporal consistency might yield an integrity result that is non-sensical or simply incorrect. Such subtleties and discrepancies between (implicit) assumptions in definitions and implementations can be a source of inconsistenceies, which might lead to vulnerabilities. In this paper, we systematically explore the notion of temporal consistency of cryptographic integrity-ensuring functions. We show that its lack in implementations of such functions can lead to inconsistent results and security violations in protocols and systems using them, e.g., remote attestation, remote updates and secure resets. We consider several mechanisms that guarantee temporal consistency of implementations of integrity-ensuring functions in embedded systems with a focus on remote attestation. We also assess performance of proposed mechanisms on two commodity hardware platforms: I.MX6-SabreLite and ODROID-XU4.
2018-11-19
Huang, H., Wang, H., Luo, W., Ma, L., Jiang, W., Zhu, X., Li, Z., Liu, W..  2017.  Real-Time Neural Style Transfer for Videos. 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). :7044–7052.

Recent research endeavors have shown the potential of using feed-forward convolutional neural networks to accomplish fast style transfer for images. In this work, we take one step further to explore the possibility of exploiting a feed-forward network to perform style transfer for videos and simultaneously maintain temporal consistency among stylized video frames. Our feed-forward network is trained by enforcing the outputs of consecutive frames to be both well stylized and temporally consistent. More specifically, a hybrid loss is proposed to capitalize on the content information of input frames, the style information of a given style image, and the temporal information of consecutive frames. To calculate the temporal loss during the training stage, a novel two-frame synergic training mechanism is proposed. Compared with directly applying an existing image style transfer method to videos, our proposed method employs the trained network to yield temporally consistent stylized videos which are much more visually pleasant. In contrast to the prior video style transfer method which relies on time-consuming optimization on the fly, our method runs in real time while generating competitive visual results.