Visible to the public Information Theoretic Security

SoS Newsletter- Advanced Book Block

Information Theoretic Security


A cryptosystem is said to be information-theoretically secure if its security derives purely from information theory and cannot be broken even when the adversary has unlimited computing power. For example, the one-time pad is an information-theoretically secure cryptosystem proven by Claude Shannon, inventor of information theory, to be secure. Information-theoretically secure cryptosystems are often used for the most sensitive communications such as diplomatic cables and high-level military communications, because of the great efforts enemy governments expend toward breaking them. Because of this importance, methods, theory and practice in information theory security also remains high. The works cited here address quantum computing, steganography, DNA based security, cyclic elliptic curves, algebraic coding theory and other approaches.

  • Thapa, D.; Harnesk, D., "Rethinking the Information Security Risk Practices: A Critical Social Theory Perspective," System Sciences (HICSS), 2014 47th Hawaii International Conference on , vol., no., pp.3207,3214, 6-9 Jan. 2014. (ID#:14-1684) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6758999&isnumber=6758592 There is a lack of theoretical understanding of information security risk practices. For example, the information security risks related literatures are dominated by instrumental approach to protect the information assets. This approach, however, often fails to acknowledge the ideologies and consequences of risks practices. In this paper, through critical analysis, we suggest various perspectives to advance the understanding in this regard. In doing so, we present our argument by reviewing the security risk literature using Habermas's concept of four orientations: instrumental, strategic, communicative and discursive. The contribution of this paper is to develop conceptual clarity of the risk related ideologies and its consequences on emancipation. Keywords: asset management; risk management; security of data; Habermas concept; communicative orientation; critical analysis; critical social theory perspective; discursive orientation; information asset protection; information security risk practices; instrumental approach; instrumental orientation; risk practices; risk related ideologies; strategic orientation; Context; Information security; Instruments; Organizations; Risk management; Information security risk practices; critical social theory; emancipation B
  • aheti, Ankita; Singh, Lokesh; Khan, Asif Ullah, "Proposed Method for Multimedia Data Security Using Cyclic Elliptic Curve, Chaotic System, and Authentication Using Neural Network," Communication Systems and Network Technologies (CSNT), 2014 Fourth International Conference on , vol., no., pp.664,668, 7-9 April 2014. (ID#:14-1685) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6821481&isnumber=6821334 As multimedia applications are used increasingly, security becomes an important issue of security of images. The combination of chaotic theory and cryptography forms an important field of information security. In the past decade, chaos based image encryption is given much attention in the research of information security and a lot of image encryption algorithms based on chaotic maps have been proposed. But, most of them delay the system performance, security, and suffer from the small key space problem. This paper introduces an efficient symmetric encryption scheme based on a cyclic elliptic curve and chaotic system that can overcome these disadvantages. The cipher encrypts 256-bit of plain image to 256-bit of cipher image within eight 32-bit registers. The scheme generates pseudorandom bit sequences for round keys based on a piecewise nonlinear chaotic map. Then, the generated sequences are mixed with the key sequences derived from the cyclic elliptic curve points. The proposed algorithm has good encryption effect, large key space, high sensitivity to small change in secret keys and fast compared to other competitive algorithms. Keywords: Authentication; Chaotic communication; Elliptic curves; Encryption; Media; Multimedia communication; authentication; chaos; decryption; encryption; neural network
  • Abercrombie, R.K.; Schlicher, B.G.; Sheldon, F.T., "Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation," System Sciences (HICSS), 2014 47th Hawaii International Conference on , vol., no., pp.2015,2024, 6-9 Jan. 2014. (ID#:14-1686) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6758853&isnumber=6758592 Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From these five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA. Keywords: game theory; security of data; ABGT simulations; AMI failure scenarios; CIA; NESCOR; advanced metering infrastructure functional domain; agent based game theoretic simulation; confidentiality integrity and availability; information security analysis; national electric sector cyber security organization resource; Analytical models; Availability; Computational modeling; Computer security; Game theory; Games; AMI Failure Scenarios; Advanced Metering Infrastructure; Agent Based Simulation; Availability; Confidentiality; Integrity; Risk Management; Simulation; Smart Grid Cyber Security Analysis
  • Mukherjee, A.; Fakoorian, S.; Huang, J.; Swindlehurst, A., "Principles of Physical Layer Security in Multiuser Wireless Networks: A Survey," Communications Surveys & Tutorials, IEEE, vol.PP, no.99, pp.1,24, February 2014. (ID#:14-1687) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6739367&isnumber=5451756 This paper provides a comprehensive review of the domain of physical layer security in multiuser wireless networks. The essential premise of physical layer security is to enable the exchange of confidential messages over a wireless medium in the presence of unauthorized eavesdroppers, without relying on higher-layer encryption. This can be achieved primarily in two ways: without the need for a secret key by intelligently designing transmit coding strategies, or by exploiting the wireless communication medium to develop secret keys over public channels. The survey begins with an overview of the foundations dating back to the pioneering work of Shannon and Wyner on information-theoretic security. We then describe the evolution of secure transmission strategies from point-to-point channels to multiple-antenna systems, followed by generalizations to multiuser broadcast, multiple-access, interference, and relay networks. Secret-key generation and establishment protocols based on physical layer mechanisms are subsequently covered. Approaches for secrecy based on channel coding design are then examined, along with a description of inter-disciplinary approaches based on game theory and stochastic geometry. The associated problem of physical layer message authentication is also briefly introduced. The survey concludes with observations on potential research directions in this area. Keywords: Information-theoretic security; Physical layer security; artificial noise; cooperative jamming; secrecy; secret-key agreement; wiretap channel
  • Kodovsky, J.; Fridrich, J., "Effect of Image Downsampling on Steganographic Security," Information Forensics and Security, IEEE Transactions on , vol.9, no.5, pp.752,762, May 2014. (ID#:14-1688) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6750732&isnumber=6776454 The accuracy of steganalysis in digital images primarily depends on the statistical properties of neighboring pixels, which are strongly affected by the image acquisition pipeline as well as any processing applied to the image. In this paper, we study how the detectability of embedding changes is affected when the cover image is downsampled prior to embedding. This topic is important for practitioners because the vast majority of images posted on websites, image sharing portals, or attached to e-mails are downsampled. It is also relevant to researchers as the security of steganographic algorithms is commonly evaluated on databases of downsampled images. In the first part of this paper, we investigate empirically how the steganalysis results depend on the parameters of the resizing algorithm-the choice of the interpolation kernel, the scaling factor (resize ratio), antialiasing, and the downsampled pixel grid alignment. We report on several novel phenomena that appear valid universally across the tested cover sources, steganographic methods, and steganalysis features. This paper continues with a theoretical analysis of the simplest interpolation kernel - the box kernel. By fitting a Markov chain model to pixel rows, we analytically compute the Fisher information rate for any mutually independent embedding operation and derive the proper scaling of the secure payload with resizing. For least significant bit (LSB) matching and a limited range of downscaling, the theory fits experiments rather well, which indicates the existence of a new scaling law expressing the length of the secure payload when the cover size is modified by subsampling. Keywords: Markov processes; image matching; sampling methods; steganography; Fisher information rate; LSB matching; Markov chain model; Web sites; antialiasing; cover image; digital images; downsampled pixel grid alignment; e-mail attachment; electronic mail; embedding changes; image acquisition pipeline; image downsampling; image processing; image sharing portals ;interpolation kernel; least significant bit matching; mutually independent embedding operation; resize ratio ;scaling factor; statistical properties; steganalysis; steganographic algorithms; steganographic security; subsampling; Databases; Electronic mail;Interpolation;Kernel;Payloads;Security;Testing;Steganographic security ;image downsampling; steganalysis
  • Stanisavljevic, Z.; Stanisavljevic, J.; Vuletic, P.; Jovanovic, Z., "COALA - System for Visual Representation of Cryptography Algorithms," Learning Technologies, IEEE Transactions on, vol.PP, no.99, pp.1,1, April 2014. (ID#:14-1689) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6784486&isnumber=4620077 Educational software systems have an increasingly significant presence in engineering sciences. They aim to improve students' attitudes and knowledge acquisition typically through visual representation and simulation of complex algorithms and mechanisms or hardware systems that are often not available to the educational institutions. This paper presents a novel software system for CryptOgraphic ALgorithm visuAl representation (COALA), which was developed to support a Data Security course at the School of Electrical Engineering, University of Belgrade. The system allows users to follow the execution of several complex algorithms (DES, AES, RSA, and Diffie- Hellman) on real world examples in a step by step detailed view with the possibility of forward and backward navigation. Benefits of the COALA system for students are observed through the increase of the percentage of students who passed the exam and the average grade on the exams during one school year. Keywords: Algorithm design and analysis; Cryptography; Data visualization; Software algorithms; Visualization
  • Threepak, T.; Watcharapupong, A., "Web attack detection using entropy-based analysis," Information Networking (ICOIN), 2014 International Conference on , vol., no., pp.244,247, 10-12 Feb. 2014. (ID#:14-1690) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6799699&isnumber=6799467 Web attacks are increases both magnitude and complexity. In this paper, we try to use the Shannon entropy analysis to detect these attacks. Our approach examines web access logging text using the principle that web attacking scripts usually have more sophisticated request patterns than legitimate ones. Risk level of attacking incidents are indicated by the average (AVG) and standard deviation (SD) of each entropy period, i.e., Alpha and Beta lines which are equal to AVG-SD and AVG-2*SD, respectively. They represent boundaries in detection scheme. As the result, our technique is not only used as high accurate procedure to investigate web request anomaly behaviors, but also useful to prune huge application access log files and focus on potential intrusive events. The experiments show that our proposed process can detect anomaly requests in web application system with proper effectiveness and low false alarm rate. Keywords: entropy; security of data; AVG-SD; Shannon entropy analysis; Web access logging text; Web attack detection; Web attacking scripts; Web request anomaly behaviors; entropy-based analysis; intrusive events; standard deviation; C omplexity theory; Entropy; Equations; Intrusion detection; Mathematical model; Standards; Anomaly Detection; Entropy Analysis ;Information Security
  • Jiantao Zhou; Xianming Liu; Au, O.C.; Yuan Yan Tang, "Designing an Efficient Image Encryption-Then-Compression System via Prediction Error Clustering and Random Permutation," Information Forensics and Security, IEEE Transactions on , vol.9, no.1, pp.39,50, Jan. 2014. (ID#:14-1691) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6670767&isnumber=6684617 In many practical scenarios, image encryption has to be conducted prior to image compression. This has led to the problem of how to design a pair of image encryption and compression algorithms such that compressing the encrypted images can still be efficiently performed. In this paper, we design a highly efficient image encryption-then-compression (ETC) system, where both lossless and lossy compression are considered. The proposed image encryption scheme operated in the prediction error domain is shown to be able to provide a reasonably high level of security. We also demonstrate that an arithmetic coding-based approach can be exploited to efficiently compress the encrypted images. More notably, the proposed compression approach applied to encrypted images is only slightly worse, in terms of compression efficiency, than the state-of-the-art lossless/lossy image coders, which take original, unencrypted images as inputs. In contrast, most of the existing ETC solutions induce significant penalty on the compression efficiency. Keywords: arithmetic codes; data compression; image coding; pattern clustering; prediction theory; random codes; ETC; arithmetic coding-based approach; image encryption-then-compression system design ;lossless compression; lossless image coder; lossy compression; lossy image coder; prediction error clustering; random permutation; security; Bit rate; Decoding; Encryption ;Image coding; Image reconstruction; Compression of encrypted image; encrypted domain signal processing
  • Portmann, C., "Key Recycling in Authentication," Information Theory, IEEE Transactions on , vol.60, no.7, pp.4383,4396, July 2014. (ID#:14-1692) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6797875&isnumber=6832684 In their seminal work on authentication, Wegman and Carter propose that to authenticate multiple messages, it is sufficient to reuse the same hash function as long as each tag is encrypted with a one-time pad. They argue that because the one-time pad is perfectly hiding, the hash function used remains completely unknown to the adversary. Since their proof is not composable, we revisit it using a composable security framework. It turns out that the above argument is insufficient: if the adversary learns whether a corrupted message was accepted or rejected, information about the hash function is leaked, and after a bounded finite amount of rounds it is completely known. We show however that this leak is very small: Wegman and Carter's protocol is still ( varepsilon ) -secure, if ( varepsilon ) -almost strongly universal (_2) hash functions are used. This implies that the secret key corresponding to the choice of hash function can be reused in the next round of authentication without any additional error than this ( varepsilon ) . We also show that if the players have a mild form of synchronization, namely that the receiver knows when a message should be received, the key can be recycled for any arbitrary task, not only new rounds of authentication. Keywords: Abstracts; Authentication; Computational modeling; Cryptography; Protocols; Recycling; Cryptography; authentication; composable security; information-theoretic security
  • Jain, S.; Bhatnagar, V., "Analogy of various DNA based security algorithms using cryptography and steganography," Issues and Challenges in Intelligent Computing Techniques (ICICT), 2014 International Conference on , vol., no., pp.285,291, 7-8 Feb. 2014. (ID#:14-1693) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6781294&isnumber=6781240 In today's era Information technology is growing day by day. The rate of information storage and transformation is growing day by day. So information security is becoming more important. Everyone wants to protect its information from attackers and hackers. To provide security to information there are various algorithms of traditional cryptography and steganography. New field DNA cryptography emerges to provide security to data store in DNA. The DNA cryptography uses the Bio-Molecular computational abilities of DNA. In this paper authors compare the various DNA cryptographic algorithms on certain key and important parameters. These parameters would also help the future researchers to design or improve the DNA storage techniques for secure data storage in more efficient and reliable manner. The authors also explain the different biological and arithmetic operators use in the DNA cryptographic algorithms. Keywords: DNA; cryptography; information storage; steganography; DNA based security algorithms; DNA cryptography; DNA storage techniques; arithmetic operators; biological operators; biomolecular computational abilities; information security; information storage; information technology; information transformation; steganography; Biological information theory; DNA; Encryption; Facsimile; Arithmetic; Biological; Cryptography; DNA; Steganography
  • Arya, A.; Kumar, S., "Information theoretic feature extraction to reduce dimensionality of Genetic Network Programming based intrusion detection model," Issues and Challenges in Intelligent Computing Techniques (ICICT), 2014 International Conference on , vol., no., pp.34,37, 7-8 Feb. 2014. (ID#:14-1694) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6781248&isnumber=6781240 Intrusion detection techniques require examining high volume of audit records so it is always challenging to extract minimal set of features to reduce dimensionality of the problem while maintaining efficient performance. Previous researchers analyzed Genetic Network Programming framework using all 41 features of KDD cup 99 dataset and found the efficiency of more than 90% at the cost of high dimensionality. We are proposing a new technique for the same framework with low dimensionality using information theoretic approach to select minimal set of features resulting in six attributes and giving the accuracy very close to their result. Feature selection is based on the hypothesis that all features are not at same relevance level with specific class. Simulation results with KDD cup 99 dataset indicates that our solution is giving accurate results as well as minimizing additional overheads. Keywords: feature extraction; feature selection; genetic algorithms; information theory; security of data; KDD cup 99 dataset; audit records; dimensionality reduction; feature selection; genetic network programming based intrusion detection model; information theoretic feature extraction; Artificial intelligence; Correlation; Association rule; Discretization; Feature Selection; GNP
  • Hyunho Kang; Hori, Y.; Katashita, T.; Hagiwara, M.; Iwamura, K., "Cryptographie key generation from PUF data using efficient fuzzy extractors," Advanced Communication Technology (ICACT), 2014 16th International Conference on , vol., no., pp.23,26, 16-19 Feb. 2014. (ID#:14-1695) Available at: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6778915&isnumber=6778899 or http://dx.doi.org/10.1016/j.tcs.2013.09.008 Physical unclonable functions (PUFs) and biometrics are inherently noisy. When used in practice as cryptographic key generators, they need to be combined with an extraction technique to derive reliable bit strings (i.e., cryptographic key). An approach based on an error correcting code was proposed by Dodis et al. and is known as a fuzzy extractor. However, this method appears to be difficult for non-specialists to implement. In our recent study, we reported the results of some example implementations using PUF data and presented a detailed implementation diagram. In this paper, we describe a more efficient implementation method by replacing the hash function output with the syndrome from the BCH code. The experimental results show that the Hamming distance between two keys vary according to the key size and information-theoretic security has been achieved. Keywords: Hamming codes; cryptography; error correction codes; fuzzy set theory; BCH code; Hamming distance; PUF data; biometrics; cryptographic key generation; efficient fuzzy extractors; error correcting code ;information-theoretic security; physical unclonable functions; reliable bit strings; Cryptography; Data mining; Entropy; Hamming distance; High definition video; Indexes; Reliability; Arbiter PUF; Fuzzy Extractor; Physical Unclonable Functions
  • Joshua D. Guttman, "Establishing and Preserving Protocol Security Goals," Journal of Computer Security - Foundational Aspects of Security, Volume 22 Issue 2, March 2014, ( Pages 203-267). (ID#:14-1696) Available at: http://dl.acm.org/citation.cfm?id=2595841.2595843 We take a model-theoretic viewpoint on security goals and how to establish them. The models are possibly fragmentary executions. Security goals such as authentication and confidentiality are geometric sequents, i.e. implications F-Ps where F and Ps are built from atomic formulas without negations, implications, or universal quantifiers. Security goals are then statements about homomorphisms, where the source is a minimal fragmentary model of the antecedent F. If every homomorphism to a non-fragmentary, complete execution factors through a model in which Ps is satisfied, then the goal is achieved. One can validate security goals via a process of information enrichment. We call this approach enrich-by-need protocol analysis.This idea also clarifies protocol transformation. A protocol transformation preserves security goals when it preserves the form of the information enrichment process. We formalize this idea using simulation relations between labeled transition systems. These labeled transition systems formalize the analysis of the protocols, i.e. the information enrichment process, not the execution behavior of the protocols. Keywords: Authentication, Cryptographic Protocol Analysis, Security Properties, Strand Spaces
  • Guomin Yang, Chik How Tan, Yi Mu, Willy Susilo, Duncan S. Wong, "Identity Based Identification From Algebraic Coding Theory," Theoretical Computer Science, Volume 520, February, 2014,(Pages 51-61). (ID#:14-1697) Available at: http://dl.acm.org/citation.cfm?id=2567013.2567369&coll=DL&dl=GUIDE&CFID=485004180&CFTOKEN=38695484 or http://dx.doi.org/10.1016/j.tcs.2013.09.008 Cryptographic identification schemes allow a remote user to prove his/her identity to a verifier who holds some public information of the user, such as the user public key or identity. Most of the existing cryptographic identification schemes are based on number-theoretic hard problems such as Discrete Log and Factorization. This paper focuses on the design and analysis of identity based identification (IBI) schemes based on algebraic coding theory. We first revisit an existing code-based IBI scheme which is derived by combining the Courtois-Finiasz-Sendrier signature scheme and the Stern zero-knowledge identification scheme. Previous results have shown that this IBI scheme is secure under passive attacks. In this paper, we prove that the scheme in fact can resist active attacks. However, whether the scheme can be proven secure under concurrent attacks (the most powerful attacks against identification schemes) remains open. In addition, we show that it is difficult to apply the conventional OR-proof approach to this particular IBI scheme in order to obtain concurrent security. We then construct a special OR-proof variant of this scheme and prove that the resulting IBI scheme is secure under concurrent attacks. Keywords: Error-correcting codes, Identification, Identity based cryptography, Syndrome decoding
  • Yi-Kai Liu, "Building One-Time Memories From Isolated Qubits: (extended abstract) Proceedings of the 5th Conference On Innovations In Theoretical Computer Science January 2014, (Pages 269-286). (ID#:14-1698) Available at: http://dx.doi.org/10.1145/2554797.2554823 One-time memories (OTM's) are simple tamper-resistant cryptographic devices, which can be used to implement one-time programs, a very general form of software protection and program obfuscation. Here we investigate the possibility of building OTM's using quantum mechanical devices. It is known that OTM's cannot exist in a fully-quantum world or in a fully-classical world. Instead, we propose a new model based on isolated qubits - qubits that can only be accessed using local operations and classical communication (LOCC). This model combines a quantum resource (single-qubit measurements) with a classical restriction (on communication between qubits), and can be implemented using current technologies, such as nitrogen vacancy centers in diamond. In this model, we construct OTM's that are information-theoretically secure against one-pass LOCC adversaries that use 2-outcome measurements. Our construction resembles Wiesner's old idea of quantum conjugate coding, implemented using random error-correcting codes; our proof of security uses entropy chaining to bound the supremum of a suitable empirical process. In addition, we conjecture that our random codes can be replaced by some class of efficiently-decodable codes, to get computationally-efficient OTM's that are secure against computationally-bounded LOCC adversaries. In addition, we construct data-hiding states, which allow an LOCC sender to encode an (n-O(1))-bit messsage into n qubits, such that at most half of the message can be extracted by a one-pass LOCC receiver, but the whole message can be extracted by a general quantum receiver. Keywords: conjugate coding, cryptography, data-hiding states, local operations and classical communication, oblivious transfer, one-time programs, quantum computation

Note:

Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to SoS.Project (at) SecureDataBank.net for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.