Biblio
A new framework is presented in this paper for proving coding theorems for linear codes, where the systematic bits and the corresponding parity-check bits play different roles. Precisely, the noisy systematic bits are used to limit the list size of typical codewords, while the noisy parity-check bits are used to select from the list the maximum likelihood codeword. This new framework for linear codes allows that the systematic bits and the parity-check bits are transmitted in different ways and over different channels. In particular, this new framework unifies the source coding theorems and the channel coding theorems. With this framework, we prove that the Bernoulli generator matrix codes (BGMCs) are capacity-achieving over binary-input output symmetric (BIOS) channels and also entropy-achieving for Bernoulli sources.
ISSN: 2157-8117
In this work, we consider the application of the nonstationary channel polarization theory on the wiretap channel model with non-stationary blocks. Particularly, we present a time-bit coding scheme which is a secure polar codes that constructed on the virtual bit blocks by using the non-stationary channel polarization theory. We have proven that this time-bit coding scheme achieves reliability, strong security and the secrecy capacity. Also, compared with regular secure polar coding methods, our scheme has a lower coding complexity for non-stationary channel blocks.
With the advent of massive machine type of communications, security protection becomes more important than ever. Efforts have been made to impose security protection capability to physical-layer signal design, so called physical-layer security (PLS). The purpose of this paper is to evaluate the performance of PLS schemes for a multi-input-multi-output (MIMO) systems with space-time block coding (STBC) under imperfect channel estimation. Three PLS schemes for STBC schemes are modeled and their bit error rate (BER) performances are evaluated under various channel estimation error environments, and their performance characteristics are analyzed.
ISSN: 2163-0771
The cutting-edge biometric recognition systems extract distinctive feature vectors of biometric samples using deep neural networks to measure the amount of (dis-)similarity between two biometric samples. Studies have shown that personal information (e.g., health condition, ethnicity, etc.) can be inferred, and biometric samples can be reconstructed from those feature vectors, making their protection an urgent necessity. State-of-the-art biometrics protection solutions are based on homomorphic encryption (HE) to perform recognition over encrypted feature vectors, hiding the features and their processing while releasing the outcome only. However, this comes at the cost of those solutions' efficiency due to the inefficiency of HE-based solutions with a large number of multiplications; for (dis-)similarity measures, this number is proportional to the vector's dimension. In this paper, we tackle the HE performance bottleneck by freeing the two common (dis-)similarity measures, the cosine similarity and the squared Euclidean distance, from multiplications. Assuming normalized feature vectors, our approach pre-computes and organizes those (dis-)similarity measures into lookup tables. This transforms their computation into simple table-lookups and summation only. We study quantization parameters for the values in the lookup tables and evaluate performances on both synthetic and facial feature vectors for which we achieve a recognition performance identical to the non-tabularized baseline systems. We then assess their efficiency under HE and record runtimes between 28.95ms and 59.35ms for the three security levels, demonstrating their enhanced speed.
ISSN: 2474-9699
Face recognition is a biometric technique that uses a computer or machine to facilitate the recognition of human faces. The advantage of this technique is that it can detect faces without direct contact with the device. In its application, the security of face recognition data systems is still not given much attention. Therefore, this study proposes a technique for securing data stored in the face recognition system database. It implements the Viola-Jones Algorithm, the Kanade-Lucas-Tomasi Algorithm (KLT), and the Principal Component Analysis (PCA) algorithm by applying a database security algorithm using XOR encryption. Several tests and analyzes have been performed with this method. The histogram analysis results show no visual information related to encrypted images with plain images. In addition, the correlation value between the encrypted and plain images is weak, so it has high security against statistical attacks with an entropy value of around 7.9. The average time required to carry out the introduction process is 0.7896 s.
Cancelable biometric is a new era of technology that deals with the protection of the privacy content of a person which itself helps in protecting the identity of a person. Here the biometric information instead of being stored directly on the authentication database is transformed into a non-invertible coded format that will be utilized for providing access. The conversion into an encrypted code requires the provision of an encryption key from the user side. Both invertible and non-invertible coding techniques are there but non-invertible one provides additional security to the user. In this paper, a non-invertible cancelable biometric method has been proposed where the biometric image information is canceled and encoded into a code using a user-provided encryption key. This code is generated from the image histogram after continuous bin updation to the maximal value and then it is encrypted by the Hill cipher. This code is stored on the database instead of biometric information. The technique is applied to a set of retinal information taken from the Indian Diabetic Retinopathy database.
In healthcare 4.0 ecosystems, authentication of healthcare information allows health stakeholders to be assured that data is originated from correct source. Recently, biometric based authentication is a preferred choice, but as the templates are stored on central servers, there are high chances of copying and generating fake biometrics. An adversary can forge the biometric pattern, and gain access to critical health systems. Thus, to address the limitation, the paper proposes a scheme, PHBio, where an encryption-based biometric system is designed prior before storing the template to the server. Once a user provides his biometrics, the authentication process does not decrypt the data, rather uses a homomorphic-enabled Paillier cryptosystem. The scheme presents the encryption and the comparison part which is based on euclidean distance (EUD) strategy between the user input and the stored template on the server. We consider the minimum distance, and compare the same with a predefined threshold distance value to confirm a biometric match, and authenticate the user. The scheme is compared against parameters like accuracy, false rejection rates (FARs), and execution time. The proposed results indicate the validity of the scheme in real-time health setups.
Advanced Encryption Standard (AES) algorithm plays an important role in a data security application. In general S-box module in AES will give maximum confusion and diffusion measures during AES encryption and cause significant path delay overhead. In most cases, either L UTs or embedded memories are used for S- box computations which are vulnerable to attacks that pose a serious risk to real-world applications. In this paper, implementation of the composite field arithmetic-based Sub-bytes and inverse Sub-bytes operations in AES is done. The proposed work includes an efficient multiple round AES cryptosystem with higher-order transformation and composite field s-box formulation with some possible inner stage pipelining schemes which can be used for throughput rate enhancement along with path delay optimization. Finally, input biometric-driven key generation schemes are used for formulating the cipher key dynamically, which provides a higher degree of security for the computing devices.
Efficient large-scale biometric identification is a challenging open problem in biometrics today. Adding biometric information protection by cryptographic techniques increases the computational workload even further. Therefore, this paper proposes an efficient and improved use of coefficient packing for homomorphically protected biometric templates, allowing for the evaluation of multiple biometric comparisons at the cost of one. In combination with feature dimensionality reduction, the proposed technique facilitates a quadratic computational workload reduction for biometric identification, while long-term protection of the sensitive biometric data is maintained throughout the system. In previous works on using coefficient packing, only a linear speed-up was reported. In an experimental evaluation on a public face database, efficient identification in the encrypted domain is achieved on off-the-shelf hardware with no loss in recognition performance. In particular, the proposed improved use of coefficient packing allows for a computational workload reduction down to 1.6% of a conventional homomorphically protected identification system without improved packing.
Considered sensitive information by the ISO/IEC 24745, biometric data should be stored and used in a protected way. If not, privacy and security of end-users can be compromised. Also, the advent of quantum computers demands quantum-resistant solutions. This work proposes the use of Kyber and Saber public key encryption (PKE) algorithms together with homomorphic encryption (HE) in a face recognition system. Kyber and Saber, both based on lattice cryptography, were two finalists of the third round of NIST post-quantum cryptography standardization process. After the third round was completed, Kyber was selected as the PKE algorithm to be standardized. Experimental results show that recognition performance of the non-protected face recognition system is preserved with the protection, achieving smaller sizes of protected templates and keys, and shorter execution times than other HE schemes reported in literature that employ lattices. The parameter sets considered achieve security levels of 128, 192 and 256 bits.
ISSN: 1617-5468
In this paper, an overall introduction of fingerprint encryption algorithm is made, and then a fingerprint encryption algorithm with error correction is designed by adding error correction mechanism. This new fingerprint encryption algorithm can produce stochastic key in the form of multinomial coefficient by using the binary system sequencer, encrypt fingerprint, and use the Lagrange difference value to restore the multinomial during authenticating. Due to using the cyclic redundancy check code to find out the most accurate key, the accuracy of this algorithm can be ensured. Experimental result indicates that the fuzzy vault algorithm with error correction can well realize the template protection, and meet the requirements of biological information security protection. In addition, it also indicates that the system's safety performance can be enhanced by chanaing the key's length.
When storing face biometric samples in accordance with ISO/IEC 19794 as JPEG2000 encoded images, it is necessary to encrypt them for the sake of users’ privacy. Literature suggests selective encryption of JPEG2000 images as fast and efficient method for encryption, the trade-off is that some information is left in plaintext. This could be used by an attacker, in case the encrypted biometric samples are leaked. In this work, we will attempt to utilize a convolutional neural network to perform cryptanalysis of the encryption scheme. That is, we want to assess if there is any information left in plaintext in the selectively encrypted face images which can be used to identify the person. The chosen approach is to train CNNs for biometric face recognition not only with plaintext face samples but additionally conduct a refinement training with partially encrypted data. If this system can successfully utilize encrypted face samples for biometric matching, we can show that the information left in encrypted biometric face samples is information actually usable for biometric recognition.The method works and we can show that a supposedly secure biometric sample still contains identifying information on average over the whole database.
ISSN: 2831-7475
Biometric security is the fastest growing area that receives considerable attention over the past few years. Digital hiding and encryption technologies provide an effective solution to secure biometric information from intentional or accidental attacks. Visual cryptography is the approach utilized for encrypting the information which is in the form of visual information for example images. Meanwhile, the biometric template stored in the databases are generally in the form of images, the visual cryptography could be employed effectively for encrypting the template from the attack. This study develops a share creation with improved encryption process for secure biometric verification (SCIEP-SBV) technique. The presented SCIEP-SBV technique majorly aims to attain security via encryption and share creation (SC) procedure. Firstly, the biometric images undergo SC process to produce several shares. For encryption process, homomorphic encryption (HE) technique is utilized in this work. To further improve the secrecy, an improved bald eagle search (IBES) approach was exploited in this work. The simulation values of the SCIEP-SBV system are tested on biometric images. The extensive comparison study demonstrated the improved outcomes of the SCIEP-SBV technique over compared methods.
Code optimization is an essential feature for compilers and almost all software products are released by compiler optimizations. Consequently, bugs in code optimization will inevitably cast significant impact on the correctness of software systems. Locating optimization bugs in compilers is challenging as compilers typically support a large amount of optimization configurations. Although prior studies have proposed to locate compiler bugs via generating witness test programs, they are still time-consuming and not effective enough. To address such limitations, we propose an automatic bug localization approach, ODFL, for locating compiler optimization bugs via differentiating finer-grained options in this study. Specifically, we first disable the fine-grained options that are enabled by default under the bug-triggering optimization levels independently to obtain bug-free and bug-related fine-grained options. We then configure several effective passing and failing optimization sequences based on such fine-grained options to obtain multiple failing and passing compiler coverage. Finally, such generated coverage information can be utilized via Spectrum-Based Fault Localization formulae to rank the suspicious compiler files. We run ODFL on 60 buggy GCC compilers from an existing benchmark. The experimental results show that ODFL significantly outperforms the state-of-the-art compiler bug isolation approach RecBi in terms of all the evaluated metrics, demonstrating the effectiveness of ODFL. In addition, ODFL is much more efficient than RecBi as it can save more than 88% of the time for locating bugs on average.
ISSN: 1534-5351