Visible to the public Ternary Convolutional LDGM Codes with Applications to Gaussian Source Compression

TitleTernary Convolutional LDGM Codes with Applications to Gaussian Source Compression
Publication TypeConference Paper
Year of Publication2022
AuthorsZhu, Tingting, Liang, Jifan, Ma, Xiao
Conference Name2022 IEEE International Symposium on Information Theory (ISIT)
Date Publishedjun
Keywordsblock Markov superposition transmission (BMST), coding theory, composability, compositionality, Convolutional codes, convolutional LDGM codes, cryptography, distortion, Entropy, Gaussian source, Generators, Markov processes, Metrics, pubcrawl, Quantization (signal), resilience, Resiliency, security, source coding, ternary compression
AbstractWe present a ternary source coding scheme in this paper, which is a special class of low density generator matrix (LDGM) codes. We prove that a ternary linear block LDGM code, whose generator matrix is randomly generated with each element independent and identically distributed, is universal for source coding in terms of the symbol-error rate (SER). To circumvent the high-complex maximum likelihood decoding, we introduce a special class of convolutional LDGM codes, called block Markov superposition transmission of repetition (BMST-R) codes, which are iteratively decodable by a sliding window algorithm. Then the presented BMST-R codes are applied to construct a tandem scheme for Gaussian source compression, where a dead-zone quantizer is introduced before the ternary source coding. The main advantages of this scheme are its universality and flexibility. The dead-zone quantizer can choose a proper quantization level according to the distortion requirement, while the LDGM codes can adapt the code rate to approach the entropy of the quantized sequence. Numerical results show that the proposed scheme performs well for ternary sources over a wide range of code rates and that the distortion introduced by quantization dominates provided that the code rate is slightly greater than the discrete entropy.
NotesISSN: 2157-8117
DOI10.1109/ISIT50566.2022.9834500
Citation Keyzhu_ternary_2022