Visible to the public Graph Based Transforms based on Graph Neural Networks for Predictive Transform Coding

TitleGraph Based Transforms based on Graph Neural Networks for Predictive Transform Coding
Publication TypeConference Paper
Year of Publication2021
AuthorsRoy, Debaleena, Guha, Tanaya, Sanchez, Victor
Conference Name2021 Data Compression Conference (DCC)
Date Publishedmar
KeywordsArtificial neural networks, composability, Compression, Cyber-physical systems, DCT, Decoding, DST, GBST, GBT, GBT L, GBT NN, graph neural networks, KLT, Laplace equations, mean square error methods, network coding, Neural Network, Predictive Metrics, Predictive Transform Coding, pubcrawl, Resiliency, Transform coding, Transforms
AbstractThis paper introduces the GBT-NN, a novel class of Graph-based Transform within the context of block-based predictive transform coding using intra-prediction. The GBT-NNis constructed by learning a mapping function to map a graph Laplacian representing the covariance matrix of the current block. Our objective of learning such a mapping functionis to design a GBT that performs as well as the KLT without requiring to explicitly com-pute the covariance matrix for each residual block to be transformed. To avoid signallingany additional information required to compute the inverse GBT-NN, we also introduce acoding framework that uses a template-based prediction to predict residuals at the decoder. Evaluation results on several video frames and medical images, in terms of the percentageof preserved energy and mean square error, show that the GBT-NN can outperform the DST and DCT.
DOI10.1109/DCC50243.2021.00079
Citation Keyroy_graph_2021