Visible to the public Channel Coding Theorems in Non-stochastic Information Theory

TitleChannel Coding Theorems in Non-stochastic Information Theory
Publication TypeConference Paper
Year of Publication2021
AuthorsRangi, Anshuka, Franceschetti, Massimo
Conference Name2021 IEEE International Symposium on Information Theory (ISIT)
Date Publishedjul
Keywords(\$\textbackslashtextbackslashepsilon, \textbackslashtextbackslashdelta\$)-capacity, channel coding, Channel estimation, coding theorem, composability, Computer science, Estimation, Kolmogorov capacity, Learning systems, Metrics, Mutual information, pubcrawl, Resiliency, security, Shannon capacity, sufficient conditions, zero-error capacity
AbstractRecently, the d-mutual information between uncertain variables has been introduced as a generalization of Nair's non-stochastic mutual information functional [1], [2]. Within this framework, we introduce four different notions of capacity and present corresponding coding theorems. Our definitions include an analogue of Shannon's capacity in a non-stochastic setting, and a generalization of the zero-error capacity. The associated coding theorems hold for stationary, memoryless, non-stochastic uncertain channels. These results establish the relationship between the d-mutual information and our operational definitions, providing a step towards the development of a complete non-stochastic information theory.
DOI10.1109/ISIT45174.2021.9518008
Citation Keyrangi_channel_2021