RUS  ENG
Full version
JOURNALS // Problemy Peredachi Informatsii // Archive

Probl. Peredachi Inf., 2007 Volume 43, Issue 1, Pages 15–27 (Mi ppi2)

This article is cited in 8 papers

Information Theory

On Inequalities between Information and Variation

V. V. Prelov

A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Abstract: We continue studying the relationship between mutual information and variational distance started in Pinsker?s paper [1], where an upper bound for the mutual information via variational distance was obtained. We present a simple lower bound, which in some cases is optimal or asymptotically optimal. A uniform upper bound for the mutual information via variational distance is also derived for random variables with a finite number of values. For such random variables, the asymptotic behaviour of the maximum of mutual information is also investigated in the cases where the variational distance tends either to zero or to its maximum value.

UDC: 621.391.1

Received: 21.11.2006


 English version:
Problems of Information Transmission, 2007, 43:1, 12–22

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024