RUS  ENG
Full version
JOURNALS // Problemy Peredachi Informatsii // Archive

Probl. Peredachi Inf., 2009 Volume 45, Issue 4, Pages 3–17 (Mi ppi1995)

This article is cited in 5 papers

Information Theory

Mutual information of several random variables and its estimation via variation

V. V. Prelov

A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Abstract: We obtain some upper and lower bounds for the maximum of mutual information of several random variables via variational distance between the joint distribution of these random variables and the product of its marginal distributions. In this connection, some properties of variational distance between probability distributions of this type are derived. We show that in some special cases estimates of the maximum of mutual information obtained here are optimal or asymptotically optimal. Some results of this paper generalize the corresponding results of [1–3] to the multivariate case.

UDC: 621.391.1+519.2

Received: 12.05.2009


 English version:
Problems of Information Transmission, 2009, 45:4, 295–308

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2025