RUS  ENG
Full version
JOURNALS // Problemy Peredachi Informatsii // Archive

Probl. Peredachi Inf., 2010 Volume 46, Issue 2, Pages 24–29 (Mi ppi2013)

This article is cited in 4 papers

Information Theory

On computation of information via variation and inequalities for the entropy function

V. V. Prelov

A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Abstract: This paper supplements the author's paper [1]. We obtain an explicit formula which in a special case allows us to calculate the maximum of mutual information of several random variables via the variational distance between the joint distribution of these random variables and the product of their marginal distributions. We establish two new inequalities for the binary entropy function, which are related to the problem considered here.

UDC: 621.391.1+519.7

Received: 26.01.2010


 English version:
Problems of Information Transmission, 2010, 46:2, 122–126

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2025