RUS  ENG
Full version
JOURNALS // Problemy Peredachi Informatsii // Archive

Probl. Peredachi Inf., 2016 Volume 52, Issue 4, Pages 3–13 (Mi ppi2218)

This article is cited in 2 papers

Information Theory

On some extremal problems for mutual information and entropy

V. V. Prelov

Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences, Moscow, Russia

Abstract: The problem of determining the maximum mutual information $I(X;Y)$ and minimum entropy $H(X,Y)$ of a pair of discrete random variables $X$ and $Y$ is considered under the condition that the probability distribution of $X$ is fixed and the error probability $\mathrm{Pr}\{Y\ne X\}$ takes a given value $\varepsilon$, $0\le\varepsilon\le1$. Precise values for these quantities are found, which in several cases allows us to obtain explicit formulas for both the maximum information and minimum entropy in terms of the probability distribution of $X$ and the parameter $\varepsilon$.

UDC: 621.391.1+519.72

Received: 01.12.2015
Revised: 14.10.2016


 English version:
Problems of Information Transmission, 2016, 52:4, 319–328

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2025