RUS  ENG
Full version
JOURNALS // Problemy Peredachi Informatsii // Archive

Probl. Peredachi Inf., 2014 Volume 50, Issue 3, Pages 3–18 (Mi ppi2141)

This article is cited in 3 papers

Information Theory

On one extreme value problem for entropy and error probability

V. V. Prelov

Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences, Moscow, Russia

Abstract: The problem of determining both the maximum and minimum entropy of a random variable $Y$ as well as the maximum absolute value of the difference between entropies of $Y$ and another random variable $X$ is considered under the condition that the probability distribution of $X$ is fixed and the error probability (i.e., the probability of noncoincidence of random values of $X$ and $Y$) is given. A precise expression for the minimum entropy of $Y$ is found. Some conditions under which the entropy of $Y$ takes its maximum value are pointed out. In other cases, some lower and upper bounds are obtained for the maximum entropy of $Y$ as well as for the maximum absolute value of the difference between entropies of $Y$ and $X$.

UDC: 621.391.1+519.72

Received: 21.01.2014
Revised: 19.06.2014


 English version:
Problems of Information Transmission, 2014, 50:3, 203–216

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2025