RUS  ENG
Full version
JOURNALS // Problemy Peredachi Informatsii // Archive

Probl. Peredachi Inf., 2017 Volume 53, Issue 3, Pages 16–22 (Mi ppi2239)

This article is cited in 3 papers

Information Theory

On coupling of probability distributions and estimating the divergence through variation

V. V. Prelov

Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences, Moscow, Russia

Abstract: Let $X$ be a discrete random variable with a given probability distribution. For any $\alpha$, $0\le\alpha\le1$, we obtain precise values for both the maximum and minimum variational distance between $X$ and another random variable $Y$ under which an $\alpha$-coupling of these random variables is possible. We also give the maximum and minimum values for couplings of $X$ and $Y$ provided that the variational distance between these random variables is fixed. As a consequence, we obtain a new lower bound on the divergence through variational distance.

UDC: 621.391.1+519.2

Received: 22.11.2016
Revised: 10.02.2017


 English version:
Problems of Information Transmission, 2017, 53:3, 215–221

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2025