Abstract:
A generalization of a Pinsker problem [1] on estimation of mutual information via variation is considered. We obtain some upper and lower bounds for the maximum of the absolute value of the difference between the mutual information of several random variables via variational distance between the probability distributions of these random variables. In some cases, these bounds are optimal.