Abstract:
The main result of this paper amounts to the following statement: If a sequence of pairs of random variables
$(\xi_n,\eta_n)$ is given and this sequence converges in variation to a pair of random variables $(\xi,\eta)$, then $\lim _{n\to\infty}I(\xi_n,\eta_n)=I(\xi,\eta)(I(\xi,\eta)$ is the information of the pair $(\xi,\eta)$ if and only if the sequence of corresponding information densities is uniformly integrable. A similar result is proved for entropies and for a new concept in information within a probability $E$ of events. Conditions are found for the convergence of these quantities.