Abstract:
This paper is devoted to the mathematical study of some divergences based on
mutual information which are well suited to categorical random vectors. These divergences are
generalizations of the “entropy distance” and “information distance.” Their main characteristic
is that they combine a complexity term and the mutual information. We then introduce the
notion of (normalized) information-based divergence, propose several examples, and discuss
their mathematical properties, in particular, in some prediction framework.