Abstract:
A minimax approach to the problem of optimal information coding differing from Shannon's probability approach is formulated. The construction of codes optimal from this point of view is given. It is shown that with a certain natural choice of the estimate of usefulness of messages, such codes ensure the compression of messages corresponding to their redundancy (in the Shannon probabilistic sense of the word) simultaneously for an extensive class of message sources, considered as random sequences of symbols. These universal coding methods are more simply constructed and ensure better efficiency for small lengths of blocks to be coded than the universal quasi-entropic method of coding, without taking account of correlation, previously proposed by the author.