RUS  ENG
Full version
JOURNALS // Problemy Peredachi Informatsii // Archive

Probl. Peredachi Inf., 1996 Volume 32, Issue 1, Pages 35–40 (Mi ppi299)

Inequalities and Algorithms for Universal Data Compression

J. Ziv


Abstract: We describe nonasymptotic uniform bounds on the performance of data-compression algorithms in cases where the length $N$ of the training sequence (“history”) that is available to the encoder is not large enough so as to yield the ultimate compression, namely the entropy of the source. Two characteristic ultimate goals are considered: The lth-order entropy $H(X_1^\ell)$, and the associated conditional entropy $H(X_1^{\ell-k}|X_{-k+1}^0)$. The bounds are based on classical information-theoretic convexity arguments. Nevertheless, it is demonstrated that convexity arguments that work for some cases are totally useless for others. Furthermore, these classical convexity arguments, when properly used, lead to efficient universal data-compression algorithms for each of the cases considered. We limit the discussion to fixed-to-variable, uniquely decipherable codes.

UDC: 621.391.1:519.28


 English version:
Problems of Information Transmission, 1996, 32:1, 28–32

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024