RUS  ENG
Full version
JOURNALS // Trudy Matematicheskogo Instituta imeni V.A. Steklova // Archive

Trudy Mat. Inst. Steklova, 2011 Volume 274, Pages 103–118 (Mi tm3316)

This article is cited in 4 papers

On joint conditional complexity (entropy)

Nikolay K. Vereshchagina, Andrej A. Muchnik

a Department of Mathematical Logic and Theory of Algorithms, Faculty of Mechanics and Mathematics, Moscow State University, Moscow, Russia

Abstract: The conditional Kolmogorov complexity of a word $a$ relative to a word $b$ is the minimum length of a program that prints $a$ given $b$ as an input. We generalize this notion to quadruples of strings $a,b,c,d$: their joint conditional complexity $K((a\to c)\land(b\to d))$ is defined as the minimum length of a program that transforms $a$ into $c$ and transforms $b$ into $d$. In this paper, we prove that the joint conditional complexity cannot be expressed in terms of the usual conditional (and unconditional) Kolmogorov complexity. This result provides a negative answer to the following question asked by A. Shen on a session of the Kolmogorov seminar at Moscow State University in 1994: Is there a problem of information processing whose complexity is not expressible in terms of the conditional (and unconditional) Kolmogorov complexity? We show that a similar result holds for the classical Shannon entropy. We provide two proofs of both results, an effective one and a “quasi-effective” one. Finally, we present a quasi-effective proof of a strong version of the following statement: there are two strings whose mutual information cannot be extracted. Previously, only a noneffective proof of that statement has been known.

UDC: 510.5

Received in June 2011


 English version:
Proceedings of the Steklov Institute of Mathematics, 2011, 274, 90–104

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2024