RUS  ENG
Полная версия
СЕМИНАРЫ

Математический семинар ФКН ВШЭ
3 октября 2025 г. 18:10, г. Москва, Покровский бульвар 11, аудитория R306


Algorithmic probability and the information distance

Бруно Баувенс


https://www.youtube.com/watch?v=70UZQjRfIss

Аннотация: The conditional Kolmogorov complexity of a string x given a string y is the minimal length of a program that on input y prints x and halts. Andrei Kolmogorov promoted the study of this notion in 1964 to find a theory that could somehow justify his famous axioms of probability. But to connect to probability, one should use a variant of complexity, which is based on self delimiting programs. This notion can be defined in 4 different ways, one of which is the logarithm of algorithmic probability (in discrete form). This probability was introduced by Solomonoff in 1960 to describe learning in a very general way.
In various applications, there is a need for a symmetric notion of conditional complexity. The first proposal from 1998 is to consider the minimal length of a program that prints x on input y and also prints y on input x. The authors also prove that the other symmetrized definitions of conditional complexity are close to each other, but conjecture that they do not coincide. Recently, I have proven this conjecture and also showed that the 4 definitions only differ in strange corner cases (for example, one string needs to be exponentially more complex than the other).
In this talk, I will briefly discuss applications of algorithmic probability and the algorithmic information distance to machinelearning. Then I will prove the coding theorem and its approximate bidirectional variant. Finally, I discuss recent results.

Website: https://vk.com/cshse?from=groups&z=video-69306530_456240151


© МИАН, 2025