RUS  ENG
Full version
SEMINARS

Colloquium of the Faculty of Computer Science
September 11, 2014 16:40, Moscow


Algorithmic Information Theory and Randomness of Individual Objects

Alexander Shen'ab

a A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences, Moscow
b Laboratoire d'Informatique Fondamentale de Marseille


http://www.youtube.com/watch?v=X0Lo5IWLjko

Abstract: In mid-20th century Shannon introduced the concept of entropy, which can be intuitively described as the “average number of information bits in a single value of a random variable.” However this concept is not applicable to individual objects, such as the text of a novel or the DNA molecule. Indeed, there is no random variable without an ensemble of similar objects of the same kind.
In mid-1960s several people (Kolmogorov, Solomonoff, Levin, Chaitin, …) realized that the amount of information or complexity contained in an individual object can be defined as the minimal length of a computer program that generates this object, under some natural assumptions on the programming language. Thus appeared algorithmic information theory, which proved to be connected to various domains such as philosophical foundations of probability (when can a statistical hypothesis be rejected?), combinatorics (inequalities between cardinalities of sets and their projections), and theory of computation.
I will present basic definitions and some of the fundamental results of this theory, and also, depending on the audience, a glimpse of current work.

Website: https://www.youtube.com/watch?v=X0Lo5IWLjko


© Steklov Math. Inst. of RAS, 2024