|
SEMINARS |
|
Causal inference and Kolmogorov complexity Bruno Bauwens Faculty of Computer Science, National Research University "Higher School of Economics" |
|||
Abstract: It is often stated that “Correlation does not imply causation”: a dependency between observed values of two random variables might not imply that there exists a causal connection between the corresponding processes (and assuming there exists one, one might not know the direction). In Shannon information theory, this is reflected by the law of “symmetry of information”: This law remains valid if Shannon entropy is replaced by Kolmogorov complexity. However, there exists a subtler setting where this law is violated and one might speculate that this asymmetry can be used to reconstruct causality. In the second part of the talk, we discuss the postulate of independence of conditionals for the inference of causal relations in observed data. This postulate was introduced by D. Janzing and B. Schölkopf in 2010, and the two-variable case states that, if Language: English |