On Parallels Between Shannon’s and Kolmogorov’s Information Theories (where the parallelism fails and why)
Two versions of information theory - the theory of Shannon's entropy and the theory of Kolmgorov complexity - have manifest similarities in the basic definitions as well as in deep technical theorems. The interplay between these two theories often lead to remarkable insights. In the talk we will show different examples of this interplay concerning information inequalities and conditional encoding theorems, and discuss the limits of this parallelism.