

Interpolation between random matrices and free operators, and application to Quantum Information Theory
De Félix Parraud


Wasserstein gradient flows and applications to sampling in machine learning - lecture 1
De Anna Korba
Apparaît dans la collection : 2016 - T1 - WS1 - Distributed computation and communication theme
Proving an impossibility result in information theory typically boils down to quantifying a tension between information measures that naturally emerge in an operational setting and then showing that the extremal points satisfy a single-letterization (tensorization) property. In contrast, studies on ‘strong data processing’ essentially dispense with the operational aspect and directly investigate functionals involving information measures and the properties they enjoy (e. g. , tensorization). In this talk, we adopt the latter approach and prove a strengthening of Shannon's Entropy Power Inequality (EPI) that is closely tied to strong data processing for Gaussian channels. The new inequality generalizes Costa's EPI and leads to short converse proofs for Gaussian multi terminal source coding problems where none were previously known.