

Interpolation between random matrices and free operators, and application to Quantum Information Theory
De Félix Parraud


Wasserstein gradient flows and applications to sampling in machine learning - lecture 1
De Anna Korba
Apparaît dans la collection : 2016 - T1 - WS5 - Secrecy and privacy theme
If you see a cryptographic hash of my password, how can I quantify your uncertainty about the password? Entropy – a traditional measure of uncertainty – is of little help: conditioned on your knowledge of the hash value, the distribution of my passwords has small support, and yet your knowledge should not of be of much help to you in guessing my password. Conversely, if you see a cryptographic hash of a long document, how can you quantify my certainty about the document? Entropy is again of little help: there are many possible documents that hash to the same value, and yet you can be fairly certain that I know at most one of them. Computational analogues of entropy provide answers to these questions and enable a host of cryptographic applications. In fact, the right notion of entropy and a toolbox of lemmas can make for beautifully simple proofs. This talk will survey some of the notions, lemmas, and applications.