Appears in collection : Nexus Trimester - 2016 - Secrecy and Privacy Theme
Common information aims to characterize the common randomness between two random variables. Two distinct, but related, definitions for common information were proposed in the 1970s by Gács and Körner and by Wyner, each with operational significance for particular coding problems. In this talk, we will show how these classical measures are useful tools in the characterization and derivation of several recent results in security and privacy. In particular, common information characterizes which channels are complete for secure two-party computation, as well as playing a key role in the simplified converse proof. Common information also characterizes which joint distributions are trivial to securely sample, which has applications in realizing correlated equilibria for games preceded by communication. In the problem of privacy-preserving data release, common information characterizes whether the best privacy-utility tradeoffs attainable by an output perturbation mechanism will be generally optimal or strictly suboptimal.