

Wasserstein gradient flows and applications to sampling in machine learning - lecture 1
By Anna Korba


Wasserstein gradient flows and applications to sampling in machine learning - lecture 2
By Anna Korba
Appears in collection : Shannon 100
This year, we celebrate Shannon’s 100th birthday and it has been 68 years since he laid the foundations of communications. To realize his number 1 goal or error free communication we use error correcting codes. Every time we make a call, connect to WiFi, download a movie, or store a file, they help us get things right. The journey began with codes based on algebraic structures such as Reed-Muller and Reed- Solomon codes. Then lattices helped convey continuous-valued signals. Slowly, deterministic codes made way for random sparse graphs codes with low-complexity message-passing decoding, such as Turbo codes and LDPC codes. The new millennium brought us Polar codes that use the chain rule of mutual information to achieve capacity and spatially-coupled codes that exploit the physical mechanism that makes crystals grow to simultaneously achieve the capacity of a large family of communication channels. Recently, the story has come full circle, and the symmetry inherent in algebraic constructions has brought the focus back on Reed-Muller codes. I will describe how ideas from such diverse areas as abstract algebra, number theory, probability, information theory, and physics slowly made it from the blackboard into products, and outline the main challenges that we face today.