

Wasserstein gradient flows and applications to sampling in machine learning - lecture 1
De Anna Korba


Wasserstein gradient flows and applications to sampling in machine learning - lecture 2
De Anna Korba
Apparaît dans la collection : 2016 - T1 - WS4 - Inference problems theme
This talk will present some constructions of iteratively-decoded sparse-graph codes over various erasure channel models, coming from distributed storage systems. Although the state-of-the-art of coding for distributed storage is built on short algebraic block codes, there have been several attempts to use sparse-graph codes, with the aim to improve the decoding complexity and the scalability of the storage system in whole. This talk will introduce the existing code constructions and will discuss the use of graph-based codes in the framework of distributed storage.