

Wasserstein gradient flows and applications to sampling in machine learning - lecture 1
By Anna Korba


Wasserstein gradient flows and applications to sampling in machine learning - lecture 2
By Anna Korba
Appears in collection : 2016 - T1 - WS2 - Fundamental inequalities and lower bounds theme
In this mini-course, we survey the various techniques developed for proving data structure lower bounds. On the dynamic data structures side, we cover the Chronogram Technique of Fredman and Saks for proving log(n)/loglog(n) lower bounds, the Information Transfer technique of Patrascu and Demaine for proving logn lower bounds, and the mixture of the Chronogram technique and Cell Sampling technique introduced by Larsen for proving (log(n)/log(log(n)))^2 lower bounds. On the static data structures side, we first see Miltersen et al.'s reduction to communication complexity, resulting in log(n)/log(S) lower bounds. We then see the refined reduction by Patrascu and Thorup for proving log(n)/log(Slog(n)/n) lower bounds and finally we see the Cell Sampling technique first introduced by Panigrahy et al. and later refined by Larsen to prove log(n)/log(S/n) lower bounds.