

Wasserstein gradient flows and applications to sampling in machine learning - lecture 1
De Anna Korba


Wasserstein gradient flows and applications to sampling in machine learning - lecture 2
De Anna Korba
Apparaît dans la collection : 2016 - T1 - WS2 - Fundamental inequalities and lower bounds theme
It has become possible in recent years to provide unconditional lower bounds on the time needed to perform a number of basic computational operations. I will discuss some of the main techniques involved and show how one in particular, the information transfer method, can be exploited to give unconditional time lower bounds for computation on streaming data. I will then discuss the main open problem in this area that derives from the fact that existing techniques fail when the arriving stream consists of single bits. I will set out a recently developed method that allows to make some progress in this setting and then introduce the mathematical hurdles we still need to overcome.