

Statistical learning in biological neural networks
De Johannes Schmidt-Hieber


Wasserstein gradient flows and applications to sampling in machine learning - lecture 1
De Anna Korba
Apparaît dans la collection : Schlumberger workshop - Computational and statistical trade-offs in learning
In large-scale data settings, randomized 'sketching' has become an increasingly popular tool. In the numerical linear algebra literature, randomized sketching based on either random projections or sub-sampling has been shown to achieve optimal worst-case error. In particular the sketched ordinary least-squares (OLS) solution and the CUR decomposition have been shown to achieve optimal approximation error bounds in a worst-case setting. However, until recently there has been limited work on consider the performance of the OLS estimator under a statistical model using statistical metrics. In this talk I present some recent results which address both the performance of sketching in the statistical setting, where we assume an underlying statistical model and show that many of the existing intuitions and results are quite different from the worst-case algorithmic setting.