

From robust tests to robust Bayes-like posterior distribution
De Yannick Baraud


Statistical learning in biological neural networks
De Johannes Schmidt-Hieber
Apparaît dans la collection : 2022 - T3 - WS1 - Non-linear and high dimensional inference
Although the application of deep neural networks to real-world problems has become ubiquitous, the question of why they are so effective has not yet been satisfactorily answered. However, some progress has been made in establishing an underlying mathematical foundation. This talk surveys results on statistical risk bounds of deep neural networks. In particular, we focus on the question of when neural networks bypass the curse of dimensionality. Here we discuss results for vanilla feedforward and convolutional neural networks as well as regression and classification settings.