00:00:00 / 00:00:00

Statistical theory for deep neural networks - lecture 1

By Johannes Schmidt-Hieber

Appears in collection : CEMRACS 2021: Data Assimilation and Model Reduction in High Dimensional Problems / CEMRACS 2021: Assimilation de données et réduction de modèle pour des problêmes en grande dimension

Recently a lot of progress has been made regarding the theoretical understanding of machine learning methods in particular deep learning. One of the very promising directions is the statistical approach, which interprets machine learning as a collection of statistical methods and builds on existing techniques in mathematical statistics to derive theoretical error bounds and to understand phenomena such as overparametrization. The lecture series surveys this field and describes future challenges.

Information about the video

Citation data

  • DOI 10.24350/CIRM.V.19781503
  • Cite this video Schmidt-Hieber, Johannes (22/07/2021). Statistical theory for deep neural networks - lecture 1. CIRM. Audiovisual resource. DOI: 10.24350/CIRM.V.19781503
  • URL https://dx.doi.org/10.24350/CIRM.V.19781503

Last related questions on MathOverflow

You have to connect your Carmin.tv account with mathoverflow to add question

Ask a question on MathOverflow




Register

  • Bookmark videos
  • Add videos to see later &
    keep your browsing history
  • Comment with the scientific
    community
  • Get notification updates
    for your favorite subjects
Give feedback