00:00:00 / 00:00:00

Appears in collection : Imaging and machine learning

Projecting data in low dimensions is often key to scale machine learning to large high-dimensional data-sets. In this talk we will take take a statistical learning tour of classic as well as recent projection methods: from classical principal component analysis, to sketching and random subsampling. We will show that, perhaps surprisingly, there are number of settings, where it is possible to substantially reduce data dimensions, hence computational costs, without losing statistical accuracy. As a byproduct we derive a massively scalable kernel/Gaussian process solver with optimal statistical guarantees, and excellent performance in a number of large scale problems.

Information about the video

  • Date of recording 03/04/2019
  • Date of publication 09/05/2019
  • Institution IHP
  • Language English
  • Format MP4
  • Venue Institut Henri Poincaré

Domain(s)

Last related questions on MathOverflow

You have to connect your Carmin.tv account with mathoverflow to add question

Ask a question on MathOverflow




Register

  • Bookmark videos
  • Add videos to see later &
    keep your browsing history
  • Comment with the scientific
    community
  • Get notification updates
    for your favorite subjects
Give feedback