00:00:00 / 00:00:00

Projections, Learning, and Sparsity for Efficient Data Processing

By Remi Gribonval, Renaud Dessalles

Appears in collection : Schlumberger workshop - Computational and statistical trade-offs in learning

The talk will discuss recent generalizations of sparse recovery guarantees and compressive sensing to the context of machine learning. Assuming some "low-dimensional model" on the probability distribution of the data, we will see that in certain scenarios it is indeed (empirically) possible to compress a large data-collection into a reduced representation, of size driven by the complexity of the learning task, while preserving the essential information necessary to process it. Two case studies will be given: compressive clustering, and compressive Gaussian Mixture Model estimation, with an illustration on large-scale model-based speaker verification. Time allowing, some recent results on compressive spectral clustering will also be discussed.

Information about the video

  • Date of recording 23/03/2016
  • Date of publication 28/03/2016
  • Institution IHES
  • Format MP4

Domain(s)

Last related questions on MathOverflow

You have to connect your Carmin.tv account with mathoverflow to add question

Ask a question on MathOverflow




Register

  • Bookmark videos
  • Add videos to see later &
    keep your browsing history
  • Comment with the scientific
    community
  • Get notification updates
    for your favorite subjects
Give feedback