00:00:00 / 00:00:00

Rapture of the deep: highs and lows of sparsity in a world of depths

De Remi Gribonval

Apparaît dans la collection : A Multiscale tour of Harmonic Analysis and Machine Learning - To Celebrate Stéphane Mallat's 60th birthday

Promoting sparse connections in neural networks is natural to control their complexity. Besides, given its thoroughly documented role in inverse problems and variable selection, sparsity also has the potential to give rise to learning mechanisms endowed with certain interpretability guarantees. Through an overview of recent explorations around this theme, I will compare and contrast classical sparse regularization for inverse problems with multilayer sparse approximation. During our journey, I will notably highlight the role of rescaling-invariances in deep parameterizations, which come with their curses and blessings. In the process we will also be remembered that there is life beyond gradient descent, as illustrated by an algorithm that brings speedups of up to two orders of magnitude when learning certain fast transforms via multilayer sparse factorization.

Informations sur la vidéo

  • Date de captation 21/04/2023
  • Date de publication 26/04/2023
  • Institut IHES
  • Langue Anglais
  • Audience Chercheurs
  • Format MP4

Domaine(s)

Dernières questions liées sur MathOverflow

Pour poser une question, votre compte Carmin.tv doit être connecté à mathoverflow

Poser une question sur MathOverflow




Inscrivez-vous

  • Mettez des vidéos en favori
  • Ajoutez des vidéos à regarder plus tard &
    conservez votre historique de consultation
  • Commentez avec la communauté
    scientifique
  • Recevez des notifications de mise à jour
    de vos sujets favoris
Donner son avis