Mathematical Methods of Modern Statistics 2 / Méthodes mathématiques en statistiques modernes 2

Collection Mathematical Methods of Modern Statistics 2 / Méthodes mathématiques en statistiques modernes 2

Organisateur(s) Bogdan, Malgorzata ; Graczyk, Piotr ; Panloup, Fabien ; Proïa, Frédéric ; Roquain, Etienne
Date(s) 15/06/2020 - 19/06/2020
URL associée https://www.cirm-math.com/cirm-virtual-event-2146.html
00:00:00 / 00:00:00
14 25

The smoothed multivariate square-root Lasso: an optimization lens on concomitant estimation

De Joseph Salmon

In high dimensional sparse regression, pivotal estimators are estimators for which the optimal regularization parameter is independent of the noise level. The canonical pivotal estimator is the square-root Lasso, formulated along with its derivatives as a "non-smooth + non-smooth'' optimization problem. Modern techniques to solve these include smoothing the datafitting term, to benefit from fast efficient proximal algorithms. In this work we focus on minimax sup-norm convergence rates for non smoothed and smoothed, single task and multitask square-root Lasso-type estimators. We also provide some guidelines on how to set the smoothing hyperparameter, and illustrate on synthetic data the interest of such guidelines. This is joint work with Quentin Bertrand (INRIA), Mathurin Massias, Olivier Fercoq and Alexandre Gramfort.

Informations sur la vidéo

Données de citation

  • DOI 10.24350/CIRM.V.19643003
  • Citer cette vidéo Salmon, Joseph (04/06/2020). The smoothed multivariate square-root Lasso: an optimization lens on concomitant estimation. CIRM. Audiovisual resource. DOI: 10.24350/CIRM.V.19643003
  • URL https://dx.doi.org/10.24350/CIRM.V.19643003

Bibliographie

  • BECK, Amir et TEBOULLE, Marc. Smoothing and first order methods: A unified framework. SIAM Journal on Optimization, 2012, vol. 22, no 2, p. 557-580. - https://doi.org/10.1137/100818327
  • BELLONI, Alexandre, CHERNOZHUKOV, Victor, et WANG, Lie. Square-root lasso: pivotal recovery of sparse signals via conic programming. Biometrika, 2011, vol. 98, no 4, p. 791-806. - https://www.jstor.org/stable/23076172
  • BERTRAND, Quentin, MASSIAS, Mathurin, GRAMFORT, Alexandre, et al. Handling correlated and repeated measurements with the smoothed multivariate square-root Lasso. In : Advances in Neural Information Processing Systems. 2019. p. 3961-3972. - https://arxiv.org/abs/1902.02509
  • BICKEL, Peter J., RITOV, Ya’acov, TSYBAKOV, Alexandre B., et al. Simultaneous analysis of Lasso and Dantzig selector. The Annals of Statistics, 2009, vol. 37, no 4, p. 1705-1732. - http://dx.doi.org/10.1214/08-AOS620
  • CANDES, E.J., WAKIN, M.B. & BOYD, S.P. Enhancing Sparsity by Reweighted ℓ 1 Minimization. J Fourier Anal Appl 14, 877–905 (2008). - https://doi.org/10.1007/s00041-008-9045-x
  • CHEN, Scott S. et DONOHO, David L. andM. A. Saunders," Atomic decomposition by basis pursuit,". SIAM J. Sci. Comput, 1999, vol. 20, no 1, p. 33-61. - https://doi.org/10.1137/S1064827596304010
  • DALALYAN, Arnak S., HEBIRI, Mohamed, LEDERER, Johannes, et al. On the prediction performance of the lasso. Bernoulli, 2017, vol. 23, no 1, p. 552-581. - http://dx.doi.org/10.3150/15-BEJ756
  • DAUBECHIES, Ingrid. CBMS-NSF regional conference series in applied mathematics. Ten lectures on wavelets, 1992, vol. 61. - https://doi.org/10.1137/1.9781611970104
  • DELORME, Arnaud, PALMER, Jason, ONTON, Julie, et al. Independent EEG sources are dipolar. PloS one, 2012, vol. 7, no 2. - http://dx.doi.org/10.1371/journal.pone.0030135
  • MASSIAS, Mathurin, FERCOQ, Olivier, GRAMFORT, Alexandre, et al. Generalized concomitant multi-task lasso for sparse multimodal regression. In: AISTATS. Vol. 84. 2018, p. 998-1007.
  • MASSIAS, Mathurin, BERTRAND, Quentin, GRAMFORT, Alexandre, et al. Support recovery and sup-norm convergence rates for sparse pivotal estimation. In: AISTATS. 2020. - https://arxiv.org/abs/2001.05401
  • NDIAYE, Eugene, FERCOQ, Olivier, GRAMFORT, Alexandre, et al. Efficient smoothed concomitant Lasso estimation for high dimensional regression. In : Journal of Physics: Conference Series. IOP Publishing, 2017. p. 012006. - https://doi.org/10.1088/1742-6596/904/1/012006
  • NESTEROV, Yu. Smooth minimization of non-smooth functions. Mathematical programming, 2005, vol. 103, no 1, p. 127-152. - https://doi.org/10.1007/s10107-004-0552-5
  • OBOZINSKI, Guillaume, TASKAR, Ben, et JORDAN, Michael I. Joint covariate selection and joint subspace selection for multiple classification problems. Statistics and Computing, 2010, vol. 20, no 2, p. 231-252. - https://doi.org/10.1007/s11222-008-9111-x
  • OLSHAUSEN, Bruno A. et FIELD, David J. Sparse coding with an overcomplete basis set: A strategy employed by V1?. Vision research, 1997, vol. 37, no 23, p. 3311-3325. - https://doi.org/10.1016/S0042-6989(97)00169-7
  • OWEN, A. B. A robust hybrid of lasso and ridge regression. In: Contemporary Mathematics 2007, vol. 443, p. 59–72. - http://dx.doi.org/10.1090/conm/443
  • TIBSHIRANI, Robert. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 1996, vol. 58, no 1, p. 267-288. - https://www.jstor.org/stable/2346178
  • VAN DE GEER, Sara. Estimation and testing under sparsity. Lecture notes in mathematics, 2016, vol. 2159. - http://dx.doi.org/10.1007/978-3-319-32774-7

Dernières questions liées sur MathOverflow

Pour poser une question, votre compte Carmin.tv doit être connecté à mathoverflow

Poser une question sur MathOverflow




Inscrivez-vous

  • Mettez des vidéos en favori
  • Ajoutez des vidéos à regarder plus tard &
    conservez votre historique de consultation
  • Commentez avec la communauté
    scientifique
  • Recevez des notifications de mise à jour
    de vos sujets favoris
Donner son avis