Room 611B, 6th floor in Bldg.2, NITech
Skeaker ：Joseph Salmon（Professor, TELECOM Paris Tech, France)
Organizer: Prof. Ichiro Takeuchi ( Frontier Research Institute for Information Science, NITech)
In high dimensional settings, sparse structures are crucial for efficiency, both in term of memory, computation and performance. It is customary to consider l1 penalties to enforce sparsity in such scenarios.
For efficiency, they rely on tuning a parameter trading data fitting versus sparsity. For the Lasso theory to hold this tuning parameter should be proportional to the noise level, yet the latter is often unknown in practice. A possible remedy is to jointly optimize over the regression parameter as well as over the noise level. This has been considered under several names in the literature: Scaled-Lasso, Square-root Lasso, Concomitant Lasso estimation for instance, and could be of interest for confidence sets or uncertainty quantification.
In this work, after illustrating numerical difficulties for the Concomitant Lasso formulation, we propose a modification we coined Smoothed Concomitant Lasso, aimed at increasing numerical stability. We propose an efficient and accurate solver leading to a computational cost no more expansive than the one for the Lasso. We leverage on standard ingredients behind the success of fast Lasso solvers: a coordinate descent algorithm, combined with safe screening rules to achieve speed efficiency, by eliminating early irrelevant features.
This is a joint work with Eugene Ndiaye, Olivier Fercoq, Alexandre Gramfort and Vincent Leclère