3. Kalman Lecture with Sara van de Geer

Sara van de Geer, ETH Zürich, Switzerland 10:15 - 11:15

The Lasso revisited: entropy bounds and dual certificates

The Lasso is least squares estimation with an ℓ1-penalty on the coeffcients. In this talk we consider the case where the design matrix is structured, and where its columns have high correlations. We consider for this situation the theory for non-adaptive (minimax) rates and adaptive (oracle) rates.
The ℓ1-penalty leads to studying the entropy of the signed convex hull of the columns of the design matrix. We will present a bound for this entropy based on projection theory. This generalizes a result in Ball and Pajor [1990] which bounds the entropy of the convex hull in terms of the covering number of its extreme points. The entropy bound is applied to obtain non-adaptive rates.
For the adaptive rates, we will combine the approach of Dalalyan et al. [2017] with dual certificates as given in Candes and Plan [2011]. This will be applied to the trend filtering problem, to higher dimensional extensions and to total variation on graphs.

Joint work with Francesco Ortelli

References
K. Ball and A. Pajor. The entropy of convex bodies with "few" extreme points. London Math. Soc. Lecture Note Series, 158:25-32, 1990.
Emmanuel J Candes and Yaniv Plan. A probabilistic and RIPless theory of compressed sensing. IEEE Transactions on Information Theory, 57(11):7235-7254, 2011.
Arnak Dalalyan, Mohamed Hebiri, and Johannes Lederer. On the prediction performance of the lasso. Bernoulli, 23(1):552-581, 2017.

***This years Kálmán Lecture will be conducted online due to the corona pandemic. We will send out an invitation for a zoom meeting via our email list. If you are not on the mail list already, please send an email up front to Hendrik.Gessner[at]uni-potsdam.de ***