Abstract:
We study a bilevel optimization framework for hyperparameter learning in variational models, focusing on sparse regression and classification. Specifically, we use a weighted elastic-net regularizer, where feature-wise penalties are learned through a bilevel formulation. Our main contribution is a Forward–Backward (FB) reformulation of the nonsmooth lower-level problem that preserves its minimizers. This yields a bilevel objective composed with a locally Lipschitz solution map, enabling the use of generalized subdifferential calculus and efficient subgradient-based methods. Experiments on synthetic data show that our approach significantly improves prediction accuracy and support recovery compared to scalar regularization, demonstrating the benefits of feature-wise tuning and bilevel learning.
Venue: John Von Neumann Seminar Room, CMM, Beauchef 851, North Tower, 7th Floor
Speaker: David Villacís
Affiliation: Universidad de Loyola, España
Coordinator: Pedro Pérez
Posted on Jul 22, 2025 in Optimization and Equilibrium, Seminars



Noticias en español
