Abstract: We analyze Douglas-Rachford splitting techniques applied to solving weakly convex optimization problems. Under mild regularity assumptions, and by the token of a suitable merit function, we show convergence to critical points and local linear rates of convergence. The merit function, comparable to the Moreau envelope in Variational Analysis, generates a descent sequence, a feature that allows us to extend to the non-convex setting arguments employed in convex optimization. A by-product of our approach is an ADMM-like method for constrained problems with weakly convex objective functions. When specialized to multistage stochastic programming, the proposal yields a nonconvex version of the Progressive Hedging algorithm that converges with linear speed. The numerical assessment on a battery of phase retrieval problems shows promising numerical performance of our method, when compared to existing algorithms in the literature.
Venue: Sala de Seminario John Von Neumann, CMM, Beauchef 851, Torre Norte, Piso 7.
Speaker: Felipe Atenas
Affiliation: Universidade Estadual de Campinas, Brazil
Coordinator: Emilio Vilches
Posted on Apr 5, 2023 in Optimization and Equilibrium, Seminars



Noticias en español
