Convergence of descent methods in non-smooth non-convex optimization


Résumé: 

Following recent advances, we present a large class of descent
methods which converges strongly to a critical point of a non-smooth
non-convex function, under the condition that it has a ‘good’ behavior
around its critical points.
More exactly we consider proper lower semi-continuous functions satisfying
the so-called Kurdyka-Lojasiewicz inequality. It is  satisfied in finite
dimensions by tame functions (like semi-algebraic or bounded sub-analytic
functions), while in infinite dimensions we know some useful examples in
PDE, for instance energies of elliptic PDE with analytic non-linearities.

This class of descent methods covers the gradient, proximal,
Forward-Backward algorithms and the alternate proximal method, which are
all particular cases of what we called the Alternate Forward-Backward
algorithm. We considered the possibility to change the ambient metric
during the computation : this should permit to increase the speed of
convergence of the sequence, by modifying the ambient space metric
according to the geometry of the function, like in Newton methods. It
opens the road to nonconvex versions of Levemberg-Marquardt algorithm or
Newton-like projected methods.
Date: Sep 25, 2013
Date of closure: Sep 25, 2013
Venue: Avda. Blanco Encalada 2120, 7mo Piso, Sala de Seminarios CMM.
Speaker: Guillaume Garrigos
Affiliation: Universidad de Montpellier, France
Coordinator: Abderrahim Hantoute
Abstract:
PDF - PS

Posted on Sep 24, 2013 in Optimization and Equilibrium, Seminars