An SDE perspective on stochastic convex optimization.

Abstract: We analyze the global and local behavior of gradient-like flows under stochastic errors towards the aim of solving convex optimization problems with noisy gradient input. We first study the unconstrained differentiable convex case, using a stochastic differential equation where the drift term is minus the gradient of the objective function and the diffusion term is either bounded or square-integrable. In this context, under Lipschitz continuity of the gradient, our first main result shows almost sure weak convergence of the trajectory process towards a minimizer of the objective function. We also provide a comprehensive complexity analysis by establishing several new pointwise and ergodic convergence rates in expectation for the convex, strongly convex, and (local) Lojasiewicz case. The latter, which involves local analysis, is challenging and requires non-trivial arguments from measure theory. Then, we extend our study to certain nonsmooth situations. We show that several of our results have natural extensions obtained by replacing the gradient of the objective function by a cocoercive monotone operator. Finally, we show that using a time rescaling and averaging technique we can obtain results for a stochastic version of the Inertial System Implicit Hessian Damping (ISIHD) .

Date: Dec 20, 2023 at 16:15:00 h
Venue: Sala de Seminario John Von Neumann, CMM, Beauchef 851, Torre Norte, Piso 7.
Speaker: Rodrigo Maulén
Affiliation: Ecole Nationale Supérieure d'Ingénieurs de Caen (ENSICAEN), France
Coordinator: Emilio Vilches
More info at:
Event website
Abstract:
PDF

Posted on Dec 12, 2023 in Optimization and Equilibrium, Seminars