Stochastic incremental mirror descent algorithms with Nesterov smoothing.

Abstract: We propose a stochastic incremental mirror a prox-friendly proper, convex and lower semicontinuous function. Different to the previous cdescent algorithm constructed by means of the Nesterov smoothing for minimizing a sum of finitely many proper, convex and lower semicontinuous functions over a nonempty closed convex set in an Euclidean space. The algorithm can be adapted in order to minimize (in the same setting) a sum of finitely many proper, convex and lower semicontinuous functions composed with linear operators. Another modification of the scheme leads to a stochastic incremental mirror descent Bregman-proximal scheme with Nesterov smoothing for minimizing the sum of finitely many proper, convex and lower semicontinuous functions with ontributions from the literature on mirror descent methods for minimizing sums of functions, we do not require these to be (Lipschitz) continuous or differentiable. Applications in Logistics, Tomography and Machine Learning modeled as optimization problems illustrate the theoretical achievements. The talk is based on joint work with Sandy Bitterlich.

Date: Oct 20, 2021 at 10:00:00 h
Venue: Modalidad Vía Online
Speaker: Professor Sorin-Mihai Grad.
Affiliation: Department of Applied Mathematics, ENSTA Paris, France
Coordinator: Fabián Flore-Bazán & Abderrahim Hantoute
More info at:
Event website
Abstract:
PDF

Posted on Oct 13, 2021 in Optimization and Equilibrium, Seminars