Abstract: Randomized block-coordinate algorithms are recognized to furnish efficient iterative schemes for addressing large-scale problems, especially when the computation of full derivatives entails substantial memory requirements and computational efforts. Classically, the convergence analysis of these methods relies on a standard assumption of global Lipschitz continuity of partial gradients of differentiable functions. This compromises its applicability to situations where gradient Lipschitz continuity is violated, for instance, in nonnegative matrix factorization or recovery of signals from quadratic measurements. In this talk, we present a randomized block proximal gradient algorithm for addressing the sum of a separable (nonsmooth) proper lower-semicontinuous function and a differentiable function whose partial gradients are assumed to be Lipschitz continuous only locally. At each iteration, the method adaptively selects a proximal stepsize to satisfy a sufficient decrease condition without prior knowledge of the
local Lipschitz moduli of the partial gradients of the differentiable function. We conduct a thorough analysis of the convergence of the method and illustrate its performance in an experiment in image compression.
local Lipschitz moduli of the partial gradients of the differentiable function. We conduct a thorough analysis of the convergence of the method and illustrate its performance in an experiment in image compression.
Date: Apr 30, 2025 at 16:15:00 h
Speaker: David Torregrosa Belén
Affiliation: Centro de Modelamiento Matemático, Universidad de Chile
Coordinator: Pedro Pérez
Speaker: David Torregrosa Belén
Affiliation: Centro de Modelamiento Matemático, Universidad de Chile
Coordinator: Pedro Pérez
Abstract:
PDF
Posted on Apr 24, 2025 in Optimization and Equilibrium, Seminars