Differentially Private Stationary Points in Stochastic Nonconvex Optimization.

Abstract:  Differentially private (DP) stochastic nonconvex optimization (SNCO) is a fundamental problem, where the goal is to approximate stationary points (i.e points with small norm of the gradient) of the population loss, given a dataset of i.i.d. samples from a distribution, while satisfying differential privacy with respect to the dataset. Most of the existing works in the literature have addressed either the convex version of this problem (DP-SCO) or the closely related problem of Private Non-Convex Empirical Risk Minimization, where one seeks to approximate stationary points of the empirical loss.

In the first part of this talk I will provide an overview of differential privacy and of (non-private) stochastic nonconvex optimization. Then, I will show how to privatize the well known SPIDER algorithm for SNCO that relies on variance reduction techniques, and how to prove privacy and accuracy guarantees for its private version. The private version of SPIDER leads to the rate of convergence to stationary points of the population loss of O(d^{1/4} / (n\epsilon

Date: Jul 13, 2022 at 15:00:00 h
Venue: Sala de Seminario John Von Neuman, CMM, Beauchef 851, Torre Norte, Piso 7.
Speaker: Tomás González
Affiliation: IMC, UC.
Coordinator: José Verschae
More info at:
Event website
Abstract:
PDF

Posted on Jul 11, 2022 in ACGO, Seminars