Optimization and Equilibrium

Lipschitz-free spaces

Event Date: Apr 13, 2016 in Optimization and Equilibrium, Seminars

Abstract:   Let M be a pointed metric space and Lip0 (M ) the space of Lipschitz functions vanishing at 0. Endowed with the Lipschitz norm this space is a Banach space. Denote F(M ) the closed subspace of Lip(M )* spanned by the evaluation points and call it the Lipschitz-free space over M. After an introduction explaining how one can use these spaces in the context of non linear classification of Banach spaces, we will more particularly be interested in dual Lipschitz-free spaces and shortly explain there link with optimal transportation.

Read More

Recent advances on the acceleration of first-order methods in convex optimization

Event Date: Apr 06, 2016 in Optimization and Equilibrium, Seminars

Abstract: Let f : H → R ∪ {+∞} be a proper lower-semicontinuous convex function defined on a Hilbert space H (think of RN), and let (xk) be a sequence in H generated by means of a “typical” first-order method, and intended to minimize f. If argmin(f) ̸= ∅, then (xk) will converge weakly, as k → +∞, to a minimizer of f, with a worst-case theoretical convergence rate of f(xk) − min(f) = O(1/k). In the 1980’s Y. Nesterov came up with a revolutionary – yet remarkably simple! – idea to modify the computation of the iterates with essentially the same computational cost, in order to...

Read More

Incremental Proximal and Augmented Lagrangian Methods for Convex Optimization: A Survey

Event Date: Mar 30, 2016 in Optimization and Equilibrium, Seminars

Abstract: Incremental methods deal effectively with an optimization problem of great importance in machine learning, signal processing, and large-scale and distributed optimization: the minimization of the sum of a large number of convex functions. We survey these methods and we propose incremental  aggregated and nonaggregated versions of the proximal algorithm. Under cost function differentiability and strong convexity assumptions, we show linear convergence for a sufficiently small constant stepsize. This result also applies to distributed asynchronous variants of the method, involving...

Read More

What is…the inverse function theorem

Event Date: Mar 16, 2016 in Optimization and Equilibrium, Seminars

Abstract:   After a gentle introduction to the  paradigm of the inverse function theorem, advanced version of this theorem will be presented for nonsmooth functions and set-valued mappings.

Read More

“Recovering a function from a subdifferential”

Event Date: Mar 23, 2016 in Optimization and Equilibrium, Seminars

Abstract:     The issue of recovering a function from its derivative or one of its directional derivatives is a central issue that dates back to the seminal work of Lebesgue (1904). The question of recovering a function from its subdifferential is more recent and has been the subject of intensive research in recent years, since the seminal work of Moreau and Rockafellar (1970) on convex functions, up to the many successive works of Thibault and his co-authors on increasingly broad classes of functions beyond the convex functions.   The talk focuses on this issue. We divided...

Read More

Coloquio Optimización A mediodía de Optimización y análisis variacional”

Event Date: Mar 09, 2016 in Optimization and Equilibrium, Seminars

EXPOSITORES   – Pr. Dimitri Bertsekas, Departamento de Ingeniería Eléctrica e Informática en la Facultad de Ingeniería en el Instituto de Tecnología de Massachusetts (MIT), en Cambridge, Massachusetts. – Pr. Assen Dontchev, Mathematical Reviews y la Universidad de Michigan. – Pr. R. Tyrrell Rockafellar, Departamentos de matemáticas y matemáticas aplicadas en la Universidad de Washington, Seattle.

Read More