Progressive Decoupling of Linkages in Optimization with Elicitable Convexity



A method called the Progressive Decoupling Algorithm is described for solving variational inequalities and optimization problems in which a subspace captures “linkages” that can be relaxed.  The approach is inspired by the Progressive Hedging Algorithm in convex stochastic programming and resembles the Partial Inverse Method of Spingarn, but retains more parametric flexibility than the latter.  It is able even to work when mononicity or convexity is not directly present but can be “elicited”.  The role of elicitation mimics the role of “augmentation” in Lagrangian methods of multipliers.  Applications can be made to problem decomposition and splitting.

Date: Mar 13, 2019 at 16:00:00 h
Venue: Sala de Seminarios CMM, John Von Neumann, Séptimo Piso, Torre Norte de Beauchef 851.
Speaker: Prof. Ralph Tyrrell Rockafellar
Affiliation: University of Washington, USA
Coordinator: Prof. Juan Peypouquet

Posted on Mar 4, 2019 in Optimization and Equilibrium, Seminars