Optimization and Equilibrium

An approach to optimality in convex optimization via some new Moreau- Rockafellar type formulas for the subdifferential of the supremum function.

Event Date: Nov 07, 2018 in Optimization and Equilibrium, Seminars

Abstract: We present different characterizations of the subdifferential of the supremum function of finitely and infinitely indexed families of convex functions under weak continuity assumptions. The resulting formulas are given in terms of the exact subdifferential of the data functions at the reference point, and not at nearby points. Based on these characterizations we give new Fritz-John and KKT-type optimality conditions for semi-infinite convex optimization, dropping out the typical continuity/closedness assumptions which are usual in the literature. The presentation is a selection of...

Read More

“Second-order characterizations of C1-smooth robustly quasiconvex functions”

Event Date: Oct 24, 2018 in Optimization and Equilibrium, Seminars

Abstract:   “Our aim in this talk is to investigate the possibility of using the Fréchet and Mordukhovich second-order subdifferentials to characterize the robust quasiconvexity of  C1-smooth functions. We set up a necessary condition for the robust quasiconvexity of C1,1-smooth functions and univariate C1-smooth ones. We also show that the established necessary condition is indeed a sufficient one for the robust quasiconvexity of C1-smooth functions.”

Read More

A Study of the Difference-of-Convex Approach for Solving Linear Programs with Complementarity Constraints

Event Date: Sep 05, 2018 in Optimization and Equilibrium, Seminars

Abstract:   This work studies the difference-of-convex (DC) penalty formulations and the associated difference-of-convex algorithm (DCA) for computing stationary solutions of linear programs with complementarity constraints (LPCCs). We focus on two such formulations and establish connections between their stationary solutions and those of the LPCC. Improvements of the DCA are proposed to remedy some drawbacks in a straightforward adaptation of the DCA to these formulations. Extensive numerical results, including comparisons with an existing nonlinear programming solver and the...

Read More

Convergence of projection algorithms: some results and counterexamples.

Event Date: May 30, 2018 in Optimization and Equilibrium, Seminars

Abstract:   Projection methods can be used for solving a range of feasibility and optimisation problems. Whenever the constraints are represented as the intersection of closed (convex) sets with readily implementable projections onto each of these sets, a projection based algorithm can be employed to force the iterates towards the feasible set. Some versions of projection methods employ approximate projections; one can also consider under- and over-relaxed iterations (such as in the Douglas-Rachford method). In this talk I will focus on the convergence of projection methods. This...

Read More

A Game Theoretic Model for Optimizing Electricity Consumers Flexibilities in the Smart Grid.

Event Date: May 16, 2018 in Optimization and Equilibrium, Seminars

Abstract: With the evolution of electricity usages (electric vehicles, smart appliances) and the development of communication structures (smart grid), new opportunities of optimization have emerged for the actors of the electrical network. Aggregators can send signals to enrolled consumers to play on their demand flexibilities, and to optimize the providing costs and the social welfare. Game theory has been shown to be a valuable tool to study strategic electricity consumers participating in such a demand side management program. We propose a simple billing mechanism where the aggregator...

Read More

On regularization/convexification of functionals including an l2-misfit term

Event Date: Mar 14, 2018 in Optimization and Equilibrium, Seminars

Abstract: A common technique for solving ill-posed inverse problems is to include some sparsity/low-rank constraint, and pose it as a convex optimization problem, as is done e.g. in compressive sensing. The corresponding functional to be minimized often includes an l2 data fidelity term plus a convex term forcing sparsity. However, for many applications a non-convex term would be more suitable, although this is usually discarded since it leads to issues with algorithm convergence, local minima etc. I will introduce a new transform to (partially) convexify non-convex functionals of the above...

Read More