Complexity in Convex Optimization: Oracles, Algorithms and Applications

Abastract

 

The need to solve large-scale optimization problems is ubiquitous. Modern applications in signal processing, machine learning and big data, need at their core conv optimization solvers. In this respect, understanding the limits of performance of such methods is a crucial task.

 

In this talk we introduce the theory of oracle complexity as a standpoint to understand the efficiency of optimization algorithms. We will see how state of the art algorithms can be seen as oracle-based, and how the classical theory does not address current applications. Motivated by the later issue, we extend the theory of oracle complexity in

two directions: First, we establish worst-case lower bounds smooth convex optimization

over non-Euclidean domains. We derive nearly optimal lower bounds for smooth convex minimization over the L^1 ball, widely used in sparse signal processing; and the L^infinity ball, establishing the near optimality of the Frank-Wolfe method over this domain. Second, we extend complexity analysis to the distributional setting. We provide a unifying information-theoretic approach to derive lower bounds for average case

oracle complexity based on a String-Guessing Problem, which is combinatorial in nature.

Finally, time permitting, I will present future directions of research in

convex optimization and related areas.

Date: Mar 11, 2015 at 16:30 h
Venue: Beauchef 851, Sala Multimedia CMM, Sexto Piso,
Speaker: Cristóbal Guzmán
Affiliation: Ph.D. Candidate in Algorithms, Combinatorics & Optimization ISyE, Georgia Institute of Technology
Coordinator: Abderrahim Hantoute
Abstract:
PDF

Posted on Mar 5, 2015 in Optimization and Equilibrium, Seminars