Tue.1 10:30–11:45 | H 3007 | SPA
.

Methods and Theory for Nonconvex Optimization

Chair: Nadav Hallak Organizer: Nadav Hallak
10:30

Armin Eftekhari

Non-Convex Augmented Lagrangian Method for Learning and Inference

We present efficient ALM and ADMM variants for solving nonconvex optimization problems, with local sublinear convergence rates. We show how the ubiquitous Slater's condition in convex optimization gives its place to a geometric condition, reminiscent of the Mangasarian-Fromovitz constraint qualification. We apply these new algorithms to a variety of applications, including clustering, generalized eigenvalue decomposition, and even robustness in deep nets. We will also show improved linear rates can be obtained using the concepts of restricted strong convexity and smoothness.

10:55

Peter Ochs

Model Function Based Conditional Gradient Method with Armijo-like Line Search

The Conditional Gradient Method is generalized to a class of non-smooth non-convex optimization problems with many applications in machine learning. The proposed algorithm iterates by minimizing so-called model functions over the constraint set. As special cases of the abstract framework, for example, we develop an algorithm for additive composite problems and an algorithm for non-linear composite problems which leads to a Gauss-Newton-type algorithm. Moreover, the framework yields a hybrid version of Conditional Gradient and Proximal Minimization schemes for free.

11:20

Nadav Hallak

joint work with Marc Teboulle

Convergence to Second-Order Stationary Points: A Feasible Direction Approach

We focus on nonconvex problems minimizing a smooth nonconvex objective over a closed convex set. Finding second-order stationary points for this class of problems is usually a hard task. Relying on basic and fundamental feasible descent ideas, this talk presents a surprisingly simple second order feasible directions scheme for computing such points, with proven theoretical convergence guarantees (to second order stationarity) under mild assumptions. (Numerical examples illustrate the practicability of the proposed method for nonconvex problems.)