Wed.1 11:30–12:45 | H 1012 | NON
.

Nonlinear Optimization Methods and Their Global Rates of Convergence

Chair: Geovani Grapiglia Organizer: Geovani Grapiglia
11:30

Oliver Hinder

joint work with Yair Carmon, John Duchi, Aaron Sidford

The complexity of finding stationary points of nonconvex functions

We investigate the complexity of finding stationary points of smooth (potentially) nonconvex functions. We provide both new algorithms, adapting Nesterov’s accelerated methods to the nonconvex setting (achieving faster convergence than gradient descent), and fundamental algorithm-independent lower bounds on the complexity of finding stationary points. Our bounds imply that gradient descent and cubic-regularized Newton’s method are optimal within their natural function classes.

11:55

Philippe L. Toint

joint work with Coralia Cartis, Nick Gould, Bendetta Morini, Stefania Bellavia, Gianmarco Gurioli

Recent results in worst-case evaluation complexity for smooth and non-smooth, exact and inexact, nonconvex optimization

We present a review of results on the worst-case complexity of minimization algorithm for nonconvex problems using potentially high-degree models. Global complexity bound are presented that are valid for any model's degree and any order of optimality, thereby generalizing known results for first- and second-order methods. An adaptive regularization algorithm using derivatives up to degree p will produce an $\epsilon$-approximate q-th order minimizer in at most $O(\epsilon^{(p+1)/(p-q+1)})$ evaluations. We will also extend these results to the case of inexact objective function and derivatives.

12:20

Ernesto G. Birgin

joint work with J. M. Martínez

A Newton-like method with mixed factorizations and cubic regularization and its usage in an Augmented Lagrangian framework

A Newton-like method for unconstrained minimization is introduced that uses a single cheap factorization per iteration. Convergence and complexity results, even in the case in which the subproblems' Hessians are far from being Hessians of the objective function, are presented. When the Hessian is Lipschitz-continuous, the proposed method enjoys $O(\varepsilon^{-3/2})$ evaluation complexity for first-order optimality and $O(\varepsilon^{-3})$ for second-order optimality. The usage of the introduced method for bound-constrained minimization and for nonlinear programming is reported.