joint work with Yair Carmon, John Duchi, Aaron Sidford
We investigate the complexity of finding stationary points of smooth (potentially) nonconvex functions. We provide both new algorithms, adapting Nesterov’s accelerated methods to the nonconvex setting (achieving faster convergence than gradient descent), and fundamental algorithm-independent lower bounds on the complexity of finding stationary points. Our bounds imply that gradient descent and cubic-regularized Newton’s method are optimal within their natural function classes.
joint work with Coralia Cartis, Nick Gould, Bendetta Morini, Stefania Bellavia, Gianmarco Gurioli
We present a review of results on the worst-case complexity of minimization algorithm for nonconvex problems using potentially high-degree models. Global complexity bound are presented that are valid for any model's degree and any order of optimality, thereby generalizing known results for first- and second-order methods. An adaptive regularization algorithm using derivatives up to degree p will produce an $\epsilon$-approximate q-th order minimizer in at most $O(\epsilon^{(p+1)/(p-q+1)})$ evaluations. We will also extend these results to the case of inexact objective function and derivatives.
joint work with J. M. Martínez
A Newton-like method for unconstrained minimization is introduced that uses a single cheap factorization per iteration. Convergence and complexity results, even in the case in which the subproblems' Hessians are far from being Hessians of the objective function, are presented. When the Hessian is Lipschitz-continuous, the proposed method enjoys $O(\varepsilon^{-3/2})$ evaluation complexity for first-order optimality and $O(\varepsilon^{-3})$ for second-order optimality. The usage of the introduced method for bound-constrained minimization and for nonlinear programming is reported.