Mon.1 11:00–12:15 | H 0105 | NON
.

Nonlinear and Stochastic Optimization (1/4)

Chair: Albert Berahas Organizers: Albert Berahas, Geovani Grapiglia
11:00

Jorge Nocedal

joint work with Yuchen Xie, Richard Byrd

Analysis of the BFGS Method with Errors

The classical convergence analysis of quasi-Newton methods assumes that the function and gradients employed at each iteration are exact. Here we consider the case when there are (bounded) errors in both computations and study the complex interaction between the Hessian update and step computation. We establish convergence guarantees for a line search BFGS method.

11:25

Fred Roosta

joint work with Yang Liu, Peng Xu, Michael Mahoney

Newton-MR: Newton's Method Without Smoothness or Convexity

Establishing global convergence of the classical Newton's method has long been limited to making restrictive assumptions on (strong) convexity as well as Lipschitz continuity of the gradient/Hessian. We show that two simple modifications of the classical Newton's method result in an algorithm, called Newton-MR, which is almost indistinguishable from its classical counterpart but it can readily be applied to invex problems. By introducing a weaker notion of joint regularity of Hessian and gradient, we show that Newton-MR converges even in the absence of the traditional smoothness assumptions.

11:50

Frank E. Curtis

NonOpt: Nonsmooth Optimization Solver

We discuss NonOpt, an open source C++ solver for nonconvex, nonsmooth minimization problems. The code has methods based on both gradient sampling and bundle method approaches, all incorporating self-correcting BFGS inverse Hessian approximations. The code comes equipped with its own specialized QP solver for solving the arising subproblems. It is also extensible, intended to allow users to add their own step computation strategies, line search routines, etc. Numerical experiments show that the software is competitive with other available codes.