Tue.2 13:15–14:30 | H 0105 | NON
.

Nonlinear and Stochastic Optimization (4/4)

Chair: Albert Berahas Organizers: Albert Berahas, Geovani Grapiglia
13:15

Michael Overton

joint work with Azam Asl

Behavior of L-BFGS on Nonsmooth Problems in Theory and in Practice

When applied to minimize nonsmooth functions, the "full" BFGS method works remarkably well, typically converging linearly to Clarke stationary values, with no counterexamples to this behavior known in the convex case, though its theoretical behavior is not yet well understood. In contrast, limited-memory BFGS may converge to non-stationary values, even on a simple convex function, but this seems to be rare. We summarize our experience with L-BFGS in both theory and practice.

13:40

Anton Rodomanov

Greedy Quasi-Newton Method with Explicit Superlinear Convergence

We propose a new quasi-Newton method for unconstrained minimization of smooth functions. Our method is based on the famous BFGS scheme but it uses a greedily selected coordinate vector for updating the Hessian approximation at each iteration instead of the previous search direction. We prove that the proposed method has local superlinear convergence and establish a precise bound for its rate. To our knowledge, this result is the first explicit non-asymptotic rate of superlinear convergence for quasi-Newton methods. All our conclusions are confirmed by numerical experiments.

14:05

Albert Berahas

joint work with Frank E. Curtis, Baoyu Zhou

Limited-Memory BFGS with Displacement Aggregation

We present a displacement aggregation strategy for the curvature pairs stored in a limited-memory BFGS method such that the resulting (inverse) Hessian approximations are equal to those derived from a full-memory BFGS method. Using said strategy, an algorithm employing the limited-memory method can achieve the same convergence properties as when full-memory Hessian approximations are employed. We illustrate the performance of an LBFGS algorithm that employs the aggregation strategy.