Tue.4 16:30–17:45 | H 3002 | CNV
.

Performance Estimation of First-Order Methods

Chair: Adrien Taylor Organizers: François Glineur, Adrien Taylor
16:30

Yoel Drori

joint work with Ohad Shamir

Performance estimation of stochastic first-order methods

We present an extension to the performance estimation methodology, allowing for a computer-assisted analysis of stochastic first-order methods. The approach is applicable in the convex and non-convex settings, and for various suboptimality criteria including the standard minimal gradient norm of the iterates. We describe the approach, new numerical and analytical bounds on the stochastic gradient method, and a construction of a worst-case example establishing tightness of the analyses.

16:55

Donghwan Kim

Accelerated Proximal Point Method for Maximally Monotone Operators

This work proposes an accelerated proximal point method for maximally monotone operators. The proof is computer-assisted via the performance estimation problem approach. The proximal point method includes many well-known convex optimization methods, such as the alternating direction method of multipliers and the proximal method of multipliers, and thus the proposed acceleration has wide applications. Numerical experiments will be presented to demonstrate the accelerating behaviors.

17:20

François Glineur

joint work with Antoine Daccache, Julien Hendrickx

On the Worst-Case of the Fixed-Step Gradient Method for Arbitrary Stepsizes

The problem of computing of the worst-case performance of a given optimization algorithm over a given class of objective functions can in some cases be formulated exactly as a semidefinite optimization problem, whose solution provides both the sought worst-case performance and a description the corresponding objective function. We use this methodology to study the gradient method performing N steps with fixed but arbitrary stepsizes, applied to a smooth convex function. Results suggest the existence of an exponential (in N) number of distinct worst-case behaviours depending on the stepsizes.