Tue.2 13:15–14:30 | H 0104 | NON
.

Recent Advances in First-Order Methods (1/6)

Chair: Yuyuan Ouyang Organizers: Yuyuan Ouyang, Guanghui Lan, Yi Zhou
13:15

Renato Monteiro

An Average Curvature Accelerated Composite Gradient Method for Nonconvex Smooth Composite Optimization Problems

This talk discusses an accelerated composite gradient (ACG) variant for solving nonconvex smooth composite minimization problems. As opposed to well-known ACG variants which are based on either a known Lipschitz gradient constant or the sequence of maximum observed curvatures, the current one is based on the sequence of average observed curvatures.

13:40

William Hager

joint work with Hongchao Zhang

Inexact alternating direction multiplier methods for separable convex optimization

Inexact alternating direction multiplier methods (ADMMs) are developed for solving general separable convex optimization problems with a linear constraint and with an objective that is the sum of smooth and nonsmooth terms. The approach involves linearized subproblems, a back substitution step, and either gradient or accelerated gradient techniques. Global convergence is established and convergence rate bounds are obtained for both ergodic and non-ergodic iterates.

14:05

Dmitry Grishchenko

joint work with Franck Iutzeler, Jerome Malick

Identification-based first-order algorithms for distributed learning

In this talk, we present a first-order optimization algorithm for distributed learning problems. We propose an efficient sparsification of proximal-gradient updates to lift the communication bottleneck of exchanging huge size updates between master machine and workers machines. Using a sketch-and-project technique with projections onto near-optimal low-dimensional subspaces, we get a significant performance improvement in terms of convergence with respect to the total size of communication.