This talk discusses an accelerated composite gradient (ACG) variant for solving nonconvex smooth composite minimization problems. As opposed to well-known ACG variants which are based on either a known Lipschitz gradient constant or the sequence of maximum observed curvatures, the current one is based on the sequence of average observed curvatures.
joint work with Hongchao Zhang
Inexact alternating direction multiplier methods (ADMMs) are developed for solving general separable convex optimization problems with a linear constraint and with an objective that is the sum of smooth and nonsmooth terms. The approach involves linearized subproblems, a back substitution step, and either gradient or accelerated gradient techniques. Global convergence is established and convergence rate bounds are obtained for both ergodic and non-ergodic iterates.
joint work with Franck Iutzeler, Jerome Malick
In this talk, we present a first-order optimization algorithm for distributed learning problems. We propose an efficient sparsification of proximal-gradient updates to lift the communication bottleneck of exchanging huge size updates between master machine and workers machines. Using a sketch-and-project technique with projections onto near-optimal low-dimensional subspaces, we get a significant performance improvement in terms of convergence with respect to the total size of communication.