Wed.3 16:00–17:15 | H 2038 | SPA
.

Block Alternating Schemes for Nonsmooth Optimization at a Large Scale

Chair: Jean-Christophe Pesquet Organizer: Emilie Chouzenoux
16:00

Krzysztof Rutkowski

joint work with Ewa Bednarczuk

On projected dynamical system related to best approximation primal-dual algorithm for solving maximally monotone operators inclusion problem

In this talk we discuss Projected Dynamical System (PDS) associated with the proximal primal-dual algorithm for solving maximally monotone inclusions problem. We investigate the discretized version of (PDS) with the aim of giving an insight into the relationships between the trajectories solving (PDS) and sequences generated by proximal primal-dual algorithm. We relate the discretization of (PDS) with Haugazeau-based proximal primal-dual algorithms and discuss its block versions. The talk is based on the common work with Ewa Bednarczuk.

16:25

Marco Prato

joint work with Silvia Bonettini, Simone Rebegoldi

A variable metric nonlinear Gauss-Seidel algorithm for nonconvex nonsmooth optimization

We propose a block coordinate forward-backward algorithm aimed at minimizing functions defined on disjoint blocks of variables. The proposed approach is characterized by the possibility of performing several variable metric forward-backward steps on each block of variables, in combination with an Armijo backtracking linesearch to ensure the sufficient decrease of the objective function. The forward-backward step may be computed approximately, according to an implementable stopping criterion, and the parameters defining the variable metric may be chosen using flexible desired adaptive rules.

16:50

Jean-Christophe Pesquet

joint work with Emilie Chouzenoux, Giovanni Chierchia, Luis BriceƱo-Arias

A Random Block-Coordinate Douglas-Rachford Splitting Method with Low Computational Complexity for Binary Logistic Regression

In this talk, we introduce a new optimization algorithm for sparse logistic regression based on a stochastic version of the Douglas Rachford splitting method. Our algorithm sweeps the training set by randomly selecting a mini-batch of data at each iteration, and it allows us to update the variables in a block coordinate manner. Our approach leverages the proximity operator of the logistic loss, which is expressed with the generalized Lambert W function. Experiments carried out on standard datasets demonstrate the efficiency of our approach w.r.t. stochastic gradient-like methods.