Wed.2 14:15–15:30 | H 2038 | SPA
.

Primal-Dual Methods for Structured Optimization

Chair: Mathias Staudigl Organizer: Mathias Staudigl
14:15

Robert Ernö Csetnek

The proximal alternating minimization algorithm for two-block separable convex optimization problems with linear constraints

The Alternating Minimization Algorithm has been proposed by Paul Tseng to solve convex programming problems with two-block separable linear constraints and objectives, whereby (at least) one of the components of the latter is assumed to be strongly convex. The fact that one of the subproblems to be solved within the iteration process of this method does not usually correspond to the calculation of a proximal operator through a closed formula affects the implementability of the algorithm. In this talk, we allow in each block of the objective a further smooth convex function and propose a proximal version of the algorithm, which is achieved by equipping the algorithm with proximal terms induced by variable metrics. For suitable choices of the latter, the solving of the two subproblems in the iterative scheme can be reduced to the computation of proximal operators. We investigate the convergence of the proposed algorithm in a real Hilbert space setting and illustrate its numerical performances on two applications in image processing and machine learning.

14:40

Darina Dvinskikh

Complexity rates for accelerated primal-dual gradient method for stochastic optimisation problem

We study the optimal complexity bounds of first-order methods for the convex finite-sum minimization problem. To address this problem, we appeal to distributed optimization and consider the network of multiple agents (processors) which cooperatively minimize the target function. We compare two different approaches toward the finding of the minimum depending on the functions properties in terms of oracle calls and communication rounds of distributed system: primal approach and dual approach which is based on the assumption of existence of the Fenchel-Legendre transform for the objective.

15:05

Kimon Antonakopoulos

joint work with Panayotis Mertikopoulos

Primal-dual methods for Stochastic Variational Inequalities with Unbounded Operators

Motivated by applications in machine learning and imaging, we study a class of variational inequality problems with possibly unbounded operators. In contrast to their classical counterparts, these problems could exhibit singularities near the boundary of the problem domain, thus precluding the use of standard first-order methods. To circumvent this difficulty, we introduce a family of local norms adapted to the problem’s singularity landscape, and we derive a class of Bregman proximal methods that exploit this structure and provide optimal convergence rates in this setting.