Mon.1 11:00–12:15 | H 1029 | CNV
.

Recent Advances in First-Order Methods for Constrained Optimization and Related Problems (1/4)

Chair: Quoc Tran-Dinh Organizers: Quoc Tran-Dinh, Ion Necoara
11:00

Dirk Lorenz

Extensions of the randomized Kaczmarz method: sparsity and mismatched adjoints

The randomized Kaczmarz method is an incremental method to solve linear equations. It uses only a single row of the system in each step and hence, only needs very little storage. The standard Kaczmarz method approximates (in the underdetermined case) the minimum norm solution. In this talk we will study two extensions of the method: First, we will show how to adapt the method to produce sparse solutions of underdetermined and second, we investigate how a mismatch in the adjoint matrix does affect the convergence.

11:25

Angelia Nedich

A Smooth Inexact Penalty Reformulation of Convex Problems with Linear Constraints

We consider a constrained convex problem with linear inequalities and provide an inexact penalty reformulation. The novelty is in the choice of the penalty functions, which are smooth and can induce a non-zero penalty over some points in feasible region of the original problem. For problems with a large number of constraints, a particular advantage of such a smooth penalty-based reformulation is that it renders a penalized problem suitable for the implementation of fast incremental gradient methods, which require only one sample from the inequality constraints at each iteration (such as SAGA).

11:50

Ion Necoara

Minibatch stochastic first order methods for convex optimization

We present convergence rates for minibatch stochastic first order methods with constant or variable stepsize for convex optimization problems, expressed in terms of expectation operator. We show that these methods can achieve linear convergence rate up to a constant proportional to stepsize and under some strong stochastic bounded gradient condition even pure linear convergence. Moreover, when variable stepsize is chosen we derive sublinear convergence rates that show an explicit dependence on minibatch size. Applications of these results to convex feasibility problems will be also discussed.