Wed.2 14:15–15:30 | H 1028 | CNV
.

New Frontiers in Splitting Algorithms for Optimization and Monotone Inclusions (2/3)

Chair: Patrick L. Combettes Organizer: Patrick L. Combettes
14:15

Calvin Wylie

joint work with Adrian S. Lewis

Partial smoothness, partial information, and fast local algorithms

The idea of partial smoothness unifies active-set and constant-rank properties in optimization and variational inequalities. We sketch two local algorithmic schemes in partly smooth settings. One promising heuristic solves finite minimax subproblems using only partial knowledge of the component functions. The second, more conceptual, approach relies on splitting schemes for variational inequalities.

14:40

Isao Yamada

joint work with Masao Yamagishi

Global optimization of sum of convex and nonconvex functions with proximal splitting techniques

In sparsity aware signal processing and machine learning, ideal optimization models are often formulated as a minimization of a cost function defined as the sum of convex and possibly nonconvex terms of which the globally optimal solution is hard to compute in general. We present a unified translation of a certain class of such convex optimization problems into monotone inclusion problems and propose iterative algorithms of guaranteed convergence to a globally optimal solution.

15:05

Christian L. Müller

joint work with Patrick L. Combettes

Perspective maximum likelihood-type estimation via proximal decomposition

We introduce an optimization model for general maximum likelihood-type estimation (M-estimation). The model leverages the observation that convex M-estimators with concomitant scale as well as various regularizers are instances of perspective functions. Such functions are amenable to proximal analysis and solution methods via splitting techniques. Using a geometrical approach based on duality, we derive novel proximity operators for several perspective functions. Numerical examples are provided.