Wed.3 16:00–17:15 | H 3025 | CNV
.

Algorithms for Large-Scale Convex and Nonsmooth Optimization (2/2)

Chair: Daniel Robinson
16:00

Masoud Ahookhosh

joint work with Andreas Themelis, Panagiotis Patrinos

A superlinearly convergent Bregman forward-backward method for nonsmooth nonconvex optimization

A superlinearly convergent Bregman forward-backward method for solving sum of two nonconvex functions is proposed, where the first function is relatively smooth and the second one is (nonsmooth) nonconvex. We first introduce the Bregman forward-backward envelope (BFBE), verify its first- and second-order properties under some mild assumptions, and develop a newton-type forward-backward method using BFBE that only needs first-order black-box oracle. After giving the global and local superlinear convergence of the method, we will present encouraging numerical results from several applications.

16:25

Avinash Dixit

joint work with Tanmoy Som

A new accelerated algorithm to solve convex optimization problems

We consider the convex minimization problems of the sum of two convex functions, in which one is differentiable. In the present research work, we used the iterative technique to solve the minimization problem. The recent trend is to introduce techniques having greater convergence speed. Firstly, we introduce an accelerated algorithm to find minimization of a nonexpansive mapping and corresponding proximal gradient algorithm to solve convex optimization problems. Later on, the proposed algorithm is compared with already existing algorithms on the basis of convergence speed and accuracy.

16:50

Ewa Bednarczuk

joint work with Krzysztof Rutkowski

[moved] On Lipschitzness of projections onto convex sets given by systems of nonlinear inequalities and equations under relaxed constant rank condition

When approaching convex optimization problems from continuous perspective the properties of related dynamical systems often depend on behavior of projections, In particular, when investigating continuous variants of primal-dual best approximation methods, projections onto moving convex sets appear. We investigate lipschitzness of projection onto moving convex sets. To this aim we use the relaxed constant rank condition, introduced by Minchenko and Stakhovski.