joint work with Andreas Themelis, Panagiotis Patrinos
A superlinearly convergent Bregman forward-backward method for solving sum of two nonconvex functions is proposed, where the first function is relatively smooth and the second one is (nonsmooth) nonconvex. We first introduce the Bregman forward-backward envelope (BFBE), verify its first- and second-order properties under some mild assumptions, and develop a newton-type forward-backward method using BFBE that only needs first-order black-box oracle. After giving the global and local superlinear convergence of the method, we will present encouraging numerical results from several applications.
joint work with Tanmoy Som
We consider the convex minimization problems of the sum of two convex functions, in which one is differentiable. In the present research work, we used the iterative technique to solve the minimization problem. The recent trend is to introduce techniques having greater convergence speed. Firstly, we introduce an accelerated algorithm to find minimization of a nonexpansive mapping and corresponding proximal gradient algorithm to solve convex optimization problems. Later on, the proposed algorithm is compared with already existing algorithms on the basis of convergence speed and accuracy.
joint work with Krzysztof Rutkowski
When approaching convex optimization problems from continuous perspective the properties of related dynamical systems often depend on behavior of projections, In particular, when investigating continuous variants of primal-dual best approximation methods, projections onto moving convex sets appear. We investigate lipschitzness of projection onto moving convex sets. To this aim we use the relaxed constant rank condition, introduced by Minchenko and Stakhovski.