joint work with Pavel Dvurechensky, Alexander Gasnikov
In this paper, we propose a new class of convex optimization problem with the objective function formed as a sum of m+1 functions with the different order of smoothness: non-smooth but simple with known structure, with Lipschitz-continuous gradient, with Lipschitz-continuous second-order derivative, ..., with Lipschitz-continuous m-order derivative. To solve this problem, we propose a high-order optimization method, that takes into account information about all functions. We prove the convergence rate of this method. We obtain faster convergence rates than the ones known in the literature.
joint work with Juan Carlos De los Reyes
In this talk we present a second order method for solving composite sparse optimization problems, which consist in minimizing the sum of a differentiable, possibly nonconvex function and a nondifferentiable convex term. The composite nondifferentiable convex penalization is given by $\ell_1$--norm of a matrix times the coefficient vector. Our proposed method generalizes the previous OESOM algorithm in its three main ingredients to the case of the generalized composite optimization: outhant directions, the projection step and, in particular, the full second--order information.