joint work with Quoc Tran-Dinh, Kim-Chuan Toh
This paper suggests a novel idea to develop new proximal variable-metric methods for solving a class of composite convex problems. The idea is to utilise a new optimality condition parameterization to design a class of homotopy proximal variable-metric methods that can achieve linear convergence and finite global iteration-complexity bounds. Moreover, rigorous proofs of convergence results are also provided. Numerical experiments on several applications are given to illustrate our theoretical and computational advancements when compare to other state-of-the-art algorithms.
joint work with Defeng Sun, Kim-Chuan Toh, Yancheng Yuan
The exclusive lasso regularization based on the $\ell_{1,2}$ norm has become popular due to its superior performance, which results in both inter-group and intra-group sparsity. In this paper, we derive a closed-form solution for the proximal mapping of the $\ell_{1,2}$ norm and its generalized Jacobian. Then we design efficient first and second order algorithms for machine learning models involving the exclusive lasso regularization.
joint work with Vincent Roulet, Sham Kakade, Zaid Harchaoui
We consider the problem of optimizing the parameters of a maximum-margin structured prediction model at a large-scale. We show how an appropriate smoothing by infimal convolution of such non-smooth objectives paves the way to the development of large-scale smooth optimization algorithms making calls to smooth inference oracles, comparable in complexity to the regular inference oracles. We present such an algorithm called Casimir, establish its worst-case information-based complexity, and demonstrate its effectiveness on several machine learning applications.