joint work with Ebrahim Sarabi
In this talk we present applications of second-order variational analysis and generalized differentiation to establish the superlinear convergence of SQP methods for general problems of conic programming. For the basic SQP method, the primal-dual superlinear convergence is derived under an appropriate second-order condition and some stability/calmness property, which automatically holds for nonlinear programs. For the quasi-Newton SQP methods, the primal superlinear convergence is obtained under the Dennis-More condition for constrained optimization.
joint work with Clarice Poon
In this talk, we will present a geometry based adaptive acceleration framework for speeding up first-order methods when applied to non-smooth optimisation. Such a framework can render algorithms which are able to automatically adjust themselves to the underlying geometry of the optimisation problems and the structure of the first-order methods. State-of-the-art performances are obtained on various non-smooth optimisation problems and first-order methods, particularly the non-descent type methods.
Splitting methods is a class of important and widely used approach for solving feasibility problems. The behaviour of these methods has been reasonably understood in the convex setting. They have been further successfully applied to various nonconvex instances recently; while the theoretical justification in this latter setting is far from complete. We will examine global convergence of several popular splitting methods (including Douglas-Rachford and Peaceman Rachford splitting method) for general nonconvex nonsmooth optimization problems.