joint work with Guoyin Li, Ting Kei Pong
KL exponent plays an important role in inducing the local convergence rates of many first-order methods. These methods are applied to solve large-scale optimization problems encountered in fields like compressed sensing and machine learning. However, KL exponent is hard to estimate explicitly and few results are known. In this talk, we study when inf-projection preserves KL exponent. This enables us to deduce the KL exponent of some SDP-representable functions and some nonconvex models such as least squares with rank constraint.
joint work with Duan Li
We present a new solution framework to solve the generalized trust region subproblem (GTRS) of minimizing a quadratic objective over a quadratic constraint. More specifically, we derive a convex quadratic reformulation of minimizing the maximum of the two convex quadratic functions for the case under our investigation. We develop algorithms corresponding to two different line search rules. We prove for both algorithms their global sublinear convergence rates. The first algorithm admits a local linear convergence rate by estimating the Kurdyka-{\L}ojasiewicz exponent at any optimal solution.
Metric subregularity of subdifferentials is an important property to study linear convergence of many first-order methods for solving convex and nonconvex optimization problems. This talk reviews recent achievements and provides some new developments in this direction. Moreover, we will show how metric subregularity of subdifferentials could play significant roles in investigating well-posedness of optimization problems via second-order variational analysis. Applications to some structured optimization problems such as $\ell_1$, nuclear normed, Poisson regularized problems will be discussed.