Mon.2 13:45–15:00 | H 1012 | SPA
.

Nonlinear Optimization Algorithms and Applications in Data Analysis (1/3)

Chair: Xin Liu Organizers: Cong Sun, Xin Liu
13:45

Yaxiang Yuan

joint work with Xin Liu, Zaiwen Wen

Efficient Optimization Algorithms For Large-scale Data Analysis

This talk first reviews two efficient Newton type methods based on problem structures: 1) semi-smooth Newton methods for composite convex programs and its application to large-scale semi-definite program problems and machine learning; 2) an adaptive regularized Newton method for Riemannian Optimization. Next, a few parallel optimization approaches are discussed: 1) parallel subspace correction method for a class of composite convex program; 2) parallelizable approach for linear eigenvalue problems; 3) parallelizable approach for optimization problems with orthogonality constraint.

14:10

Liping Wang

Robust ordinal regression induced by $l_p$-centroid

We propose a novel type of class centroid derived from the $l_p$-norm (coined as $l_p$-centroid) to overcome the drawbacks above, and provide an optimization algorithm and corresponding convergence analysis for computing the $l_p$-centroid. To evaluate the effectiveness of $l_p$-centroid in Ordinal regression (OR) context against noises, we then combine the $l_p$-centroid with two representative class-center-induced ORs. Finally, extensive OR experiments on synthetic and real-world datasets demonstrate the effectiveness and superiority of the proposed methods to related existing methods.

14:35

Cong Sun

joint work with Jinpeng Liu, Yaxiang Yuan

New gradient method with adaptive stepsize update strategy

In this work, a new stepsize update strategy for the gradient method are proposed. On one hand, the stepsizes are updated in a cyclic way, where the Cauchy steps are combined with fixed step lengths; on the other hand, the iteration number in one cycle is adjusted adaptively according to the gradient residue. Theorecially, the new method terminates in 7 iterations for 3 dimensional convex quadratic function minimization problem; for general high dimensional problems, it converges R-linearly. Numerical results show the superior performance of the proposed method over the states of the art.