Tue.1 10:30–11:45 | H 0105 | NON
.

Nonlinear and Stochastic Optimization (3/4)

Chair: Albert Berahas Organizers: Albert Berahas, Geovani Grapiglia
10:30

Clément Royer

joint work with Stephen Wright, Michael O'Neill

Complexity guarantees for practical second-order algorithms

We discuss methods for smooth unconstrained optimization that converge to approximate second-order necessary points with guaranteed complexity. These methods perform line searches along two different types of directions, approximate Newton directions and negative curvature directions for the Hessian. They require evaluations of functions, gradients, and products of the Hessian with an arbitrary vector. The methods have good practical performance on general problems as well as best-in-class complexity guarantees.

10:55

Katya Scheinberg

New effective nonconvex approximations to expected risk minimization

In this talk we will present novel approach to empirical risk minimization. We will show that in the case of linear predictors, the expected error and the expected ranking loss can be effectively approximated by smooth functions whose closed form expressions and those of their first (and second) order derivatives depend on the first and second moments of the data distribution, which can be precomputed. This results in a surprisingly effective approach to linear classification. We compare our method to the state-of-the-art approaches such as logistic regression.

11:20

Daniel Robinson

Accelerated Methods for Problems with Group Sparsity

I discuss an optimization framework for solving problems with sparsity inducing regularization. Such regularizers include Lasso (L1), group Lasso, and latent group Lasso. The framework computes iterates by optimizing over small dimensional subspaces, thus keeping the cost per iteration relatively low. Theoretical convergence results and numerical tests on various learning problems will be presented.