Mon.2 13:45–15:00 | H 0105 | NON
.

Nonlinear and Stochastic Optimization (2/4)

Chair: Albert Berahas Organizers: Albert Berahas, Geovani Grapiglia
13:45

Alejandra Peña-Ordieres

joint work with James Luedtke, Andreas Wächter

A Nonlinear Programming Reformulation of Chance Constraints

In this talk, we introduce a method for solving nonlinear continuous optimization problems with chance constraints. Our method is based on a reformulation of the probabilistic constraint as a smooth sample-based quantile function. We provide theoretical statistical guarantees of the approximation. Furthermore, we propose an Sl1QP-type trust-region method to solve instances with joint chance constraints. We show that the method scales well in the number of samples and that the smoothing can be used to counteract the bias in the chance constraint approximation induced by the sample.

14:10

Stefan Wild

joint work with Matt Menickelly

Manifold Sampling for Robust Calibration

We adapt a manifold sampling algorithm for the composite nonsmooth, nonconvex optimization formulations of model calibration that arise when imposing robustness to outliers present in training data. We demonstrate the approach on objectives based on trimmed loss and highlight the challenges resulting from the nonsmooth, nonconvex paradigm needed for robust learning. Initial results demonstrate that the method has favorable scaling properties. Savings in time on large-scale problems arise at the expense of not certifying global optimality in empirical studies, but the method also extends to cases where the loss is determined by a black-box oracle.

14:35

Nikita Doikov

Complexity of cubically regularized Newton method for minimizing uniformly convex functions

In this talk we discuss iteration complexity of cubically regularized proximal Newton method for solving composite minimization problems with uniformly convex objective. We introduce the notion of second-order condition number of a certain degree and present the linear rate of convergence in a nondegenerate case. The algorithm automatically achieves the best possible global complexity bound among different problem classes of functions with Holder continuous Hessian of the smooth part. This research was supported by ERC Advanced Grant 788368.