Tue.1 10:30–11:45 | H 2038 | DER
.

Derivative-Free Optimization Under Uncertainty (1/2)

Chair: Stefan Wild Organizers: Stefan Wild, Ana Luisa Custodio, Francesco Rinaldi, Margherita Porcelli, Sébastien Le Digabel
10:30

Matt Menickelly

joint work with Stefan Wild

Formulations and Methods for Derivative-Free Optimization Under Uncertainty

In this talk, we will discuss progress made in the incorporation of derivative-free techniques into various paradigms of optimization under uncertainty. In particular, we will consider a novel outer approximations method for derivative-free robust optimization with particular application to so-called implementation error problems. We will also investigate the application of a particular derivative-free method for nonsmooth optimization (manifold sampling) to robust data-fitting of least trimmed estimators, as well to a feasibility restoration procedure for chance-constrained optimization.

10:55

Zaikun Zhang

joint work with Serge Gratton, Luis Nunes Vicente

Trust region methods based on inexact gradient information

We discuss the behavior of trust-region method assuming the objective function is smooth yet the gradient information available is inaccurate. We show trust-region method is quite robust with respect to gradient inaccuracy. It converges even if the gradient is evaluated with only one correct significant digit and encounters random failures with a positive probability. The worst case complexity of the method is essentially the same as when the gradient evaluation is accurate.

11:20

Morteza Kimiaei

joint work with Arnold Neumaier

Efficient black box optimization with complexity guarantees

I'll report on recent joint work with Morteza Kimiaei on derivative-free unconstrained optimization algorithms that both have high quality complexity guarantees for finding a local minimizer and perform highly competitive with state-of-the-art derivative-free solvers.