joint work with Stefan Wild
In this talk, we will discuss progress made in the incorporation of derivative-free techniques into various paradigms of optimization under uncertainty. In particular, we will consider a novel outer approximations method for derivative-free robust optimization with particular application to so-called implementation error problems. We will also investigate the application of a particular derivative-free method for nonsmooth optimization (manifold sampling) to robust data-fitting of least trimmed estimators, as well to a feasibility restoration procedure for chance-constrained optimization.
joint work with Serge Gratton, Luis Nunes Vicente
We discuss the behavior of trust-region method assuming the objective function is smooth yet the gradient information available is inaccurate. We show trust-region method is quite robust with respect to gradient inaccuracy. It converges even if the gradient is evaluated with only one correct significant digit and encounters random failures with a positive probability. The worst case complexity of the method is essentially the same as when the gradient evaluation is accurate.
joint work with Arnold Neumaier
I'll report on recent joint work with Morteza Kimiaei on derivative-free unconstrained optimization algorithms that both have high quality complexity guarantees for finding a local minimizer and perform highly competitive with state-of-the-art derivative-free solvers.