joint work with Jianyuan Zhai
Use of high-fidelity simulations is common for computer-aided design, but lack of analytical equations creates challenges. One main challenge in derivative-free optimization is the variability in the incumbent solution, when (a) the problem is stochastic; (b) different initial samples or surrogate models are used to guide the search. We propose to mitigate this lack of robustness in globally convergent behavior, through a surrogate-based spatial branch-and-bound method. We derive bounds for several surrogate models and use them to cleverly partition and prune the space for faster convergence.
joint work with Gianluca Geraci, Michael S. Eldred, Hans-Joachim Bungartz, Youssef M. Marzouk
We present new developments for near-optimal noise control in the stochastic nonlinear derivative-free constrained optimization method SNOWPAC. SNOWPAC estimates measures of robustness or risk, appearing in the optimization objective and/or constraints, using Monte Carlo sampling. It also employs a derivative-free trust region approach that is regularized by the resulting sampling noise. We describe how to use Gaussian processes as control variates to decrease this noise in a nearly optimal fashion, thus improving the trust region approximation and the convergence of the method.
joint work with Arnold Neumaier
We present a new algorithm for unconstrained noisy black box optimization in high dimension, based on low rank quadratic models for efficiency, and on techniques for guaranteeing good complexity results. A comparison of our algorithm with some solvers from the literature will be given.