joint work with Adilet Otemissov
We show that the scalability challenges of Global Optimisation (GO) algorithms can be overcome for functions with low effective dimensionality, which are constant along certain subspaces. We propose the use of random subspace embeddings within a(ny) global optimization algorithm, extending Wang et al (2013). Using tools from random matrix theory and conic integral geometry, we investigate the success rates of our low-dimensional embeddings of the original problem, in both a static and adaptive formulation. We illustrate our algorithmic proposals and theoretical findings numerically.
joint work with Matt Menickelly, Stefan Wild, Kamil Khan
We present and analyze a manifold sampling method for minimizing $f(x) = h(F(x))$ when $h$ is nonsmooth, nonconvex, and has a known structure and $F$ is smooth, expensive to evaluate, and has a relatively unknown structure. Manifold sampling seeks to take advantage of knowledge of $h$ in order to efficiently use past evaluations of $F$.
joint work with Felix Lieder
We discuss a derivative-free solver for nonlinear programs with nonlinear equality and inequality constraints. The algorithm aims at finding a local minimizer by using finite differences and sequential quadratic programming (SQP). As approximations of the derivatives are expensive compared to the numerical computations that usually dominate the computational effort of NLP solvers a somewhat effortful trust-region SQP-subproblem is solved at each iteration by second order cone programs. The public domain implementation in Matlab or Octave is suitable for small to medium size problems.