joint work with Layne Watson, Thomas Lux
The Delaunay triangulation is an unstructured simplicial mesh, which is often used to perform piecewise linear interpolation. In this talk, the Delaunay triangulation is presented as a surrogate model for blackbox optimization. If the objective function is smooth, then the estimated gradient in each simplex is guaranteed to converge to the true gradient, and the proposed algorithm will emulate gradient descent. This algorithm is useful when paired with an effective global optimizer, particularly in situations where function evaluations are expensive and gradient information is unavailable.
joint work with Stefan Ulbrich, Tristan Gally, Peter Groche, Florian Hoppe, Anja Kuttich, Marc Pfetsch, Martin Rakowitsch
In this talk we propose an approach to identify model uncertainty using both parameter estimation and optimal design of experiments. The key idea is to optimize sensor locations and input configurations for the experiments such that the identifiable model parameters can be determined with minimal variance. This enables us to determine confidence regions, in which the real parameters or the parameter estimations from different test sets have to lie. We test the proposed method using examples from load-carrying mechanical structures. Numerical results are presented.
joint work with Adriano Lisboa, Matheus de Oliveira Mendonça
It is presented line search optimization methods using a mathematical framework based on the simple concept of a v-pattern and its properties, providing theoretical guarantees on preserving, in the localizing interval, a local optimum no worse than the starting point. The framework can be applied to arbitrary unidimensional functions, including multimodal and infinitely valued ones. Enhanced versions of the golden section and Brent’s methods are proposed and analyzed within this framework: they inherit the improving local optimality guarantee.