An adaptive regularization algorithm using inexact function and derivatives evaluations is proposed for the solution of composite nonsmooth nonconvex optimization. It is shown that this algorithm needs at most $O(|\log(\epsilon)|\epsilon^{-2})$ evaluations of the problem's functions and their derivatives for finding an $\epsilon$-approximate first-order stationary point. This complexity bound is within a factor $|\log(\epsilon)|$ of the optimal bound known for smooth and nonsmooth nonconvex minimization with exact evaluations.
joint work with Francois Gallard, Serge Gratton, Benoit Pauwels, Philippe L. Toint
The classical L-BFGS-B algorithm, used to minimize an objective function depending on variables that must remain within given bounds, is generalized to the case where the algorithm accepts a general preconditioner. We will explain the main ingredients of the preconditioned algorithm especially focusing on the efficient computation of the so-called generalized Cauchy point. This algorithm is implemented in Python and is applied to the aerodynamic shape design problem, which consists of solving several optimization problems, to reduce the number of expensive objective function evaluations.
joint work with Henri Calandra, Serge Gratton, Xavier Vasseur
We investigate the use of multilevel methods (recursive procedures that exploit the knowledge of a sequence of approximations to the original objective function, defined on spaces of reduced dimension, to build alternative models to the standard Taylor one, cheaper to minimize) to solve problems that do not have an underlying geometrical structure. Specifically, we focus on an important class of such problems, those arising in the training of artificial neural networks. We propose a strategy based on algebraic multigrid techniques to build the sequence of coarse problems.