joint work with Paul Goulart, Nikitas Rontsis
Eigenvalue perturbation theory is a classical subject in matrix analysis and numerical linear algebra, but seemingly has had little impact in optimization. This talk presents recent progress in eigenvalue perturbation theory, highlighting its roles in optimization. We first derive sharp perturbation and error bounds for Ritz vectors (approximate eigenvectors) and approximate singular vectors, which are used e.g. in SDP, PCA and trust-region subproblems. We also present bounds on the accuracy of approximate semidefinite projection, motivated by operator splitting methods in optimization.
The talk explores the numerical implications of using variable eliminations in nonlinear problems versus applying intermediate variables; looking at implications on feasibility, formula sizes or the size of the auto differentiation stacks. For bounded cases when eliminations are not applicable, the talk will present cascading, a technology originating from classical refinery optimization models but recently successfully applied to financial problems. Detection methods of elimination candidates and cascading structures will be presented.
In this talk we consider the effective numerical solution of PDE-constrained optimization problems with additional box constraints on the state and control variables. Upon discretization, a sensible solution strategy is to apply an interior point method, provided one can solve the large and structured matrix systems that arise at each Newton step. We therefore consider fast and robust preconditioned iterative methods for these systems, examining two cases: (i) with L2 norms arising within the cost functional; (ii) with an additional L1 norm term promoting sparsity in the control variable.